The sad story of pseudohash criticism
I just had to endure MJD’s horrible pseudohash explanation at the Pittsburgh Workshop. “A new, never-before-seen talk on Perl’s catastrophic experiment with “pseudohashes”, which wasted everyone’s time for nine years between 1998 and 2007”
https://www.youtube.com/watch?v=-HlGQtAuZuY
Watch it, you can fast forward through it. I honestly had higher opinions on Marc-Jason.
So let’s see what’s wrong with the popular and uninformed pseudohash critic:
Their main points are that storing a hash in the array slot 0 for run-time lookup is too complicated, the exists and delete ops need to check for arrays / pseudohashes now also, and all the pseudohash checks slowed down general hash usage by 15%. Which basically levelled the advantage of faster compile-time accelerated array lookup on those pseudo hashes, which was ~15%.
package Critter;
use fields qw(NAME TYPE);
my Critter $h; # compile-time optimization: href NAME => aref 1
$h->{NAME}; # ==> $h->[1]
but:
$key = "NAME"; # defer to run-time lookup of href in aref 0
$h->{$key}; # ==> $h->[ $h->[0]->{$key} ]
So by allowing the slow run-time access, you need to preserve the hash semantics of the array. Still, the compilers knows about the type of $h, and can still compile it to a href $key aref 0.
Same problem with exists and delete.
exists $h->{NAME} is compile-time constant foldable to YES or NO.
delete $h->{NAME} needs to store a sentinel as with hashes in aref 1. This only slows down aref for pseudohashes, but should not slow down href or aref for arrays.
Of course this was not how it was implemented. In good old perl5 fashion $h was kept as hash, and all the hash ops were extended to check for pseudohashes at run-time. Yes, at run-time, in the ops.
What should have been done instead was to either reject pseudohash optimization when a run-time key was parsed, maybe with a warning under use warnings.
Or if you really don’t want to punish bad behaviour by using computed keys with explicitly requested compile-time keys, compile $h to arrays and not to hashes.
As I said before, perl5 is just badly implemented, but still fixable. use fields could still be a hash hint for the compiler to change it to arrays.
Just don’t expect anything from the current maintainers and the old-timers.
Horrible talk.
At risk of jumping in to a battle I barely understand, my understanding of the talk was criticizing and lamenting the loss of time and productivity that could have gone to other efforts (perhaps even to your suggestion). I don't think MJD was saying that pseudohashes were doomed to failure, just that the experiment ended in failure. I don't see why that opinion causes him to be labelled uninformed.
Maybe someday the extent of your feigned respect will even go so far that you'll bother to spell his fucking name right.
To be fair, mjd is often purposefully provocative and he commits to his idea to get people to think deeply about what they are doing. However, I think he's fairly well behaved in this talk and I don't see the same thing you do.
I think the key part of his talk is that comparing relative speeds of two things in a single perl release isn't the right thing to do if they are both slower than one of the things in a previous release. His philosophical point merely uses this piece of Perl history.
But then, I've given some pretty stupid talks I regret too. :)