A similar problem applies. Many local libraries have Perl books which are ten or more years old and therefore misleading.
Beginners are the most likely to be mislead so I decided to check the availabilty of "Learning Perl" which is about as basic as it's possible to get.
I checked Portsmouth, Hampshire, Essex and Surrey area libraries (not little local libraries). Portsmouth and Essex had the 3rd edition, Hampshire didn't have it and Surrey had the 5th edition.
The solution turned out to be simple. It was just a question of asking. I asked a member of each library to request the 6th edition. (I didn't bother with Surrey because the 5th edition is probably OK.) In each case the library agreed to put it on order.
I was also able to read the ticket in the Portsmouth copy and verify that it was being issued a couple of times a year and that updating it is therefore worth while.
I don't have any more contacts so I can't request books in any more areas. It would be useful if some other people would request a copy in their area.
This post is about web-based material. Perhaps the answer there is also as simple as asking the hosts to update their sites.
]]>That said, here in Australia most public libraries don't include computing books any more as they suffer an 80%+ theft rate and are outdated too quickly for their cost.
When i was at university, i think they had a standing order of all O'reilly releases (and some other publishers) as they always had the latest books across most categories. I was impressed when they had current editions of BSD books.
Perlmonks is a messy site though. Ive tried to get in to it, i really have... so even if it was well SEO'd - its quite an eye-sore.
How about a hack-o-thon to improve some of the sanctioned websites and knit them together more? and tie them to social media perhaps?
]]>Thanks to everyone for the responses. I'll be responding more in detail to each of you tonight.
Meanwhile i'm also thinking about how to resolve this problem. The short of it is: There are tutorials out there, but not all of them are fit to serve as "The Tutorial". I'm thinking about setting up a "Perl Tutorial Hub" site, which would be a wiki linking to and detailing other tutorials, including information on how up-to-date they are. Additionally, since it'll be a wiki (probably ikiwiki), it'll be a premier site to build "The Tutorial".
I've already sent out an email to maybe get that onto http://tutorial.perl.com, but i'll have to see how things pan out.
]]>Assuming something like this would be of interest to the Perl community, what would be the best target for links?
]]>I'll start: http://earlruby.org/learn-perl/
Thank you very much!
How to list ways to learn Perl by Earl C. Ruby III.
Question: does Google have any mechanism for SEO that includes a way for web admins to flag their own sites as "historical" or "outdated" or anything like that? Just curious.
I'll be happy to click on a "promoteperl.com" link or do anything else I can to help Google get their results straightened out. Thanks for bringing this to our attention.
-MC
]]>For example learn.perl.org doesn't actually use the text "Perl tutorial" on the front page, which is bad if people are searching for a "perl tutorial" instead of trying to "learn perl"
Perl tutorial is tucked into a keywords meta tag, but (using a SEO checking tool) in comparison the term "tutorial" is the second most popular term on the Leeds site with "perl tutorial" being the most used 2 word phrase. In comparison the "learn" site concentrates on "learn perl", "perl learning" and "install perl".
So while "learn" has a Page Rank of 7 compared to "Leeds" PR 5, it is very unlikely to turn up in results for "perl tutorial" because that isn't the term that it is targeting. Try "learn perl" instead and you'll see it right away.
I am by no means an SEO expert, but it seems that if Google Trends says that "Perl tutorial" is what people are searching for then that should be what the "learn" site should be using.
(and of course this applies to any of the other great tutorial sites out there)
]]>Yahoo bots do look at it, but they don't count as anything towards perceived relevancy.
]]>Why? github is an out-of-date snapshot.
]]>is
keyword. So the following reads nicely:
has $shopping_list is built_by("_build_shopping_list");]]>