As I have previously posted here, YAPC::Asia Tokyo is the world's LARGEST YAPC. Last year we brought 841 people together for the two day geek-fest. This year we're doing it again in the lovely Keio-University Hiyoshi Campus.
Why not come and mingle with the Japanese Perl hackers? If you use tools like Plack, Test::TCP, Parallel::Prefork, Mouse, plenv, etc then you're got to come talk to their authors! Meanwhile, we'll be even more delighted if you choose to give a presentation. Please submit your talk proposal here
Should you need more info/help, please contact @lestrrat on twitter. I'll be more than happy to help you get to this conference!
In Ejecting
CGI.pm From the Perl Core chromatic discusses the pending deprication
of a storied module. I needed to address a perhaps overlooked aspect
of the argument.
Nearly all web ui are written to support small usergroups inside some
kind of organizational infrastructure.
Very few websites are moderate to large. Most are small. With maybe
a few dozen to a hundred or so users. Most are tools used to provide
some internal organizational function. Not to serve media websites to
thousands viewers or general public readers. On the complementary side
of this, most perl users have some other axe to grind. Web development
is at best a side interest. Or maybe a self defense skill they picked
up out of a need to automate away some function of their job.
I have loved Moose for some time now, but like others, I disliked how heavy it was. Then Moo came along and it was great … until I found myself not availing myself of the type system, because it was harder to get to than in Moose. I was skipping validation more and more.
A recent project came up and I really wanted to do it right, so I tried Toby Inkster’s new Type::Tiny and may I say, “hat’s off to you sir!” The combination of Moo and Type::Tiny brings that Moosey feeling back, while still being light and responsive and even fat-pack-able! Great work to all involved in both projects!
There is a reason large software companies provide free or relatively cheap license for their super-expensive software to students.
When these people finish school and look for a job, they will already know how to use that super-expensive software and not the other one.
So when a company hires them, either they let the person use this super-expensive program they need to train the person to use the other one. Even if the other one is free of charge, the training cost and the time it takes to get up-to-speed in that other tool will mean that most companies will opt for the software that has a larger pool of knowledgeable users.
With programming languages there is a similar effect.
Yesterday I came across another Perl-related site, called PrePAN, that asks me to log in with some other site's login information. I've seen this before on MetaCPAN and Play Perl and it makes me very uncomfortable. It goes against the security advice that I've been given for decades.
I’m sure most of the readers of this blog will have seen that both Module::Build and CGI.pm are up for removal from the Perl core. I thought I would toss my $0.03 (inflation) in on the matters.
Stratopan is a slick new service for hosting custom CPAN-like repositories in the cloud. I've been doing all the development work for Stratopan on my laptop computer. But the other day, I decided to try running it on the Linode server I rent.
So I logged in to the (nearly pristine) server, fired up cpanm to install all the prerequisite modules, and launched the application. Lo and behold, it was broken! Read on to find out how Stratopan actually saved me from hours of debugging pain...
Today I was looking at old automated Perl test based on Test::WWW::Mechanize. It was it testing a complex form. For each of about 10 tests, it loaded the form (with a get_ok() call), and then submitted the form with a variety of input.
Now that we run about 25,000 tests in total in the test suite, I’m always looking for ways to speed up the tests. HTTP calls are relatively slow, so systematic ways to slim them down are attractive.
In this case, I found there was a simple change that I could apply that sped up the particular test by about 28%.
Each time the test was calling get_ok() on the page, it was getting back the same result, which is wasteful. I refactored it like this:
my $base_mech = Test::WWW::Mechanize->new;
$base->get_ok($page_to_test);
Then, everywhere we had a get_ok() call to load the page again, I replaced it with this:
I'm at a conference in Portland this week to learn about a web framework called Symfony2 for my $DAYJOB. The Symfony2 stuff is all co-located with DrupalCon 2013 and one of the rooms across the hall from our workshop was for Drupal sprints. The sign in the picture was posted outside - and it makes it super clear that newbies are welcome, and who they should try to find to get going on a task quickly.
This is an idea we need to steal for Perl hackathons.
CPAN authors should look at the smoke tests for their modules to ensure that they're passing on Perl 5.18. The hash randomization change (and a few others) has bitten many a module that may currently be relying on undefined hash behavior. If you haven't checked your modules recently, you may be in for a surprise.
The problem is significant, especially when compounded by the fact that a single misbehaving module can block many more modules that may depend on it.
As an example (not to finger-point, but just to illustrate): Crypt::DES, Crypt::IDEA, Crypt::Blowfish, and Crypt::Twofish had a single line of XS code that was incompatible with Perl 5.18 (which happened to not be related to hash randomization). As a result, all modules that depended on any of these modules also would fail to install on Perl 5.18. This includes Authen::Passphrase, and Crypt::OpenPGP, as well as dozens of others.
Long answer - no, but (a) Git is a powerful distributed source control system for developers and (b) Git's underlying object database is a powerful, fast database (faster than "cp-a" by some accounts). (It might be your next filesystem, though -- read the linked article.)
Git's object database sacrifices some space for ease of manipulation, as each object is a file referred to by its SHA-1 hash. These objects are in a 2-level store, so an object A4F272058... will always be found at A4/272058... This is both fast (as mentioned before) and easy to debug and manipulate. And it is fast (worth mentioning twice), to the point that "git checkout" can be faster than "cp -a" in at least some circumstances.
Welcome to Perl 5 Porters Weekly, a summary of the email traffic of the
perl5-porters email list.
The topic of the week on P5P was the clean up and release of perl 5.18.0
which was released on 5-18 for North Americans. RJBS in a seperate email
said blead would be reopen for patches on Tuesday, as 5.19.0 is scheduled
to be released on Monday (May 20).
I'm quite enjoying Test::Class::Moose. It's very easy to use and it gives you such fine-grained control over your test suite and powerful reporting capabilities that it's turning out to be far more powerful than I had expected. It's actually easy enough to use for beginners, but power users will really appreciate it. There was, however, a major issue I had with it and it stems from a habit I picked up from Test::Class.
For those who are very familiar with using Test::Class (or if you've read my Test::Class tutorial), you may be used to seeing a base class that looks like this:
Nobody understands augment/inner properly, and the whole idea is broken.
The authors of these statements are intelligent, experienced programmers. You can find similar statements all over the web, also made by intelligent, experienced programmers. It certainly sounds like this augment thing is a pretty terrible idea. So why all the hate for augment?