CPAN Testers Summary - June 2012 - Seventeen Seconds
MDK is the man! A big thank you to Mark Keating for a great post about how to donate to the CPAN Testers Fund. I have received some feedback about how to make the fund and the donation process a little more prominent on the websites, and I plan to address that in the coming months. I have also been very encouraged by some of the feedback, and hopefully we shall see more donations and sponsorship in the coming years too. Aside from asking your company if they can donate, or writing on your blog about how CPAN Testers have helped you, if you're so inclined, you might want to add a note to your README or POD to tell users how they can donate. There are plenty of other funds you might want to advertise too, so don't feel restricted to the CPAN Testers Fund. CPAN Testers can also benefit indirectly from the other funds, so its all good.
A post I forgot to mention last month was the news that CPAN Dependencies now accommodates META.json files in distributions. With the move to use META.json files for version 2 of the Meta Specification, a number of distributions have started adding the file. Some have included both a META.json and a META.yml, but some are now only releasing distributions with a META.json. This change to CPAN Dependencies now means that these distributions can once again be included in the calculations. Thanks to Dave Cantrell for the update.
Another post I should have mentioned last month, but only became aware of it after the post, was from Vyacheslav Matjukhin. Although his post wasn't particularly related to CPAN Testers, it does make an interesting discussion point regarding how automated testing isn't the only avenue for testing. Automated tests provided in your distribution are only part of the whole picture. Documentation is important and we can test for it being correctly formatted, but we can't test whether its understandable. We can't also test whether the usage and results of the code make sense to another user. You can write tests that make sure you understand what you expect, but if a user of your distribution struggles to understand the arguments and results, it doesn't help them. Without a doubt Validation::Class has benefited from the feedback, and while the author and reviewer might not agree on all points, it was a very worthwhile exercise. Not always easy to get such feedback, but if you can it is well worth taking advantage of it.
On the mailing list Nigel Horne highlighted to me an issue with the Leaderboard. His numbers didn't appear to be adding up correctly. This was in part due to the trawl through the Metabase to find the missing reports Amazon had failed to send through, but also due to an issue with the way the leaderboard is generated. The code has now been rewritten, and a new database table created, to better manage all the counts. As of now, the leaderboard numbers should be much more accurate, and be incremented as expected. During this process I took the opportunity to map some addresses, and got through 105 mappings, of which 44 were brand new names to the system.
Next month sees YAPC::Europe 2012 taking place in Frankfurt. There are a few testing related talks planned, but I'll cover those in next month's summary. The deadline for talk submissions is 15th July, so there are still a few days to complete your submissions before the deadline. Hope to see you there.
Leave a comment