CPAN Testers Summary - July 2011 - Permanent Waves
957,044 ... that's 957,044
That's the number of report submissions we saw during the 31 days of July! This biggest monthly submission we've ever had. Just over 40,000 reports more and we would have broken the 1 million barrier. Considering it took 9 and a half years to reach out first million milestone, the fact that we're now seeing nearly 1 million a month is just staggering. I've stopped posting about passing each million mark as its becoming to frequent. You'll have to wait for the 20 millionth report (expected about Christmas 2011 at the current rate) for the next notable post in that regard.
All these new submissions has had one significant effect. They have exposed our stress points. While I have been already working on improving many aspects of the report builder process, I hadn't been looking at the generator side of things, which takes a feed from the Metabase and inserts the reports in a queue for the builder. The generator started to fall behind, and thanks to the great Devel::NYTProf, I ran several passes over the generator code to see where we were exposed. The two noticeable points were the Metabase feed and the database communication.
Although we were already aware of it, the current level of submissions has highlighted how badly Amazon's SimpleDB is able to cope with the volumes of throughput we now have. As a consequence, David Golden has been working on redesigning the Metabase to use a NoSQL database with much better scaling to cope with future volumes. In the coming months, we'll keep you updated of the progress, and I'm sure David will be posting significant challenges on his own blog. In David's own words; "I've written some non-Amazon back-end classes and will be testing them at scale over the next week or two. If all looks good, I plan to switch over the servers and migrate the legacy data."
The database communication highlighted several areas where single requests were made, where a single large request would be of more benefit by caching the data. There are still more optimisations to be completed, but already the changes I've made have improved the performance. In addition the database has been reconfigured and updated, which has also helped to improve performance.
We're currently seeing a 2-3 day lag of the Metabase feed, which is catching-up slowly, and the reports builder is now roughly 24hrs behind. It's going to be sometime before we are almost realtime, but in the short term we should be able to have all pages displaying their reports with a day of being posted. However, so you know whether your report should have appeared the Report Status page has been updated. The latest report is the most recent report added to the cpanstats database from the Metabase feed, while the oldest page request tells you how old the queue for page building is (note that not all requests are report submissions). Use both to guage whether a particular report may still be waiting to be add to the appropriate page.
In other news I happened to be reading Google+ the other day and spotted a post by Paul Johnson (creator or Devel::Cover), who was able to fix a problem with Devel::Cover thanks to some reports by CPAN Testers. Paul has allowed me to reprint his post:
"CPAN Testers was throwing up a few failures for Devel::Cover. The problem was in a fairly convoluted test whose results were dependant on the number of perl modules being loaded, and that in turn depended on the particular configuration of perl being used.
(The module which might or might not be loaded was Tie::Hash::NamedCapture which provides the behaviour for %+ (new in 5.10), and is used depending on on whether the coverage database is being stored using Data::Dumper or JSON and that depends (before 5.14) on whether or not a JSON module has been installed.)
I thought I had fixed the problem during the Perl QA hackathon in Amsterdam a couple of months ago, but it turns out that I hadn't. The reason was that the API that I had in place (such as it is) for massaging the test output wasn't quite as comprehensive as I thought it was. So I made it more comprehensive, and now I'm expecting far fewer CPAN Testers failures for the next release.
Big thanks to all the CPAN testers for showing me the bugs in my code."
Thanks to Paul for a great example of how CPAN Testers can help. If you have any stories of how CPAN Testers has helped you, please let me know or post on your blog and send me a link.
And finally... YAPC::Europe will be happening 15-17 August, and two specific CPAN Testers talks are to be featured. The first, "Smoking The Onion - Tales of CPAN Testers", will look at some of the hidden corners of CPAN Testers, which can hopefully improve the experience both for authors and users. The second talk, "How CPAN Testers helped me improve my module" by Léon Brocard, takes a look from an author's perspective of how you can use CPAN Testers to help you improve your modules. There are several other Test/TAP related talks too, so if you're going, I hope you can make it along to some of the talks. See you there.
Cross-posted from the CPAN Testers Blog.