Leaderboards & Low Hanging Fruit
Back in May there was some discussion on the CPAN Testers Discussion list regarding the statistics available on the CPAN Testers Statistics website. As I was already in the process of cleaning up several pages, some of the ideas looked worthy of including in the next release. Two have made it, and are now available for those interested in the reporting statistics.
The first is a change to the way the leaderboards are created. We still have the master Tester Leaderboard, but now in addition to that we have leaderboards for each Operating System. It was highlighted that new testers might find it hard to get motivated to be a high volume tester as reaching the dizzy heights of the all time top ten takes a lot of effort. However, when looking on a per OS basis, it becomes a lot easier to see firstly where a tester can make a significant impact, and secondly what OSes we desperately need reports for. On the Monthly Operating System page it is reasonbly easy to see which OSes are getting the most attention, but previously we didn't highlight who was making the contributions or help the less popular OSes gain more reports. Hopefully with the new leaderboards, we can highlight the efforts of some of these testers.
Initially we did think that the new leaderboards might hightlight the need for more Windows and Mac OS X testers, but as those leaderboards show we do already have a steady stream of reports, with the likes of Serguei Trouchelle and Florian Helmberger providing some great coverage of those OSes, although we're happy to have more testers contributing too. However, looking at some of the more specialised OSes, we could do with a few more reports for AIX, Dragonfly BSD, HP-UX, Haiku, VMS (which includes OpenVMS). Although Chris Williams does have Dragonfly BSD well in his sights, it would be good to get variety of platform environments submiting reports. So if you're looking for an OS to make an impact with, please take at look the new leaderboards and see what you can help with.
Our second change has been the inclusion of a No Reports page. This page lists all the distributions that have been uploaded to CPAN, where the latest version has not received any reports. There are several reasons why a distribution may not have generated any reports, with the most common reason being that it appears on one or more ignore lists. We have listed some of the ignore lists from our high-profile testers on the page, so if one of your distributions is listed, take a look and see whether its been added to an ignore list. If you believe it shouldn't be on a tester's ignore list, perhaps because you've now resolved the issue that prevented it being tested, please post to the CPAN Testers Discussion list and let the testers know they can now test your distribution.
The list of distributions on the No Reports page is currently being refined, as some distributions do not apply. Distributions that have been repackaged under new names, or are a collection of scripts or data files, will eventually be removed. If you spot any distribution that shouldn't really be there, please let me know.
However, several distributions do deserve to be highlighted on the list, as they are often badly packaged. Although some have been left on the shelf and should perhaps have been deleted from CPAN, many have been uploaded without test files, meta files or suitable make/build files. In some cases the directory structure could do with some attention too. If you're looking for a short project, many of these distributions could do with some TLC. As an aside you may also wish to look at one of the Failures lists to help distributions with too many fail reports.
My thanks particularly to David Golden and Gabor Szabo for the ideas and suggestions for more statistics. There are more statistics to be added, but if you have any more ideas that we could use, please let me know. Patches (as always) are welcome :)
Cross-posted from the CPAN Testers Blog
Leave a comment