Moving PPI to GitHub encourages some new activity
With the increased number of module takeover requests I've been seeing in the second half of the year, I've been under increased pressure to put more serious efforts into moving my modules to GitHub.
Over the last few weeks I've been writing wrapper code around svn2git and doing some bulk conversion test runs on the 200 or so modules in my repository for which I am still the most recent distribution publisher.
After hand-labelling Git author emails for the 100ish users in my repository, the conversion process seems to run fairly well for modules which don't contain any branches (svn2git still seems to misbehave on my repo despite the use of --nobranches.
Before I start bulk creating GitHub repos, I'm testing the result of the conversion on a few important modules that have suffered the most pain from lack of releases.
The GitHub PPI repository in particular has seen immediate activity from new contributors following it's creation, with merge requests appearing within the first 24 hours.
The goal of these changes is to significantly improve PPI's parsing speed in various pathological cases such as Data::Dumper files and large __DATA__ areas.
These pathological cases are the main reason behind the file size limit, and these early benchmarks are exciting because they indicate we should be able to achieve something close to O(n) parsing performance in almost all cases.
This should allow either an order-of-magnitude increase or (more likely) the complete removal of the maximum file size limit from PPI, which will improve downstream behavior in many different apps.
The second module to move across to GitHub is Perl::MinimumVersion, another very handy module used by a number of downstream tools.
If anyone has any module requests to add to this initial import (prior to the much bigger 200-module dump) I would be happy to take requests at this time.