-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CPAN builds - assess use of App::cpm #105
Comments
cpm invocation will need these flags at least
|
What I did back in 2009 was to build off a local minicpan, so everything was ALREADY downloaded. I don't know if that is still being done, or if it's orthogonal to the question asked. |
Thanks @csjewell In this case the downloads aren't an issue (except in the rare case a module author uploads a new but broken version). The issue is the time taken to build and test the full set of 200+ modules and their dependencies. The restore points are only saved when a stage is completed so there is a lot of rework when a module build fails. Parallel builds speed things up. We could also subdivide the cpan step to get more restore points. However, the advantage of cpm is that it caches the build artefacts so there is no need to rebuild successfully-built modules. This has a greater potential impact during a development phase, such as we just went through, but even for routine builds it can be useful. |
The build system currently uses cpanm for the CPAN build step. This works well but there is no caching of package build results.
If one package install fails then the whole step is flagged as failed and no restore point is generated. That means ~230 or more unnecessary rebuilds, plus dependencies.
App::cpm retains build products and re-uses them. This would greatly speed up CPAN rebuilds.
Hopefully it is just a case of adapting the current CPANMINUS_install_module.pl script, although some arguments might need to be updated. https://github.com/StrawberryPerl/Perl-Dist-Strawberry/tree/master/share/utils
The text was updated successfully, but these errors were encountered: