-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
paid work- optimise NGBoost #298
Comments
@Geethen You might be interested in using LightGBMLSS which is an extension of LightGBM to probabilistic forecasting. |
Also, if all you want are prediction intervals and you don't need the full conditional density you should use conformal inference instead: https://cdsamii.github.io/cds-demos/conformal/conformal-tutorial.html. |
@StatMixedML Thank you very much- this seems very useful. I will give it a test drive. @alejandroschuler - I was interested in NGBoost because it has been significantly outperforming LightGBM, XGBoost, etc, on the regression problems I am working on. Nevertheless, thank you for your suggestion. |
Hey @alejandroschuler ! What do you mean by the full conditional density? |
That seems suspicious... If you are just doing standard regression point prediction of the response (i.e. you don't care about the distribution of |
The data are assumed to be draws from some unknown distribution The conditional density fully describes the conditional distribution so you can build a prediction interval from it (i.e. for each |
Conformal Predictive Distributions estimate full conditional density one can find tutorials and more materials here https://github.com/valeman/awesome-conformal-prediction https://proceedings.mlr.press/v91/vovk18a.html https://www.youtube.com/watch?v=FUi5jklGvvo&t=3s Crepes library in Python based on Conformal Predictive Distributions builds the complete CDF for each test object whilst providing guarantees that the density is well calibrated and valid including being located in the right place. https://github.com/henrikbostrom/crepes |
The company I work for is looking for someone to reduce the amount of time NGBoost takes to train and run inference.
We will be willing to pay someone to refactor the algorithm (rates are negotiable). All work done will be made openly available. depending on the speedups achieved, a research paper could stem from your work (if that is something you are interested in).
Deliverable: A version of NGBoost that runs close to the speeds of LightGBM (or faster) by leveraging numba/C++/GPU acceleration or any other relevant acceleration approaches (maybe incorporating LightGBM techniques).
Link to the company website:
If interested and you have the time available, please email me to discuss options, timelines and payment: [email protected]
The text was updated successfully, but these errors were encountered: