-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add SOTA optimisers #61
Comments
@GilesStrong Can we try https://github.com/jettify/pytorch-optimizer (they provide both RAdam and Ranger along with few more optimizers). Since lumin uses torch.optim to load optimizers, adding pytorch-optimizer as an installation requirement and using that in ModelBuilder can be a quick solution. |
Hi @kiryteo , thanks for looking into this! Since writing this issue, I've tried the new optimisers out in various different tasks, but so far haven't found them to be significantly beneficial. Based on this, I would be hesitant to include them in LUMIN, or add an extra dependence. |
I agree, it could be an additional dependency and instead an example for custom addition could be useful as well. Are you considering a jupyter notebook or simple section addition in the README? |
I think in the "Single_Target_Regression_Di-Higgs_mass_prediction.ipynb" example, Ranger could be used as a custom optimiser, with some text emphasising that a custom optimiser is being used. (Either pip-install pytorch-optimizer within the notebook, or copy the ranger source code and their licence header into the notebook. |
Alright, thanks! I'm trying to add this now but it seems that the notebook has some issues with uproot as I get error at loading the ROOTTree as DataFrame |
Thanks for looking into this. Uproot 4 got released with breaking API changes. If you change the pip install line in the first code cell to |
Thanks, I'll proceed with this! |
There was a big kerfuffle in 2019 about some new optimisers: Regularised Adam (Liu et al., 2019), Look Ahead (Zhang, Lucas, Hinton, & Ba, 2019), and a combination of both of them, Ranger (which also now includes Gradient Centralization (Yong, Huang, Hua, & Zhang, 2020).
Having tried these, (except the latest version of Ranger), I'vbe not found much improvement compared to Adam, but this was only on one dataset. The performance of Ranger, though, looks to be quite good for other datasets, so perhaps it is useful.
User-defined optimisers can easily be used in LUMIN, by passing the partial optimiser to the
opt_args
argument ofModelBuilder
, e.g.opt_args = {'eps':1e-08, 'opt':partial(RAdam)}
. It could be useful, however, to include the optimisers in LUMIN, to allow them to be easily used, without the user having to include an copied code.These git repos include Apache 2.0 - licensed implementations of Radam and Ranger, so inclusion should be straight forward.
The text was updated successfully, but these errors were encountered: