-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Easy as possible to use #55
Labels
enhancement
New feature or request
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Enhancement
Make Cooper as easy as possible to integrate with a vanilla Pytorch pipeline.
In particular, allow for the user to populate a
CMPState
and provide it to Cooper without having to create a custom ConstrainedMinimizationProblem.Note that we should not remove the CMP altogether. In particular, extrapolation and alternating updates would still require internal calls to
cmp.closure
andcmp.defect_fn
.This would require a major overhaul of the documentation, tutorials, and examples, showcasing that both the previous CMP approach and this new approach are admissible. Love for the documentation has also been requested in #53 and #29.
Motivation
Users are currently required to implement a custom ConstrainedMinimizationProblem with a closure method.
This overhead can be detrimental to Cooper attracting ML researchers and practitioners who would otherwise compute the loss and constraint violations themselves.
For minimal integrations with Cooper which do not need "fancy features" like Augmented Lagrangian or Extrapolation, simply asking for the loss and constraint defects makes the user's life easier.
Alternatives
It may be possible to remove the CMP completely. Nonetheless, we would still require a
closure
for an internal call during extrapolation steps and adefect_fc
for internal use by theAlternatingConstrainedOptimizer
.The text was updated successfully, but these errors were encountered: