Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

structs for Neural Operators #19

Closed
wants to merge 16 commits into from
Closed

structs for Neural Operators #19

wants to merge 16 commits into from

Conversation

ayushinav
Copy link
Contributor

fixes #16

@ayushinav
Copy link
Contributor Author

I thought overloading Lux.setup and defining model(x, ps, st) would work without changing anything else but then there are other Lux functions that dispatch on a Lux.AbstractExplictLayer so maybe we can just always call model.ch everywhere?

@avik-pal
Copy link
Member

avik-pal commented Aug 1, 2024

I thought overloading Lux.setup

That is an invalid overload. It almost certainly won't do what you are expecting it to do.

I don't understand the rest of your question.

@ayushinav
Copy link
Contributor Author

ayushinav commented Aug 1, 2024

Instead of moving to structs, it is more convinient to make FNOs a compact layer too. I previously had a struct with the chain ch as its only parameter. This meant we call model.ch everytime, which did not seem very elegant to me, especially for someone using neural operators independently. I overloaded Lux.setup and model(x, ps, st) but there are other Lux functions which take a model and it would be somewhat hard to maintain if overload all the required functions.
Finally, moving FNO to compact layer works, and does not require any change in appplying the models.
To dispatch a function on DeepONet, we have CompactLuxLayer{:DeepONet} and similarly for FNO: CompactLuxLayer{:FourierNeuralOperator}
In future, we would therefore need to have all the models as compact layers or something similar if we want them to be dispatched in a similar way.
@avik-pal

That is an invalid overload. It almost certainly won't do what you are expecting it to do.

I don't understand the rest of your question.

In the structs setup, I had calling Lux.setup(rng, model) = Lux.setup(rng, model.ch) and similarly for model(x, ps, st)= model.ch(x, ps, st) but there are functions such as Lux.Experimental.single_train_step!(...) that take a Lux layer and it won't be easy to account for all such functions

@ayushinav
Copy link
Contributor Author

@avik-pal
Copy link
Member

avik-pal commented Aug 1, 2024

No that is not the correct way to define the structs discussed in that issue. Go through https://lux.csail.mit.edu/stable/manual/interface

@ayushinav
Copy link
Contributor Author

I'll move around the docstring. Right now ig they do not correspond to right function calls.
Also, this might be a good time to convert OperatorKernel into struct too.

@avik-pal
Copy link
Member

avik-pal commented Aug 3, 2024

Yes let's do that.

avik-pal and others added 6 commits August 9, 2024 00:57
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Bumps [crate-ci/typos](https://github.com/crate-ci/typos) from 1.23.2 to 1.23.5.
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](crate-ci/typos@v1.23.2...v1.23.5)

---
updated-dependencies:
- dependency-name: crate-ci/typos
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Make DeepONet and FourierNeuralOperator into structs
2 participants