-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deeponet #5
Deeponet #5
Conversation
Also cc @ChrisRackauckas |
Also can you open a PR setting up CI workflows. Copy the ones over from https://github.com/LuxDL/Boltz.jl/. Don't have to setup the documentation workflows, we will put the documentation of the package in the main lux repo https://lux.csail.mit.edu/stable/api/Domain_Specific_Modeling/Boltz |
Rebase I have added CI (CPU, CUDA, and AMDGPU). |
Thanks, that helps! |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #5 +/- ##
==========================================
+ Coverage 97.05% 97.53% +0.47%
==========================================
Files 7 8 +1
Lines 68 81 +13
==========================================
+ Hits 66 79 +13
Misses 2 2 ☔ View full report in Codecov by Sentry. |
Looks like it's all good? @avik-pal |
Format check is failing, run the formatter and it should be good. |
The failing test has the following main lines in the error message.
ERROR: Method overwriting is not permitted during Module precompilation. Use ``__precompile__(false)`` to opt-out of precompilation.
and
ArgumentError: Illegal conversion of a CUDA.DeviceMemory to a Ptr{Float32}
.Wasn't completely sure about this and thought about putting it here.
The first
DeepONet
function provides the feature compatibility withNeuralOperatorrs.jl
and provides the more commonly usedDense
layers to construct the branch and the trunk nets. Its documentation can be accessed byhelp?> DeepONet(;)
or any other matching dispatch.The second function connects any two architectures provided they have compatible last layers. Its documentation pops up when someone types in
help?> DeepONet
.Also, had to do a small workaround to get the printing of architectures better.
We can make tutorials for both implementations. Putting the ideas down here until we make docs and tutorials and before we merge them into
NeuralOperators.jl
.