We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug 🐞
Unclear internal behavior of the first layer in NOMAD model.
Minimal Reproducible Example 👇
using Lux, NeuralOperators, Random nomad = NOMAD(Chain(Dense(8 => 8, σ), Dense(8 => 7)), Chain(Dense(8 => 4, σ), Dense(4 => 1))) rng = Random.default_rng() ps, st = Lux.setup(rng, nomad) u = rand(Float32, 8, 10) y = rand(Float32, 1, 10) v = nomad.model.layers[1](u, ps.layer_1, st.layer_1)[1] @show size(v)
Error & Stacktrace ⚠️
size(v) = (15, 10)
Expected behavior
The shape of the output from the first layer is expected to be (7, 10) so that it can be concatenated to the subsequent layer.
Environment (please complete the following information):
using Pkg; Pkg.status()
[b2108857] Lux v1.4.1 [ea5c82af] NeuralOperators v0.5.2
versioninfo()
Julia Version 1.11.1
The text was updated successfully, but these errors were encountered:
cc @ayushinav
Sorry, something went wrong.
Successfully merging a pull request may close this issue.
Describe the bug 🐞
Unclear internal behavior of the first layer in NOMAD model.
Minimal Reproducible Example 👇
Error & Stacktrace⚠️
Expected behavior
The shape of the output from the first layer is expected to be (7, 10) so that it can be concatenated to the subsequent layer.
Environment (please complete the following information):
using Pkg; Pkg.status()
versioninfo()
The text was updated successfully, but these errors were encountered: