-
-
Notifications
You must be signed in to change notification settings - Fork 610
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature requests for RNNs #2514
Comments
Could it be possible to add to the list the option to use different initializers for the input matrix and recurrent matrix? This is provided by both Keras/TF and Flax. This should be as straightforward as function RNNCell((in, out)::Pair, σ=relu;
kernel_init = glorot_uniform,
recurrent_kernel_init = glorot_uniform,
bias = true)
Wi = kernel_init(out, in)
U = recurrent_kernel_init(out, out)
b = create_bias(Wi, bias, size(Wi, 1))
return RNNCell(σ, Wi, U, b)
end I can also open a quick PR on this if needed |
yes! PR welcome |
Following up on this, should we also have an option to choose the init for the bias? |
We don't do it for feedforward layers, if someone wants a non-zero bias can just change it manually in the constructor, |
After the redesign in #2500, here is a list of potential improvements for recurrent layers and recurrent cells
Bidirectional
for RNN layers #1790)The text was updated successfully, but these errors were encountered: