Replies: 1 comment 1 reply
-
⬇️ |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Activation functions are the key to representing complex nonlinear functions with neural networks. As such, it is important that we implement a wide variety of them to ensure a flexible framework. You can see many activation functions here that should be considered for implementation. As of now, we have ReLU, Softmax, and Tanh completed, so you can use their source files for reference.
Beta Was this translation helpful? Give feedback.
All reactions