Replies: 1 comment 3 replies
-
RELU is not differentiable, though considering the definition of the potential energy, |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm curious about why DeePMD-kit uses
tanh
as the default activation function instead ofReLU
, which is more commonly preferred in the deep learning field. I noticed that DeePMD-kit currently supports various activation functions, including "gelu_tf", "none", "linear", "gelu", "softplus", "relu6", "tanh", "sigmoid", and "relu".Here are my questions:
I would greatly appreciate any insights from the development team or the community regarding these questions. This would be particularly helpful for users like me in choosing the most appropriate activation function for specific use cases.
Thank you in advance for any guidance or information!
Beta Was this translation helpful? Give feedback.
All reactions