-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Seems that the abs activation is redundant? #13
Comments
@yzslab You are correct. The abs activation is kept just as a failsafe and we don't expect any performance improvements with it |
Hi, does the
Could you please confirm if this conclusion is still valid? |
Hi, I found that the opacities will not be optimized after replacing the activation function with the abs here, due to the absence of the$$(0, 1)$$ and the opacities are fixed, they will never fall below 0. Therefore, it seems that the abs activation is redundant, as simply returning the same tensor suffices.
replace_tensor_to_optimizer
operation.Since the range of the Sigmoid is
I tested removing the abs by the modification below and it produced the same metrics:
Is this expected?
The text was updated successfully, but these errors were encountered: