-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loss is negative when use MCR2 to my own scenario #6
Comments
Hi, I would make sure that Z is normalized before you put it through the loss function. Another possibility is the number of classes is not set correctly. I would try those easy fixes first. In general, the R loss should be bigger than R_c loss, and both loss should be positive, hence the total loss should be positive also. Good luck! |
Could you please check if the loss.item() in pytorch version: Version: 1.9.0+cu111 |
I will try it when I have time. I am busy with my own experiments and if I have results I will contact you. Thanks again for you reply. |
loss func is defined as loss=compress_loss-discrimn_loss , so neg value is correct. |
I tried to use this loss in my own CIFAR-100-long tail scenario, but the loss is negative, meaning that compress_loss_emprical is larger than discrimn loss emprical. What is wrong ?
The text was updated successfully, but these errors were encountered: