Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss is negative when use MCR2 to my own scenario #6

Open
adf1178 opened this issue May 28, 2021 · 4 comments
Open

Loss is negative when use MCR2 to my own scenario #6

adf1178 opened this issue May 28, 2021 · 4 comments

Comments

@adf1178
Copy link

adf1178 commented May 28, 2021

I tried to use this loss in my own CIFAR-100-long tail scenario, but the loss is negative, meaning that compress_loss_emprical is larger than discrimn loss emprical. What is wrong ?

@ryanchankh
Copy link
Owner

Hi, I would make sure that Z is normalized before you put it through the loss function. Another possibility is the number of classes is not set correctly. I would try those easy fixes first. In general, the R loss should be bigger than R_c loss, and both loss should be positive, hence the total loss should be positive also. Good luck!

@wetliu
Copy link

wetliu commented Jul 18, 2021

Could you please check if the loss.item() in python3 train_sup.py --arch resnet18 --data cifar10 --fd 128 --epo 50 0 --bs 1000 --eps 0.5 --gam1 1 --gam2 1 --lr 0.01 --lcr 0.0 is negative?

pytorch version: Version: 1.9.0+cu111

@adf1178
Copy link
Author

adf1178 commented Jul 18, 2021

I will try it when I have time. I am busy with my own experiments and if I have results I will contact you. Thanks again for you reply.

@zhrli
Copy link

zhrli commented Aug 5, 2021

Could you please check if the loss.item() in python3 train_sup.py --arch resnet18 --data cifar10 --fd 128 --epo 50 0 --bs 1000 --eps 0.5 --gam1 1 --gam2 1 --lr 0.01 --lcr 0.0 is negative?

pytorch version: Version: 1.9.0+cu111

loss func is defined as loss=compress_loss-discrimn_loss , so neg value is correct.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants