Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using NLL_ours as the loss function results in a negative loss #10

Open
Taxalfer opened this issue Nov 20, 2023 · 3 comments
Open

Using NLL_ours as the loss function results in a negative loss #10

Taxalfer opened this issue Nov 20, 2023 · 3 comments

Comments

@Taxalfer
Copy link

I wanted to try to train a new model using my own dataset, and when using NLL_ours as the Loss function, the loss value would gradually become negative during training. While training is normal when using L2 or AL, I don't know how to solve it. Looking forward to your reply.

@Genshin-Impact-king
Copy link

Hello,I also want to train a new model using my own dataset.But I noticed that "--dataset_name" only has nyu/scannet,I wonder that if I should load my data with /data/dataloader_custom.py? Or if there are any other steps I should take?Thank you.

@Taxalfer
Copy link
Author

/data/dataloader_custom.py is used to load data when you use test.py, if you want to train your own data, you may need to write a new dataloader

@baegwangbin
Copy link
Owner

Hi, very sorry for the delayed response.

For NLL_ours, it is natural that the loss becomes negative. The likelihood can be higher than 1 and the loss (negative log likelihood) can thus be smaller than 0. There is nothing to worry about.

For custom datasets, you need to write your own dataloader, as different datasets have different format (e.g. for GT surface normals).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants