Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Validation score are >90% even if the prediction is wrong #14

Open
mzouink opened this issue Feb 5, 2024 · 0 comments
Open

Validation score are >90% even if the prediction is wrong #14

mzouink opened this issue Feb 5, 2024 · 0 comments
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@mzouink
Copy link
Member

mzouink commented Feb 5, 2024

Validation score calculation (e.g. F1 Score) are currently using the model class - Forground/Background weighting
Related to issue : #3

TODO create score using 50/50 Forground/Background weighting or different weighting mechanism

@mzouink mzouink added bug Something isn't working help wanted Extra attention is needed labels Feb 5, 2024
@mzouink mzouink moved this to Todo in DaCapo Hackathon 2024 Feb 5, 2024
@rhoadesScholar rhoadesScholar moved this from Todo to Backlog in DaCapo Hackathon 2024 Feb 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
Status: Backlog
Development

No branches or pull requests

1 participant