-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calculating the AUC and ROC #370
Comments
I'm not sure how you're plotting the ROC curve when you need a threshold to sweep through to change the point at which a label is predicted. Tribuo already supports AUCROC for classifiers which produce probabilities, but KNNTrainer doesn't. |
Well I just calculate the FPR and TPR after each fold and use them to plot my ROC curve and I pass them to AUCCalculator to get the AUC value which is done by the trapezoidal rule. please if this is not correct tell me to change it. |
That won't give you an appropriate ROC curve as it's not on the same data and doesn't represent how changing the classification threshold would change the false positive & true positive rate. |
Thanks for that, can you give some suggestions here.
…On Wed, May 22, 2024 at 2:11 PM Adam Pocock ***@***.***> wrote:
That won't give you an appropriate ROC curve as it's not on the same data
and doesn't represent how changing the classification threshold would
change the false positive & true positive rate.
—
Reply to this email directly, view it on GitHub
<#370 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AWGLSQ5CAG2SX2QE45DIVG3ZDTNWTAVCNFSM6AAAAABICPAJXKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMRVGQ2TIMBQGM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
You'll need to use a model which supports generating probabilities, and then you can use the methods on LabelEvaluation to compute the AUC, or LabelEvaluationUtil to compute the ROC curve itself - https://tribuo.org/learn/4.3/javadoc/org/tribuo/classification/evaluation/LabelEvaluationUtil.html. |
Dear Adam, I need your help here. After getting the FPR nad TPR and Threshold like this: FPR: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.007352941176470588, 0.014705882352941176, 0.022058823529411766, 0.029411764705882353, 0.03676470588235294, 0.04411764705882353, 0.051470588235294115, 0.058823529411764705, 0.0661764705882353, 0.07352941176470588, 0.08088235294117647, 0.08823529411764706, 0.09558823529411764, 0.10294117647058823, 0.11764705882352941, 1.0] TPR: [0.0, 0.9456521739130435, 0.9565217391304348, 0.967391304347826, 0.9782608695652174, 0.9891304347826086, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0] Threshold: [Infinity, 0.9999999999999585, 0.9999999999900508, 0.9999999999532103, 0.9999999993470261, 0.999999986279678, 0.999999082458228, 0.4672622616731544, 0.010474966475835753, 7.848048691768931E-4, 4.464634619108306E-4, 1.8357524563945583E-4, 7.697946270832445E-5, 1.5905677137563365E-5, 1.258714255136621E-7, 4.6428762209717544E-8, 1.3855807195706487E-8, 8.900923141832403E-9, 7.567814072544735E-9, 7.443858692758792E-9, 2.8687081675940852E-9, 1.3147063911388807E-12, 1.7464080806775956E-82] when plotting FPR and TPR, how to get the number of correctly classified points corresponding the positive label to get the ROC ?! |
That information isn't stored in the ROC class, the number of correctly classified points is stored in your |
Dear Tribuo developers,
I am trying to get the TRP and the FTP in order to calculate the AUC and plot the ROC curve. But the results sometimes unreasonable since i have high accuracy, yet low AUC. Furthermore, here is my code:
The text was updated successfully, but these errors were encountered: