This is pretty explanatory. Here are some remarks on the meanings:
- Predictions specify whether the output is positive or negative
- Match between prediction and actual condition specify if the output is true or false
Precision
Precision is the measure of how we are successfully classifying the positives. More clearly, how much of the positive outputs are actually positive. If all of the positive outputs are actually positive, we get 100% precision.
Formula: \(\frac{TP}{TP+FP}\)
Recall
Recall is the measure of our model performance on finding the actual positives. If our model can find all actual positives, the recall is 100%.
Formula: \(\frac{TP}{TP+FN}\)
True Positive Rate (TPR)
Formula: \(\frac{TP}{TP+FN}\)
False Positive Rate (FPR)
Formula: \(\frac{FP}{FP+TN}\)
False Discovery Rate (FDR)
Formula: \(\frac{FP}{FP+TP}\)
F1-Score
Formula: \(\frac{TP}{2TP+FP+FN}\)
References
[1] https://en.wikipedia.org/wiki/Precision_and_recall
Leave a Reply