Precision and Recall

July 13, 2020

When evaluating machine learning models, accuracy is usually the go-to metric. However, it is not always the appropriate metric, especially for imbalanced datasets.  Use the Precision metric when you need to reduce the amount of False Positives and use the Recall metric to reduce the amount of False Negatives. To figure out your scores for each metric, use the following formulas:

  • Precision = True Positives / (True Positives + False Positives)
  • Recall = True Positives / (True Positives + False Negatives).

Aim for higher scores for each of these metrics when evaluating your models.