+2 votes
by (98.9k points)
Explain with example confusion matrix, accuracy and precision .

1 Answer

0 votes
by (98.9k points)

Confusion Matrix:

Consider a binary classification problem where we aim to predict whether an email is spam or not spam. The confusion matrix for this scenario is structured as follows:


                     | Predicted Not Spam | Predicted Spam |
Actual Not Spam      |        TN           |        FP       |
Actual Spam          |        FN           |        TP       |


  • TN (True Negative): Emails correctly predicted as not spam.
  • FP (False Positive): Emails incorrectly predicted as spam (Type I error).
  • FN (False Negative): Emails incorrectly predicted as not spam (Type II error).
  • TP (True Positive): Emails correctly predicted as spam.


Accuracy measures the overall correctness of the model, providing the ratio of correctly predicted instances to the total instances.


Precision gauges the accuracy of positive predictions, answering the question: Of the instances predicted as positive, how many are truly positive?

Recall (Sensitivity or True Positive Rate):

Recall assesses the model’s ability to capture all positive instances, answering: Of all actual positive instances, how many were predicted correctly?

F1 Score:

The F1 score, being the harmonic mean of precision and recall, provides a balanced measure between the two metrics. These evaluation metrics collectively offer a comprehensive assessment of a model’s performance, crucial in scenarios with imbalanced classes.

Related questions

0 votes
0 answers 23 views
+1 vote
1 answer 118 views
0 votes
1 answer 122 views
asked May 14, 2021 by Doubtly (98.9k points)

Doubtly is an online community for engineering students, offering:

  • Free viva questions PDFs
  • Previous year question papers (PYQs)
  • Academic doubt solutions
  • Expert-guided solutions

Get the pro version for free by logging in!

5.7k questions

5.1k answers


495 users