What is confusion matrix?

Confusion matrix is a table that is used to measure the performance of the machine learning classification model(typically for supervised learning, in case of unsupervised learning it usually called matching matrix) where output can be two or more classes. Each row of the confusion matrix represents the instances in a predicted class while each column represents the instance in an actual class or vice versa.

Confusion matrix is also known as error matrix.

 In this article we will be dealing with the various parameters of confusion matrix and the information that we can extract from it.The structure of confusion matrix is as shown in the figure below.

Confusion matrix
Confusion matrix

 Now let’s understand what are TP, FP, FN, TN.

Here we have two classes Yes and No, then we define,

  1. TP-True positive: You predicted Yes class and its actual class is also Yes.
  2. TN-True negative: You predicted No class and its actual class is No.
  3. FP-False positive: You predicted Yes class but actually it belongs to No class. It is also called type 1 error.
  4. FN-False Negative: You predicted No class but actually it belongs to Yes class. It is also called type II error.
So, what are the classification performance metrics that we can calculate from the confusion matrix. Let’s see.
 
By observing the confusion matrix we can calculate Accuracy, Recall, Precision and F1-score(or F measure) of the classification model. Let’s understand them taking an example of confusion matrix.

Confusion matrix 1
Confusion matrix 1

Information we obtain from above confusion matrix:

  1. There are all together 165 data points (i.e. observations or objects)  and they are classified into two classes Yes and No.
  2. Our classification model predicted Yes, 110 times, and No, 55 times But according  to the actual classification, there are all  together 105, Yes and 60, No’s .
The confusion matrix including above calculations is as given below,
Confusion matrix 2
Confusion matrix 2

Now, let’s understand above metrics in brief.

  • Accuracy: The accuracy of classification can be obtained by using the formula below,

    Confusion matrix- Accuracy
    Confusion matrix- Accuracy
On the basic of above confusion matrix we can calculate the accuracy of model as,
Accuracy = (100+50) /(100+5+10+50)= 0.90
 
The error of the classification is given as ,
Error = 1- Accuracy = 1 – 0.90 = 0.1.
  • Precision: It tells,out of all the classes, how much our classifier predicted correctly. It should be high as possible. In other words, Precision tells us about when it predicts a class, how often is it correct. It is calculated by using the formula below,
    Confusion matrix - Precision
    Confusion matrix – Precision
On the basic of above confusion matrix we can calculate the Precision of model as,
Precision = 100/ (100+10)=0.91
  • Recall : Recall tells us about when it is actually yes, how often does our classifier predicted  yes. or it can also be defined as, out of all the positive classes, how much our classifier predicted correctly. It is calculated by using the formula below,
    Confusion matrix- Recall
    Confusion matrix- Recall
On the basic of above confusion matrix we can calculate the Recall of model as,
Recall= 100/(100+5)=0.95
  • F-measure(F1-Score): F-measure(F1-Score) is obtained as the harmonic mean of recall and Precision.It is calculated by using the formula below,
    Confusion matrix- F-measure(F1-Score)
    Confusion matrix- F-measure(F1-Score)
On the basic of above confusion matrix we can calculate the F-measure of model as,
F-measure = (2*Recall*Precision)/(Recall+Presision)
                   =(2*0.95*0.91)/(0.91+0.95)
                   =0.92
Also read:

One thought on “What is confusion matrix?

Leave a Reply

Insert math as
Block
Inline
Additional settings
Formula color
Text color
#333333
Type math using LaTeX
Preview
\({}\)
Nothing to preview
Insert
%d bloggers like this: