What Is A Confusion Matrix?

What Is A Confusion Matrix?

The most aptly named AI term is actually simple.

A Confusion Matrix is a table that shows how often an AI classifier gets confused predicting true and false conditions. Here is a simple example of a Confusion Matrix for a model that classifies whether a fruit is an orange or not, out of a sample of 166.

How Well Did My Model Do?

As you can see from the table, the classifier was pretty accurate overall. It was correct (true positives and true negatives) 155 times, or 93.37% of the time. It did very well at predicting when fruits were oranges – only one wrong (false negative), or about 99%. It was not as good at predicting when they were not oranges – 83.3% right and 16.7 % wrong (false positives).

Confusion matrices are especially informative when considering the consequences of false negatives versus false positives in your use cases.