AI Blog

AI Blog

by Michele Laurelli

Confusion Matrix

/kənˈfjuːʒən ˈmeɪtrɪks/
Concept
Definition

A table used to evaluate classification model performance by showing true vs predicted classes.

The confusion matrix displays true positives, false positives, true negatives, and false negatives. It enables calculation of metrics like precision, recall, F1-score, and accuracy.

Examples

1

Binary classification evaluation

2

Multi-class performance analysis

3

Model error analysis

Michele Laurelli - AI Research & Engineering