by Michele Laurelli
A table used to evaluate classification model performance by showing true vs predicted classes.
The confusion matrix displays true positives, false positives, true negatives, and false negatives. It enables calculation of metrics like precision, recall, F1-score, and accuracy.
Binary classification evaluation
Multi-class performance analysis
Model error analysis