AI Blog

AI Blog

by Michele Laurelli

Overfitting

/ˌoʊvərˈfɪtɪŋ/
Concept
Definition

When a model learns training data too well, including noise, resulting in poor generalization to new data.

Overfitting occurs when models are too complex relative to training data size. Solutions include regularization, dropout, early stopping, and data augmentation. Balance between underfitting and overfitting is crucial.

Examples

1

A decision tree memorizing all training examples

2

Neural network with too many parameters

3

Model performing 100% on training but 60% on test data

Michele Laurelli - AI Research & Engineering