by Michele Laurelli
When a model learns training data too well, including noise, resulting in poor generalization to new data.
Overfitting occurs when models are too complex relative to training data size. Solutions include regularization, dropout, early stopping, and data augmentation. Balance between underfitting and overfitting is crucial.
A decision tree memorizing all training examples
Neural network with too many parameters
Model performing 100% on training but 60% on test data