AI Blog

AI Blog

by Michele Laurelli

Early Stopping

/ˈɜːrli ˈstɒpɪŋ/
Technique
Definition

A regularization technique that stops training when validation performance stops improving.

Early stopping monitors validation loss during training and stops when it hasn't improved for a set number of epochs (patience). This prevents overfitting by stopping before the model memorizes training data.

Examples

1

Stop training at epoch 50 when validation loss plateaus

2

Patience of 10 epochs

3

Restore best weights

Michele Laurelli - AI Research & Engineering