AI Blog

AI Blog

by Michele Laurelli

Epoch

/ˈɛpɒk/
Training
Definition

One complete pass through the entire training dataset during model training.

Multiple epochs allow the model to learn patterns iteratively. Too few epochs lead to underfitting, too many to overfitting. Monitored with validation loss for early stopping.

Examples

1

Training for 100 epochs

2

Early stopping at epoch 45

3

Learning curves over epochs

Michele Laurelli - AI Research & Engineering