AI Blog

AI Blog

by Michele Laurelli

Learning Rate Schedule

/ˈlɜːrnɪŋ reɪt ˈʃɛdjuːl/
Technique
Definition

A strategy for adjusting the learning rate during training to improve convergence and performance.

Learning rate schedules reduce the learning rate over time. Common strategies include step decay, exponential decay, cosine annealing, and warm restarts. This helps fine-tune weights as training progresses.

Examples

1

Reduce LR on plateau

2

Cosine annealing

3

Warm-up + decay schedule

Michele Laurelli - AI Research & Engineering