by Michele Laurelli
A strategy for adjusting the learning rate during training to improve convergence and performance.
Learning rate schedules reduce the learning rate over time. Common strategies include step decay, exponential decay, cosine annealing, and warm restarts. This helps fine-tune weights as training progresses.
Reduce LR on plateau
Cosine annealing
Warm-up + decay schedule