AI Blog

AI Blog

by Michele Laurelli

Optimization

Concept
Definition

The process of adjusting model parameters to minimize the loss function and improve performance.

Optimization uses algorithms like gradient descent, Adam, and RMSprop to update model weights. The goal is to find optimal parameters that minimize prediction error.

Examples

1

Adam optimizer for fast training

2

SGD with momentum

3

Learning rate scheduling

Michele Laurelli - AI Research & Engineering