AI Blog

AI Blog

by Michele Laurelli

Gradient Descent

/ˈɡreɪdiənt dɪˈsɛnt/
Algorithm
Definition

An optimization algorithm that iteratively adjusts parameters to minimize a loss function by following the gradient.

Gradient descent updates parameters in the direction opposite to the gradient. Variants include batch, mini-batch, and stochastic gradient descent (SGD). Learning rate controls step size.

Examples

1

Training neural network weights

2

Optimizing linear regression

3

Fine-tuning model parameters

Michele Laurelli - AI Research & Engineering