AI Blog

AI Blog

by Michele Laurelli

Batch Gradient Descent

/bætʃ ˈɡreɪdiənt dɪˈsɛnt/
Algorithm
Definition

A gradient descent variant that computes gradients using the entire training dataset in each iteration.

Batch GD provides stable, accurate gradient estimates but is computationally expensive for large datasets. It guarantees convergence to global minimum for convex problems.

Examples

1

Full batch training on small datasets

2

Theoretical analysis

3

Deterministic optimization

Michele Laurelli - AI Research & Engineering