AI Blog

AI Blog

by Michele Laurelli

Backpropagation

/ˌbækprɒpəˈɡeɪʃən/
Training
Definition

An algorithm for training neural networks by calculating gradients of the loss function with respect to weights.

Backpropagation uses the chain rule to efficiently compute gradients layer by layer, from output to input. These gradients guide weight updates during training through gradient descent optimization.

Examples

1

Training deep neural networks

2

Optimizing convolutional layers

3

Fine-tuning language models

Michele Laurelli - AI Research & Engineering