AI Blog

AI Blog

by Michele Laurelli

Batch Normalization

Technique
Definition

Normalizes layer inputs using batch statistics to stabilize and accelerate training.

Normalizes using batch mean/variance, then scales/shifts with learnable parameters. Reduces internal covariate shift. Allows higher learning rates.

Examples

1

After conv layers

2

Before activation

3

Stabilizing deep networks

Michele Laurelli - AI Research & Engineering