AI Blog

AI Blog

by Michele Laurelli

Adam Optimizer

/ˈædəm/
Algorithm
Definition

An adaptive learning rate optimization algorithm combining momentum and RMSprop.

Adam (Adaptive Moment Estimation) computes adaptive learning rates for each parameter using first and second moment estimates. It's the default optimizer for many deep learning applications due to its robustness.

Examples

1

Training transformers

2

Deep neural networks

3

Default optimizer in many frameworks

Michele Laurelli - AI Research & Engineering