AI Blog

AI Blog

by Michele Laurelli

Dropout

Technique
Definition

A regularization technique that randomly deactivates neurons during training to prevent overfitting.

During each training iteration, dropout temporarily removes a random percentage of neurons. This forces the network to learn more robust and generalizable representations.

Examples

1

Dropout with rate 0.5 in dense layers

2

Reducing overfitting in deep networks

3

Improving generalization

Michele Laurelli - AI Research & Engineering