AI Blog

AI Blog

by Michele Laurelli

Hyperparameter

/ˌhaɪpərpəˈræmɪtər/
Concept
Definition

A configuration variable that is set before training and controls the learning process.

Hyperparameters are not learned from data but set by the practitioner. Examples include learning rate, batch size, number of layers, and dropout rate. Tuning them is crucial for optimal performance.

Examples

1

Learning rate = 0.001

2

Batch size = 32

3

Number of hidden layers = 5

Michele Laurelli - AI Research & Engineering