by Michele Laurelli
A configuration variable that is set before training and controls the learning process.
Hyperparameters are not learned from data but set by the practitioner. Examples include learning rate, batch size, number of layers, and dropout rate. Tuning them is crucial for optimal performance.
Learning rate = 0.001
Batch size = 32
Number of hidden layers = 5