by Michele Laurelli
A layer that applies a non-linear activation function element-wise to its input.
Activation layers introduce non-linearity, enabling networks to learn complex patterns. Often placed after linear transformations (convolution, dense layers). Common: ReLU, sigmoid, tanh layers.
ReLU activation layer
Sigmoid output layer
Tanh hidden layer