AI Blog

AI Blog

by Michele Laurelli

Activation Function

/ˌæktɪˈveɪʃən ˈfʌŋkʃən/
Concept
Definition

A mathematical function applied to a neuron's output to introduce non-linearity into the network.

Activation functions enable neural networks to learn complex patterns. Common functions include ReLU, sigmoid, tanh, and softmax. They determine whether a neuron should be activated based on input.

Examples

1

ReLU for hidden layers

2

Sigmoid for binary classification

3

Softmax for multi-class classification

Michele Laurelli - AI Research & Engineering