AI Blog

AI Blog

by Michele Laurelli

Self-Supervised Learning

Paradigm
Definition

Learning paradigm where models create supervision signal from unlabeled data.

Predicts parts of input from other parts (masked LM, image inpainting, next frame). Enables pre-training on massive unlabeled datasets. Foundation of BERT, GPT.

Examples

1

BERT masked language modeling

2

GPT next token prediction

3

Image rotation prediction

Michele Laurelli - AI Research & Engineering