AI Blog

AI Blog

by Michele Laurelli

Positional Encoding

/pəˈzɪʃənəl ɪnˈkoʊdɪŋ/
Technique
Definition

A technique to inject position information into transformer inputs since transformers lack inherent sequence order.

Positional encodings add unique patterns to each position in a sequence, allowing transformers to distinguish order. Common methods use sine/cosine functions or learned embeddings.

Examples

1

Sinusoidal positional encoding in original Transformer

2

Learned positional embeddings

3

Relative position encodings

Michele Laurelli - AI Research & Engineering