by Michele Laurelli
A technique to inject position information into transformer inputs since transformers lack inherent sequence order.
Positional encodings add unique patterns to each position in a sequence, allowing transformers to distinguish order. Common methods use sine/cosine functions or learned embeddings.
Sinusoidal positional encoding in original Transformer
Learned positional embeddings
Relative position encodings