Skip to content

Positional Encoding

A technique for injecting token position information into a transformer, which otherwise has no inherent sense of word order. Methods include fixed sinusoidal encodings and learned embeddings. Rotary position embeddings (RoPE) are widely used in modern LLMs.

Related terms

TransformerAttention MechanismToken
← Back to glossary