LLM Reference
AI Glossary

Rotary Position Embedding

RoPE

Definition

RoPE (Rotary Position Embedding) is a positional encoding method that encodes absolute positions using rotation matrices, enabling efficient relative position representation in Transformers. It improves generalization to longer sequences and has become the standard in modern LLMs like Llama and GPT variants.