AICurious Logo

What is: Rotary Position Embedding?

SourceRoFormer: Enhanced Transformer with Rotary Position Embedding
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Rotary Position Embedding, or RoPE, is a type of position embedding which encodes absolute positional information with rotation matrix and naturally incorporates explicit relative position dependency in self-attention formulation. Notably, RoPE comes with valuable properties such as flexibility of being expand to any sequence lengths, decaying inter-token dependency with increasing relative distances, and capability of equipping the linear self-attention with relative position encoding.