AICurious Logo

What is: Linformer?

SourceLinformer: Self-Attention with Linear Complexity
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Linformer is a linear Transformer that utilises a linear self-attention mechanism to tackle the self-attention bottleneck with Transformer models. The original scaled dot-product attention is decomposed into multiple smaller attentions through linear projections, such that the combination of these operations forms a low-rank factorization of the original attention.