AICurious Logo

What is: Locality Sensitive Hashing Attention?

SourceReformer: The Efficient Transformer
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

LSH Attention, or Locality Sensitive Hashing Attention is a replacement for dot-product attention with one that uses locality-sensitive hashing, changing its complexity from O(L2L^2) to O(LlogLL\log L), where LL is the length of the sequence. LSH refers to a family of functions (known as LSH families) to hash data points into buckets so that data points near each other are located in the same buckets with high probability, while data points far from each other are likely to be in different buckets. It was proposed as part of the Reformer architecture.