AICurious Logo

What is: LeViT Attention Block?

SourceLeViT: a Vision Transformer in ConvNet's Clothing for Faster Inference
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

LeViT Attention Block is a module used for attention in the LeViT architecture. Its main feature is providing positional information within each attention block, i.e. where we explicitly inject relative position information in the attention mechanism. This is achieved by adding an attention bias to the attention maps.