AICurious Logo

What is: Single-Headed Attention?

SourceSingle Headed Attention RNN: Stop Thinking With Your Head
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Single-Headed Attention is a single-headed attention module used in the SHA-RNN language model. The principle design reasons for single-headedness were simplicity (avoiding running out of memory) and scepticism about the benefits of using multiple heads.