AICurious Logo

What is: ELECTRA?

SourceELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

ELECTRA is a transformer with a new pre-training approach which trains two transformer models: the generator and the discriminator. The generator replaces tokens in the sequence - trained as a masked language model - and the discriminator (the ELECTRA contribution) attempts to identify which tokens are replaced by the generator in the sequence. This pre-training task is called replaced token detection, and is a replacement for masking the input.