AICurious Logo

What is: Gated Recurrent Unit?

SourceLearning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

A Gated Recurrent Unit, or GRU, is a type of recurrent neural network. It is similar to an LSTM, but only has two gates - a reset gate and an update gate - and notably lacks an output gate. Fewer parameters means GRUs are generally easier/faster to train than their LSTM counterparts.

Image Source: here