AICurious Logo

What is: Temporal Activation Regularization?

SourceRevisiting Activation Regularization for Language RNNs
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Temporal Activation Regularization (TAR) is a type of slowness regularization for RNNs that penalizes differences between states that have been explored in the past. Formally we minimize:

βL_2(h_th_t+1)\beta{L\_{2}}\left(h\_{t} - h\_{t+1}\right)

where L_2L\_{2} is the L_2L\_{2} norm, hth_{t} is the output of the RNN at timestep tt, and β\beta is a scaling coefficient.