AICurious Logo

What is: Activation Normalization?

SourceGlow: Generative Flow with Invertible 1x1 Convolutions
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Activation Normalization is a type of normalization used for flow-based generative models; specifically it was introduced in the GLOW architecture. An ActNorm layer performs an affine transformation of the activations using a scale and bias parameter per channel, similar to batch normalization. These parameters are initialized such that the post-actnorm activations per-channel have zero mean and unit variance given an initial minibatch of data. This is a form of data dependent initilization. After initialization, the scale and bias are treated as regular trainable parameters that are independent of the data.