AICurious Logo

What is: Virtual Batch Normalization?

SourceImproved Techniques for Training GANs
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Virtual Batch Normalization is a normalization method used for training generative adversarial networks that extends batch normalization. Regular batch normalization causes the output of a neural network for an input example x\mathbf{x} to be highly dependent on several other inputs x\mathbf{x}' in the same minibatch. To avoid this problem in virtual batch normalization (VBN), each example x\mathbf{x} is normalized based on the statistics collected on a reference batch of examples that are chosen once and fixed at the start of training, and on x\mathbf{x} itself. The reference batch is normalized using only its own statistics. VBN is computationally expensive because it requires running forward propagation on two minibatches of data, so the authors use it only in the generator network.