AICurious Logo

What is: LOGAN?

SourceLOGAN: Latent Optimisation for Generative Adversarial Networks
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

LOGAN is a generative adversarial network that uses a latent optimization approach using natural gradient descent (NGD). For the Fisher matrix in NGD, the authors use the empirical Fisher FF' with Tikhonov damping:

F=ggT+βIF' = g \cdot g^{T} + \beta{I}

They also use Euclidian Norm regularization for the optimization step.

For LOGAN's base architecture, BigGAN-deep is used with a few modifications: increasing the size of the latent source from 186186 to 256256, to compensate the randomness of the source lost when optimising zz. 2, using the uniform distribution U(1,1)U\left(−1, 1\right) instead of the standard normal distribution N(0,1)N\left(0, 1\right) for p(z)p\left(z\right) to be consistent with the clipping operation, using leaky ReLU (with the slope of 0.2 for the negative part) instead of ReLU as the non-linearity for smoother gradient flow for δf(z)δz\frac{\delta{f}\left(z\right)}{\delta{z}} .