AICurious Logo

What is: Powerpropagation?

SourcePowerpropagation: A sparsity inducing weight reparameterisation
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Powerpropagation is a weight-parameterisation for neural networks that leads to inherently sparse models. Exploiting the behaviour of gradient descent, it gives rise to weight updates exhibiting a “rich get richer” dynamic, leaving low-magnitude parameters largely unaffected by learning.In other words, parameters with larger magnitudes are allowed to adapt faster in order to represent the required features to solve the task, while smaller magnitude parameters are restricted, making it more likely that they will be irrelevant in representing the learned solution. Models trained in this manner exhibit similar performance, but have a distribution with markedly higher density at zero, allowing more parameters to be pruned safely.