AICurious Logo

What is: Orthogonal Regularization?

SourceNeural Photo Editing with Introspective Adversarial Networks
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Orthogonal Regularization is a regularization technique for convolutional neural networks, introduced with generative modelling as the task in mind. Orthogonality is argued to be a desirable quality in ConvNet filters, partially because multiplication by an orthogonal matrix leaves the norm of the original matrix unchanged. This property is valuable in deep or recurrent networks, where repeated matrix multiplication can result in signals vanishing or exploding. To try to maintain orthogonality throughout training, Orthogonal Regularization encourages weights to be orthogonal by pushing them towards the nearest orthogonal manifold. The objective function is augmented with the cost:

L_ortho=(WWTI)\mathcal{L}\_{ortho} = \sum\left(|WW^{T} − I|\right)

Where \sum indicates a sum across all filter banks, WW is a filter bank, and II is the identity matrix