AICurious Logo

What is: Maxout?

SourceMaxout Networks
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

The Maxout Unit is a generalization of the ReLU and the leaky ReLU functions. It is a piecewise linear function that returns the maximum of the inputs, designed to be used in conjunction with dropout. Both ReLU and leaky ReLU are special cases of Maxout.

f(x)=max(wT_1x+b_1,wT_2x+b_2)f\left(x\right) = \max\left(w^{T}\_{1}x + b\_{1}, w^{T}\_{2}x + b\_{2}\right)

The main drawback of Maxout is that it is computationally expensive as it doubles the number of parameters for each neuron.