AICurious Logo

What is: Scaled Exponential Linear Unit?

SourceSelf-Normalizing Neural Networks
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Scaled Exponential Linear Units, or SELUs, are activation functions that induce self-normalizing properties.

The SELU activation function is given by

f(x)=λx if x0f\left(x\right) = \lambda{x} \text{ if } x \geq{0} f(x)=λα(exp(x)1) if x<0f\left(x\right) = \lambda{\alpha\left(\exp\left(x\right) -1 \right)} \text{ if } x < 0

with α1.6733\alpha \approx 1.6733 and λ1.0507\lambda \approx 1.0507.