AICurious Logo

What is: Parametric Exponential Linear Unit?

SourceParametric Exponential Linear Unit for Deep Convolutional Neural Networks
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Parameterized Exponential Linear Units, or PELU, is an activation function for neural networks. It involves learning a parameterization of ELU in order to learn the proper activation shape at each layer in a CNN.

The PELU has two additional parameters over the ELU:

f(x)=cx if x>0f\left(x\right) = cx \text{ if } x > 0 f(x)=αexpxb1 if x0f\left(x\right) = \alpha\exp^{\frac{x}{b}} - 1 \text{ if } x \leq 0

Where aa, bb, and c>0c > 0. Here cc causes a change in the slope in the positive quadrant, bb controls the scale of the exponential decay, and α\alpha controls the saturation in the negative quadrant.

Source: Activation Functions