AICurious Logo

What is: SERLU?

SourceEffectiveness of Scaled Exponentially-Regularized Linear Units (SERLUs)
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

SERLU, or Scaled Exponentially-Regularized Linear Unit, is a type of activation function. The new function introduces a bump-shaped function in the region of negative input. The bump-shaped function has approximately zero response to large negative input while being able to push the output of SERLU towards zero mean statistically.

SERLU(x))=λ_serlux if x0\text{SERLU}\left(x\right)) = \lambda\_{serlu}x \text{ if } x \geq 0 SERLU(x))=λ_serluα_serluxex if x<0\text{SERLU}\left(x\right)) = \lambda\_{serlu}\alpha\_{serlu}xe^{x} \text{ if } x < 0

where the two parameters λ_serlu>0\lambda\_{serlu} > 0 and α_serlu>0\alpha\_{serlu} > 0 remain to be specified.