AICurious Logo

What is: Collapsing Linear Unit?

SourceDeeper Learning with CoLU Activation
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

CoLU is an activation function similar to Swish and Mish in properties. It is defined as: f(x)=x1x(x+ex)f(x)=\frac{x}{1-x^{-(x+e^x)}} It is smooth, continuously differentiable, unbounded above, bounded below, non-saturating, and non-monotonic. Based on experiments done with CoLU with different activation functions, it is observed that CoLU usually performs better than other functions on deeper neural networks.