AICurious Logo

What is: Rectified Linear Units?

Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Rectified Linear Units, or ReLUs, are a type of activation function that are linear in the positive dimension, but zero in the negative dimension. The kink in the function is the source of the non-linearity. Linearity in the positive dimension has the attractive property that it prevents non-saturation of gradients (contrast with sigmoid activations), although for half of the real line its gradient is zero.

f(x)=max(0,x)f\left(x\right) = \max\left(0, x\right)