AICurious Logo

What is: CReLU?

SourceUnderstanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

CReLU, or Concatenated Rectified Linear Units, is a type of activation function which preserves both positive and negative phase information while enforcing non-saturated non-linearity. We compute by concatenating the layer output hh as:

[ReLU(h),ReLU(h)]\left[\text{ReLU}\left(h\right), \text{ReLU}\left(-h\right)\right]