α

# What is: Tanh Activation?

 Year 2000 Data Source CC BY-SA - https://paperswithcode.com

Tanh Activation is an activation function used for neural networks:

$f\left(x\right) = \frac{e^{x} - e^{-x}}{e^{x} + e^{-x}}$

Historically, the tanh function became preferred over the sigmoid function as it gave better performance for multi-layer neural networks. But it did not solve the vanishing gradient problem that sigmoids suffered, which was tackled more effectively with the introduction of ReLU activations.

Image Source: Junxi Feng