View source on GitHub |
Mish activation function.
tf.keras.activations.mish(
x
)
It is defined as:
def mish(x):
return x * tanh(softplus(x))
where softplus
is defined as:
def softplus(x):
return log(exp(x) + 1)
Example:
a = tf.constant([-3.0, -1.0, 0.0, 1.0], dtype = tf.float32)
b = tf.keras.activations.mish(a)
b.numpy()
array([-0.14564745, -0.30340144, 0., 0.86509836], dtype=float32)
Args | |
---|---|
x
|
Input tensor. |
Returns | |
---|---|
The mish activation. |
Reference | |
---|---|