View source on GitHub |
Additional activation functions.
Functions
gelu(...)
: Gaussian Error Linear Unit.
hardshrink(...)
: Hard shrink function.
lisht(...)
: LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.
mish(...)
: Mish: A Self Regularized Non-Monotonic Neural Activation Function.
rrelu(...)
: Randomized leaky rectified liner unit function.
snake(...)
: Snake activation to learn periodic functions.
softshrink(...)
: Soft shrink function.
sparsemax(...)
: Sparsemax activation function.
tanhshrink(...)
: Tanh shrink function.