Built-in activation functions.
Functions
deserialize(...)
: Returns activation function denoted by input string.
elu(...)
: Exponential linear unit.
exponential(...)
: Exponential activation function.
get(...)
: Returns function.
hard_sigmoid(...)
: Hard sigmoid activation function.
linear(...)
: Linear activation function.
relu(...)
: Applies the rectified linear unit activation function.
selu(...)
: Scaled Exponential Linear Unit (SELU).
serialize(...)
: Returns name attribute (__name__
) of function.
sigmoid(...)
: Sigmoid activation function.
softmax(...)
: Softmax converts a real vector to a vector of categorical probabilities.
softplus(...)
: Softplus activation function.
softsign(...)
: Softsign activation function.
swish(...)
: Swish activation function.
tanh(...)
: Hyperbolic tangent activation function.