TensorFlow 1 version |
Built-in activation functions.
Functions
elu(...)
: Exponential linear unit.
exponential(...)
: Exponential activation function.
hard_sigmoid(...)
: Hard sigmoid activation function.
linear(...)
: Linear activation function.
relu(...)
: Rectified Linear Unit.
selu(...)
: Scaled Exponential Linear Unit (SELU).
sigmoid(...)
: Sigmoid.
softmax(...)
: The softmax activation function transforms the outputs so that all values are in
softplus(...)
: Softplus activation function.
softsign(...)
: Softsign activation function.
tanh(...)
: Hyperbolic Tangent (tanh) activation function.