tf.raw_ops.Elu

Computes the exponential linear function.

Compat aliases for migration

See Migration guide for more details.

tf.compat.v1.raw_ops.Elu

The ELU function is defined as:

  • if
  • if

Examples:

tf.nn.elu(1.0)
<tf.Tensor: shape=(), dtype=float32, numpy=1.0>
tf.nn.elu(0.0)
<tf.Tensor: shape=(), dtype=float32, numpy=0.0>
tf.nn.elu(-1000.0)
<tf.Tensor: shape=(), dtype=float32, numpy=-1.0>

See Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)

features A Tensor. Must be one of the following types: half, bfloat16, float32, float64.
name A name for the operation (optional).

A Tensor. Has the same type as features.