TensorFlow 1 version | View source on GitHub |
Rectified Linear Unit.
tf.keras.activations.relu(
x, alpha=0.0, max_value=None, threshold=0
)
With default values, it returns element-wise max(x, 0)
.
Otherwise, it follows:
f(x) = max_value
for x >= max_value
,
f(x) = x
for threshold <= x < max_value
,
f(x) = alpha * (x - threshold)
otherwise.
Arguments | |
---|---|
x
|
A tensor or variable. |
alpha
|
A scalar, slope of negative section (default=0. ).
|
max_value
|
float. Saturation threshold. |
threshold
|
float. Threshold value for thresholded activation. |
Returns | |
---|---|
A tensor. |