Warning: This project is deprecated. TensorFlow Addons has stopped development,
The project will only be providing minimal maintenance releases until May 2024. See the full
announcement here or on
github.
Module: tfa.activations
Stay organized with collections
Save and categorize content based on your preferences.
Additional activation functions.
Functions
gelu(...)
: Gaussian Error Linear Unit.
hardshrink(...)
: Hard shrink function.
lisht(...)
: LiSHT: Non-Parameteric Linearly Scaled Hyperbolic Tangent Activation Function.
mish(...)
: Mish: A Self Regularized Non-Monotonic Neural Activation Function.
rrelu(...)
: Randomized leaky rectified liner unit function.
snake(...)
: Snake activation to learn periodic functions.
softshrink(...)
: Soft shrink function.
sparsemax(...)
: Sparsemax activation function.
tanhshrink(...)
: Tanh shrink function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-05-25 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-05-25 UTC."],[],[]]