Update '*var' according to the Adam algorithm.
$$lr_t := \text{learning\_rate} * \sqrt{1 - beta_2^t} / (1 - beta_1^t)$$ $$m_t := beta_1 * m_{t-1} + (1 - beta_1) * g$$ $$v_t := beta_2 * v_{t-1} + (1 - beta_2) * g * g$$ $$variable := variable - lr_t * m_t / (\sqrt{v_t} + \epsilon)$$
Nested Classes
class | ApplyAdam.Options | Optional attributes for ApplyAdam
|
Constants
String | OP_NAME | The name of this op, as known by TensorFlow core engine |
Public Methods
Output<T> |
asOutput()
Returns the symbolic handle of the tensor.
|
static <T extends TType> ApplyAdam<T> | |
Output<T> |
out()
Same as "var".
|
static ApplyAdam.Options |
useLocking(Boolean useLocking)
|
static ApplyAdam.Options |
useNesterov(Boolean useNesterov)
|
Inherited Methods
Constants
public static final String OP_NAME
The name of this op, as known by TensorFlow core engine
Public Methods
public Output<T> asOutput ()
Returns the symbolic handle of the tensor.
Inputs to TensorFlow operations are outputs of another TensorFlow operation. This method is used to obtain a symbolic handle that represents the computation of the input.
public static ApplyAdam<T> create (Scope scope, Operand<T> var, Operand<T> m, Operand<T> v, Operand<T> beta1Power, Operand<T> beta2Power, Operand<T> lr, Operand<T> beta1, Operand<T> beta2, Operand<T> epsilon, Operand<T> grad, Options... options)
Factory method to create a class wrapping a new ApplyAdam operation.
Parameters
scope | current scope |
---|---|
var | Should be from a Variable(). |
m | Should be from a Variable(). |
v | Should be from a Variable(). |
beta1Power | Must be a scalar. |
beta2Power | Must be a scalar. |
lr | Scaling factor. Must be a scalar. |
beta1 | Momentum factor. Must be a scalar. |
beta2 | Momentum factor. Must be a scalar. |
epsilon | Ridge term. Must be a scalar. |
grad | The gradient. |
options | carries optional attributes values |
Returns
- a new instance of ApplyAdam
public static ApplyAdam.Options useLocking (Boolean useLocking)
Parameters
useLocking | If `True`, updating of the var, m, and v tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. |
---|
public static ApplyAdam.Options useNesterov (Boolean useNesterov)
Parameters
useNesterov | If `True`, uses the nesterov update. |
---|