Update '*var' according to the RMSProp algorithm.
Note that in dense implementation of this algorithm, ms and mom will update even if the grad is zero, but in this sparse implementation, ms and mom will not update in iterations during which the grad is zero.
mean_square = decay * mean_square + (1-decay) * gradient ** 2 Delta = learning_rate * gradient / sqrt(mean_square + epsilon)
ms <- rho * ms_{t-1} + (1-rho) * grad * grad mom <- momentum * mom_{t-1} + lr * grad / sqrt(ms + epsilon) var <- var - mom
Nested Classes
class | ApplyRmsProp.Options | Optional attributes for ApplyRmsProp
|
Constants
String | OP_NAME | The name of this op, as known by TensorFlow core engine |
Public Methods
Output<T> |
asOutput()
Returns the symbolic handle of the tensor.
|
static <T extends TType> ApplyRmsProp<T> | |
Output<T> |
out()
Same as "var".
|
static ApplyRmsProp.Options |
useLocking(Boolean useLocking)
|
Inherited Methods
Constants
public static final String OP_NAME
The name of this op, as known by TensorFlow core engine
Public Methods
public Output<T> asOutput ()
Returns the symbolic handle of the tensor.
Inputs to TensorFlow operations are outputs of another TensorFlow operation. This method is used to obtain a symbolic handle that represents the computation of the input.
public static ApplyRmsProp<T> create (Scope scope, Operand<T> var, Operand<T> ms, Operand<T> mom, Operand<T> lr, Operand<T> rho, Operand<T> momentum, Operand<T> epsilon, Operand<T> grad, Options... options)
Factory method to create a class wrapping a new ApplyRmsProp operation.
Parameters
scope | current scope |
---|---|
var | Should be from a Variable(). |
ms | Should be from a Variable(). |
mom | Should be from a Variable(). |
lr | Scaling factor. Must be a scalar. |
rho | Decay rate. Must be a scalar. |
epsilon | Ridge term. Must be a scalar. |
grad | The gradient. |
options | carries optional attributes values |
Returns
- a new instance of ApplyRmsProp
public static ApplyRmsProp.Options useLocking (Boolean useLocking)
Parameters
useLocking | If `True`, updating of the var, ms, and mom tensors is protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. |
---|