public final class
ResourceApplyProximalAdagrad
Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
accum += grad grad prox_v = var - lr grad (1 / sqrt(accum)) var = sign(prox_v)/(1+lrl2) max{|prox_v|-lrl1,0}
Nested Classes
class | ResourceApplyProximalAdagrad.Options | Optional attributes for ResourceApplyProximalAdagrad
|
Constants
String | OP_NAME | The name of this op, as known by TensorFlow core engine |
Public Methods
static <T extends TType> ResourceApplyProximalAdagrad | |
static ResourceApplyProximalAdagrad.Options |
useLocking(Boolean useLocking)
|
Inherited Methods
Constants
public static final String OP_NAME
The name of this op, as known by TensorFlow core engine
Constant Value:
"ResourceApplyProximalAdagrad"
Public Methods
public static ResourceApplyProximalAdagrad create (Scope scope, Operand<?> var, Operand<?> accum, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<T> grad, Options... options)
Factory method to create a class wrapping a new ResourceApplyProximalAdagrad operation.
Parameters
scope | current scope |
---|---|
var | Should be from a Variable(). |
accum | Should be from a Variable(). |
lr | Scaling factor. Must be a scalar. |
l1 | L1 regularization. Must be a scalar. |
l2 | L2 regularization. Must be a scalar. |
grad | The gradient. |
options | carries optional attributes values |
Returns
- a new instance of ResourceApplyProximalAdagrad
public static ResourceApplyProximalAdagrad.Options useLocking (Boolean useLocking)
Parameters
useLocking | If True, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. |
---|