Retrieve Adagrad embedding parameters.
An op that retrieves optimization parameters from embedding to host memory. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to retrieve updated parameters before saving a checkpoint.
Nested Classes
class | RetrieveTPUEmbeddingAdagradParameters.Options | Optional attributes for RetrieveTPUEmbeddingAdagradParameters
|
Constants
String | OP_NAME | The name of this op, as known by TensorFlow core engine |
Public Methods
Output<TFloat32> |
accumulators()
Parameter accumulators updated by the Adagrad optimization algorithm.
|
static RetrieveTPUEmbeddingAdagradParameters.Options |
config(String config)
|
static RetrieveTPUEmbeddingAdagradParameters |
create(Scope scope, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new RetrieveTPUEmbeddingAdagradParameters operation.
|
Output<TFloat32> |
parameters()
Parameter parameters updated by the Adagrad optimization algorithm.
|
static RetrieveTPUEmbeddingAdagradParameters.Options |
tableId(Long tableId)
|
static RetrieveTPUEmbeddingAdagradParameters.Options |
tableName(String tableName)
|
Inherited Methods
Constants
public static final String OP_NAME
The name of this op, as known by TensorFlow core engine
Public Methods
public Output<TFloat32> accumulators ()
Parameter accumulators updated by the Adagrad optimization algorithm.
public static RetrieveTPUEmbeddingAdagradParameters create (Scope scope, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new RetrieveTPUEmbeddingAdagradParameters operation.
Parameters
scope | current scope |
---|---|
options | carries optional attributes values |
Returns
- a new instance of RetrieveTPUEmbeddingAdagradParameters
public Output<TFloat32> parameters ()
Parameter parameters updated by the Adagrad optimization algorithm.