Load ADAM embedding parameters with debug support.
An op that loads optimization parameters into HBM for embedding. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to install parameters that are loaded from a checkpoint before a training loop is executed.
Nested Classes
class | LoadTPUEmbeddingADAMParametersGradAccumDebug.Options | Optional attributes for LoadTPUEmbeddingADAMParametersGradAccumDebug
|
Constants
String | OP_NAME | The name of this op, as known by TensorFlow core engine |
Public Methods
static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options |
config(String config)
|
static LoadTPUEmbeddingADAMParametersGradAccumDebug |
create(Scope scope, Operand<TFloat32> parameters, Operand<TFloat32> momenta, Operand<TFloat32> velocities, Operand<TFloat32> gradientAccumulators, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new LoadTPUEmbeddingADAMParametersGradAccumDebug operation.
|
static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options |
tableId(Long tableId)
|
static LoadTPUEmbeddingADAMParametersGradAccumDebug.Options |
tableName(String tableName)
|
Inherited Methods
Constants
public static final String OP_NAME
The name of this op, as known by TensorFlow core engine
Public Methods
public static LoadTPUEmbeddingADAMParametersGradAccumDebug create (Scope scope, Operand<TFloat32> parameters, Operand<TFloat32> momenta, Operand<TFloat32> velocities, Operand<TFloat32> gradientAccumulators, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new LoadTPUEmbeddingADAMParametersGradAccumDebug operation.
Parameters
scope | current scope |
---|---|
parameters | Value of parameters used in the ADAM optimization algorithm. |
momenta | Value of momenta used in the ADAM optimization algorithm. |
velocities | Value of velocities used in the ADAM optimization algorithm. |
gradientAccumulators | Value of gradient_accumulators used in the ADAM optimization algorithm. |
options | carries optional attributes values |
Returns
- a new instance of LoadTPUEmbeddingADAMParametersGradAccumDebug