View source on GitHub |
Apply standard lookup ops with tf.tpu.experimental.embedding
configs.
tf.tpu.experimental.embedding.serving_embedding_lookup(
inputs, weights, tables, feature_config
)
This function is a utility which allows using the
tf.tpu.experimental.embedding
config objects with standard lookup functions.
This can be used when exporting a model which uses
tf.tpu.experimental.embedding.TPUEmbedding
for serving on CPU. In particular
tf.tpu.experimental.embedding.TPUEmbedding
only supports lookups on TPUs and
should not be part of your serving graph.
Note that TPU specific options (such as max_sequence_length
) in the
configuration objects will be ignored.
In the following example we take a trained model (see the documentation for
tf.tpu.experimental.embedding.TPUEmbedding
for the context) and create a
saved model with a serving function that will perform the embedding lookup and
pass the results to your model:
model = model_fn(...)
embedding = tf.tpu.experimental.embedding.TPUEmbedding(
feature_config=feature_config,
batch_size=1024,
optimizer=tf.tpu.experimental.embedding.SGD(0.1))
checkpoint = tf.train.Checkpoint(model=model, embedding=embedding)
checkpoint.restore(...)
@tf.function(input_signature=[{'feature_one': tf.TensorSpec(...),
'feature_two': tf.TensorSpec(...),
'feature_three': tf.TensorSpec(...)}])
def serve_tensors(embedding_featurese):
embedded_features = tf.tpu.experimental.embedding.serving_embedding_lookup(
embedding_features, None, embedding.embedding_tables,
feature_config)
return model(embedded_features)
model.embedding_api = embedding
tf.saved_model.save(model,
export_dir=...,
signatures={'serving_default': serve_tensors})
Returns | |
---|---|
A nested structure of Tensors with the same structure as inputs. |