View source on GitHub |
Defines signatures to support regress and predict serving.
tfr.keras.saved_model.Signatures(
model: tf.keras.Model,
context_feature_spec: Dict[str, Union[tf.io.FixedLenFeature, tf.io.RaggedFeature]],
example_feature_spec: Dict[str, Union[tf.io.FixedLenFeature, tf.io.RaggedFeature]],
mask_feature_name: str
)
This wraps the trained Keras model in two serving functions that can be saved
with tf.saved_model.save
or model.save
, and loaded with corresponding
signature names. The regress serving signature takes a batch of serialized
tf.Example
s as input, whereas the predict serving signature takes a batch of
serialized ExampleListWithContext
as input.
Example usage:
A ranking model can be saved with signatures as follows:
tf.saved_model.save(model, path, signatures=Signatures(model, ...)())
For regress serving, scores can be generated using REGRESS
signature as
follows:
loaded_model = tf.saved_model.load(path)
predictor = loaded_model.signatures[tf.saved_model.REGRESS_METHOD_NAME]
scores = predictor(serialized_examples)[tf.saved_model.REGRESS_OUTPUTS]
For predict serving, scores can be generated using PREDICT
signature as
follows:
loaded_model = tf.saved_model.load(path)
predictor = loaded_model.signatures[tf.saved_model.PREDICT_METHOD_NAME]
scores = predictor(serialized_elwcs)[tf.saved_model.PREDICT_OUTPUTS]
Attributes | |
---|---|
name
|
Returns the name of this module as passed or determined in the ctor. |
name_scope
|
Returns a tf.name_scope instance for this class.
|
non_trainable_variables
|
Sequence of non-trainable variables owned by this module and its submodules. |
submodules
|
Sequence of all sub-modules.
Submodules are modules which are properties of this module, or found as properties of modules which are properties of this module (and so on).
|
trainable_variables
|
Sequence of trainable variables owned by this module and its submodules. |
variables
|
Sequence of variables owned by this module and its submodules. |
Methods
normalize_outputs
normalize_outputs(
default_key: str, outputs: Union[tf.Tensor, Dict[str, tf.Tensor]]
) -> Dict[str, tf.Tensor]
Returns a dict of Tensors for outputs.
Args | |
---|---|
default_key
|
If outputs is a Tensor, use the default_key to make a dict. |
outputs
|
outputs to be normalized. |
Returns | |
---|---|
A dict maps from str-like key(s) to Tensor(s). |
Raises | |
---|---|
TypeError if outputs is not a Tensor nor a dict. |
predict_tf_function
predict_tf_function() -> Callable[[tf.Tensor], Dict[str, tf.Tensor]]
Makes a tensorflow function for predict
.
regress_tf_function
regress_tf_function() -> Callable[[tf.Tensor], Dict[str, tf.Tensor]]
Makes a tensorflow function for regress
.
with_name_scope
@classmethod
with_name_scope( method )
Decorator to automatically enter the module name scope.
class MyModule(tf.Module):
@tf.Module.with_name_scope
def __call__(self, x):
if not hasattr(self, 'w'):
self.w = tf.Variable(tf.random.normal([x.shape[1], 3]))
return tf.matmul(x, self.w)
Using the above module would produce tf.Variable
s and tf.Tensor
s whose
names included the module name:
mod = MyModule()
mod(tf.ones([1, 2]))
<tf.Tensor: shape=(1, 3), dtype=float32, numpy=..., dtype=float32)>
mod.w
<tf.Variable 'my_module/Variable:0' shape=(2, 3) dtype=float32,
numpy=..., dtype=float32)>
Args | |
---|---|
method
|
The method to wrap. |
Returns | |
---|---|
The original method wrapped such that it enters the module's name scope. |
__call__
__call__(
serving_default: str = 'regress'
) -> Dict[str, Callable[[tf.Tensor], Dict[str, tf.Tensor]]]
Returns a dict of signatures.
Args | |
---|---|
serving_default
|
Specifies "regress" or "predict" as the serving_default signature. |
Returns | |
---|---|
A dict of signatures. |