View source on GitHub |
AutoGraph-based auto-batching frontend.
Modules
ab_type_inference
module: Type inference pass on functional control flow graph.
allocation_strategy
module: Live variable analysis.
dsl
module: Python-embedded DSL frontend for authoring autobatchable IR programs.
gast_util
module: Gast compatibility library. Supports 0.2.2 and 0.3.2.
instructions
module: Instruction language for auto-batching virtual machine.
lowering
module: Lowering the full IR to stack machine instructions.
st
module: A stackless auto-batching VM.
stack
module: Optimizing stack usage (pushes and pops).
tf_backend
module: TensorFlow (graph) backend for auto-batching VM.
vm
module: The auto-batching VM itself.
Classes
class Context
: Context object for auto-batching multiple Python functions together.
Functions
truthy(...)
: Normalizes Tensor ranks for use in if
conditions.
Other Members | |
---|---|
TF_BACKEND |
Instance of tfp.experimental.auto_batching.TensorFlowBackend
|