Reduction using NCCL all-reduce.
tf.distribute.NcclAllReduce(
num_packs=1
)
Args |
num_packs
|
values will be packed in this many splits. num_packs should
be greater than 0.
|
Raises |
ValueError if num_packs is zero or negative.
|
Methods
batch_reduce
View source
batch_reduce(
reduce_op, value_destination_pairs
)
Reduce PerReplica objects in a batch.
Reduce each first element in value_destination_pairs
to each second
element which indicates the destinations.
Args |
reduce_op
|
Indicates how per_replica_value will be reduced. Accepted
values are tf.distribute.ReduceOp.SUM , tf.distribute.ReduceOp.MEAN .
|
value_destination_pairs
|
a list or a tuple of tuples of PerReplica objects
(or tensors with device set if there is one device) and destinations.
|
Returns |
a list of Mirrored objects.
|
Raises |
ValueError
|
if value_destination_pairs is not a list or a tuple of
tuples of PerReplica objects and destinations
|
broadcast
View source
broadcast(
tensor, destinations
)
Broadcast the tensor
to destinations.
Args |
tensor
|
the tensor to broadcast.
|
destinations
|
the broadcast destinations.
|
Returns |
a Mirrored object.
|
reduce
View source
reduce(
reduce_op, per_replica_value, destinations
)
Reduce per_replica_value
to destinations
.
It runs the reduction operation defined by reduce_op
and put the
result on destinations
.
Returns |
a Mirrored object.
|
Raises |
ValueError
|
if per_replica_value can't be converted to a PerReplica
object.
|