View source on GitHub |
Looks up embeddings for the given ids and weights from a list of tensors.
tf.nn.embedding_lookup_sparse(
params,
sp_ids,
sp_weights,
combiner=None,
max_norm=None,
name=None,
allow_fast_lookup=False
)
This op assumes that there is at least one id for each row in the dense tensor represented by sp_ids (i.e. there are no rows with empty features), and that all the indices of sp_ids are in canonical row-major order.
sp_ids
and sp_weights
(if not None) are SparseTensor
s or RaggedTensor
s
with rank of 2. For SpareTensor
s with left-aligned non-zero entries which
can be described as RaggedTensor
s, use of RaggedTensor
s can yield higher
performance.
It also assumes that all id values lie in the range [0, p0), where p0 is the sum of the size of params along dimension 0.
If len(params) > 1
, each element of sp_ids
is partitioned between the
elements of params
according to the "div" partition strategy, which means we
assign ids to partitions in a contiguous manner. For instance, 13 ids are
split across 5 partitions as:
[[0, 1, 2], [3, 4, 5], [6, 7, 8], [9, 10], [11, 12]]
.
If the id space does not evenly divide the number of partitions, each of the
first (max_id + 1) % len(params)
partitions will be assigned one more id.
Returns | |
---|---|
A dense tensor representing the combined embeddings for the
sparse ids. For each row in the dense tensor represented by sp_ids , the op
looks up the embeddings for all ids in that row, multiplies them by the
corresponding weight, and combines these embeddings as specified.
In other words, if
and
then
For instance, if params is a 10x20 matrix, and sp_ids / sp_weights are
with
|
Raises | |
---|---|
TypeError
|
If sp_ids is not a SparseTensor , or if sp_weights is
neither None nor SparseTensor .
|
ValueError
|
If combiner is not one of {"mean", "sqrtn", "sum"}.
|