tensorflow::
ops::
SoftmaxCrossEntropyWithLogits
#include <nn_ops.h>
Computes softmax cross entropy cost and gradients to backpropagate.
Summary
Inputs are the logits, not probabilities.
Args:
- scope: A Scope object
- features: batch_size x num_classes matrix
- labels: batch_size x num_classes matrix The caller must ensure that each batch of labels represents a valid probability distribution.
Returns:
-
Output
loss: Per example loss (batch_size vector). -
Output
backprop: backpropagated gradients (batch_size x num_classes matrix).
Constructors and Destructors |
|
---|---|
SoftmaxCrossEntropyWithLogits
(const ::
tensorflow::Scope
& scope, ::
tensorflow::Input
features, ::
tensorflow::Input
labels)
|
Public attributes |
|
---|---|
backprop
|
|
loss
|
|
operation
|
Public attributes
Public functions
SoftmaxCrossEntropyWithLogits
SoftmaxCrossEntropyWithLogits( const ::tensorflow::Scope & scope, ::tensorflow::Input features, ::tensorflow::Input labels )