softmax_cross_entropy

softmax_cross_entropy#

class brainpy.losses.softmax_cross_entropy(logits, labels)[source]#

Computes the softmax cross entropy between sets of logits and labels. Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.

References

[Goodfellow et al, 2016](http://www.deeplearningbook.org/contents/prob.html)

Parameters:
  • logits – unnormalized log probabilities.

  • labels – a valid probability distribution (non-negative, sum to 1), e.g a one hot encoding of which class is the correct one for each input.

Returns:

the cross entropy loss.