brainpy.losses.softmax_cross_entropy(logits, labels)[source]#

Computes the softmax cross entropy between sets of logits and labels. Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both. .. rubric:: References

[Goodfellow et al, 2016](

  • logits – unnormalized log probabilities.

  • labels – a valid probability distribution (non-negative, sum to 1), e.g a one hot encoding of which class is the correct one for each input.


the cross entropy loss.