- brainpy.losses.sigmoid_binary_cross_entropy(logits, labels)[source]#
Computes sigmoid cross entropy given logits and multiple class labels. Measures the probability error in discrete classification tasks in which each class is an independent binary prediction and different classes are not mutually exclusive. This may be used for multilabel image classification for instance a model may predict that an image contains both a cat and a dog.
[Goodfellow et al, 2016](http://www.deeplearningbook.org/contents/prob.html)
logits – unnormalized log probabilities.
labels – the probability for that class.
a sigmoid cross entropy loss.