# brainpy.losses.cross_entropy_loss#

brainpy.losses.cross_entropy_loss(logits, targets, weight=None, reduction='mean')[source]#

This criterion combines LogSoftmax and NLLLoss in one single class.

It is useful when training a classification problem with C classes. If provided, the optional argument weight should be a 1D Tensor assigning weight to each of the classes. This is particularly useful when you have an unbalanced training set.

The input is expected to contain raw, unnormalized scores for each class. input has to be an array of size either $$(minibatch, C)$$ or $$(d_1, d_2, ..., d_K, minibatch, C)$$ with $$K \geq 1$$ for the K-dimensional case (described later).

This criterion expects a class index in the range $$[0, C-1]$$ as the target for each value of a 1D tensor of size minibatch.

The loss can be described as:

$\text{loss}(x, class) = -\log\left(\frac{\exp(x[class])}{\sum_j \exp(x[j])}\right) = -x[class] + \log\left(\sum_j \exp(x[j])\right)$

or in the case of the weight argument being specified:

$\text{loss}(x, class) = weight[class] \left(-x[class] + \log\left(\sum_j \exp(x[j])\right)\right)$

Can also be used for higher dimension inputs, such as 2D images, by providing an input of size $$(d_1, d_2, ..., d_K, minibatch, C)$$ with $$K \geq 1$$, where $$K$$ is the number of dimensions, and a target of appropriate shape.

Parameters
• logits (jmath.JaxArray) – $$(N, C)$$ where C = number of classes, or $$(d_1, d_2, ..., d_K, N, C)$$ with $$K \geq 1$$ in the case of K-dimensional loss.

• targets (jmath.JaxArray) – $$(N, C)$$ or $$(N)$$ where each value is $$0 \leq \text{targets}[i] \leq C-1$$, or $$(d_1, d_2, ..., d_K, N, C)$$ or $$(d_1, d_2, ..., d_K, N)$$ with $$K \geq 1$$ in the case of K-dimensional loss.

• weight (mjax.JaxArray, optional) – A manual rescaling weight given to each class. If given, has to be an array of size C.

• reduction (str, optional) – Specifies the reduction to apply to the output: 'none' | 'mean' | 'sum'. - 'none': no reduction will be applied, - 'mean': the weighted mean of the output is taken, - 'sum': the output will be summed.

Returns

output – If reduction is 'none'`, then the same size as the target: $$(N)$$, or $$(d_1, d_2, ..., d_K, N)$$ with $$K \geq 1$$ in the case of K-dimensional loss.

Return type

scalar, mjax.JaxArray