multi_margin_loss

multi_margin_loss#

class brainpy.losses.multi_margin_loss(predicts, targets, margin=1.0, p=1, reduction='mean')[source]#

Computes multi-class margin loss, also called multi-class hinge loss.

This loss function is often used in multi-class classification problems. It is a type of hinge loss that tries to ensure the correct class score is greater than the scores of other classes by a margin.

The loss function for sample \(i\) is:

\[\ell(x, y) = \sum_{j \neq y_i} \max(0, x_{y_j} - x_{y_i} + \text{margin})\]

where \(x\) is the input, \(y\) is the target, and \(y_i\) is the index of the true class, and \(i \in \left\{0, \; \cdots , \; \text{x.size}(0) - 1\right\}\)

and \(i \neq y\).

Parameters:
  • predicts\((N, C)\) where C = number of classes.

  • target\((N)\) where each value is \(0 \leq \text{targets}[i] \leq C-1\).

  • margin (float, optional) – Has a default value of \(1\).

  • p (float, optional) – Has a default value of \(1\).

  • reduction (str, optional) – Specifies the reduction to apply to the output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': the sum of the output will be divided by the number of elements in the output, 'sum': the output will be summed. Note: size_average and reduce are in the process of being deprecated, and in the meantime, specifying either of those two args will override reduction. Default: 'mean'

Returns:

a scalar representing the multi-class margin loss. If reduction is 'none', then \((N)\).