brainpy.losses
module#
Comparison#
This criterion combines |
|
Computes the softmax cross-entropy loss. |
|
Computes the sigmoid cross-entropy loss. |
|
The negative log likelihood loss. |
|
Creates a criterion that measures the mean absolute error (MAE) between each element in the logits \(x\) and targets \(y\). |
|
Computes the L2 loss. |
|
Huber loss. |
|
Computes the mean absolute error between x and y. |
|
Computes the mean squared error between x and y. |
|
Computes the mean squared logarithmic error between y_true and y_pred. |
|
Binary logistic loss. |
|
Multiclass logistic loss. |
|
Computes sigmoid cross entropy given logits and multiple class labels. |
|
Computes the softmax cross entropy between sets of logits and labels. |
|
Calculates the log-cosh loss for a set of predictions. |
|
Computes CTC loss and CTC forward-probabilities. |
|
Computes CTC loss. |
|
Computes multi-class margin loss, also called multi-class hinge loss. |
This criterion computes the cross entropy loss between input logits and target. |
|
The negative log likelihood loss. |
|
Creates a criterion that measures the mean absolute error (MAE) between each element in the input \(x\) and target \(y\). |
|
Creates a criterion that measures the mean squared error (squared L2 norm) between each element in the input \(x\) and target \(y\). |
Regularization#
Computes the L2 loss. |
|
Computes the mean absolute error between x and y. |
|
Calculates the log-cosh loss for a set of predictions. |
|
Apply label smoothing. |