brainpy.losses module

brainpy.losses module#

Comparison#

cross_entropy_loss

This criterion combines LogSoftmax and NLLLoss` in one single class.

cross_entropy_sparse

Computes the softmax cross-entropy loss.

cross_entropy_sigmoid

Computes the sigmoid cross-entropy loss.

nll_loss

The negative log likelihood loss.

l1_loss

Creates a criterion that measures the mean absolute error (MAE) between each element in the logits \(x\) and targets \(y\).

l2_loss

Computes the L2 loss.

huber_loss

Huber loss.

mean_absolute_error

Computes the mean absolute error between x and y.

mean_squared_error

Computes the mean squared error between x and y.

mean_squared_log_error

Computes the mean squared logarithmic error between y_true and y_pred.

binary_logistic_loss

Binary logistic loss.

multiclass_logistic_loss

Multiclass logistic loss.

sigmoid_binary_cross_entropy

Computes sigmoid cross entropy given logits and multiple class labels.

softmax_cross_entropy

Computes the softmax cross entropy between sets of logits and labels.

log_cosh_loss

Calculates the log-cosh loss for a set of predictions.

ctc_loss_with_forward_probs

Computes CTC loss and CTC forward-probabilities.

ctc_loss

Computes CTC loss.

multi_margin_loss

Computes multi-class margin loss, also called multi-class hinge loss.

CrossEntropyLoss

This criterion computes the cross entropy loss between input logits and target.

NLLLoss

The negative log likelihood loss.

L1Loss

Creates a criterion that measures the mean absolute error (MAE) between each element in the input \(x\) and target \(y\).

MAELoss

MSELoss

Creates a criterion that measures the mean squared error (squared L2 norm) between each element in the input \(x\) and target \(y\).

Regularization#

l2_norm

Computes the L2 loss.

mean_absolute

Computes the mean absolute error between x and y.

mean_square

log_cosh

Calculates the log-cosh loss for a set of predictions.

smooth_labels

Apply label smoothing.