brainpy.losses module#

Comparison#

cross_entropy_loss(predicts, targets[, ...])

This criterion combines LogSoftmax and NLLLoss` in one single class.

cross_entropy_sparse(predicts, targets)

Computes the softmax cross-entropy loss.

cross_entropy_sigmoid(predicts, targets)

Computes the sigmoid cross-entropy loss.

l1_loos(logits, targets[, reduction])

Creates a criterion that measures the mean absolute error (MAE) between each element in the logits \(x\) and targets \(y\).

l2_loss(predicts, targets)

Computes the L2 loss.

huber_loss(predicts, targets[, delta])

Huber loss.

mean_absolute_error(x, y[, axis, reduction])

Computes the mean absolute error between x and y.

mean_squared_error(predicts, targets[, ...])

Computes the mean squared error between x and y.

mean_squared_log_error(predicts, targets[, ...])

Computes the mean squared logarithmic error between y_true and y_pred.

binary_logistic_loss(predicts, targets)

Binary logistic loss.

multiclass_logistic_loss(label, logits)

Multiclass logistic loss.

sigmoid_binary_cross_entropy(logits, labels)

Computes sigmoid cross entropy given logits and multiple class labels.

softmax_cross_entropy(logits, labels)

Computes the softmax cross entropy between sets of logits and labels.

log_cosh_loss(predicts, targets)

Calculates the log-cosh loss for a set of predictions.

ctc_loss_with_forward_probs(logits, ...[, ...])

Computes CTC loss and CTC forward-probabilities.

ctc_loss(logits, logit_paddings, labels, ...)

Computes CTC loss.

Regularization#

l2_norm(x[, axis])

Computes the L2 loss.

mean_absolute(outputs[, axis])

Computes the mean absolute error between x and y.

mean_square(predicts[, axis])

log_cosh(errors)

Calculates the log-cosh loss for a set of predictions.

smooth_labels(labels, alpha)

Apply label smoothing.