- brainpy.losses.smooth_labels(labels, alpha)[source]#
Apply label smoothing. Label smoothing is often used in combination with a cross-entropy loss. Smoothed labels favour small logit gaps, and it has been shown that this can provide better model calibration by preventing overconfident predictions. .. rubric:: References
[Müller et al, 2019](https://arxiv.org/pdf/1906.02629.pdf)
labels – one hot labels to be smoothed.
float) – the smoothing factor, the greedy category with be assigned probability (1-alpha) + alpha / num_categories
- Return type:
a smoothed version of the one hot input labels.