Softmin

Softmin#

class brainpy.dnn.Softmin(dim=None)[source]#

Applies the Softmin function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0, 1] and sum to 1.

Softmin is defined as:

\[\text{Softmin}(x_{i}) = \frac{\exp(-x_i)}{\sum_j \exp(-x_j)}\]
Shape:
  • Input: \((*)\) where * means, any number of additional dimensions

  • Output: \((*)\), same shape as the input

Parameters:

dim (int) – A dimension along which Softmin will be computed (so every slice along dim will sum to 1).

Returns:

a Tensor of the same dimension and shape as the input, with values in the range [0, 1]

Examples:

>>> import brainpy as bp
>>> import brainpy.math as bm
>>> m = bp.dnn.Softmin(dim=1)
>>> input = bm.random.randn(2, 3)
>>> output = m(input)
update(input)[source]#

The function to specify the updating rule.

Return type:

TypeVar(ArrayType, Array, Variable, TrainVar, Array, ndarray)