LogSoftmax

LogSoftmax#

class brainpy.dnn.LogSoftmax(dim=None)[source]#

Applies the \(\log(\text{Softmax}(x))\) function to an n-dimensional input Tensor. The LogSoftmax formulation can be simplified as:

\[\text{LogSoftmax}(x_{i}) = \log\left(\frac{\exp(x_i) }{ \sum_j \exp(x_j)} \right)\]
Shape:
  • Input: \((*)\) where * means, any number of additional dimensions

  • Output: \((*)\), same shape as the input

Parameters:

dim (int) – A dimension along which LogSoftmax will be computed.

Returns:

a Tensor of the same dimension and shape as the input with values in the range [-inf, 0)

Examples:

>>> import brainpy as bp
>>> import brainpy.math as bm
>>> m = bp.dnn.LogSoftmax(dim=1)
>>> input = bm.random.randn(2, 3)
>>> output = m(input)
update(input)[source]#

The function to specify the updating rule.

Return type:

TypeVar(ArrayType, Array, Variable, TrainVar, Array, ndarray)