Softplus

Softplus#

class brainpy.dnn.Softplus(beta=1, threshold=20.0)[source]#

Applies the Softplus function \(\text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x))\) element-wise.

SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive.

For numerical stability the implementation reverts to the linear function when \(input \times \beta > threshold\).

Parameters:
  • beta (float) – the \(\beta\) value for the Softplus formulation. Default: 1

  • threshold (float) – values above this revert to a linear function. Default: 20

Shape:
  • Input: \((*)\), where \(*\) means any number of dimensions.

  • Output: \((*)\), same shape as the input.

Examples:

>>> import brainpy as bp
>>> import brainpy.math as bm
>>> m = bp.dnn.Softplus()
>>> input = bm.random.randn(2)
>>> output = m(input)
update(x)[source]#

The function to specify the updating rule.

Return type:

TypeVar(ArrayType, Array, Variable, TrainVar, Array, ndarray)