softplus

Contents

softplus#

class brainpy.math.softplus(x, beta=1.0, threshold=20.0)[source]#

Softplus activation function.

Computes the element-wise function

\[\text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x))\]

SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive.

For numerical stability the implementation reverts to the linear function when \(input \times \beta > threshold\).

Parameters:
  • x (The input array.) –

  • beta (the \(\beta\) value for the Softplus formulation. Default: 1.) –

  • threshold (values above this revert to a linear function. Default: 20.) –