LeakyReLU

LeakyReLU#

class brainpy.dnn.LeakyReLU(negative_slope=0.01, inplace=False)[source]#

Applies the element-wise function:

\[\text{LeakyReLU}(x) = \max(0, x) + \text{negative\_slope} * \min(0, x)\]

or

\[\begin{split}\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \text{negative\_slope} \times x, & \text{ otherwise } \end{cases}\end{split}\]
Parameters:
  • negative_slope (float) – Controls the angle of the negative slope (which is used for negative input values). Default: 1e-2

  • inplace (bool) – can optionally do the operation in-place. Default: False

Shape:
  • Input: \((*)\) where * means, any number of additional dimensions

  • Output: \((*)\), same shape as the input

Examples:

>>> import brainpy as bp
>>> import brainpy.math as bm
>>> m = bp.dnn.LeakyReLU(0.1)
>>> input = bm.random.randn(2)
>>> output = m(input)
update(input)[source]#

The function to specify the updating rule.

Return type:

TypeVar(ArrayType, Array, Variable, TrainVar, Array, ndarray)