leaky_relu

Contents

leaky_relu#

class brainpy.math.leaky_relu(x, negative_slope=0.01)[source]#

Leaky rectified linear unit activation function.

Computes the element-wise function:

\[\begin{split}\mathrm{leaky\_relu}(x) = \begin{cases} x, & x \ge 0\\ \alpha x, & x < 0 \end{cases}\end{split}\]

where \(\alpha\) = negative_slope.

Parameters:
  • x (ArrayType) – The input array.

  • negative_slope (float) – The scalar specifying the negative slope (default: 0.01)