LeakyReLU#
- class brainpy.dnn.LeakyReLU(negative_slope=0.01, inplace=False)[source]#
Applies the element-wise function:
\[\text{LeakyReLU}(x) = \max(0, x) + \text{negative\_slope} * \min(0, x)\]or
\[\begin{split}\text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \text{negative\_slope} \times x, & \text{ otherwise } \end{cases}\end{split}\]- Parameters:
- Shape:
Input: \((*)\) where * means, any number of additional dimensions
Output: \((*)\), same shape as the input
Examples:
>>> import brainpy as bp >>> import brainpy.math as bm >>> m = bp.dnn.LeakyReLU(0.1) >>> input = bm.random.randn(2) >>> output = m(input)