RReLU#
- class brainpy.dnn.RReLU(lower=0.125, upper=0.3333333333333333, inplace=False)[source]#
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
Empirical Evaluation of Rectified Activations in Convolutional Network.
The function is defined as:
\[\begin{split}\text{RReLU}(x) = \begin{cases} x & \text{if } x \geq 0 \\ ax & \text{ otherwise } \end{cases}\end{split}\]where \(a\) is randomly sampled from uniform distribution \(\mathcal{U}(\text{lower}, \text{upper})\).
- Parameters:
- Shape:
Input: \((*)\), where \(*\) means any number of dimensions.
Output: \((*)\), same shape as the input.
Examples:
>>> import brainpy as bp >>> import brainpy.math as bm >>> m = bp.dnn.RReLU(0.1, 0.3) >>> input = bm.random.randn(2) >>> output = m(input)