rrelu#
- class brainpy.math.rrelu(x, lower=0.125, upper=0.3333333333333333)[source]#
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
Empirical Evaluation of Rectified Activations in Convolutional Network.
The function is defined as:
\[\begin{split}\text{RReLU}(x) = \begin{cases} x & \text{if } x \geq 0 \\ ax & \text{ otherwise } \end{cases}\end{split}\]where \(a\) is randomly sampled from uniform distribution \(\mathcal{U}(\text{lower}, \text{upper})\).
- Parameters:
lower – lower bound of the uniform distribution. Default: \(\frac{1}{8}\)
upper – upper bound of the uniform distribution. Default: \(\frac{1}{3}\)
- Shape:
Input: \((*)\), where \(*\) means any number of dimensions.
Output: \((*)\), same shape as the input.