ReLU

Contents

ReLU#

class brainpy.dnn.ReLU(inplace=False)[source]#

Applies the rectified linear unit function element-wise:

\(\text{ReLU}(x) = (x)^+ = \max(0, x)\)

Parameters:

inplace (bool) – can optionally do the operation in-place. Default: False

Shape:
  • Input: \((*)\), where \(*\) means any number of dimensions.

  • Output: \((*)\), same shape as the input.

Examples:

  >>> import brainpy as bp
  >>> import brainpy.math as bm
  >>> m = bp.dnn.ReLU()
  >>> input = bm.random.randn(2)
  >>> output = m(input)


An implementation of CReLU - https://arxiv.org/abs/1603.05201

  >>> import brainpy as bp
  >>> import brainpy.math as bm
  >>> m = bp.dnn.ReLU()
  >>> input = bm.random.randn(2).unsqueeze(0)
  >>> output = bm.cat((m(input), m(-input)))
update(input)[source]#

The function to specify the updating rule.

Return type:

TypeVar(ArrayType, Array, Variable, TrainVar, Array, ndarray)