SELU

Contents

SELU#

class brainpy.dnn.SELU(inplace=False)[source]#

Applied element-wise, as:

\[\text{SELU}(x) = \text{scale} * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)))\]

with \(\alpha = 1.6732632423543772848170429916717\) and \(\text{scale} = 1.0507009873554804934193349852946\).

More details can be found in the paper Self-Normalizing Neural Networks .

Parameters:

inplace (bool, optional) – can optionally do the operation in-place. Default: False

Shape:
  • Input: \((*)\), where \(*\) means any number of dimensions.

  • Output: \((*)\), same shape as the input.

Examples:

>>> import brainpy as bp
>>> import brainpy.math as bm
>>> m = bp.dnn.SELU()
>>> input = bm.random.randn(2)
>>> output = m(input)
update(input)[source]#

The function to specify the updating rule.

Return type:

TypeVar(ArrayType, Array, Variable, TrainVar, Array, ndarray)