SiLU#
- class brainpy.dnn.SiLU(inplace=False)[source]#
Applies the Sigmoid Linear Unit (SiLU) function, element-wise. The SiLU function is also known as the swish function.
\[\text{silu}(x) = x * \sigma(x), \text{where } \sigma(x) \text{ is the logistic sigmoid.}\]Note
See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.
- Parameters:
inplace (
bool
) – can optionally do the operation in-place. Default:False
- Shape:
Input: \((*)\), where \(*\) means any number of dimensions.
Output: \((*)\), same shape as the input.
Examples:
>>> import brainpy as bp >>> import brainpy.math as bm >>> m = bp.dnn.SiLU() >>> input = bm.random.randn(2) >>> output = m(input)