General Mathematical Operators#

BrainPy Array#

Array

Multiple-dimensional array in BrainPy.

ShardedArray

The sharded array, which stores data across multiple devices.

Array Interoperability to JAX#

as_device_array

Convert the input to a jax.numpy.DeviceArray.

as_jax

Convert the input to a jax.numpy.DeviceArray.

Array Interoperability to NumPy#

as_ndarray

Convert the input to a numpy.ndarray.

as_numpy

Convert the input to a numpy.ndarray.

Array Interoperability to BrainPy#

as_variable

Convert the input to a brainpy.math.Variable.

asarray

Activation Functions#

celu

Continuously-differentiable exponential linear unit activation.

elu

Exponential linear unit activation function.

gelu

Gaussian error linear unit activation function.

glu

Gated linear unit activation function.

prelu

Applies the element-wise function:

silu

SiLU activation function.

selu

Scaled exponential linear unit activation.

relu

relu6

Rectified Linear Unit 6 activation function.

rrelu

Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:

hard_silu

Hard SiLU activation function

leaky_relu

Leaky rectified linear unit activation function.

hard_tanh

Hard \(\mathrm{tanh}\) activation function.

hard_sigmoid

Hard Sigmoid activation function.

tanh_shrink

Applies the element-wise function:

hard_swish

Hard SiLU activation function

hard_shrink

Applies the Hard Shrinkage (Hardshrink) function element-wise.

soft_sign

Soft-sign activation function.

soft_shrink

Applies the soft shrinkage function elementwise:

softmax

Softmax function.

softmin

Applies the Softmin function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0, 1] and sum to 1.

softplus

Softplus activation function.

swish

SiLU activation function.

mish

Applies the Mish function, element-wise.

log_sigmoid

Log-sigmoid activation function.

log_softmax

Log-Softmax function.

one_hot

One-hot encodes the given indicies.

normalize

Normalizes an array by subtracting mean and dividing by sqrt(var).

sigmoid

Sigmoid activation function.

identity

tanh

Similar to jax.numpy.tanh function, while it is compatible with brainpy Array/Variable.