Artificial Layers#

Convolutional Layers#

GeneralConv(in_channels, out_channels, ...)

Applies a convolution to the inputs.

Conv1D(in_channels, out_channels, ...)

Conv2D(in_channels, out_channels, ...)

Conv3D(in_channels, out_channels, ...)

Dropout Layers#

Dropout(prob[, seed, mode, name])

A layer that stochastically ignores a subset of inputs each training step.

Dense Connection Layers#

Dense(num_in, num_out[, W_initializer, ...])

A linear transformation applied over the last dimension of the input.

NVAR Layers#

NVAR(num_in, delay[, order, stride, ...])

Nonlinear vector auto-regression (NVAR) node.

Reservoir Layers#

Reservoir(input_shape, num_out[, ...])

Reservoir node, a pool of leaky-integrator neurons with random recurrent connections [1]_.

Normalization Layers#

BatchNorm(axis[, epsilon, use_bias, ...])

Batch Normalization node.

BatchNorm1d([axis])

1-D batch normalization.

BatchNorm2d([axis])

2-D batch normalization.

BatchNorm3d([axis])

3-D batch normalization.

GroupNorm([num_groups, group_size, epsilon, ...])

Group normalization layer.

LayerNorm([epsilon, use_bias, use_scale, ...])

Layer normalization (https://arxiv.org/abs/1607.06450).

InstanceNorm([epsilon, use_bias, use_scale, ...])

Instance normalization layer.

Pooling Layers#

Pool(init_v, reduce_fn, window_shape, ...[, ...])

MaxPool(window_shape[, strides, padding])

Pools the input by taking the maximum over a window.

AvgPool(window_shape[, strides, padding])

Pools the input by taking the average over a window.

MinPool(window_shape[, strides, padding])

Pools the input by taking the minimum over a window.

Artificial Recurrent Layers#

VanillaRNN(num_in, num_out[, ...])

Basic fully-connected RNN core.

GRU(num_in, num_out[, Wi_initializer, ...])

Gated Recurrent Unit.

LSTM(num_in, num_out[, Wi_initializer, ...])

Long short-term memory (LSTM) RNN core.