Nodes: artificial neural network#

Artificial neural network (ANN) nodes

GeneralConv(out_channels, kernel_size[, ...])

Applies a convolution to the inputs.

Conv1D(out_channels, kernel_size, **kwargs)

Conv2D(out_channels, kernel_size, **kwargs)

Conv3D(out_channels, kernel_size, **kwargs)

Dropout(prob[, seed])

A layer that stochastically ignores a subset of inputs each training step.

VanillaRNN(num_unit[, state_initializer, ...])

Basic fully-connected RNN core.

GRU(num_unit[, wi_initializer, ...])

Gated Recurrent Unit.

LSTM(num_unit[, wi_initializer, ...])

Long short-term memory (LSTM) RNN core.

Pool(init_v, reduce_fn, window_shape, ...)

MaxPool(window_shape[, strides, padding])

Pools the input by taking the maximum over a window.

AvgPool(window_shape[, strides, padding])

Pools the input by taking the average over a window.

MinPool(window_shape[, strides, padding])

Pools the input by taking the minimum over a window.

BatchNorm(axis[, epsilon, use_bias, ...])

Batch Normalization node.

BatchNorm1d([axis])

1-D batch normalization.

BatchNorm2d([axis])

2-D batch normalization.

BatchNorm3d([axis])

3-D batch normalization.

GroupNorm([num_groups, group_size, epsilon, ...])

Group normalization layer.

LayerNorm([epsilon, use_bias, use_scale, ...])

Layer normalization (https://arxiv.org/abs/1607.06450).

InstanceNorm([epsilon, use_bias, use_scale, ...])

Instance normalization layer.