brainpy.dyn.layers.Reservoir#

class brainpy.dyn.layers.Reservoir(input_shape, num_out, leaky_rate=0.3, activation='tanh', activation_type='internal', Win_initializer=Normal(scale=0.1, seed=1060522), Wrec_initializer=Normal(scale=0.1, seed=8804248), b_initializer=ZeroInit, in_connectivity=0.1, rec_connectivity=0.1, comp_type='dense', spectral_radius=None, noise_in=0.0, noise_rec=0.0, noise_type='normal', seed=None, mode=BatchingMode, name=None)[source]#

Reservoir node, a pool of leaky-integrator neurons with random recurrent connections 1.

Parameters
  • input_shape (int, tuple of int) – The input shape.

  • num_out (int) – The number of reservoir nodes.

  • Win_initializer (Initializer) – The initialization method for the feedforward connections.

  • Wrec_initializer (Initializer) – The initialization method for the recurrent connections.

  • b_initializer (optional, Array, Initializer) – The initialization method for the bias.

  • leaky_rate (float) – A float between 0 and 1.

  • activation (str, callable, optional) – Reservoir activation function. - If a str, should be a brainpy.math.activations function name. - If a callable, should be an element-wise operator on tensor.

  • activation_type (str) –

    • If “internal” (default), then leaky integration happens on states transformed by the activation function:

    \[r[n+1] = (1 - \alpha) \cdot r[t] + \alpha \cdot f(W_{ff} \cdot u[n] + W_{fb} \cdot b[n] + W_{rec} \cdot r[t])\]
    • If “external”, then leaky integration happens on internal states of each neuron, stored in an internal_state parameter (\(x\) in the equation below). A neuron internal state is the value of its state before applying the activation function \(f\):

      \[\begin{split}x[n+1] &= (1 - \alpha) \cdot x[t] + \alpha \cdot f(W_{ff} \cdot u[n] + W_{rec} \cdot r[t] + W_{fb} \cdot b[n]) \\ r[n+1] &= f(x[n+1])\end{split}\]

  • in_connectivity (float, optional) – Connectivity of input neurons, i.e. ratio of input neurons connected to reservoir neurons. Must be in [0, 1], by default 0.1

  • rec_connectivity (float, optional) – Connectivity of recurrent weights matrix, i.e. ratio of reservoir neurons connected to other reservoir neurons, including themselves. Must be in [0, 1], by default 0.1

  • comp_type (str) – The connectivity type, can be “dense” or “sparse”.

  • spectral_radius (float, optional) – Spectral radius of recurrent weight matrix, by default None

  • noise_rec (float, optional) – Gain of noise applied to reservoir internal states, by default 0.0

  • noise_in (float, optional) – Gain of noise applied to feedforward signals, by default 0.0

  • noise_type (optional, str, callable) – Distribution of noise. Must be a random variable generator distribution (see brainpy.math.random.RandomState), by default “normal”.

  • seed (optional, int) – The seed for random sampling in this node.

References

1

Lukoševičius, Mantas. “A practical guide to applying echo state networks.” Neural networks: Tricks of the trade. Springer, Berlin, Heidelberg, 2012. 659-686.

__init__(input_shape, num_out, leaky_rate=0.3, activation='tanh', activation_type='internal', Win_initializer=Normal(scale=0.1, seed=1060522), Wrec_initializer=Normal(scale=0.1, seed=8804248), b_initializer=ZeroInit, in_connectivity=0.1, rec_connectivity=0.1, comp_type='dense', spectral_radius=None, noise_in=0.0, noise_rec=0.0, noise_type='normal', seed=None, mode=BatchingMode, name=None)[source]#

Methods

__init__(input_shape, num_out[, leaky_rate, ...])

clear_input()

get_delay_data(identifier, delay_step, *indices)

Get delay data according to the provided delay steps.

load_states(filename[, verbose])

Load the model states.

nodes([method, level, include_self])

Collect all children nodes.

offline_fit(target, fit_record)

offline_init()

online_fit(target, fit_record)

online_init()

register_delay(identifier, delay_step, ...)

Register delay variable.

register_implicit_nodes(*nodes, **named_nodes)

register_implicit_vars(*variables, ...)

reset([batch_size])

Reset function which reset the whole variables in the model.

reset_local_delays([nodes])

Reset local delay variables.

reset_state([batch_size])

Reset function which reset the states in the model.

save_states(filename[, variables])

Save the model states.

train_vars([method, level, include_self])

The shortcut for retrieving all trainable variables.

unique_name([name, type_])

Get the unique name for this object.

update(sha, x)

Feedforward output.

update_local_delays([nodes])

Update local delay variables.

vars([method, level, include_self])

Collect all variables in this node and the children nodes.

Attributes

global_delay_data

mode

Mode of the model, which is useful to control the multiple behaviors of the model.

name

Name of the model.