brainpy.neurons.ALIFBellec2020
brainpy.neurons.ALIFBellec2020#
- class brainpy.neurons.ALIFBellec2020(size, keep_size=False, V_rest=-70.0, V_th=-60.0, R=1.0, beta=1.6, tau=20.0, tau_a=2000.0, tau_ref=None, noise=None, V_initializer=OneInit(value=-70.0), a_initializer=OneInit(value=-50.0), spike_fun=<brainpy._src.math.surrogate._utils.VJPCustom object>, method='exp_auto', name=None, mode=None, eprop=False)[source]#
Leaky Integrate-and-Fire model with SFA 1.
This model is similar to the GLIF2 model in the Technical White Paper on generalized LIF (GLIF) models from AllenInstitute 2.
Formally, this model is given by:
\[\begin{split}\tau \dot{V} = -(V - V_{\mathrm{rest}}) + R*I \\ \tau_a \dot{a} = -a\end{split}\]Once a spike is induced by \(V(t) > V_{\mathrm{th}} + \beta a\), then
\[\begin{split}V \gets V - V_{\mathrm{th}} \\ a \gets a + 1\end{split}\]References
- 1
Bellec, Guillaume, et al. “A solution to the learning dilemma for recurrent networks of spiking neurons.” Nature communications 11.1 (2020): 1-15.
- 2
Allen Institute: Cell Types Database. © 2018 Allen Institute for Brain Science. Allen Cell Types Database, cell feature search. Available from: celltypes.brain-map.org/data (2018).
- __init__(size, keep_size=False, V_rest=-70.0, V_th=-60.0, R=1.0, beta=1.6, tau=20.0, tau_a=2000.0, tau_ref=None, noise=None, V_initializer=OneInit(value=-70.0), a_initializer=OneInit(value=-50.0), spike_fun=<brainpy._src.math.surrogate._utils.VJPCustom object>, method='exp_auto', name=None, mode=None, eprop=False)[source]#
Methods
__init__
(size[, keep_size, V_rest, V_th, R, ...])clear_input
()Function to clear inputs in the neuron group.
cpu
()Move all variable into the CPU device.
cuda
()Move all variables into the GPU device.
dV
(V, t, I_ext)da
(a, t)get_batch_shape
([batch_size])get_delay_data
(identifier, delay_step, *indices)Get delay data according to the provided delay steps.
load_state_dict
(state_dict[, warn])Copy parameters and buffers from
state_dict
into this module and its descendants.load_states
(filename[, verbose])Load the model states.
nodes
([method, level, include_self])Collect all children nodes.
offline_fit
(target, fit_record)offline_init
()online_fit
(target, fit_record)online_init
()register_delay
(identifier, delay_step, ...)Register delay variable.
register_implicit_nodes
(*nodes[, node_cls])register_implicit_vars
(*variables, ...)reset
([batch_size])Reset function which reset the whole variables in the model.
reset_local_delays
([nodes])Reset local delay variables.
reset_state
([batch_size])Reset function which reset the states in the model.
save_states
(filename[, variables])Save the model states.
state_dict
()Returns a dictionary containing a whole state of the module.
to
(device)Moves all variables into the given device.
tpu
()Move all variables into the TPU device.
train_vars
([method, level, include_self])The shortcut for retrieving all trainable variables.
tree_flatten
()Flattens the object as a PyTree.
tree_unflatten
(aux, dynamic_values)New in version 2.3.1.
unique_name
([name, type_])Get the unique name for this object.
update
(tdi[, x])The function to specify the updating rule.
update_local_delays
([nodes])Update local delay variables.
vars
([method, level, include_self, ...])Collect all variables in this node and the children nodes.
Attributes
derivative
global_delay_data
Global delay data, which stores the delay variables and corresponding delay targets.
mode
Mode of the model, which is useful to control the multiple behaviors of the model.
name
Name of the model.
varshape
The shape of variables in the neuron group.