# brainpy.dyn.synapses.AlphaCUBA#

class brainpy.dyn.synapses.AlphaCUBA(pre, post, conn, conn_type='dense', g_max=1.0, delay_step=None, tau_decay=10.0, method='exp_auto', name=None)[source]#

Current-based alpha synapse model.

Model Descriptions

The analytical expression of alpha synapse is given by:

$g_{syn}(t)= g_{max} \frac{t-t_{s}}{\tau} \exp \left(-\frac{t-t_{s}}{\tau}\right).$

While, this equation is hard to implement. So, let’s try to convert it into the differential forms:

\begin{split}\begin{aligned} &g_{\mathrm{syn}}(t)= g_{\mathrm{max}} g \\ &\frac{d g}{d t}=-\frac{g}{\tau}+h \\ &\frac{d h}{d t}=-\frac{h}{\tau}+\delta\left(t_{0}-t\right) \end{aligned}\end{split}

The current onto the post-synaptic neuron is given by

$I_{syn}(t) = g_{\mathrm{syn}}(t).$

Model Examples

>>> import brainpy as bp
>>> import matplotlib.pyplot as plt
>>>
>>> neu1 = bp.dyn.LIF(1)
>>> neu2 = bp.dyn.LIF(1)
>>> syn1 = bp.dyn.AlphaCUBA(neu1, neu2, bp.connect.All2All())
>>> net = bp.dyn.Network(pre=neu1, syn=syn1, post=neu2)
>>>
>>> runner = bp.dyn.DSRunner(net, inputs=[('pre.input', 25.)], monitors=['pre.V', 'post.V', 'syn.g', 'syn.h'])
>>> runner.run(150.)
>>>
>>> fig, gs = bp.visualize.get_figure(2, 1, 3, 8)
>>> plt.plot(runner.mon.ts, runner.mon['pre.V'], label='pre-V')
>>> plt.plot(runner.mon.ts, runner.mon['post.V'], label='post-V')
>>> plt.legend()
>>> plt.plot(runner.mon.ts, runner.mon['syn.g'], label='g')
>>> plt.plot(runner.mon.ts, runner.mon['syn.h'], label='h')
>>> plt.legend()
>>> plt.show()
Parameters
• pre (NeuGroup) – The pre-synaptic neuron group.

• post (NeuGroup) – The post-synaptic neuron group.

• conn (optional, ndarray, JaxArray, dict of (str, ndarray), TwoEndConnector) – The synaptic connections.

• conn_type (str) – The connection type used for model speed optimization. It can be sparse and dense. The default is sparse.

• delay_step (int, ndarray, JaxArray, Initializer, Callable) – The delay length. It should be the value of $$\mathrm{delay\_time / dt}$$.

• tau_decay (float, JaxArray, ndarray) – The time constant of the synaptic decay phase. [ms]

• g_max (float, ndarray, JaxArray, Initializer, Callable) – The synaptic strength (the maximum conductance). Default is 1.

• name (str) – The name of this synaptic projection.

• method (str) – The numerical integration methods.

References

1

Sterratt, David, Bruce Graham, Andrew Gillies, and David Willshaw. “The Synapse.” Principles of Computational Modelling in Neuroscience. Cambridge: Cambridge UP, 2011. 172-95. Print.

__init__(pre, post, conn, conn_type='dense', g_max=1.0, delay_step=None, tau_decay=10.0, method='exp_auto', name=None)[source]#

Methods

 __init__(pre, post, conn[, conn_type, ...]) check_post_attrs(*attrs) Check whether post group satisfies the requirement. check_pre_attrs(*attrs) Check whether pre group satisfies the requirement. dg(g, t, h) dh(h, t) get_delay_data(name, delay_step, *indices) Get delay data according to the provided delay steps. ints([method]) Collect all integrators in this node and the children nodes. load_states(filename[, verbose]) Load the model states. nodes([method, level, include_self]) Collect all children nodes. output(g_post) register_delay(name, delay_step, delay_target) Register delay variable. register_implicit_nodes(nodes) register_implicit_vars(variables) reset() Reset function which reset the whole variables in the model. save_states(filename[, variables]) Save the model states. train_vars([method, level, include_self]) The shortcut for retrieving all trainable variables. unique_name([name, type_]) Get the unique name for this object. update(t, dt) The function to specify the updating rule. vars([method, level, include_self]) Collect all variables in this node and the children nodes.

Attributes

 global_delay_targets global_delay_vars name steps