brainpy.dyn.synapses.Alpha#

class brainpy.dyn.synapses.Alpha(pre, post, conn, output=CUBA, stp=None, comp_method='dense', g_max=1.0, delay_step=None, tau_decay=10.0, method='exp_auto', name=None, mode=NormalMode, stop_spike_gradient=False)[source]#

Alpha synapse model.

Model Descriptions

The analytical expression of alpha synapse is given by:

\[g_{syn}(t)= g_{max} \frac{t-t_{s}}{\tau} \exp \left(-\frac{t-t_{s}}{\tau}\right).\]

While, this equation is hard to implement. So, let’s try to convert it into the differential forms:

\[\begin{split}\begin{aligned} &g_{\mathrm{syn}}(t)= g_{\mathrm{max}} g \\ &\frac{d g}{d t}=-\frac{g}{\tau}+h \\ &\frac{d h}{d t}=-\frac{h}{\tau}+\delta\left(t_{0}-t\right) \end{aligned}\end{split}\]

Model Examples

>>> import brainpy as bp
>>> from brainpy.dyn import neurons, synapses, synouts
>>> import matplotlib.pyplot as plt
>>>
>>> neu1 = neurons.LIF(1)
>>> neu2 = neurons.LIF(1)
>>> syn1 = synapses.Alpha(neu1, neu2, bp.connect.All2All(), output=synouts.CUBA())
>>> net = bp.dyn.Network(pre=neu1, syn=syn1, post=neu2)
>>>
>>> runner = bp.dyn.DSRunner(net, inputs=[('pre.input', 25.)], monitors=['pre.V', 'post.V', 'syn.g', 'syn.h'])
>>> runner.run(150.)
>>>
>>> fig, gs = bp.visualize.get_figure(2, 1, 3, 8)
>>> fig.add_subplot(gs[0, 0])
>>> plt.plot(runner.mon.ts, runner.mon['pre.V'], label='pre-V')
>>> plt.plot(runner.mon.ts, runner.mon['post.V'], label='post-V')
>>> plt.legend()
>>> fig.add_subplot(gs[1, 0])
>>> plt.plot(runner.mon.ts, runner.mon['syn.g'], label='g')
>>> plt.plot(runner.mon.ts, runner.mon['syn.h'], label='h')
>>> plt.legend()
>>> plt.show()

(Source code, png, hires.png, pdf)

../../../../_images/brainpy-dyn-synapses-Alpha-1.png
Parameters
  • pre (NeuGroup) – The pre-synaptic neuron group.

  • post (NeuGroup) – The post-synaptic neuron group.

  • conn (optional, ndarray, JaxArray, dict of (str, ndarray), TwoEndConnector) – The synaptic connections.

  • comp_method (str) – The connection type used for model speed optimization. It can be sparse and dense. The default is sparse.

  • delay_step (int, ndarray, JaxArray, Initializer, Callable) – The delay length. It should be the value of \(\mathrm{delay\_time / dt}\).

  • tau_decay (float, JaxArray, ndarray) – The time constant of the synaptic decay phase. [ms]

  • g_max (float, ndarray, JaxArray, Initializer, Callable) – The synaptic strength (the maximum conductance). Default is 1.

  • name (str) – The name of this synaptic projection.

  • method (str) – The numerical integration methods.

References

1

Sterratt, David, Bruce Graham, Andrew Gillies, and David Willshaw. “The Synapse.” Principles of Computational Modelling in Neuroscience. Cambridge: Cambridge UP, 2011. 172-95. Print.

__init__(pre, post, conn, output=CUBA, stp=None, comp_method='dense', g_max=1.0, delay_step=None, tau_decay=10.0, method='exp_auto', name=None, mode=NormalMode, stop_spike_gradient=False)[source]#

Methods

__init__(pre, post, conn[, output, stp, ...])

check_post_attrs(*attrs)

Check whether post group satisfies the requirement.

check_pre_attrs(*attrs)

Check whether pre group satisfies the requirement.

clear_input()

dg(g, t, h)

dh(h, t)

get_delay_data(identifier, delay_step, *indices)

Get delay data according to the provided delay steps.

init_weights(weight, comp_method[, sparse_data])

rtype

Union[float, TypeVar(Array, JaxArray, Variable, TrainVar, Array, ndarray)]

load_states(filename[, verbose])

Load the model states.

nodes([method, level, include_self])

Collect all children nodes.

offline_fit(target, fit_record)

offline_init()

online_fit(target, fit_record)

online_init()

register_delay(identifier, delay_step, ...)

Register delay variable.

register_implicit_nodes(*nodes, **named_nodes)

register_implicit_vars(*variables, ...)

reset([batch_size])

Reset function which reset the whole variables in the model.

reset_local_delays([nodes])

Reset local delay variables.

reset_state([batch_size])

Reset function which reset the states in the model.

save_states(filename[, variables])

Save the model states.

syn2post_with_all2all(syn_value, syn_weight)

syn2post_with_dense(syn_value, syn_weight, ...)

syn2post_with_one2one(syn_value, syn_weight)

train_vars([method, level, include_self])

The shortcut for retrieving all trainable variables.

unique_name([name, type_])

Get the unique name for this object.

update(tdi[, pre_spike])

The function to specify the updating rule.

update_local_delays([nodes])

Update local delay variables.

vars([method, level, include_self])

Collect all variables in this node and the children nodes.

Attributes

global_delay_data

mode

Mode of the model, which is useful to control the multiple behaviors of the model.

name

Name of the model.