brainpy.synapses.NMDA#

class brainpy.synapses.NMDA(pre, post, conn, output=MgBlock, stp=None, comp_method='dense', g_max=0.15, delay_step=None, tau_decay=100.0, a=0.5, tau_rise=2.0, method='exp_auto', name=None, mode=None, stop_spike_gradient=False)[source]#

NMDA synapse model.

Model Descriptions

The NMDA receptor is a glutamate receptor and ion channel found in neurons. The NMDA receptor is one of three types of ionotropic glutamate receptors, the other two being AMPA and kainate receptors.

The NMDA receptor mediated conductance depends on the postsynaptic voltage. The voltage dependence is due to the blocking of the pore of the NMDA receptor from the outside by a positively charged magnesium ion. The channel is nearly completely blocked at resting potential, but the magnesium block is relieved if the cell is depolarized. The fraction of channels \(g_{\infty}\) that are not blocked by magnesium can be fitted to

\[g_{\infty}(V,[{Mg}^{2+}]_{o}) = (1+{e}^{-\alpha V} \frac{[{Mg}^{2+}]_{o}} {\beta})^{-1}\]

Here \([{Mg}^{2+}]_{o}\) is the extracellular magnesium concentration, usually 1 mM. Thus, the channel acts as a “coincidence detector” and only once both of these conditions are met, the channel opens and it allows positively charged ions (cations) to flow through the cell membrane 2.

If we make the approximation that the magnesium block changes instantaneously with voltage and is independent of the gating of the channel, the net NMDA receptor-mediated synaptic current is given by

\[I_{syn} = g_\mathrm{NMDA}(t) (V(t)-E) \cdot g_{\infty}\]

where \(V(t)\) is the post-synaptic neuron potential, \(E\) is the reversal potential.

Simultaneously, the kinetics of synaptic state \(g\) is given by

\[\begin{split}& g_\mathrm{NMDA} (t) = g_{max} g \\ & \frac{d g}{dt} = -\frac{g} {\tau_{decay}}+a x(1-g) \\ & \frac{d x}{dt} = -\frac{x}{\tau_{rise}}+ \sum_{k} \delta(t-t_{j}^{k})\end{split}\]

where the decay time of NMDA currents is usually taken to be \(\tau_{decay}\) =100 ms, \(a= 0.5 ms^{-1}\), and \(\tau_{rise}\) =2 ms.

The NMDA receptor has been thought to be very important for controlling synaptic plasticity and mediating learning and memory functions 3.

Model Examples

>>> import brainpy as bp
>>> from brainpy import synapses, neurons
>>> import matplotlib.pyplot as plt
>>>
>>> neu1 = neurons.HH(1)
>>> neu2 = neurons.HH(1)
>>> syn1 = synapses.NMDA(neu1, neu2, bp.connect.All2All(), E=0.)
>>> net = bp.Network(pre=neu1, syn=syn1, post=neu2)
>>>
>>> runner = bp.DSRunner(net, inputs=[('pre.input', 5.)], monitors=['pre.V', 'post.V', 'syn.g', 'syn.x'])
>>> runner.run(150.)
>>>
>>> fig, gs = bp.visualize.get_figure(2, 1, 3, 8)
>>> fig.add_subplot(gs[0, 0])
>>> plt.plot(runner.mon.ts, runner.mon['pre.V'], label='pre-V')
>>> plt.plot(runner.mon.ts, runner.mon['post.V'], label='post-V')
>>> plt.legend()
>>>
>>> fig.add_subplot(gs[1, 0])
>>> plt.plot(runner.mon.ts, runner.mon['syn.g'], label='g')
>>> plt.plot(runner.mon.ts, runner.mon['syn.x'], label='x')
>>> plt.legend()
>>> plt.show()

(Source code)

Parameters
  • pre (NeuGroup) – The pre-synaptic neuron group.

  • post (NeuGroup) – The post-synaptic neuron group.

  • conn (optional, ArrayType, dict of (str, ndarray), TwoEndConnector) – The synaptic connections.

  • comp_method (str) – The connection type used for model speed optimization. It can be sparse and dense. The default is dense.

  • delay_step (int, ArrayType, Initializer, Callable) – The delay length. It should be the value of \(\mathrm{delay\_time / dt}\).

  • g_max (float, ArrayType, Initializer, Callable) – The synaptic strength (the maximum conductance). Default is 1.

  • tau_decay (float, ArrayType) – The time constant of the synaptic decay phase. Default 100 [ms]

  • tau_rise (float, ArrayType) – The time constant of the synaptic rise phase. Default 2 [ms]

  • a (float, ArrayType) – Default 0.5 ms^-1.

  • name (str) – The name of this synaptic projection.

  • method (str) – The numerical integration methods.

References

1

Brunel N, Wang X J. Effects of neuromodulation in a cortical network model of object working memory dominated by recurrent inhibition[J]. Journal of computational neuroscience, 2001, 11(1): 63-85.

2

Furukawa, Hiroyasu, Satinder K. Singh, Romina Mancusso, and Eric Gouaux. “Subunit arrangement and function in NMDA receptors.” Nature 438, no. 7065 (2005): 185-192.

3

Li, F. and Tsien, J.Z., 2009. Memory and the NMDA receptors. The New England journal of medicine, 361(3), p.302.

4

https://en.wikipedia.org/wiki/NMDA_receptor

__init__(pre, post, conn, output=MgBlock, stp=None, comp_method='dense', g_max=0.15, delay_step=None, tau_decay=100.0, a=0.5, tau_rise=2.0, method='exp_auto', name=None, mode=None, stop_spike_gradient=False)[source]#

Methods

__init__(pre, post, conn[, output, stp, ...])

check_post_attrs(*attrs)

Check whether post group satisfies the requirement.

check_pre_attrs(*attrs)

Check whether pre group satisfies the requirement.

clear_input()

cpu()

Move all variable into the CPU device.

cuda()

Move all variables into the GPU device.

dg(g, t, x)

dx(x, t)

get_delay_data(identifier, delay_step, *indices)

Get delay data according to the provided delay steps.

load_state_dict(state_dict[, warn])

Copy parameters and buffers from state_dict into this module and its descendants.

load_states(filename[, verbose])

Load the model states.

nodes([method, level, include_self])

Collect all children nodes.

offline_fit(target, fit_record)

offline_init()

online_fit(target, fit_record)

online_init()

register_delay(identifier, delay_step, ...)

Register delay variable.

register_implicit_nodes(*nodes[, node_cls])

register_implicit_vars(*variables, ...)

reset([batch_size])

Reset function which reset the whole variables in the model.

reset_local_delays([nodes])

Reset local delay variables.

reset_state([batch_size])

Reset function which reset the states in the model.

save_states(filename[, variables])

Save the model states.

state_dict()

Returns a dictionary containing a whole state of the module.

to(device)

Moves all variables into the given device.

tpu()

Move all variables into the TPU device.

train_vars([method, level, include_self])

The shortcut for retrieving all trainable variables.

tree_flatten()

Flattens the object as a PyTree.

tree_unflatten(aux, dynamic_values)

New in version 2.3.1.

unique_name([name, type_])

Get the unique name for this object.

update(tdi[, pre_spike])

The function to specify the updating rule.

update_local_delays([nodes])

Update local delay variables.

vars([method, level, include_self, ...])

Collect all variables in this node and the children nodes.

Attributes

global_delay_data

Global delay data, which stores the delay variables and corresponding delay targets.

mode

Mode of the model, which is useful to control the multiple behaviors of the model.

name

Name of the model.