BioNMDA#

class brainpy.dyn.BioNMDA(size, keep_size=False, sharding=None, method='exp_auto', name=None, mode=None, alpha1=2.0, beta1=0.01, alpha2=1.0, beta2=0.5, T=1.0, T_dur=0.5)[source]#

Biological NMDA synapse model.

Model Descriptions

The NMDA receptor is a glutamate receptor and ion channel found in neurons. The NMDA receptor is one of three types of ionotropic glutamate receptors, the other two being AMPA and kainate receptors.

The NMDA receptor mediated conductance depends on the postsynaptic voltage. The voltage dependence is due to the blocking of the pore of the NMDA receptor from the outside by a positively charged magnesium ion. The channel is nearly completely blocked at resting potential, but the magnesium block is relieved if the cell is depolarized. The fraction of channels \(g_{\infty}\) that are not blocked by magnesium can be fitted to

\[g_{\infty}(V,[{Mg}^{2+}]_{o}) = (1+{e}^{-a V} \frac{[{Mg}^{2+}]_{o}} {b})^{-1}\]

Here \([{Mg}^{2+}]_{o}\) is the extracellular magnesium concentration, usually 1 mM. Thus, the channel acts as a “coincidence detector” and only once both of these conditions are met, the channel opens and it allows positively charged ions (cations) to flow through the cell membrane [2].

If we make the approximation that the magnesium block changes instantaneously with voltage and is independent of the gating of the channel, the net NMDA receptor-mediated synaptic current is given by

\[I_{syn} = g_\mathrm{NMDA}(t) (V(t)-E) \cdot g_{\infty}\]

where \(V(t)\) is the post-synaptic neuron potential, \(E\) is the reversal potential.

Simultaneously, the kinetics of synaptic state \(g\) is determined by a 2nd-order kinetics [1]:

\[\begin{split}& \frac{d g}{dt} = \alpha_1 x (1 - g) - \beta_1 g \\ & \frac{d x}{dt} = \alpha_2 [T] (1 - x) - \beta_2 x\end{split}\]

where \(\alpha_1, \beta_1\) refers to the conversion rate of variable g and \(\alpha_2, \beta_2\) refers to the conversion rate of variable x.

The NMDA receptor has been thought to be very important for controlling synaptic plasticity and mediating learning and memory functions [3].

This module can be used with interface brainpy.dyn.ProjAlignPreMg2, as shown in the following example:

import numpy as np
import brainpy as bp
import brainpy.math as bm

import matplotlib.pyplot as plt


class BioNMDA(bp.Projection):
    def __init__(self, pre, post, delay, prob, g_max, E=0.):
        super().__init__()
        self.proj = bp.dyn.ProjAlignPreMg2(
            pre=pre,
            delay=delay,
            syn=bp.dyn.BioNMDA.desc(pre.num, alpha1=2, beta1=0.01, alpha2=0.2, beta2=0.5, T=1, T_dur=1),
            comm=bp.dnn.CSRLinear(bp.conn.FixedProb(prob, pre=pre.num, post=post.num), g_max),
            out=bp.dyn.COBA(E=E),
            post=post,
        )

class SimpleNet(bp.DynSysGroup):
    def __init__(self, E=0.):
        super().__init__()

        self.pre = bp.dyn.SpikeTimeGroup(1, indices=(0, 0, 0, 0), times=(10., 30., 50., 70.))
        self.post = bp.dyn.LifRef(1, V_rest=-60., V_th=-50., V_reset=-60., tau=20., tau_ref=5.,
                                  V_initializer=bp.init.Constant(-60.))
        self.syn = BioNMDA(self.pre, self.post, delay=None, prob=1., g_max=1., E=E)

    def update(self):
        self.pre()
        self.syn()
        self.post()

        # monitor the following variables
        conductance = self.syn.proj.refs['syn'].g
        current = self.post.sum_inputs(self.post.V)
        return conductance, current, self.post.V


indices = np.arange(1000)  # 100 ms, dt= 0.1 ms
conductances, currents, potentials = bm.for_loop(SimpleNet(E=0.).step_run, indices, progress_bar=True)
ts = indices * bm.get_dt()

fig, gs = bp.visualize.get_figure(1, 3, 3.5, 4)
fig.add_subplot(gs[0, 0])
plt.plot(ts, conductances)
plt.title('Syn conductance')
fig.add_subplot(gs[0, 1])
plt.plot(ts, currents)
plt.title('Syn current')
fig.add_subplot(gs[0, 2])
plt.plot(ts, potentials)
plt.title('Post V')
plt.show()
Parameters:
supported_modes: Optional[Sequence[bm.Mode]] = (<class 'brainpy._src.math.modes.NonBatchingMode'>, <class 'brainpy._src.math.modes.BatchingMode'>)#

Supported computing modes.

update(pre_spike)[source]#

The function to specify the updating rule.