ExpIF#

class brainpy.dyn.ExpIF(size, sharding=None, keep_size=False, mode=None, name=None, spk_fun=InvSquareGrad(alpha=100.0), spk_dtype=None, spk_reset='soft', detach_spk=False, method='exp_auto', init_var=True, scaling=None, V_rest=-65.0, V_reset=-68.0, V_th=-55.0, V_T=-59.9, delta_T=3.48, R=1.0, tau=10.0, V_initializer=ZeroInit)[source]#

Exponential integrate-and-fire neuron model.

Model Descriptions

In the exponential integrate-and-fire model [1], the differential equation for the membrane potential is given by

\[\begin{split}\tau\frac{d V}{d t}= - (V-V_{rest}) + \Delta_T e^{\frac{V-V_T}{\Delta_T}} + RI(t), \\ \text{after} \, V(t) \gt V_{th}, V(t) = V_{reset} \, \text{last} \, \tau_{ref} \, \text{ms}\end{split}\]

This equation has an exponential nonlinearity with “sharpness” parameter \(\Delta_{T}\) and “threshold” \(\vartheta_{rh}\).

The moment when the membrane potential reaches the numerical threshold \(V_{th}\) defines the firing time \(t^{(f)}\). After firing, the membrane potential is reset to \(V_{rest}\) and integration restarts at time \(t^{(f)}+\tau_{\rm ref}\), where \(\tau_{\rm ref}\) is an absolute refractory time. If the numerical threshold is chosen sufficiently high, \(V_{th}\gg v+\Delta_T\), its exact value does not play any role. The reason is that the upswing of the action potential for \(v\gg v +\Delta_{T}\) is so rapid, that it goes to infinity in an incredibly short time. The threshold \(V_{th}\) is introduced mainly for numerical convenience. For a formal mathematical analysis of the model, the threshold can be pushed to infinity.

The model was first introduced by Nicolas Fourcaud-Trocmé, David Hansel, Carl van Vreeswijk and Nicolas Brunel [1]. The exponential nonlinearity was later confirmed by Badel et al. [3]. It is one of the prominent examples of a precise theoretical prediction in computational neuroscience that was later confirmed by experimental neuroscience.

Two important remarks:

  • (i) The right-hand side of the above equation contains a nonlinearity that can be directly extracted from experimental data [3]. In this sense the exponential nonlinearity is not an arbitrary choice but directly supported by experimental evidence.

  • (ii) Even though it is a nonlinear model, it is simple enough to calculate the firing rate for constant input, and the linear response to fluctuations, even in the presence of input noise [4].

References

Examples

There is a simple usage example:

import brainpy as bp

neu = bp.dyn.ExpIF(1, )

# example for section input
inputs = bp.inputs.section_input([0., 5., 0.], [100., 300., 100.])

runner = bp.DSRunner(neu, monitors=['V'])
runner.run(inputs=inputs)

bp.visualize.line_plot(runner.mon['ts'], runner.mon['V'], show=True)

Model Parameters

Parameter

Init Value

Unit

Explanation

V_rest

-65

mV

Resting potential.

V_reset

-68

mV

Reset potential after spike.

V_th

-30

mV

Threshold potential of spike.

V_T

-59.9

mV

Threshold potential of generating action potential.

delta_T

3.48

Spike slope factor.

R

1

Membrane resistance.

tau

10

Membrane time constant. Compute by R * C.

tau_ref

1.7

Refractory period length.

Model Variables

Variables name

Initial Value

Explanation

V

0

Membrane potential.

input

0

External and synaptic input current.

spike

False

Flag to mark whether the neuron is spiking.

refractory

False

Flag to mark whether the neuron is in refractory period.

t_last_spike

-1e7

Last spike time stamp.

Parameters:
  • size (TypeVar(Shape, int, Tuple[int, ...])) – int, or sequence of int. The neuronal population size.

  • sharding (Optional[Sequence[str]]) – The sharding strategy.

  • keep_size (bool) – bool. Keep the neuron group size.

  • mode (Optional[Mode]) – Mode. The computing mode.

  • name (Optional[str]) – str. The group name.

  • spk_fun (Callable) – callable. The spike activation function.

  • detach_spk (bool) – bool.

  • method (str) – str. The numerical integration method.

  • spk_type – The spike data type.

  • spk_reset (str) – The way to reset the membrane potential when the neuron generates spikes. This parameter only works when the computing mode is TrainingMode. It can be soft and hard. Default is soft.

update(x=None)[source]#

The function to specify the updating rule.