# brainpy.dyn.neurons.ExpIF#

class brainpy.dyn.neurons.ExpIF(size, V_rest=- 65.0, V_reset=- 68.0, V_th=- 30.0, V_T=- 59.9, delta_T=3.48, R=1.0, tau=10.0, tau_ref=1.7, V_initializer=ZeroInit, keep_size=False, method='exp_auto', name=None)[source]#

Exponential integrate-and-fire neuron model.

Model Descriptions

In the exponential integrate-and-fire model 1, the differential equation for the membrane potential is given by

$\begin{split}\tau\frac{d V}{d t}= - (V-V_{rest}) + \Delta_T e^{\frac{V-V_T}{\Delta_T}} + RI(t), \\ \text{after} \, V(t) \gt V_{th}, V(t) = V_{reset} \, \text{last} \, \tau_{ref} \, \text{ms}\end{split}$

This equation has an exponential nonlinearity with “sharpness” parameter $$\Delta_{T}$$ and “threshold” $$\vartheta_{rh}$$.

The moment when the membrane potential reaches the numerical threshold $$V_{th}$$ defines the firing time $$t^{(f)}$$. After firing, the membrane potential is reset to $$V_{rest}$$ and integration restarts at time $$t^{(f)}+\tau_{\rm ref}$$, where $$\tau_{\rm ref}$$ is an absolute refractory time. If the numerical threshold is chosen sufficiently high, $$V_{th}\gg v+\Delta_T$$, its exact value does not play any role. The reason is that the upswing of the action potential for $$v\gg v +\Delta_{T}$$ is so rapid, that it goes to infinity in an incredibly short time. The threshold $$V_{th}$$ is introduced mainly for numerical convenience. For a formal mathematical analysis of the model, the threshold can be pushed to infinity.

The model was first introduced by Nicolas Fourcaud-Trocmé, David Hansel, Carl van Vreeswijk and Nicolas Brunel 1. The exponential nonlinearity was later confirmed by Badel et al. 3. It is one of the prominent examples of a precise theoretical prediction in computational neuroscience that was later confirmed by experimental neuroscience.

Two important remarks:

• (i) The right-hand side of the above equation contains a nonlinearity that can be directly extracted from experimental data 3. In this sense the exponential nonlinearity is not an arbitrary choice but directly supported by experimental evidence.

• (ii) Even though it is a nonlinear model, it is simple enough to calculate the firing rate for constant input, and the linear response to fluctuations, even in the presence of input noise 4.

Model Examples

>>> import brainpy as bp
>>> group = bp.dyn.ExpIF(1)
>>> runner = bp.dyn.DSRunner(group, monitors=['V'], inputs=('input', 10.))
>>> runner.run(300., )
>>> bp.visualize.line_plot(runner.mon.ts, runner.mon.V, ylabel='V', show=True)


Model Parameters

 Parameter Init Value Unit Explanation V_rest -65 mV Resting potential. V_reset -68 mV Reset potential after spike. V_th -30 mV Threshold potential of spike. V_T -59.9 mV Threshold potential of generating action potential. delta_T 3.48 Spike slope factor. R 1 Membrane resistance. tau 10 Membrane time constant. Compute by R * C. tau_ref 1.7 Refractory period length.

Model Variables

 Variables name Initial Value Explanation V 0 Membrane potential. input 0 External and synaptic input current. spike False Flag to mark whether the neuron is spiking. refractory False Flag to mark whether the neuron is in refractory period. t_last_spike -1e7 Last spike time stamp.

References

1(1,2)

Fourcaud-Trocmé, Nicolas, et al. “How spike generation mechanisms determine the neuronal response to fluctuating inputs.” Journal of Neuroscience 23.37 (2003): 11628-11640.

2

Gerstner, W., Kistler, W. M., Naud, R., & Paninski, L. (2014). Neuronal dynamics: From single neurons to networks and models of cognition. Cambridge University Press.

3(1,2)

Badel, Laurent, Sandrine Lefort, Romain Brette, Carl CH Petersen, Wulfram Gerstner, and Magnus JE Richardson. “Dynamic IV curves are reliable predictors of naturalistic pyramidal-neuron voltage traces.” Journal of Neurophysiology 99, no. 2 (2008): 656-666.

4

Richardson, Magnus JE. “Firing-rate response of linear and nonlinear integrate-and-fire neurons to modulated current-based and conductance-based synaptic drive.” Physical Review E 76, no. 2 (2007): 021919.

5

https://en.wikipedia.org/wiki/Exponential_integrate-and-fire

__init__(size, V_rest=- 65.0, V_reset=- 68.0, V_th=- 30.0, V_T=- 59.9, delta_T=3.48, R=1.0, tau=10.0, tau_ref=1.7, V_initializer=ZeroInit, keep_size=False, method='exp_auto', name=None)[source]#

Methods

 __init__(size[, V_rest, V_reset, V_th, V_T, ...]) derivative(V, t, I_ext) get_delay_data(name, delay_step, *indices) Get delay data according to the provided delay steps. ints([method]) Collect all integrators in this node and the children nodes. load_states(filename[, verbose]) Load the model states. nodes([method, level, include_self]) Collect all children nodes. register_delay(name, delay_step, delay_target) Register delay variable. register_implicit_nodes(nodes) register_implicit_vars(variables) reset() Reset function which reset the whole variables in the model. save_states(filename[, variables]) Save the model states. train_vars([method, level, include_self]) The shortcut for retrieving all trainable variables. unique_name([name, type_]) Get the unique name for this object. update(t, dt) The function to specify the updating rule. vars([method, level, include_self]) Collect all variables in this node and the children nodes.

Attributes

 global_delay_targets global_delay_vars name steps