Dynamical System Specification#

@Tianqiu Zhang @Chaoming Wang

BrainPy enables modularity programming and easy model debugging. To build a complex brain dynamics model, you just need to group its building blocks. In this section, we are going to talk about what building blocks we have provided, and how to use these building blocks.

import brainpy as bp
import brainpy.math as bm

bm.set_platform('cpu')

Models in brainpy.dyn#

brainpy.dyn has provided many convenient neuron, synapse, and other models for users. The following figure is a glimpse of the provided models.

The arrows in the graph represent the inheritance relations between different models.

New models will be continuously updated in the page of API documentation.

Initializing a neuron model#

All neuron models implemented in brainpy are subclasses of brainpy.dyn.NeuGroup. The initialization of a neuron model just needs to provide the geometry size of neurons in a population group.

hh = bp.dyn.HH(size=1)  # only 1 neuron

hh = bp.dyn.HH(size=10)  # 10 neurons in a group

hh = bp.dyn.HH(size=(10, 10))  # a grid of (10, 10) neurons in a group

hh = bp.dyn.HH(size=(5, 4, 2))  # a column of (5, 4, 2) neurons in a group

Generally speaking, there are two types of arguments can be set by users:

  • parameters: the model parameters, like gNa refers to the maximum conductance of sodium channel in the brainpy.dyn.HH model.

  • variables: the model variables, like V refers to the membrane potential of a neuron model.

In default, model parameters are homogeneous, which are just scalar values.

hh = bp.dyn.HH(5)  # there are five neurons in this group

hh.gNa
120.0

However, neuron models support heterogeneous parameters when performing computations in a neuron group. One can initialize heterogeneous parameters by several ways.

1. Tensor

Users can directly provide a tensor as the parameter.

hh = bp.dyn.HH(5, gNa=bm.random.uniform(110, 130, size=5))

hh.gNa
JaxArray([114.53795, 127.13995, 119.036  , 110.91665, 117.91266], dtype=float32)

2. Initializer

BrainPy provides wonderful supports on initializations. One can provide an initializer to the parameter to instruct the model initialize heterogeneous parameters.

hh = bp.dyn.HH(5, ENa=bp.init.OneInit(50.))

hh.ENa
JaxArray([50., 50., 50., 50., 50.], dtype=float32)

3. Callable function

You can also directly provide a callable function which receive a shape argument.

hh = bp.dyn.HH(5, ENa=lambda shape: bm.random.uniform(40, 60, shape))

hh.ENa
JaxArray([52.201824, 52.322166, 44.033783, 47.943596, 54.985268], dtype=float32)

Here, let’s see how the heterogeneous parameters influence our model simulation.

# we create 3 neurons in a group. Each neuron has a unique "gNa"

model = bp.dyn.HH(3, gNa=bp.init.Uniform(min_val=100, max_val=140))
runner = bp.dyn.DSRunner(model, monitors=['V'], inputs=['input', 5.])
runner.run(100.)

bp.visualize.line_plot(runner.mon.ts, runner.mon.V, plot_ids=[0, 1, 2], show=True)
../_images/26ea3542f81b10db73896f0eb858c152d4ad9156f3e4ab82d1f323bff7d87ad6.png

Similarly, the setting of the initial values of a variable can also be realized through the above three ways: Tensor, Initializer, and Callable function. For example,

hh = bp.dyn.HH(
   3,
   V_initializer=bp.init.Uniform(-80., -60.),  # Initializer
   m_initializer=lambda shape: bm.random.random(shape),  # function
   h_initializer=bm.random.random(3),  # Tensor
)
print('V: ', hh.V)
print('m: ', hh.m)
print('h: ', hh.h)
V:  Variable([-77.707954, -73.94804 , -69.09014 ], dtype=float32)
m:  Variable([0.4219371, 0.5383264, 0.8984035], dtype=float32)
h:  Variable([0.61493886, 0.81473637, 0.3291837 ], dtype=float32)

Initializing a synapse model#

Initializing a synapse model needs to provide its pre-synaptic group (pre), post-synaptic group (post) and the connection method between them (conn). The below is an example to create an Exponential synapse model:

neu = bp.dyn.LIF(10)

# here we create a synaptic projection within a population
syn = bp.dyn.ExpCUBA(pre=neu, post=neu, conn=bp.conn.All2All())

BrainPy’s build-in synapse models support heterogeneous synaptic weights and delay steps by using Tensor, Initializer and Callable function. For example,

syn = bp.dyn.ExpCUBA(neu, neu, bp.conn.FixedProb(prob=0.1),
                     g_max=bp.init.Uniform(min_val=0.1, max_val=1.),
                     delay_step=lambda shape: bm.random.randint(10, 30, shape))
syn.g_max
JaxArray([0.9790364 , 0.18719104, 0.84017825, 0.31185275, 0.38157037,
          0.80953383, 0.61926776, 0.73845625, 0.9679548 , 0.385096  ,
          0.91454816], dtype=float32)
syn.delay_step
JaxArray([18, 19, 15, 21, 17, 24, 10, 27, 12, 20], dtype=int32)

However, in BrainPy, the built-in synapse models only support homogenous synaptic parameters, like the time constant \(\tau\). Users can customize their synaptic models when they want heterogeneous synatic parameters.

Similar, the synaptic variables can be initialized heterogeneously by using Tensor, Initializer, and Callable functions.

Change model parameters during simulation#

In BrainPy, all the dynamically changed variables (no matter it is changed inside or outside of a jitted function) should be marked as brainpy.math.Variable. BrainPy’s built-in models also support modifying model parameters during simulation.

For example, if you want to fix the gNa in the first 100 ms simulation, and then try to decrease its value in the following simulations. In this case, we can provide the gNa as an instance of brainpy.math.Variable when initializing the model.

hh = bp.dyn.HH(5, gNa=bm.Variable(bm.asarray([120.])))

runner = bp.dyn.DSRunner(hh, monitors=['V'], inputs=['input', 5.])
# the first running
runner.run(100.)
bp.visualize.line_plot(runner.mon.ts, runner.mon.V, show=True)
../_images/7ee69a0a4c67a1100a5e8ba14b2ad965d0314b86ed6a81af89e8bf20c2f70534.png
# change the gNa first
hh.gNa[:] = 100.

# the second running
runner.run(100.)
bp.visualize.line_plot(runner.mon.ts, runner.mon.V, show=True)
../_images/44a021c45bb7d1a7c50bfad36e8d2b7ef439831aaea58bbd49da932509d3ab2a.png

Examples of using built-in models#

Here we show users how to simulate a famous neuron models: The Morris-Lecar neuron model, which is a two-dimensional “reduced” excitation model applicable to systems having two non-inactivating voltage-sensitive conductances.

group = bp.dyn.MorrisLecar(1)

Then users can utilize various tools provided by BrainPy to easily simulate the Morris-Lecar neuron model. Here we are not going to dive into details so please read the corresponding tutorials if you want to learn more.

runner = bp.dyn.DSRunner(group, monitors=['V', 'W'], inputs=('input', 100.))
runner.run(1000)

fig, gs = bp.visualize.get_figure(2, 1, 3, 8)
fig.add_subplot(gs[0, 0])
bp.visualize.line_plot(runner.mon.ts, runner.mon.W, ylabel='W')
fig.add_subplot(gs[1, 0])
bp.visualize.line_plot(runner.mon.ts, runner.mon.V, ylabel='V', show=True)
../_images/522c9c7ba066101ab1ad96b035b7e9ff5b5439ddbd3cab29c19a1a863a195f83.png

Next we will also give users an intuitive understanding about building a network composed of different neurons and synapses model. Users can simply initialize these models as below and pass into brainpy.dyn.Network.

neu1 = bp.dyn.HH(1)
neu2 = bp.dyn.HH(1)
syn1 = bp.dyn.AMPA(neu1, neu2, bp.connect.All2All())
net = bp.dyn.Network(pre=neu1, syn=syn1, post=neu2)

By selecting proper runner, users can simulate the network efficiently and plot the simulation results.

runner = bp.dyn.DSRunner(net, inputs=[('pre.input', 5.)], monitors=['pre.V', 'post.V', 'syn.g'])
runner.run(150.)

import matplotlib.pyplot as plt

fig, gs = bp.visualize.get_figure(2, 1, 3, 8)
fig.add_subplot(gs[0, 0])
plt.plot(runner.mon.ts, runner.mon['pre.V'], label='pre-V')
plt.plot(runner.mon.ts, runner.mon['post.V'], label='post-V')
plt.legend()

fig.add_subplot(gs[1, 0])
plt.plot(runner.mon.ts, runner.mon['syn.g'], label='g')
plt.legend()
plt.show()
../_images/9a368e2e575981bcba4dbb8ae3f3365fcc468b3326027dc5825c559722a1468f.png