Spiking Network Model Builder
Spiking Network Model Builder
Purpose
This skill encodes expert methodological knowledge for constructing biologically realistic spiking neural network simulations. A competent programmer without computational neuroscience training will get this wrong because:
- Neuron model choice determines what phenomena can emerge. A leaky integrate-and-fire (LIF) neuron cannot produce bursting, adaptation, or rebound spikes. If your phenomenon depends on these, you need an AdEx or Izhikevich model, not just a "more complex" model (Izhikevich, 2004).
- E/I balance is not optional. Cortical networks maintain a tight excitation/inhibition balance. Networks without proper E/I ratio produce either silence or epileptiform runaway activity, neither of which is biologically realistic (Brunel, 2000).
- Synaptic time constants encode biology. AMPA (fast, ~5 ms), NMDA (slow, ~100 ms), and GABA_A (~10 ms) receptors have fundamentally different dynamics. Using a single generic synapse model erases critical temporal structure (Dayan & Abbott, 2001).
- Time step selection affects correctness. Too-large integration steps cause LIF neurons to miss spikes and HH neurons to become numerically unstable. The correct step depends on the neuron model, not general ODE intuition (Rotter & Diesmann, 1999).
- Weight scaling must respect network size. Naive weight choices produce firing rates that change with network size. Balanced networks require 1/sqrt(N) scaling (Brunel, 2000).
When to Use This Skill
- Constructing a spiking neural network simulation for a research question
- Choosing a neuron model appropriate for the phenomenon of interest
- Setting biologically constrained connectivity parameters
- Implementing synaptic plasticity (STDP, homeostatic, etc.)
- Validating model outputs against known cortical statistics
- Selecting simulation software and numerical parameters
Do NOT use this skill for:
- Rate-based neural network models (use standard ML/deep learning frameworks)
- Detailed compartmental modeling of single neurons (use NEURON with morphological data)
- Analyzing experimental neural data (see
neural-population-analysis-guide)
Research Planning Protocol
Before executing the domain-specific steps below, you MUST:
- State the research question -- What specific question is this analysis/paradigm addressing?
- Justify the method choice -- Why is this approach appropriate? What alternatives were considered?
- Declare expected outcomes -- What results would support vs. refute the hypothesis?
- Note assumptions and limitations -- What does this method assume? Where could it mislead?
- Present the plan to the user and WAIT for confirmation before proceeding.
For detailed methodology guidance, see the research-literacy skill.
⚠️ Verification Notice
This skill was generated by AI from academic literature. All parameters, thresholds, and citations require independent verification before use in research. If you find errors, please open an issue.
Step 1: Select a Neuron Model
Neuron Model Decision Tree
What firing properties does your model need?
|
+-- "Just spikes, basic rate coding, large networks"
| --> Leaky Integrate-and-Fire (LIF)
| Simplest; fastest simulation; no adaptation or bursting
|
+-- "Spike initiation sharpness matters"
| --> Exponential IF (EIF)
| Adds realistic spike onset; still single-variable
|
+-- "Spike-frequency adaptation or bursting"
| --> Adaptive Exponential IF (AdEx)
| Two variables; can produce regular spiking, bursting,
| intrinsic oscillations, adaptation
|
+-- "Diverse firing patterns with minimal complexity"
| --> Izhikevich model
| Four parameters; 20+ firing patterns; fast to simulate
|
+-- "Biophysically detailed ion channel dynamics"
--> Hodgkin-Huxley (HH)
Four variables; channel-level accuracy; slow to simulate
Use only when ion channel pharmacology is relevant
Neuron Model Parameters
Leaky Integrate-and-Fire (LIF)
| Parameter | Symbol | Value | Source |
|---|---|---|---|
| Resting potential | V_rest | -65 mV | Dayan & Abbott, 2001 |
| Threshold | V_thresh | -50 mV | Dayan & Abbott, 2001 |
| Reset potential | V_reset | -65 mV | Dayan & Abbott, 2001 |
| Membrane time constant | tau_m | 20 ms | Dayan & Abbott, 2001 |
| Membrane resistance | R_m | 100 MOhm (typical cortical) | Dayan & Abbott, 2001 |
| Refractory period | t_ref | 2 ms (absolute) | Dayan & Abbott, 2001 |
Exponential Integrate-and-Fire (EIF)
| Parameter | Symbol | Value | Source |
|---|---|---|---|
| All LIF parameters | -- | Same as above | Dayan & Abbott, 2001 |
| Sharpness of spike initiation | Delta_T | 2 mV | Fourcaud-Trocme et al., 2003 |
| Spike detection threshold | V_peak | 0 mV or 20 mV | Fourcaud-Trocme et al., 2003 |
Adaptive Exponential IF (AdEx)
| Parameter | Symbol | Value | Source |
|---|---|---|---|
| Subthreshold adaptation | a | 4 nS | Brette & Gerstner, 2005 |
| Spike-triggered adaptation | b | 0.08 nA (80 pA) | Brette & Gerstner, 2005 |
| Adaptation time constant | tau_w | 100--300 ms | Brette & Gerstner, 2005 |
| Spike initiation sharpness | Delta_T | 2 mV | Brette & Gerstner, 2005 |
| All EIF parameters | -- | Same as EIF above | Brette & Gerstner, 2005 |
AdEx firing patterns by parameter regime (Brette & Gerstner, 2005; Naud et al., 2008):
| Pattern | a (nS) | b (nA) | tau_w (ms) | Typical neuron type |
|---|---|---|---|---|
| Regular spiking | 4 | 0.08 | 150 | Cortical pyramidal |
| Bursting | 4 | 0.5 | 100 | Intrinsically bursting |
| Fast spiking | 0 | 0 | -- | PV+ interneuron (no adaptation) |
| Adapting | 4 | 0.08 | 300 | Slow-adapting pyramidal |
Izhikevich Model
The model uses two variables (v, u) with four parameters (a, b, c, d) (Izhikevich, 2003):
| Pattern | a | b | c (mV) | d | Source |
|---|---|---|---|---|---|
| Regular spiking | 0.02 | 0.2 | -65 | 8 | Izhikevich, 2003 |
| Intrinsically bursting | 0.02 | 0.2 | -55 | 4 | Izhikevich, 2003 |
| Chattering | 0.02 | 0.2 | -50 | 2 | Izhikevich, 2003 |
| Fast spiking | 0.1 | 0.2 | -65 | 2 | Izhikevich, 2003 |
| Low-threshold spiking | 0.02 | 0.25 | -65 | 2 | Izhikevich, 2003 |
Hodgkin-Huxley (HH)
Use only when biophysical detail is required. See references/hh-parameters.md for the full parameter set. Key values (Hodgkin & Huxley, 1952):
- g_Na = 120 mS/cm^2, E_Na = 50 mV
- g_K = 36 mS/cm^2, E_K = -77 mV
- g_L = 0.3 mS/cm^2, E_L = -54.4 mV
- C_m = 1 uF/cm^2
Step 2: Configure Synapses
Synaptic Time Constants
| Receptor | tau_rise | tau_decay | Net tau_syn | Source |
|---|---|---|---|---|
| AMPA | ~0.5 ms | ~5 ms | 5 ms (single exponential) | Dayan & Abbott, 2001 |
| NMDA | ~2 ms | ~100 ms | 100 ms (single exponential) | Dayan & Abbott, 2001 |
| GABA_A | ~0.5 ms | ~10 ms | 10 ms (single exponential) | Dayan & Abbott, 2001 |
| GABA_B | ~50 ms | ~200 ms | 200 ms (single exponential) | Dayan & Abbott, 2001 |
Conductance-Based vs. Current-Based Synapses
| Type | Equation | When to Use | Source |
|---|---|---|---|
| Current-based | I_syn = w * g(t) | Large networks; faster simulation; when voltage-dependent effects are unimportant | Brunel, 2000 |
| Conductance-based | I_syn = g(t) * (V - E_rev) | When synaptic interactions depend on membrane potential (e.g., NMDA voltage dependence, shunting inhibition) | Dayan & Abbott, 2001 |
Domain judgment: Current-based synapses are appropriate for most network-level studies. Switch to conductance-based when the research question involves voltage-dependent effects (NMDA Mg2+ block, shunting inhibition) or when accurate I-V relationships matter (Brunel, 2000; Dayan & Abbott, 2001).
Short-Term Plasticity: Tsodyks-Markram Model
The Tsodyks-Markram (TM) model captures short-term facilitation and depression (Tsodyks & Markram, 1997):
| Parameter | Facilitating synapse | Depressing synapse | Source |
|---|---|---|---|
| U (initial release prob.) | 0.1 | 0.5 | Tsodyks & Markram, 1997 |
| tau_rec (recovery time) | 800 ms | 800 ms | Tsodyks & Markram, 1997 |
| tau_fac (facilitation time) | 1000 ms | 0 ms (no facilitation) | Tsodyks & Markram, 1997 |
Step 3: Configure Network Connectivity
Excitatory/Inhibitory Balance
| Parameter | Value | Source |
|---|---|---|
| Excitatory fraction | 80% of neurons | Braitenberg & Schutz, 1998 |
| Inhibitory fraction | 20% of neurons | Braitenberg & Schutz, 1998 |
| E-to-E connection probability | 10--20% (random) | Brunel, 2000 |
| E-to-I connection probability | 10--20% | Brunel, 2000 |
| I-to-E connection probability | 10--20% | Brunel, 2000 |
| I-to-I connection probability | 10--20% | Brunel, 2000 |
Weight Scaling for Balanced Networks
For a balanced network to produce biologically realistic asynchronous irregular (AI) firing (Brunel, 2000):
- Excitatory weight: J_E = J_0 / sqrt(N_E * p), where N_E is excitatory population size and p is connection probability
- Inhibitory weight: J_I = -g * J_E, where g > 1 (typically g = 4--8 for the AI regime)
- External drive: Poisson input to maintain target firing rates
Domain judgment: The ratio g = J_I/J_E (relative inhibitory strength) determines the network regime. g < 4 produces synchronous regular firing; g = 4--8 produces the biologically realistic asynchronous irregular (AI) state; g >> 8 produces very low firing rates or silence (Brunel, 2000).
Network Size
| Scale | Neurons | Typical Use | Source |
|---|---|---|---|
| Minimal | 100--500 | Quick tests; parameter exploration | Expert consensus |
| Cortical column | 1,000--10,000 | Standard for cortical circuit models | Brunel, 2000 |
| Large-scale | 10,000--100,000 | Multi-area models; detailed column | Potjans & Diesmann, 2014 |
Step 4: Implement Plasticity Rules
Spike-Timing-Dependent Plasticity (STDP)
Standard pair-based STDP parameters (Bi & Poo, 1998; Song et al., 2000):
| Parameter | Symbol | Value | Source |
|---|---|---|---|
| Potentiation time constant | tau_+ | 20 ms | Bi & Poo, 1998 |
| Depression time constant | tau_- | 20 ms | Bi & Poo, 1998 |
| Potentiation amplitude | A_+ | 0.01 (relative) | Song et al., 2000 |
| Depression amplitude | A_- | -0.012 ( | A_- |
| Maximum weight | w_max | Set to prevent runaway | Song et al., 2000 |
Domain judgment: The asymmetry |A_-| > A_+ is critical. Without it, STDP drives all weights to their maximum value (runaway potentiation). The slight depression bias ensures stable weight distributions (Song et al., 2000). Additional stabilization mechanisms (weight dependence, homeostatic plasticity) are often needed in practice.
Rate-Based Plasticity: BCM Rule
The Bienenstock-Cooper-Munro (BCM) rule provides a stable, rate-based plasticity rule (Bienenstock et al., 1982):
- Learning rule: dw/dt = eta * y * (y - theta_m) * x
- Sliding threshold: theta_m = E[y^2], ensuring stability
- Where y = postsynaptic rate, x = presynaptic rate, eta = learning rate
Homeostatic Plasticity
For long simulations with STDP, add homeostatic mechanisms to prevent runaway dynamics:
- Synaptic scaling: Multiplicatively scale all incoming weights to maintain target firing rate (Turrigiano et al., 1998)
- Intrinsic plasticity: Adjust neuronal excitability (threshold or adaptation) to maintain target rate (Desai et al., 1999)
- Target firing rate: 1--5 Hz for excitatory cortical neurons (expert consensus based on in vivo recordings)
Step 5: Set Simulation Parameters
Integration Time Step
| Neuron Model | Recommended dt | Maximum dt | Rationale | Source |
|---|---|---|---|---|
| LIF | 0.1 ms | 0.5 ms | Exact integration possible; larger steps miss coincident spikes | Rotter & Diesmann, 1999 |
| EIF / AdEx | 0.1 ms | 0.1 ms | Exponential term requires small steps near threshold | Brette & Gerstner, 2005 |
| Izhikevich | 0.1 ms | 0.5 ms (with Euler) | Use 0.5 ms with two half-steps per Izhikevich (2003) | Izhikevich, 2003 |
| Hodgkin-Huxley | 0.01--0.05 ms | 0.05 ms | Gating variable dynamics require fine resolution | Rotter & Diesmann, 1999 |
Simulation Duration
| Phenomenon | Minimum Duration | Rationale | Source |
|---|---|---|---|
| Network stabilization (transient) | 500 ms discard | Allow initial transient to decay | Expert consensus |
| Asynchronous irregular state | 1--5 s after transient | Sufficient for firing rate and CV statistics | Brunel, 2000 |
| STDP weight development | 10--100 s | Weights evolve slowly | Song et al., 2000 |
| Oscillation analysis | 2--10 s | Need multiple cycles for spectral analysis | Expert consensus |
Step 6: Validate the Model
Essential Validation Metrics
| Metric | Target Value | What It Indicates | Source |
|---|---|---|---|
| Mean firing rate (excitatory) | 1--10 Hz | Realistic cortical activity | Brunel, 2000 |
| Mean firing rate (inhibitory) | 5--30 Hz | Fast-spiking interneurons fire faster | Brunel, 2000 |
| CV of ISI | ~1.0 (0.8--1.2) | Irregular firing (Poisson-like) | Brunel, 2000; Softky & Koch, 1993 |
| Fano factor (spike count) | ~1.0 | Poisson-like variability | Softky & Koch, 1993 |
| Population synchrony (chi) | < 0.2 for AI state | Asynchronous activity | Brunel, 2000 |
| Pairwise correlation | 0.01--0.1 | Weak correlations as in cortex | Cohen & Kohn, 2011 |
Domain judgment: A network with mean firing rate in range but CV << 1 (regular firing) is NOT in a biologically realistic regime. Cortical neurons fire irregularly (CV ~ 1) even when the network is in a stationary state. If your CV is much less than 1, inhibition is likely too weak or connectivity too structured (Brunel, 2000).
Simulator Selection
| Simulator | Language | Best For | Limitations | Source |
|---|---|---|---|---|
| NEST | Python/C++ | Large-scale LIF/IF networks; exact integration | Less flexible for custom models | Gewaltig & Diesmann, 2007 |
| Brian2 | Python | Rapid prototyping; custom equations; education | Slower than NEST for very large networks | Stimberg et al., 2019 |
| NEURON | Python/HOC | Compartmental models; biophysical detail | Overkill for point-neuron networks | Hines & Carnevale, 1997 |
| GeNN | C++/Python | GPU-accelerated; very large networks | Requires NVIDIA GPU; steeper learning curve | Yavuz et al., 2016 |
Recommendation: Start with Brian2 for prototyping and model development. Use NEST for production runs of large-scale networks. Use NEURON only when compartmental morphology is needed. Use GeNN when GPU acceleration is required for network size (Stimberg et al., 2019).
Common Pitfalls
1. No E/I Balance
Networks without proper E/I ratio (80/20) and weight scaling produce unrealistic dynamics: runaway excitation, epileptiform synchrony, or silence. Always verify the network operates in the AI regime (Brunel, 2000).
2. Ignoring the Initial Transient
The first 200--500 ms of simulation reflect initial conditions, not the network's steady state. Always discard this transient period before computing statistics (expert consensus).
3. Wrong Time Step for the Neuron Model
Using dt = 1 ms for HH models causes numerical instability. Using dt = 0.01 ms for LIF networks wastes computation. Match dt to the model (Rotter & Diesmann, 1999).
4. STDP Without Stabilization
Pair-based STDP alone drives weights to bimodal (all 0 or all w_max) distributions. Add weight dependence, homeostatic scaling, or use triplet STDP rules for stable learning (Song et al., 2000; Turrigiano et al., 1998).
5. Network Size-Dependent Behavior
Changing network size N without rescaling weights (1/sqrt(N)) changes firing rates and dynamics. Always verify that results are robust to network size or explicitly rescale (Brunel, 2000).
6. Using Conductance-Based Synapses When Current-Based Suffice
Conductance-based synapses are slower to simulate and add complexity. Unless voltage-dependent effects (NMDA, shunting inhibition) are central to the question, current-based synapses are appropriate and much faster (Brunel, 2000).
Minimum Reporting Checklist
Based on Nordlie et al. (2009) model description standards and Brunel (2000):
- Neuron model type and all parameters (with units)
- Synapse model type (current vs. conductance) and time constants per receptor type
- Network size (N_E, N_I) and connection probability
- Weight values and scaling rule (how weights relate to N)
- External input description (Poisson rate, current injection)
- Plasticity rule and parameters (if applicable)
- Integration method and time step
- Simulation duration (including discarded transient)
- Validation metrics: mean firing rates, CV of ISI, synchrony measure
- Simulator name and version
- Random seed or number of independent realizations
Key References
- Bi, G.-Q., & Poo, M.-M. (1998). Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynaptic cell type. Journal of Neuroscience, 18(24), 10464--10472.
- Bienenstock, E. L., Cooper, L. N., & Munro, P. W. (1982). Theory for the development of neuron selectivity: Orientation specificity and binocular interaction in visual cortex. Journal of Neuroscience, 2(1), 32--48.
- Braitenberg, V., & Schutz, A. (1998). Cortex: Statistics and Geometry of Neuronal Connectivity (2nd ed.). Springer.
- Brette, R., & Gerstner, W. (2005). Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. Journal of Neurophysiology, 94(5), 3637--3642.
- Brunel, N. (2000). Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. Journal of Computational Neuroscience, 8(3), 183--208.
- Dayan, P., & Abbott, L. F. (2001). Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. MIT Press.
- Fourcaud-Trocme, N., Hansel, D., van Vreeswijk, C., & Brunel, N. (2003). How spike generation mechanisms determine the neuronal response to fluctuating inputs. Journal of Neuroscience, 23(37), 11628--11640.
- Gerstner, W., Kistler, W. M., Naud, R., & Paninski, L. (2014). Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. Cambridge University Press.
- Hodgkin, A. L., & Huxley, A. F. (1952). A quantitative description of membrane current and its application to conduction and excitation in nerve. Journal of Physiology, 117(4), 500--544.
- Izhikevich, E. M. (2003). Simple model of spiking neurons. IEEE Transactions on Neural Networks, 14(6), 1569--1572.
- Izhikevich, E. M. (2004). Which model to use for cortical spiking neurons? IEEE Transactions on Neural Networks, 15(5), 1063--1070.
- Nordlie, E., Gewaltig, M.-O., & Plesser, H. E. (2009). Towards reproducible descriptions of neuronal network models. PLoS Computational Biology, 5(8), e1000456.
- Rotter, S., & Diesmann, M. (1999). Exact digital simulation of time-invariant linear systems with applications to neuronal modeling. Biological Cybernetics, 81(5--6), 381--402.
- Song, S., Miller, K. D., & Abbott, L. F. (2000). Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nature Neuroscience, 3(9), 919--926.
- Tsodyks, M. V., & Markram, H. (1997). The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability. Proceedings of the National Academy of Sciences, 94(2), 719--723.
See references/hh-parameters.md for full Hodgkin-Huxley parameter tables.
See references/network-regimes.md for Brunel network regime diagrams and extended parameter sweeps.