1
|
Klinshov VV, Kirillov SY. Shot noise in next-generation neural mass models for finite-size networks. Phys Rev E 2022; 106:L062302. [PMID: 36671128 DOI: 10.1103/physreve.106.l062302] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Accepted: 12/12/2022] [Indexed: 06/17/2023]
Abstract
Neural mass models is a general name for various models describing the collective dynamics of large neural populations in terms of averaged macroscopic variables. Recently, the so-called next-generation neural mass models have attracted a lot of attention due to their ability to account for the degree of synchrony. Being exact in the limit of infinitely large number of neurons, these models provide only an approximate description of finite-size networks. In the present Letter we study finite-size effects in the collective behavior of neural networks and prove that these effects can be captured by appropriately modified neural mass models. Namely, we show that the finite size of the network leads to the emergence of the so-called shot noise appearing as a stochastic term in the neural mass model. The power spectrum of this shot noise contains pronounced peaks, therefore its impact on the collective dynamics might be crucial due to resonance effects.
Collapse
Affiliation(s)
- Vladimir V Klinshov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, Nizhny Novgorod 603950, Russia and Faculty of Informatics, Mathematics, and Computer Science, National Research University Higher School of Economics, 25/12 Bol'shaya Pecherskaya Street, Nizhny Novgorod 603155, Russia
| | - Sergey Yu Kirillov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, Nizhny Novgorod 603950, Russia
| |
Collapse
|
2
|
Hagen E, Magnusson SH, Ness TV, Halnes G, Babu PN, Linssen C, Morrison A, Einevoll GT. Brain signal predictions from multi-scale networks using a linearized framework. PLoS Comput Biol 2022; 18:e1010353. [PMID: 35960767 PMCID: PMC9401172 DOI: 10.1371/journal.pcbi.1010353] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Revised: 08/24/2022] [Accepted: 07/02/2022] [Indexed: 12/04/2022] Open
Abstract
Simulations of neural activity at different levels of detail are ubiquitous in modern neurosciences, aiding the interpretation of experimental data and underlying neural mechanisms at the level of cells and circuits. Extracellular measurements of brain signals reflecting transmembrane currents throughout the neural tissue remain commonplace. The lower frequencies (≲ 300Hz) of measured signals generally stem from synaptic activity driven by recurrent interactions among neural populations and computational models should also incorporate accurate predictions of such signals. Due to limited computational resources, large-scale neuronal network models (≳ 106 neurons or so) often require reducing the level of biophysical detail and account mainly for times of action potentials (‘spikes’) or spike rates. Corresponding extracellular signal predictions have thus poorly accounted for their biophysical origin. Here we propose a computational framework for predicting spatiotemporal filter kernels for such extracellular signals stemming from synaptic activity, accounting for the biophysics of neurons, populations, and recurrent connections. Signals are obtained by convolving population spike rates by appropriate kernels for each connection pathway and summing the contributions. Our main results are that kernels derived via linearized synapse and membrane dynamics, distributions of cells, conduction delay, and volume conductor model allow for accurately capturing the spatiotemporal dynamics of ground truth extracellular signals from conductance-based multicompartment neuron networks. One particular observation is that changes in the effective membrane time constants caused by persistent synapse activation must be accounted for. The work also constitutes a major advance in computational efficiency of accurate, biophysics-based signal predictions from large-scale spike and rate-based neuron network models drastically reducing signal prediction times compared to biophysically detailed network models. This work also provides insight into how experimentally recorded low-frequency extracellular signals of neuronal activity may be approximately linearly dependent on spiking activity. A new software tool LFPykernels serves as a reference implementation of the framework. Understanding the brain’s function and activity in healthy and pathological states across spatial scales and times spanning entire lives is one of humanity’s great undertakings. In experimental and clinical work probing the brain’s activity, a variety of electric and magnetic measurement techniques are routinely applied. However interpreting the extracellularly measured signals remains arduous due to multiple factors, mainly the large number of neurons contributing to the signals and complex interactions occurring in recurrently connected neuronal circuits. To understand how neurons give rise to such signals, mechanistic modeling combined with forward models derived using volume conductor theory has proven to be successful, but this approach currently does not scale to the systems level (encompassing millions of neurons or more) where simplified or abstract neuron representations typically are used. Motivated by experimental findings implying approximately linear relationships between times of neuronal action potentials and extracellular population signals, we provide a biophysics-based method for computing causal filters relating spikes and extracellular signals that can be applied with spike times or rates of large-scale neuronal network models for predictions of population signals without relying on ad hoc approximations.
Collapse
Affiliation(s)
- Espen Hagen
- Department of Data Science, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- * E-mail: (EH); (GTE)
| | - Steinn H. Magnusson
- Department of Physics, Faculty of Mathematics and Natural Sciences, University of Oslo, Oslo, Norway
| | - Torbjørn V. Ness
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Geir Halnes
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Pooja N. Babu
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
| | - Charl Linssen
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
- Institute of Neuroscience and Medicine (INM-6); Computational and Systems Neuroscience & Institute for Advanced Simulation (IAS-6); Theoretical Neuroscience & JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre and JARA, Jülich, Germany
| | - Abigail Morrison
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
- Institute of Neuroscience and Medicine (INM-6); Computational and Systems Neuroscience & Institute for Advanced Simulation (IAS-6); Theoretical Neuroscience & JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre and JARA, Jülich, Germany
- Software Engineering, Department of Computer Science 3, RWTH Aachen University, Aachen, Germany
| | - Gaute T. Einevoll
- Department of Physics, Faculty of Mathematics and Natural Sciences, University of Oslo, Oslo, Norway
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- * E-mail: (EH); (GTE)
| |
Collapse
|
3
|
Dumont G, Pérez-Cervera A, Gutkin B. A framework for macroscopic phase-resetting curves for generalised spiking neural networks. PLoS Comput Biol 2022; 18:e1010363. [PMID: 35913991 PMCID: PMC9371324 DOI: 10.1371/journal.pcbi.1010363] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2021] [Revised: 08/11/2022] [Accepted: 07/06/2022] [Indexed: 11/18/2022] Open
Abstract
Brain rhythms emerge from synchronization among interconnected spiking neurons. Key properties of such rhythms can be gleaned from the phase-resetting curve (PRC). Inferring the PRC and developing a systematic phase reduction theory for large-scale brain rhythms remains an outstanding challenge. Here we present a theoretical framework and methodology to compute the PRC of generic spiking networks with emergent collective oscillations. We adopt a renewal approach where neurons are described by the time since their last action potential, a description that can reproduce the dynamical feature of many cell types. For a sufficiently large number of neurons, the network dynamics are well captured by a continuity equation known as the refractory density equation. We develop an adjoint method for this equation giving a semi-analytical expression of the infinitesimal PRC. We confirm the validity of our framework for specific examples of neural networks. Our theoretical framework can link key biological properties at the individual neuron scale and the macroscopic oscillatory network properties. Beyond spiking networks, the approach is applicable to a broad class of systems that can be described by renewal processes. The formation of oscillatory neuronal assemblies at the network level has been hypothesized to be fundamental to many cognitive and motor functions. One prominent tool to understand the dynamics of oscillatory activity response to stimuli, and hence the neural code for which it is a substrate, is a nonlinear measure called Phase-Resetting Curve (PRC). At the network scale, the PRC defines the measure of how a given synaptic input perturbs the timing of next upcoming volley of spike assemblies: either advancing or delaying this timing. As a further application, one can use PRCs to make unambiguous predictions about whether communicating networks of neurons will phase-lock as it is often observed across the cortical areas and what would be this stable phase-configuration: synchronous, asynchronous or with asymmetric phase-shifts. The latter configuration also implies a preferential flow of information form the leading network to the follower, thereby giving causal signatures of directed functional connectivity. Because of the key position of the PRC in studying synchrony, information flow and entrainment to external forcing, it is crucial to move toward a theory that allows to compute the PRCs of network-wide oscillations not only for a restricted class of models, as has been done in the past, but to network descriptions that are generalized and can reflect flexibly single cell properties. In this manuscript, we tackle this issue by showing how the PRC for network oscillations can be computed using the adjoint systems of partial differential equations that define the dynamics of the neural activity density.
Collapse
Affiliation(s)
- Grégory Dumont
- Group for Neural Theory, LNC INSERM U960, DEC, Ecole Normale Supérieure - PSL University, Paris France
- * E-mail:
| | - Alberto Pérez-Cervera
- Center for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow
- Instituto de Matemática Interdisciplinar, Universidad Complutense de Madrid, Madrid, Spain
| | - Boris Gutkin
- Group for Neural Theory, LNC INSERM U960, DEC, Ecole Normale Supérieure - PSL University, Paris France
- Center for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow
| |
Collapse
|
4
|
Boutselis GI, Evans EN, Pereira MA, Theodorou EA. Leveraging Stochasticity for Open Loop and Model Predictive Control of Spatio-Temporal Systems. ENTROPY 2021; 23:e23080941. [PMID: 34441081 PMCID: PMC8391146 DOI: 10.3390/e23080941] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 06/30/2021] [Accepted: 07/14/2021] [Indexed: 11/16/2022]
Abstract
Stochastic spatio-temporal processes are prevalent across domains ranging from the modeling of plasma, turbulence in fluids to the wave function of quantum systems. This letter studies a measure-theoretic description of such systems by describing them as evolutionary processes on Hilbert spaces, and in doing so, derives a framework for spatio-temporal manipulation from fundamental thermodynamic principles. This approach yields a variational optimization framework for controlling stochastic fields. The resulting scheme is applicable to a wide class of spatio-temporal processes and can be used for optimizing parameterized control policies. Our simulated experiments explore the application of two forms of this approach on four stochastic spatio-temporal processes, with results that suggest new perspectives and directions for studying stochastic control problems for spatio-temporal systems.
Collapse
Affiliation(s)
- George I. Boutselis
- Department of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA 30313, USA; (G.I.B.); (E.A.T.)
| | - Ethan N. Evans
- Department of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA 30313, USA; (G.I.B.); (E.A.T.)
- Correspondence:
| | - Marcus A. Pereira
- Institute of Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30313, USA;
| | - Evangelos A. Theodorou
- Department of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA 30313, USA; (G.I.B.); (E.A.T.)
- Institute of Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30313, USA;
| |
Collapse
|
5
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
6
|
Kulkarni A, Ranft J, Hakim V. Synchronization, Stochasticity, and Phase Waves in Neuronal Networks With Spatially-Structured Connectivity. Front Comput Neurosci 2020; 14:569644. [PMID: 33192427 PMCID: PMC7604323 DOI: 10.3389/fncom.2020.569644] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 08/18/2020] [Indexed: 01/15/2023] Open
Abstract
Oscillations in the beta/low gamma range (10–45 Hz) are recorded in diverse neural structures. They have successfully been modeled as sparsely synchronized oscillations arising from reciprocal interactions between randomly connected excitatory (E) pyramidal cells and local interneurons (I). The synchronization of spatially distant oscillatory spiking E–I modules has been well-studied in the rate model framework but less so for modules of spiking neurons. Here, we first show that previously proposed modifications of rate models provide a quantitative description of spiking E–I modules of Exponential Integrate-and-Fire (EIF) neurons. This allows us to analyze the dynamical regimes of sparsely synchronized oscillatory E–I modules connected by long-range excitatory interactions, for two modules, as well as for a chain of such modules. For modules with a large number of neurons (> 105), we obtain results similar to previously obtained ones based on the classic deterministic Wilson-Cowan rate model, with the added bonus that the results quantitatively describe simulations of spiking EIF neurons. However, for modules with a moderate (~ 104) number of neurons, stochastic variations in the spike emission of neurons are important and need to be taken into account. On the one hand, they modify the oscillations in a way that tends to promote synchronization between different modules. On the other hand, independent fluctuations on different modules tend to disrupt synchronization. The correlations between distant oscillatory modules can be described by stochastic equations for the oscillator phases that have been intensely studied in other contexts. On shorter distances, we develop a description that also takes into account amplitude modes and that quantitatively accounts for our simulation data. Stochastic dephasing of neighboring modules produces transient phase gradients and the transient appearance of phase waves. We propose that these stochastically-induced phase waves provide an explanative framework for the observations of traveling waves in the cortex during beta oscillations.
Collapse
Affiliation(s)
- Anirudh Kulkarni
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France.,IBENS, Ecole Normale Supérieure, PSL University, CNRS, INSERM, Paris, France
| | - Jonas Ranft
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France.,IBENS, Ecole Normale Supérieure, PSL University, CNRS, INSERM, Paris, France
| | - Vincent Hakim
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France
| |
Collapse
|
7
|
René A, Longtin A, Macke JH. Inference of a Mesoscopic Population Model from Population Spike Trains. Neural Comput 2020; 32:1448-1498. [DOI: 10.1162/neco_a_01292] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Understanding how rich dynamics emerge in neural populations requires models exhibiting a wide range of behaviors while remaining interpretable in terms of connectivity and single-neuron dynamics. However, it has been challenging to fit such mechanistic spiking networks at the single-neuron scale to empirical population data. To close this gap, we propose to fit such data at a mesoscale, using a mechanistic but low-dimensional and, hence, statistically tractable model. The mesoscopic representation is obtained by approximating a population of neurons as multiple homogeneous pools of neurons and modeling the dynamics of the aggregate population activity within each pool. We derive the likelihood of both single-neuron and connectivity parameters given this activity, which can then be used to optimize parameters by gradient ascent on the log likelihood or perform Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. We illustrate this approach using a model of generalized integrate-and-fire neurons for which mesoscopic dynamics have been previously derived and show that both single-neuron and connectivity parameters can be recovered from simulated data. In particular, our inference method extracts posterior correlations between model parameters, which define parameter subsets able to reproduce the data. We compute the Bayesian posterior for combinations of parameters using MCMC sampling and investigate how the approximations inherent in a mesoscopic population model affect the accuracy of the inferred single-neuron parameters.
Collapse
Affiliation(s)
- Alexandre René
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada; Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany; and Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich 52425, Germany
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada, and Brain and Mind Research Institute, University of Ottawa, Ottawa K1H 8M5, Canada
| | - Jakob H. Macke
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany, and Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of Munich, Munich 80333, Germany
| |
Collapse
|
8
|
Schwalger T, Chizhov AV. Mind the last spike - firing rate models for mesoscopic populations of spiking neurons. Curr Opin Neurobiol 2019; 58:155-166. [PMID: 31590003 DOI: 10.1016/j.conb.2019.08.003] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/25/2019] [Indexed: 02/07/2023]
Abstract
The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons - a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany; Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany.
| | - Anton V Chizhov
- Ioffe Institute, 194021 Saint-Petersburg, Russia; Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, 194223 Saint-Petersburg, Russia
| |
Collapse
|
9
|
Conductance-Based Refractory Density Approach for a Population of Bursting Neurons. Bull Math Biol 2019; 81:4124-4143. [PMID: 31313084 DOI: 10.1007/s11538-019-00643-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 07/09/2019] [Indexed: 12/29/2022]
Abstract
The conductance-based refractory density (CBRD) approach is a parsimonious mathematical-computational framework for modelling interacting populations of regular spiking neurons, which, however, has not been yet extended for a population of bursting neurons. The canonical CBRD method allows to describe the firing activity of a statistical ensemble of uncoupled Hodgkin-Huxley-like neurons (differentiated by noise) and has demonstrated its validity against experimental data. The present manuscript generalises the CBRD for a population of bursting neurons; however, in this pilot computational study, we consider the simplest setting in which each individual neuron is governed by a piecewise linear bursting dynamics. The resulting population model makes use of slow-fast analysis, which leads to a novel methodology that combines CBRD with the theory of multiple timescale dynamics. The main prospect is that it opens novel avenues for mathematical explorations, as well as, the derivation of more sophisticated population activity from Hodgkin-Huxley-like bursting neurons, which will allow to capture the activity of synchronised bursting activity in hyper-excitable brain states (e.g. onset of epilepsy).
Collapse
|
10
|
Qiu SW, Chow CC. Finite-size effects for spiking neural networks with spatially dependent coupling. Phys Rev E 2018; 98:062414. [PMID: 32478211 PMCID: PMC7258138 DOI: 10.1103/physreve.98.062414] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
We study finite-size fluctuations in a network of spiking deterministic neurons coupled with nonuniform synaptic coupling. We generalize a previously developed theory of finite-size effects for globally coupled neurons with a uniform coupling function. In the uniform coupling case, mean-field theory is well defined by averaging over the network as the number of neurons in the network goes to infinity. However, for nonuniform coupling it is no longer possible to average over the entire network if we are interested in fluctuations at a particular location within the network. We show that if the coupling function approaches a continuous function in the infinite system size limit, then an average over a local neighborhood can be defined such that mean-field theory is well defined for a spatially dependent field. We then use a path-integral formalism to derive a perturbation expansion in the inverse system size around the mean-field limit for the covariance of the input to a neuron (synaptic drive) and firing rate fluctuations due to dynamical deterministic finite-size effects.
Collapse
Affiliation(s)
- Si-Wei Qiu
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), National Institutes of Health (NIH), Bethesda, Maryland 20892, USA
| | - Carson C Chow
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), National Institutes of Health (NIH), Bethesda, Maryland 20892, USA
| |
Collapse
|