1
|
Shao Y, Zhang J, Tao L. Dimensional reduction of emergent spatiotemporal cortical dynamics via a maximum entropy moment closure. PLoS Comput Biol 2020; 16:e1007265. [PMID: 32516336 PMCID: PMC7304648 DOI: 10.1371/journal.pcbi.1007265] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Revised: 06/19/2020] [Accepted: 04/29/2020] [Indexed: 11/22/2022] Open
Abstract
Modern electrophysiological recordings and optical imaging techniques have revealed a diverse spectrum of spatiotemporal neural activities underlying fundamental cognitive processing. Oscillations, traveling waves and other complex population dynamical patterns are often concomitant with sensory processing, information transfer, decision making and memory consolidation. While neural population models such as neural mass, population density and kinetic theoretical models have been used to capture a wide range of the experimentally observed dynamics, a full account of how the multi-scale dynamics emerges from the detailed biophysical properties of individual neurons and the network architecture remains elusive. Here we apply a recently developed coarse-graining framework for reduced-dimensional descriptions of neuronal networks to model visual cortical dynamics. We show that, without introducing any new parameters, how a sequence of models culminating in an augmented system of spatially-coupled ODEs can effectively model a wide range of the observed cortical dynamics, ranging from visual stimulus orientation dynamics to traveling waves induced by visual illusory stimuli. In addition to an efficient simulation method, this framework also offers an analytic approach to studying large-scale network dynamics. As such, the dimensional reduction naturally leads to mesoscopic variables that capture the interplay between neuronal population stochasticity and network architecture that we believe to underlie many emergent cortical phenomena.
Collapse
Affiliation(s)
- Yuxiu Shao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, China
| | - Jiwei Zhang
- School of Mathematics and Statistics, and Hubei Key Laboratory of Computational Science, Wuhan University, China
| | - Louis Tao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, China
- Center for Quantitative Biology, Peking University, Beijing, China
| |
Collapse
|
2
|
An Efficient Population Density Method for Modeling Neural Networks with Synaptic Dynamics Manifesting Finite Relaxation Time and Short-Term Plasticity. eNeuro 2019; 5:eN-MNT-0002-18. [PMID: 30662939 PMCID: PMC6336402 DOI: 10.1523/eneuro.0002-18.2018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2018] [Revised: 10/24/2018] [Accepted: 11/21/2018] [Indexed: 12/05/2022] Open
Abstract
When incorporating more realistic synaptic dynamics, the computational efficiency of population density methods (PDMs) declines sharply due to the increase in the dimension of master equations. To avoid such a decline, we develop an efficient PDM, termed colored-synapse PDM (csPDM), in which the dimension of the master equations does not depend on the number of synapse-associated state variables in the underlying network model. Our goal is to allow the PDM to incorporate realistic synaptic dynamics that possesses not only finite relaxation time but also short-term plasticity (STP). The model equations of csPDM are derived based on the diffusion approximation on synaptic dynamics and probability density function methods for Langevin equations with colored noise. Numerical examples, given by simulations of the population dynamics of uncoupled exponential integrate-and-fire (EIF) neurons, show good agreement between the results of csPDM and Monte Carlo simulations (MCSs). Compared to the original full-dimensional PDM (fdPDM), the csPDM reveals more excellent computational efficiency because of the lower dimension of the master equations. In addition, it permits network dynamics to possess the short-term plastic characteristics inherited from plastic synapses. The novel csPDM has potential applicability to any spiking neuron models because of no assumptions on neuronal dynamics, and, more importantly, this is the first report of PDM to successfully encompass short-term facilitation/depression properties.
Collapse
|
3
|
Large deviations for randomly connected neural networks: I. Spatially extended systems. ADV APPL PROBAB 2018. [DOI: 10.1017/apr.2018.42] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Abstract
In a series of two papers, we investigate the large deviations and asymptotic behavior of stochastic models of brain neural networks with random interaction coefficients. In this first paper, we take into account the spatial structure of the brain and consider first the presence of interaction delays that depend on the distance between cells and then the Gaussian random interaction amplitude with a mean and variance that depend on the position of the neurons and scale as the inverse of the network size. We show that the empirical measure satisfies a large deviations principle with a good rate function reaching its minimum at a unique spatially extended probability measure. This result implies an averaged convergence of the empirical measure and a propagation of chaos. The limit is characterized through a complex non-Markovian implicit equation in which the network interaction term is replaced by a nonlocal Gaussian process with a mean and covariance that depend on the statistics of the solution over the whole neural field.
Collapse
|
4
|
Barranca VJ, Kovačič G, Zhou D, Cai D. Network dynamics for optimal compressive-sensing input-signal recovery. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 90:042908. [PMID: 25375568 DOI: 10.1103/physreve.90.042908] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2014] [Indexed: 06/04/2023]
Abstract
By using compressive sensing (CS) theory, a broad class of static signals can be reconstructed through a sequence of very few measurements in the framework of a linear system. For networks with nonlinear and time-evolving dynamics, is it similarly possible to recover an unknown input signal from only a small number of network output measurements? We address this question for pulse-coupled networks and investigate the network dynamics necessary for successful input signal recovery. Determining the specific network characteristics that correspond to a minimal input reconstruction error, we are able to achieve high-quality signal reconstructions with few measurements of network output. Using various measures to characterize dynamical properties of network output, we determine that networks with highly variable and aperiodic output can successfully encode network input information with high fidelity and achieve the most accurate CS input reconstructions. For time-varying inputs, we also find that high-quality reconstructions are achievable by measuring network output over a relatively short time window. Even when network inputs change with time, the same optimal choice of network characteristics and corresponding dynamics apply as in the case of static inputs.
Collapse
Affiliation(s)
- Victor J Barranca
- Courant Institute of Mathematical Sciences & Center for Neural Science, New York University, New York, New York 10012, USA and NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Gregor Kovačič
- Mathematical Sciences Department, Rensselaer Polytechnic Institute, Troy, New York 12180, USA
| | - Douglas Zhou
- Department of Mathematics, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- Courant Institute of Mathematical Sciences & Center for Neural Science, New York University, New York, New York 10012, USA and NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates and Department of Mathematics, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
5
|
Cáceres MJ, Perthame B. Beyond blow-up in excitatory integrate and fire neuronal networks: Refractory period and spontaneous activity. J Theor Biol 2014; 350:81-9. [DOI: 10.1016/j.jtbi.2014.02.005] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2013] [Revised: 01/28/2014] [Accepted: 02/06/2014] [Indexed: 10/25/2022]
|
6
|
Ly C. A principled dimension-reduction method for the population density approach to modeling networks of neurons with synaptic dynamics. Neural Comput 2013; 25:2682-708. [PMID: 23777517 DOI: 10.1162/neco_a_00489] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The population density approach to neural network modeling has been utilized in a variety of contexts. The idea is to group many similar noisy neurons into populations and track the probability density function for each population that encompasses the proportion of neurons with a particular state rather than simulating individual neurons (i.e., Monte Carlo). It is commonly used for both analytic insight and as a time-saving computational tool. The main shortcoming of this method is that when realistic attributes are incorporated in the underlying neuron model, the dimension of the probability density function increases, leading to intractable equations or, at best, computationally intensive simulations. Thus, developing principled dimension-reduction methods is essential for the robustness of these powerful methods. As a more pragmatic tool, it would be of great value for the larger theoretical neuroscience community. For exposition of this method, we consider a single uncoupled population of leaky integrate-and-fire neurons receiving external excitatory synaptic input only. We present a dimension-reduction method that reduces a two-dimensional partial differential-integral equation to a computationally efficient one-dimensional system and gives qualitatively accurate results in both the steady-state and nonequilibrium regimes. The method, termed modified mean-field method, is based entirely on the governing equations and not on any auxiliary variables or parameters, and it does not require fine-tuning. The principles of the modified mean-field method have potential applicability to more realistic (i.e., higher-dimensional) neural networks.
Collapse
Affiliation(s)
- Cheng Ly
- Department of Statistical Sciences and Operations Research, Virginia Commonwealth University Richmond, VA 23284-3083, USA.
| |
Collapse
|
7
|
Perthame B, Salort D. On a voltage-conductance kinetic system for integrate & fire neural networks. ACTA ACUST UNITED AC 2013. [DOI: 10.3934/krm.2013.6.841] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
8
|
Rangan AV, Young LS. Dynamics of spiking neurons: between homogeneity and synchrony. J Comput Neurosci 2012; 34:433-60. [PMID: 23096934 DOI: 10.1007/s10827-012-0429-1] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2012] [Revised: 09/28/2012] [Accepted: 10/02/2012] [Indexed: 11/24/2022]
Abstract
Randomly connected networks of neurons driven by Poisson inputs are often assumed to produce "homogeneous" dynamics, characterized by largely independent firing and approximable by diffusion processes. At the same time, it is well known that such networks can fire synchronously. Between these two much studied scenarios lies a vastly complex dynamical landscape that is relatively unexplored. In this paper, we discuss a phenomenon which commonly manifests in these intermediate regimes, namely brief spurts of spiking activity which we call multiple firing events (MFE). These events do not depend on structured network architecture nor on structured input; they are an emergent property of the system. We came upon them in an earlier modeling paper, in which we discovered, through a careful benchmarking process, that MFEs are the single most important dynamical mechanism behind many of the V1 phenomena we were able to replicate. In this paper we explain in a simpler setting how MFEs come about, as well as their potential dynamic consequences. Although the mechanism underlying MFEs cannot easily be captured by current population dynamics models, this phenomena should not be ignored during analysis; there is a growing body of evidence that such collaborative activity may be a key towards unlocking the possible functional properties of many neuronal networks.
Collapse
Affiliation(s)
- Aaditya V Rangan
- Courant Institute of Mathematical Sciences, New York University, New York, USA
| | | |
Collapse
|
9
|
Bressloff PC. Metastable states and quasicycles in a stochastic Wilson-Cowan model of neuronal population dynamics. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2010; 82:051903. [PMID: 21230496 DOI: 10.1103/physreve.82.051903] [Citation(s) in RCA: 54] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/13/2010] [Revised: 09/22/2010] [Indexed: 05/08/2023]
Abstract
We analyze a stochastic model of neuronal population dynamics with intrinsic noise. In the thermodynamic limit N→∞ , where N determines the size of each population, the dynamics is described by deterministic Wilson-Cowan equations. On the other hand, for finite N the dynamics is described by a master equation that determines the probability of spiking activity within each population. We first consider a single excitatory population that exhibits bistability in the deterministic limit. The steady-state probability distribution of the stochastic network has maxima at points corresponding to the stable fixed points of the deterministic network; the relative weighting of the two maxima depends on the system size. For large but finite N , we calculate the exponentially small rate of noise-induced transitions between the resulting metastable states using a Wentzel-Kramers-Brillouin (WKB) approximation and matched asymptotic expansions. We then consider a two-population excitatory or inhibitory network that supports limit cycle oscillations. Using a diffusion approximation, we reduce the dynamics to a neural Langevin equation, and show how the intrinsic noise amplifies subthreshold oscillations (quasicycles).
Collapse
Affiliation(s)
- Paul C Bressloff
- Mathematical Institute, University of Oxford, 24-29 St. Giles', Oxford OX1 3LB, United Kingdom
| |
Collapse
|
10
|
Newhall KA, Kovačič G, Kramer PR, Cai D. Cascade-induced synchrony in stochastically driven neuronal networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2010; 82:041903. [PMID: 21230309 DOI: 10.1103/physreve.82.041903] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/02/2009] [Revised: 08/09/2010] [Indexed: 05/30/2023]
Abstract
Perfect spike-to-spike synchrony is studied in all-to-all coupled networks of identical excitatory, current-based, integrate-and-fire neurons with delta-impulse coupling currents and Poisson spike-train external drive. This synchrony is induced by repeated cascading "total firing events," during which all neurons fire at once. In this regime, the network exhibits nearly periodic dynamics, switching between an effectively uncoupled state and a cascade-coupled total firing state. The probability of cascading total firing events occurring in the network is computed through a combinatorial analysis conditioned upon the random time when the first neuron fires and using the probability distribution of the subthreshold membrane potentials for the remaining neurons in the network. The probability distribution of the former is found from a first-passage-time problem described by a Fokker-Planck equation, which is solved analytically via an eigenfunction expansion. The latter is found using a central limit argument via a calculation of the cumulants of a single neuronal voltage. The influence of additional physiological effects that hinder or eliminate cascade-induced synchrony are also investigated. Conditions for the validity of the approximations made in the analytical derivations are discussed and verified via direct numerical simulations.
Collapse
Affiliation(s)
- Katherine A Newhall
- Mathematical Sciences Department, Rensselaer Polytechnic Institute, 110 8th Street, Troy, New York 12180, USA
| | | | | | | |
Collapse
|