1
|
Ponce-Alvarez A, Deco G. The Hopf whole-brain model and its linear approximation. Sci Rep 2024; 14:2615. [PMID: 38297071 PMCID: PMC10831083 DOI: 10.1038/s41598-024-53105-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2023] [Accepted: 01/27/2024] [Indexed: 02/02/2024] Open
Abstract
Whole-brain models have proven to be useful to understand the emergence of collective activity among neural populations or brain regions. These models combine connectivity matrices, or connectomes, with local node dynamics, noise, and, eventually, transmission delays. Multiple choices for the local dynamics have been proposed. Among them, nonlinear oscillators corresponding to a supercritical Hopf bifurcation have been used to link brain connectivity and collective phase and amplitude dynamics in different brain states. Here, we studied the linear fluctuations of this model to estimate its stationary statistics, i.e., the instantaneous and lagged covariances and the power spectral densities. This linear approximation-that holds in the case of heterogeneous parameters and time-delays-allows analytical estimation of the statistics and it can be used for fast parameter explorations to study changes in brain state, changes in brain activity due to alterations in structural connectivity, and modulations of parameter due to non-equilibrium dynamics.
Collapse
Affiliation(s)
- Adrián Ponce-Alvarez
- Departament de Matemàtiques, Universitat Politècnica de Catalunya, 08028, Barcelona, Spain.
| | - Gustavo Deco
- Center for Brain and Cognition, Computational Neuroscience Group, Department of Information and Communication Technologies, Universitat Pompeu Fabra, 08005, Barcelona, Spain
- Institució Catalana de la Recerca i Estudis Avançats (ICREA), 08010, Barcelona, Spain
| |
Collapse
|
2
|
Nandi MK, de Candia A, Sarracino A, Herrmann HJ, de Arcangelis L. Fluctuation-dissipation relations in the imbalanced Wilson-Cowan model. Phys Rev E 2023; 107:064307. [PMID: 37464662 DOI: 10.1103/physreve.107.064307] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2022] [Accepted: 05/15/2023] [Indexed: 07/20/2023]
Abstract
The relation between spontaneous and stimulated brain activity is a fundamental question in neuroscience which has received wide attention in experimental studies. Recently, it has been suggested that the evoked response to external stimuli can be predicted from temporal correlations of spontaneous activity. Previous theoretical results, confirmed by the comparison with magnetoencephalography data for human brains, were obtained for the Wilson-Cowan model in the condition of balance of excitation and inhibition, a signature of a healthy brain. Here we extend previous studies to imbalanced conditions by examining a region of parameter space around the balanced fixed point. Analytical results are compared to numerical simulations of Wilson-Cowan networks. We evidence that in imbalanced conditions the functional form of the time correlation and response functions can show several behaviors, exhibiting also an oscillating regime caused by the emergence of complex eigenvalues. The analytical predictions are fully in agreement with numerical simulations, validating the role of cross-correlations in the response function. Furthermore, we identify the leading role of inhibitory neurons in controlling the overall activity of the system, tuning the level of excitability and imbalance.
Collapse
Affiliation(s)
- Manoj Kumar Nandi
- Department of Engineering, University of Campania "Luigi Vanvitelli" 81031 Aversa (Caserta), Italy
| | - Antonio de Candia
- Department of Physics "E. Pancini", University of Naples Federeico II, 80126 Naples, Italy
- INFN, Section of Naples, Gruppo collegato di Salerno, 84084 Fisciano, Italy
| | - Alessandro Sarracino
- Department of Engineering, University of Campania "Luigi Vanvitelli" 81031 Aversa (Caserta), Italy
- Institute for Complex Systems-CNR, Piazzale Aldo Moro 2, 00185 Rome, Italy
| | - Hans J Herrmann
- PMMH, ESPCI, 7 quai St. Bernard, Paris 75005, France
- Department of Physics, Federal University of Ceará, Fortaleza, Ceará 60451-970, Brazil
| | - Lucilla de Arcangelis
- Department of Mathematics & Physics, University of Campania "Luigi Vanvitelli" Viale Lincoln, 5, 81100 Caserta, Italy
| |
Collapse
|
3
|
Labay-Mora A, Zambrini R, Giorgi GL. Quantum Associative Memory with a Single Driven-Dissipative Nonlinear Oscillator. PHYSICAL REVIEW LETTERS 2023; 130:190602. [PMID: 37243658 DOI: 10.1103/physrevlett.130.190602] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 04/14/2023] [Indexed: 05/29/2023]
Abstract
Algorithms for associative memory typically rely on a network of many connected units. The prototypical example is the Hopfield model, whose generalizations to the quantum realm are mainly based on open quantum Ising models. We propose a realization of associative memory with a single driven-dissipative quantum oscillator exploiting its infinite degrees of freedom in phase space. The model can improve the storage capacity of discrete neuron-based systems in a large regime and we prove successful state discrimination between n coherent states, which represent the stored patterns of the system. These can be tuned continuously by modifying the driving strength, constituting a modified learning rule. We show that the associative-memory capability is inherently related to the existence of a spectral separation in the Liouvillian superoperator, which results in a long timescale separation in the dynamics corresponding to a metastable phase.
Collapse
Affiliation(s)
- Adrià Labay-Mora
- Institute for Cross Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca, Spain
| | - Roberta Zambrini
- Institute for Cross Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca, Spain
| | - Gian Luca Giorgi
- Institute for Cross Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca, Spain
| |
Collapse
|
4
|
Alvankar Golpayegan H, de Candia A. Bistability and criticality in the stochastic Wilson-Cowan model. Phys Rev E 2023; 107:034404. [PMID: 37073019 DOI: 10.1103/physreve.107.034404] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2022] [Accepted: 02/17/2023] [Indexed: 04/20/2023]
Abstract
We study a stochastic version of the Wilson-Cowan model of neural dynamics, where the response function of neurons grows faster than linearly above the threshold. The model shows a region of parameters where two attractive fixed points of the dynamics exist simultaneously. One fixed point is characterized by lower activity and scale-free critical behavior, while the second fixed point corresponds to a higher (supercritical) persistent activity, with small fluctuations around a mean value. When the number of neurons is not too large, the system can switch between these two different states with a probability depending on the parameters of the network. Along with alternation of states, the model displays a bimodal distribution of the avalanches of activity, with a power-law behavior corresponding to the critical state, and a bump of very large avalanches due to the high-activity supercritical state. The bistability is due to the presence of a first-order (discontinuous) transition in the phase diagram, and the observed critical behavior is connected with the line where the low-activity state becomes unstable (spinodal line).
Collapse
Affiliation(s)
- Hanieh Alvankar Golpayegan
- Dipartimento di Neuroscienze, Scienze Riproduttive e Odontostomatologiche, Università di Napoli Federico II, Via S. Pansini 5, 80131 Napoli, Italy
| | - Antonio de Candia
- Dipartimento di Fisica "E. Pancini", Università di Napoli Federico II, Complesso Universitario di Monte Sant'Angelo, via Cintia, 80126 Napoli, Italy
- INFN, Sezione di Napoli, Gruppo collegato di Salerno, 84084 Fisciano (SA), Italy
| |
Collapse
|
5
|
Moosavi SA, Truccolo W. Criticality in probabilistic models of spreading dynamics in brain networks: Epileptic seizures. PLoS Comput Biol 2023; 19:e1010852. [PMID: 36749796 PMCID: PMC9904505 DOI: 10.1371/journal.pcbi.1010852] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Accepted: 01/05/2023] [Indexed: 02/08/2023] Open
Abstract
The spread of seizures across brain networks is the main impairing factor, often leading to loss-of-consciousness, in people with epilepsy. Despite advances in recording and modeling brain activity, uncovering the nature of seizure spreading dynamics remains an important challenge to understanding and treating pharmacologically resistant epilepsy. To address this challenge, we introduce a new probabilistic model that captures the spreading dynamics in patient-specific complex networks. Network connectivity and interaction time delays between brain areas were estimated from white-matter tractography. The model's computational tractability allows it to play an important complementary role to more detailed models of seizure dynamics. We illustrate model fitting and predictive performance in the context of patient-specific Epileptor networks. We derive the phase diagram of spread size (order parameter) as a function of brain excitability and global connectivity strength, for different patient-specific networks. Phase diagrams allow the prediction of whether a seizure will spread depending on excitability and connectivity strength. In addition, model simulations predict the temporal order of seizure spread across network nodes. Furthermore, we show that the order parameter can exhibit both discontinuous and continuous (critical) phase transitions as neural excitability and connectivity strength are varied. Existence of a critical point, where response functions and fluctuations in spread size show power-law divergence with respect to control parameters, is supported by mean-field approximations and finite-size scaling analyses. Notably, the critical point separates two distinct regimes of spreading dynamics characterized by unimodal and bimodal spread-size distributions. Our study sheds new light on the nature of phase transitions and fluctuations in seizure spreading dynamics. We expect it to play an important role in the development of closed-loop stimulation approaches for preventing seizure spread in pharmacologically resistant epilepsy. Our findings may also be of interest to related models of spreading dynamics in epidemiology, biology, finance, and statistical physics.
Collapse
Affiliation(s)
- S Amin Moosavi
- Department of Neuroscience, Brown University, Providence, Rhode Island, United States of America
| | - Wilson Truccolo
- Department of Neuroscience, Brown University, Providence, Rhode Island, United States of America
- Carney Institute for Brain Science, Brown University, Providence, Rhode Island, United States of America
- * E-mail:
| |
Collapse
|
6
|
Pietras B, Schmutz V, Schwalger T. Mesoscopic description of hippocampal replay and metastability in spiking neural networks with short-term plasticity. PLoS Comput Biol 2022; 18:e1010809. [PMID: 36548392 PMCID: PMC9822116 DOI: 10.1371/journal.pcbi.1010809] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 01/06/2023] [Accepted: 12/11/2022] [Indexed: 12/24/2022] Open
Abstract
Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity patterns are propagating bursts of place-cell activities called hippocampal replay, which is critical for memory consolidation. The sudden and repeated occurrences of these burst states during ongoing neural activity suggest metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to stochastic spiking and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesoscopic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie to obtain a "chemical Langevin equation", which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down-states dynamics) by means of phase-plane analysis. An extension of the Langevin equation for small network sizes is also presented. The stochastic neural mass model constitutes the basic component of our mesoscopic model for replay. We show that the mesoscopic model faithfully captures the statistical structure of individual replayed trajectories in microscopic simulations and in previously reported experimental data. Moreover, compared to the deterministic Romani-Tsodyks model of place-cell dynamics, it exhibits a higher level of variability regarding order, direction and timing of replayed trajectories, which seems biologically more plausible and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
| | - Valentin Schmutz
- Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Tilo Schwalger
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- * E-mail:
| |
Collapse
|
7
|
Roberts PD, Conour J. Mechanistic modeling as an explanatory tool for clinical treatment of chronic catatonia. Front Pharmacol 2022; 13:1025417. [PMID: 36438845 PMCID: PMC9682077 DOI: 10.3389/fphar.2022.1025417] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Accepted: 10/04/2022] [Indexed: 11/11/2022] Open
Abstract
Mathematical modeling of neural systems is an effective means to integrate complex information about the brain into a numerical tool that can help explain observations. However, the use of neural models to inform clinical decisions has been limited. In this study, we use a simple model of brain circuitry, the Wilson-Cowan model, to predict changes in a clinical measure for catatonia, the Bush-Francis Catatonia Rating Scale, for use in clinical treatment of schizophrenia. This computational tool can then be used to better understand mechanisms of action for pharmaceutical treatments, and to fine-tune dosage in individual cases. We present the conditions of clinical care for a residential patient cohort, and describe methods for synthesizing data to demonstrated the functioning of the model. We then show that the model can be used to explain effect sizes of treatments and estimate outcomes for combinations of medications. We conclude with a demonstration of how this model could be personalized for individual patients to inform ongoing treatment protocols.
Collapse
Affiliation(s)
- Patrick D. Roberts
- Amazon Web Services, Portland, OR, United States
- *Correspondence: Patrick D. Roberts,
| | - James Conour
- Cascadia Behavioral Healthcare, Portland, OR, United States
| |
Collapse
|
8
|
Metastable spiking networks in the replica-mean-field limit. PLoS Comput Biol 2022; 18:e1010215. [PMID: 35714155 PMCID: PMC9246178 DOI: 10.1371/journal.pcbi.1010215] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 06/30/2022] [Accepted: 05/16/2022] [Indexed: 11/19/2022] Open
Abstract
Characterizing metastable neural dynamics in finite-size spiking networks remains a daunting challenge. We propose to address this challenge in the recently introduced replica-mean-field (RMF) limit. In this limit, networks are made of infinitely many replicas of the finite network of interest, but with randomized interactions across replicas. Such randomization renders certain excitatory networks fully tractable at the cost of neglecting activity correlations, but with explicit dependence on the finite size of the neural constituents. However, metastable dynamics typically unfold in networks with mixed inhibition and excitation. Here, we extend the RMF computational framework to point-process-based neural network models with exponential stochastic intensities, allowing for mixed excitation and inhibition. Within this setting, we show that metastable finite-size networks admit multistable RMF limits, which are fully characterized by stationary firing rates. Technically, these stationary rates are determined as the solutions of a set of delayed differential equations under certain regularity conditions that any physical solutions shall satisfy. We solve this original problem by combining the resolvent formalism and singular-perturbation theory. Importantly, we find that these rates specify probabilistic pseudo-equilibria which accurately capture the neural variability observed in the original finite-size network. We also discuss the emergence of metastability as a stochastic bifurcation, which can be interpreted as a static phase transition in the RMF limits. In turn, we expect to leverage the static picture of RMF limits to infer purely dynamical features of metastable finite-size networks, such as the transition rates between pseudo-equilibria.
Collapse
|
9
|
Brinkman BAW, Yan H, Maffei A, Park IM, Fontanini A, Wang J, La Camera G. Metastable dynamics of neural circuits and networks. APPLIED PHYSICS REVIEWS 2022; 9:011313. [PMID: 35284030 PMCID: PMC8900181 DOI: 10.1063/5.0062603] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 01/31/2022] [Indexed: 05/14/2023]
Abstract
Cortical neurons emit seemingly erratic trains of action potentials or "spikes," and neural network dynamics emerge from the coordinated spiking activity within neural circuits. These rich dynamics manifest themselves in a variety of patterns, which emerge spontaneously or in response to incoming activity produced by sensory inputs. In this Review, we focus on neural dynamics that is best understood as a sequence of repeated activations of a number of discrete hidden states. These transiently occupied states are termed "metastable" and have been linked to important sensory and cognitive functions. In the rodent gustatory cortex, for instance, metastable dynamics have been associated with stimulus coding, with states of expectation, and with decision making. In frontal, parietal, and motor areas of macaques, metastable activity has been related to behavioral performance, choice behavior, task difficulty, and attention. In this article, we review the experimental evidence for neural metastable dynamics together with theoretical approaches to the study of metastable activity in neural circuits. These approaches include (i) a theoretical framework based on non-equilibrium statistical physics for network dynamics; (ii) statistical approaches to extract information about metastable states from a variety of neural signals; and (iii) recent neural network approaches, informed by experimental results, to model the emergence of metastable dynamics. By discussing these topics, we aim to provide a cohesive view of how transitions between different states of activity may provide the neural underpinnings for essential functions such as perception, memory, expectation, or decision making, and more generally, how the study of metastable neural activity may advance our understanding of neural circuit function in health and disease.
Collapse
Affiliation(s)
| | - H. Yan
- State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun, Jilin 130022, People's Republic of China
| | | | | | | | - J. Wang
- Authors to whom correspondence should be addressed: and
| | - G. La Camera
- Authors to whom correspondence should be addressed: and
| |
Collapse
|
10
|
Pérez-Cervera A, Lindner B, Thomas PJ. Isostables for Stochastic Oscillators. PHYSICAL REVIEW LETTERS 2021; 127:254101. [PMID: 35029447 DOI: 10.1103/physrevlett.127.254101] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/21/2021] [Revised: 10/18/2021] [Accepted: 11/04/2021] [Indexed: 05/25/2023]
Abstract
Thomas and Lindner [P. J. Thomas and B. Lindner, Phys. Rev. Lett. 113, 254101 (2014).PRLTAO0031-900710.1103/PhysRevLett.113.254101], defined an asymptotic phase for stochastic oscillators as the angle in the complex plane made by the eigenfunction, having a complex eigenvalue with a least negative real part, of the backward Kolmogorov (or stochastic Koopman) operator. We complete the phase-amplitude description of noisy oscillators by defining the stochastic isostable coordinate as the eigenfunction with the least negative nontrivial real eigenvalue. Our results suggest a framework for stochastic limit cycle dynamics that encompasses noise-induced oscillations.
Collapse
Affiliation(s)
- Alberto Pérez-Cervera
- National Research University Higher School of Economics, 109208 Moscow, Russia and Instituto de Matemática Interdisciplinar, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Institute of Physics, Humboldt University at Berlin, Newtonstraße 15, D-12489 Berlin, Germany
| | - Peter J Thomas
- Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University, Cleveland, Ohio 44106, USA
| |
Collapse
|
11
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
12
|
Morrison CL, Greenwood PE, Ward LM. Plastic systemic inhibition controls amplitude while allowing phase pattern in a stochastic neural field model. Phys Rev E 2021; 103:032311. [PMID: 33862754 DOI: 10.1103/physreve.103.032311] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Accepted: 02/19/2021] [Indexed: 11/07/2022]
Abstract
We investigate oscillatory phase pattern formation and amplitude control for a linearized stochastic neuron field model by simulating Mexican-hat-coupled stochastic processes. We find, for several choices of parameters, that spatial pattern formation in the temporal phases of the coupled processes occurs if and only if their amplitudes are allowed to grow unrealistically large. Stimulated by recent work on homeostatic inhibitory plasticity, we introduce static and plastic (adaptive) systemic inhibitory mechanisms to keep the amplitudes stochastically bounded. We find that systems with static inhibition exhibited bounded amplitudes but no sustained phase patterns. With plastic systemic inhibition, on the other hand, the resulting systems exhibit both bounded amplitudes and sustained phase patterns. These results demonstrate that plastic inhibitory mechanisms in neural field models can dynamically control amplitudes while allowing patterns of phase synchronization to develop. Similar mechanisms of plastic systemic inhibition could play a role in regulating oscillatory functioning in the brain.
Collapse
Affiliation(s)
- Conor L Morrison
- Department of Statistics, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z4
| | - Priscilla E Greenwood
- Department of Mathematics, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z2
| | - Lawrence M Ward
- Department of Psychology and Djavad Mowafaghian Centre for Brain Health, 2136 West Mall, University of British Columbia, Vancouver, British Columbia, Canada V6T 1Z4
| |
Collapse
|
13
|
Schwalger T, Chizhov AV. Mind the last spike - firing rate models for mesoscopic populations of spiking neurons. Curr Opin Neurobiol 2019; 58:155-166. [PMID: 31590003 DOI: 10.1016/j.conb.2019.08.003] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/25/2019] [Indexed: 02/07/2023]
Abstract
The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons - a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany; Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany.
| | - Anton V Chizhov
- Ioffe Institute, 194021 Saint-Petersburg, Russia; Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, 194223 Saint-Petersburg, Russia
| |
Collapse
|
14
|
Comparison of Deterministic and Stochastic Regime in a Model for Cdc42 Oscillations in Fission Yeast. Bull Math Biol 2019; 81:1268-1302. [PMID: 30756233 DOI: 10.1007/s11538-019-00573-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2018] [Accepted: 01/29/2019] [Indexed: 01/13/2023]
Abstract
Oscillations occur in a wide variety of essential cellular processes, such as cell cycle progression, circadian clocks and calcium signaling in response to stimuli. It remains unclear how intrinsic stochasticity can influence these oscillatory systems. Here, we focus on oscillations of Cdc42 GTPase in fission yeast. We extend our previous deterministic model by Xu and Jilkine to construct a stochastic model, focusing on the fast diffusion case. We use SSA (Gillespie's algorithm) to numerically explore the low copy number regime in this model, and use analytical techniques to study the long-time behavior of the stochastic model and compare it to the equilibria of its deterministic counterpart. Numerical solutions suggest noisy limit cycles exist in the parameter regime in which the deterministic system converges to a stable limit cycle, and quasi-cycles exist in the parameter regime where the deterministic model has a damped oscillation. Near an infinite period bifurcation point, the deterministic model has a sustained oscillation, while stochastic trajectories start with an oscillatory mode and tend to approach deterministic steady states. In the low copy number regime, metastable transitions from oscillatory to steady behavior occur in the stochastic model. Our work contributes to the understanding of how stochastic chemical kinetics can affect a finite-dimensional dynamical system, and destabilize a deterministic steady state leading to oscillations.
Collapse
|
15
|
Li Y, Liu X. Noise induced escape in one-population and two-population stochastic neural networks with internal states. CHAOS (WOODBURY, N.Y.) 2019; 29:023137. [PMID: 30823742 DOI: 10.1063/1.5055051] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/06/2018] [Accepted: 01/24/2019] [Indexed: 06/09/2023]
Abstract
In the present paper, the escapes from the basins of fixed points induced by intrinsic noise are investigated in both one- and two-population stochastic hybrid neural networks. In the weak noise limit, the quasipotentials are computed through the application of WKB approximation to the original hybrid system and the results of quasi-steady-state (QSS) diffusion approximation. It is seen that the two results are consistent with each other within the neighborhood of a fixed point and an obvious discrepancy arises in the other area, of which the reason is then explored and revealed. Furthermore, the relationship between the fluctuational paths and the relaxational ones is analyzed, based on which some specific results for the hybrid system is obtained. Besides, for the two-population model, the phenomenon of saddle point avoidance is investigated by using both theoretical and numerical methods. Finally, the topological structure of Lagrangian manifold is analyzed, and its particular features and something analogous to the stochastic differential equation are found according to the accuracy of QSS within the vicinity of the saddle point.
Collapse
Affiliation(s)
- Yang Li
- State Key Laboratory of Mechanics and Control of Mechanical Structures, College of Aerospace Engineering, Nanjing University of Aeronautics and Astronautics, 29 Yudao Street, Nanjing 210016, China
| | - Xianbin Liu
- State Key Laboratory of Mechanics and Control of Mechanical Structures, College of Aerospace Engineering, Nanjing University of Aeronautics and Astronautics, 29 Yudao Street, Nanjing 210016, China
| |
Collapse
|
16
|
MacLaurin J, Salhi J, Toumi S. Mean field dynamics of a Wilson–Cowan neuronal network with nonlinear coupling term. STOCH DYNAM 2018. [DOI: 10.1142/s0219493718500466] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
In this paper we prove the propagation of chaos property for an ensemble of interacting neurons subject to independent Brownian noise. The propagation of chaos property means that in the large network size limit, the neurons behave as if they are probabilistically independent. The model for the internal dynamics of the neurons is taken to be that of Wilson and Cowan, and we consider there to be multiple different populations. The synaptic connections are modeled with a nonlinear “electrical” model. The nonlinearity of the synaptic connections means that our model lies outside the scope of classical propagation of chaos results. We obtain the propagation of chaos result by taking advantage of the fact that the mean-field equations are Gaussian, which allows us to use Borell’s Inequality to prove that its tails decay exponentially.
Collapse
Affiliation(s)
| | - Jamil Salhi
- NeuroMathcomp, Inria Sophia Antipolis, Lamsin-Enit,Tunis El Manar University, Tunisia
| | - Salwa Toumi
- Lamsin-Enit, Tunis El Manar University, Insat, Carthage University, Tunisia
| |
Collapse
|
17
|
Brackston RD, Wynn A, Stumpf MPH. Construction of quasipotentials for stochastic dynamical systems: An optimization approach. Phys Rev E 2018; 98:022136. [PMID: 30253467 DOI: 10.1103/physreve.98.022136] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2018] [Indexed: 06/08/2023]
Abstract
The construction of effective and informative landscapes for stochastic dynamical systems has proven a long-standing and complex problem. In many situations, the dynamics may be described by a Langevin equation while constructing a landscape comes down to obtaining the quasipotential, a scalar function that quantifies the likelihood of reaching each point in the state space. In this work we provide a novel method for constructing such landscapes by extending a tool from control theory: the sum-of-squares method for generating Lyapunov functions. Applicable to any system described by polynomials, this method provides an analytical polynomial expression for the potential landscape, in which the coefficients of the polynomial are obtained via a convex optimization problem. The resulting landscapes are based on a decomposition of the deterministic dynamics of the original system, formed in terms of the gradient of the potential and a remaining "curl" component. By satisfying the condition that the inner product of the gradient of the potential and the remaining dynamics is everywhere negative, our derived landscapes provide both upper and lower bounds on the true quasipotential; these bounds becoming tight if the decomposition is orthogonal. The method is demonstrated to correctly compute the quasipotential for high-dimensional linear systems and also for a number of nonlinear examples.
Collapse
Affiliation(s)
- R D Brackston
- Department of Life Sciences, Imperial College London, London SW7 2AZ, United Kingdom
| | - A Wynn
- Department of Aeronautics, Imperial College London, London SW7 2AZ, United Kingdom
| | - M P H Stumpf
- Department of Life Sciences, Imperial College London, London SW7 2AZ, United Kingdom
- School of BioScience and School of Mathematics and Statistics, University of Melbourne, Melbourne, Australia
| |
Collapse
|
18
|
Feudel U, Pisarchik AN, Showalter K. Multistability and tipping: From mathematics and physics to climate and brain-Minireview and preface to the focus issue. CHAOS (WOODBURY, N.Y.) 2018; 28:033501. [PMID: 29604626 DOI: 10.1063/1.5027718] [Citation(s) in RCA: 36] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Multistability refers to the coexistence of different stable states in nonlinear dynamical systems. This phenomenon has been observed in laboratory experiments and in nature. In this introduction, we briefly introduce the classes of dynamical systems in which this phenomenon has been found and discuss the extension to new system classes. Furthermore, we introduce the concept of critical transitions and discuss approaches to distinguish them according to their characteristics. Finally, we present some specific applications in physics, neuroscience, biology, ecology, and climate science.
Collapse
Affiliation(s)
- Ulrike Feudel
- Theoretical Physics/Complex Systems, ICBM, University of Oldenburg, 26129 Oldenburg, Germany
| | - Alexander N Pisarchik
- Center for Biomedical Technology, Technical University of Madrid, Campus Montegancedo, 28223 Pozuelo de Alarcon, Madrid, Spain
| | - Kenneth Showalter
- C. Eugene Bennett Department of Chemistry, West Virginia University, Morgantown, West Virginia 26506-6045, USA
| |
Collapse
|
19
|
Landau-Ginzburg theory of cortex dynamics: Scale-free avalanches emerge at the edge of synchronization. Proc Natl Acad Sci U S A 2018; 115:E1356-E1365. [PMID: 29378970 DOI: 10.1073/pnas.1712989115] [Citation(s) in RCA: 76] [Impact Index Per Article: 12.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Understanding the origin, nature, and functional significance of complex patterns of neural activity, as recorded by diverse electrophysiological and neuroimaging techniques, is a central challenge in neuroscience. Such patterns include collective oscillations emerging out of neural synchronization as well as highly heterogeneous outbursts of activity interspersed by periods of quiescence, called "neuronal avalanches." Much debate has been generated about the possible scale invariance or criticality of such avalanches and its relevance for brain function. Aimed at shedding light onto this, here we analyze the large-scale collective properties of the cortex by using a mesoscopic approach following the principle of parsimony of Landau-Ginzburg. Our model is similar to that of Wilson-Cowan for neural dynamics but crucially, includes stochasticity and space; synaptic plasticity and inhibition are considered as possible regulatory mechanisms. Detailed analyses uncover a phase diagram including down-state, synchronous, asynchronous, and up-state phases and reveal that empirical findings for neuronal avalanches are consistently reproduced by tuning our model to the edge of synchronization. This reveals that the putative criticality of cortical dynamics does not correspond to a quiescent-to-active phase transition as usually assumed in theoretical approaches but to a synchronization phase transition, at which incipient oscillations and scale-free avalanches coexist. Furthermore, our model also accounts for up and down states as they occur (e.g., during deep sleep). This approach constitutes a framework to rationalize the possible collective phases and phase transitions of cortical networks in simple terms, thus helping to shed light on basic aspects of brain functioning from a very broad perspective.
Collapse
|
20
|
Franović I, Maslennikov OV, Bačić I, Nekorkin VI. Mean-field dynamics of a population of stochastic map neurons. Phys Rev E 2018; 96:012226. [PMID: 29347187 DOI: 10.1103/physreve.96.012226] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2017] [Indexed: 11/07/2022]
Abstract
We analyze the emergent regimes and the stimulus-response relationship of a population of noisy map neurons by means of a mean-field model, derived within the framework of cumulant approach complemented by the Gaussian closure hypothesis. It is demonstrated that the mean-field model can qualitatively account for stability and bifurcations of the exact system, capturing all the generic forms of collective behavior, including macroscopic excitability, subthreshold oscillations, periodic or chaotic spiking, and chaotic bursting dynamics. Apart from qualitative analogies, we find a substantial quantitative agreement between the exact and the approximate system, as reflected in matching of the parameter domains admitting the different dynamical regimes, as well as the characteristic properties of the associated time series. The effective model is further shown to reproduce with sufficient accuracy the phase response curves of the exact system and the assembly's response to external stimulation of finite amplitude and duration.
Collapse
Affiliation(s)
- Igor Franović
- Scientific Computing Laboratory, Center for the Study of Complex Systems, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia
| | - Oleg V Maslennikov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
| | - Iva Bačić
- Scientific Computing Laboratory, Center for the Study of Complex Systems, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia
| | - Vladimir I Nekorkin
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
| |
Collapse
|
21
|
Fanelli D, Ginelli F, Livi R, Zagli N, Zankoc C. Noise-driven neuromorphic tuned amplifier. Phys Rev E 2017; 96:062313. [PMID: 29347454 DOI: 10.1103/physreve.96.062313] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2017] [Indexed: 06/07/2023]
Abstract
We study a simple stochastic model of neuronal excitatory and inhibitory interactions. The model is defined on a directed lattice and internodes couplings are modulated by a nonlinear function that mimics the process of synaptic activation. We prove that such a system behaves as a fully tunable amplifier: the endogenous component of noise, stemming from finite size effects, seeds a coherent (exponential) amplification across the chain generating giant oscillations with tunable frequencies, a process that the brain could exploit to enhance, and eventually encode, different signals. On a wider perspective, the characterized amplification process could provide a reliable pacemaking mechanism for biological systems. The device extracts energy from the finite size bath and operates as an out of equilibrium thermal machine, under stationary conditions.
Collapse
Affiliation(s)
- Duccio Fanelli
- Dipartimento di Fisica e Astronomia and CSDC, Università degli Studi di Firenze, via G. Sansone 1, 50019 Sesto Fiorentino, Italy
- INFN Sezione di Firenze, via G. Sansone 1, 50019 Sesto Fiorentino, Italy
| | - Francesco Ginelli
- SUPA, Institute for Complex Systems and Mathematical Biology, Kings College, University of Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Roberto Livi
- Dipartimento di Fisica e Astronomia and CSDC, Università degli Studi di Firenze, via G. Sansone 1, 50019 Sesto Fiorentino, Italy
- INFN Sezione di Firenze, via G. Sansone 1, 50019 Sesto Fiorentino, Italy
| | - Niccoló Zagli
- Dipartimento di Fisica e Astronomia and CSDC, Università degli Studi di Firenze, via G. Sansone 1, 50019 Sesto Fiorentino, Italy
| | - Clement Zankoc
- Dipartimento di Fisica e Astronomia and CSDC, Università degli Studi di Firenze, via G. Sansone 1, 50019 Sesto Fiorentino, Italy
- INFN Sezione di Firenze, via G. Sansone 1, 50019 Sesto Fiorentino, Italy
| |
Collapse
|
22
|
Lang E, Stannat W. Finite-Size Effects on Traveling Wave Solutions to Neural Field Equations. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2017; 7:5. [PMID: 28685484 PMCID: PMC5500661 DOI: 10.1186/s13408-017-0048-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2016] [Accepted: 06/02/2017] [Indexed: 06/07/2023]
Abstract
Neural field equations are used to describe the spatio-temporal evolution of the activity in a network of synaptically coupled populations of neurons in the continuum limit. Their heuristic derivation involves two approximation steps. Under the assumption that each population in the network is large, the activity is described in terms of a population average. The discrete network is then approximated by a continuum. In this article we make the two approximation steps explicit. Extending a model by Bressloff and Newby, we describe the evolution of the activity in a discrete network of finite populations by a Markov chain. In order to determine finite-size effects-deviations from the mean-field limit due to the finite size of the populations in the network-we analyze the fluctuations of this Markov chain and set up an approximating system of diffusion processes. We show that a well-posed stochastic neural field equation with a noise term accounting for finite-size effects on traveling wave solutions is obtained as the strong continuum limit.
Collapse
Affiliation(s)
- Eva Lang
- Institut für Mathematik, Technische Universität Berlin, Berlin, 10623 Germany
- Bernstein Center for Computational Neuroscience, Berlin, 10115 Germany
| | - Wilhelm Stannat
- Institut für Mathematik, Technische Universität Berlin, Berlin, 10623 Germany
- Bernstein Center for Computational Neuroscience, Berlin, 10115 Germany
| |
Collapse
|
23
|
Huang C, Doiron B. Once upon a (slow) time in the land of recurrent neuronal networks…. Curr Opin Neurobiol 2017; 46:31-38. [DOI: 10.1016/j.conb.2017.07.003] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2017] [Revised: 06/21/2017] [Accepted: 07/06/2017] [Indexed: 12/22/2022]
|
24
|
Zankoc C, Fanelli D, Ginelli F, Livi R. Intertangled stochastic motifs in networks of excitatory-inhibitory units. Phys Rev E 2017; 96:022308. [PMID: 28950520 DOI: 10.1103/physreve.96.022308] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2017] [Indexed: 06/07/2023]
Abstract
A stochastic model of excitatory and inhibitory interactions which bears universality traits is introduced and studied. The endogenous component of noise, stemming from finite size corrections, drives robust internode correlations that persist at large distances. Antiphase synchrony at small frequencies is resolved on adjacent nodes and found to promote the spontaneous generation of long-ranged stochastic patterns that invade the network as a whole. These patterns are lacking under the idealized deterministic scenario, and could provide hints on how living systems implement and handle a large gallery of delicate computational tasks.
Collapse
Affiliation(s)
- Clement Zankoc
- Dipartimento di Fisica e Astronomia and CSDC, Università degli Studi di Firenze, Via G. Sansone 1, I-50019 Sesto Fiorentino, Italia
- INFN Sezione di Firenze, Via G. Sansone 1, I-50019 Sesto Fiorentino, Italia
| | - Duccio Fanelli
- Dipartimento di Fisica e Astronomia and CSDC, Università degli Studi di Firenze, Via G. Sansone 1, I-50019 Sesto Fiorentino, Italia
- INFN Sezione di Firenze, Via G. Sansone 1, I-50019 Sesto Fiorentino, Italia
| | - Francesco Ginelli
- SUPA, Institute for Complex Systems and Mathematical Biology, Kings College, University of Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Roberto Livi
- Dipartimento di Fisica e Astronomia and CSDC, Università degli Studi di Firenze, Via G. Sansone 1, I-50019 Sesto Fiorentino, Italia
- INFN Sezione di Firenze, Via G. Sansone 1, I-50019 Sesto Fiorentino, Italia
| |
Collapse
|
25
|
On memories, neural ensembles and mental flexibility. Neuroimage 2017; 157:297-313. [PMID: 28602817 DOI: 10.1016/j.neuroimage.2017.05.068] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2017] [Revised: 05/30/2017] [Accepted: 05/31/2017] [Indexed: 12/18/2022] Open
Abstract
Memories are assumed to be represented by groups of co-activated neurons, called neural ensembles. Describing ensembles is a challenge: complexity of the underlying micro-circuitry is immense. Current approaches use a piecemeal fashion, focusing on single neurons and employing local measures like pairwise correlations. We introduce an alternative approach that identifies ensembles and describes the effective connectivity between them in a holistic fashion. It also links the oscillatory frequencies observed in ensembles with the spatial scales at which activity is expressed. Using unsupervised learning, biophysical modeling and graph theory, we analyze multi-electrode LFPs from frontal cortex during a spatial delayed response task. We find distinct ensembles for different cues and more parsimonious connectivity for cues on the horizontal axis, which may explain the oblique effect in psychophysics. Our approach paves the way for biophysical models with learned parameters that can guide future Brain Computer Interface development.
Collapse
|
26
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 68] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
27
|
Gollo LL, Roberts JA, Cocchi L. Mapping how local perturbations influence systems-level brain dynamics. Neuroimage 2017; 160:97-112. [PMID: 28126550 DOI: 10.1016/j.neuroimage.2017.01.057] [Citation(s) in RCA: 78] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2016] [Revised: 12/12/2016] [Accepted: 01/23/2017] [Indexed: 11/15/2022] Open
Abstract
The human brain exhibits a distinct spatiotemporal organization that supports brain function and can be manipulated via local brain stimulation. Such perturbations to local cortical dynamics are globally integrated by distinct neural systems. However, it remains unclear how local changes in neural activity affect large-scale system dynamics. Here, we briefly review empirical and computational studies addressing how localized perturbations affect brain activity. We then systematically analyze a model of large-scale brain dynamics, assessing how localized changes in brain activity at the different sites affect whole-brain dynamics. We find that local stimulation induces changes in brain activity that can be summarized by relatively smooth tuning curves, which relate a region's effectiveness as a stimulation site to its position within the cortical hierarchy. Our results also support the notion that brain hubs, operating in a slower regime, are more resilient to focal perturbations and critically contribute to maintain stability in global brain dynamics. In contrast, perturbations of peripheral regions, characterized by faster activity, have greater impact on functional connectivity. As a parallel with this region-level result, we also find that peripheral systems such as the visual and sensorimotor networks were more affected by local perturbations than high-level systems such as the cingulo-opercular network. Our findings highlight the importance of a periphery-to-core hierarchy to determine the effect of local stimulation on the brain network. This study also provides novel resources to orient empirical work aiming at manipulating functional connectivity using non-invasive brain stimulation.
Collapse
Affiliation(s)
| | - James A Roberts
- QIMR Berghofer Medical Research Institute, Brisbane, Australia; Centre of Excellence for Integrative Brain Function, QIMR Berghofer Medical Research Institute, Brisbane, Australia
| | - Luca Cocchi
- QIMR Berghofer Medical Research Institute, Brisbane, Australia.
| |
Collapse
|
28
|
DeVille L, Galiardi M. Finite-size effects and switching times for Moran process with mutation. J Math Biol 2016; 74:1197-1222. [PMID: 27628531 DOI: 10.1007/s00285-016-1056-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2015] [Revised: 08/30/2016] [Indexed: 12/01/2022]
Abstract
We consider the Moran process with two populations competing under an iterated Prisoner's Dilemma in the presence of mutation, and concentrate on the case where there are multiple evolutionarily stable strategies. We perform a complete bifurcation analysis of the deterministic system which arises in the infinite population size. We also study the Master equation and obtain asymptotics for the invariant distribution and metastable switching times for the stochastic process in the case of large but finite population. We also show that the stochastic system has asymmetries in the form of a skew for parameter values where the deterministic limit is symmetric.
Collapse
Affiliation(s)
- Lee DeVille
- Department of Mathematics, University of Illinois, 1409 W Green St, Urbana, IL, 61801, USA
| | - Meghan Galiardi
- Department of Mathematics, University of Illinois, 1409 W Green St, Urbana, IL, 61801, USA.
| |
Collapse
|
29
|
Jadi MP, Behrens MM, Sejnowski TJ. Abnormal Gamma Oscillations in N-Methyl-D-Aspartate Receptor Hypofunction Models of Schizophrenia. Biol Psychiatry 2016; 79:716-726. [PMID: 26281716 PMCID: PMC4720598 DOI: 10.1016/j.biopsych.2015.07.005] [Citation(s) in RCA: 84] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/10/2014] [Revised: 06/03/2015] [Accepted: 07/07/2015] [Indexed: 12/21/2022]
Abstract
N-methyl-D-aspartate receptor (NMDAR) hypofunction in parvalbumin-expressing (PV+) inhibitory neurons (INs) may contribute to symptoms in patients with schizophrenia (SZ). This hypothesis was inspired by studies in humans involving NMDAR antagonists that trigger SZ symptoms. Animal models of SZ using neuropharmacology and genetic knockouts have successfully replicated some of the key observations in human subjects involving alteration of gamma band oscillations (GBO) observed in electroencephalography and magnetoencephalography signals. However, it remains to be seen if NMDAR hypofunction in PV+ neurons is fundamental to the phenotype observed in these models. In this review, we discuss some of the key computational models of GBO and their predictions in the context of NMDAR hypofunction in INs. While PV+ INs have been the main focus of SZ studies in animal models, we also discuss the implications of NMDAR hypofunction in other types of INs using computational models for GBO modulation in the visual cortex.
Collapse
Affiliation(s)
- Monika P Jadi
- Howard Hughes Medical Institute, The Salk Institute for Biological Studies, La Jolla, California; Division of Biological Sciences, University of California at San Diego, La Jolla, California.
| | - M Margarita Behrens
- Howard Hughes Medical Institute, The Salk Institute for Biological Studies, La Jolla, California
| | - Terrence J Sejnowski
- Howard Hughes Medical Institute, The Salk Institute for Biological Studies, La Jolla, California; Division of Biological Sciences, University of California at San Diego, La Jolla, California
| |
Collapse
|
30
|
McCleney ZT, Kilpatrick ZP. Entrainment in up and down states of neural populations: non-smooth and stochastic models. J Math Biol 2016; 73:1131-1160. [DOI: 10.1007/s00285-016-0984-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2015] [Revised: 12/21/2015] [Indexed: 02/02/2023]
|
31
|
Klinshov V, Franović I. Mean-field dynamics of a random neural network with noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062813. [PMID: 26764750 DOI: 10.1103/physreve.92.062813] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/08/2015] [Indexed: 06/05/2023]
Abstract
We consider a network of randomly coupled rate-based neurons influenced by external and internal noise. We derive a second-order stochastic mean-field model for the network dynamics and use it to analyze the stability and bifurcations in the thermodynamic limit, as well as to study the fluctuations due to the finite-size effect. It is demonstrated that the two types of noise have substantially different impact on the network dynamics. While both sources of noise give rise to stochastic fluctuations in the case of the finite-size network, only the external noise affects the stationary activity levels of the network in the thermodynamic limit. We compare the theoretical predictions with the direct simulation results and show that they agree for large enough network sizes and for parameter domains sufficiently away from bifurcations.
Collapse
Affiliation(s)
- Vladimir Klinshov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
| | - Igor Franović
- Scientific Computing Laboratory, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia
| |
Collapse
|
32
|
A stochastic model of input effectiveness during irregular gamma rhythms. J Comput Neurosci 2015; 40:85-101. [PMID: 26610791 DOI: 10.1007/s10827-015-0583-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2015] [Revised: 10/15/2015] [Accepted: 10/26/2015] [Indexed: 10/22/2022]
Abstract
Gamma-band synchronization has been linked to attention and communication between brain regions, yet the underlying dynamical mechanisms are still unclear. How does the timing and amplitude of inputs to cells that generate an endogenously noisy gamma rhythm affect the network activity and rhythm? How does such "communication through coherence" (CTC) survive in the face of rhythm and input variability? We present a stochastic modelling approach to this question that yields a very fast computation of the effectiveness of inputs to cells involved in gamma rhythms. Our work is partly motivated by recent optogenetic experiments (Cardin et al. Nature, 459(7247), 663-667 2009) that tested the gamma phase-dependence of network responses by first stabilizing the rhythm with periodic light pulses to the interneurons (I). Our computationally efficient model E-I network of stochastic two-state neurons exhibits finite-size fluctuations. Using the Hilbert transform and Kuramoto index, we study how the stochastic phase of its gamma rhythm is entrained by external pulses. We then compute how this rhythmic inhibition controls the effectiveness of external input onto pyramidal (E) cells, and how variability shapes the window of firing opportunity. For transferring the time variations of an external input to the E cells, we find a tradeoff between the phase selectivity and depth of rate modulation. We also show that the CTC is sensitive to the jitter in the arrival times of spikes to the E cells, and to the degree of I-cell entrainment. We further find that CTC can occur even if the underlying deterministic system does not oscillate; quasicycle-type rhythms induced by the finite-size noise retain the basic CTC properties. Finally a resonance analysis confirms the relative importance of the I cell pacing for rhythm generation. Analysis of whole network behaviour, including computations of synchrony, phase and shifts in excitatory-inhibitory balance, can be further sped up by orders of magnitude using two coupled stochastic differential equations, one for each population. Our work thus yields a fast tool to numerically and analytically investigate CTC in a noisy context. It shows that CTC can be quite vulnerable to rhythm and input variability, which both decrease phase preference.
Collapse
|
33
|
Huang Y, Rüdiger S, Shuai J. Accurate Langevin approaches to simulate Markovian channel dynamics. Phys Biol 2015; 12:061001. [PMID: 26403205 DOI: 10.1088/1478-3975/12/6/061001] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
The stochasticity of ion-channels dynamic is significant for physiological processes on neuronal cell membranes. Microscopic simulations of the ion-channel gating with Markov chains can be considered to be an accurate standard. However, such Markovian simulations are computationally demanding for membrane areas of physiologically relevant sizes, which makes the noise-approximating or Langevin equation methods advantageous in many cases. In this review, we discuss the Langevin-like approaches, including the channel-based and simplified subunit-based stochastic differential equations proposed by Fox and Lu, and the effective Langevin approaches in which colored noise is added to deterministic differential equations. In the framework of Fox and Lu's classical models, several variants of numerical algorithms, which have been recently developed to improve accuracy as well as efficiency, are also discussed. Through the comparison of different simulation algorithms of ion-channel noise with the standard Markovian simulation, we aim to reveal the extent to which the existing Langevin-like methods approximate results using Markovian methods. Open questions for future studies are also discussed.
Collapse
Affiliation(s)
- Yandong Huang
- Department of Physics, Xiamen University, Xiamen 361005, People's Republic of China
| | | | | |
Collapse
|
34
|
Faugeras O, Inglis J. Stochastic neural field equations: a rigorous footing. J Math Biol 2015; 71:259-300. [PMID: 25069787 PMCID: PMC4496531 DOI: 10.1007/s00285-014-0807-6] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2013] [Revised: 04/28/2014] [Indexed: 11/05/2022]
Abstract
We here consider a stochastic version of the classical neural field equation that is currently actively studied in the mathematical neuroscience community. Our goal is to present a well-known rigorous probabilistic framework in which to study these equations in a way that is accessible to practitioners currently working in the area, and thus to bridge some of the cultural/scientific gaps between probability theory and mathematical biology. In this way, the paper is intended to act as a reference that collects together relevant rigorous results about notions of solutions and well-posedness, which although may be straightforward to experts from SPDEs, are largely unknown in the neuroscientific community, and difficult to find in a very large body of literature. Moreover, in the course of our study we provide some new specific conditions on the parameters appearing in the equation (in particular on the neural field kernel) that guarantee the existence of a solution.
Collapse
Affiliation(s)
- O. Faugeras
- NeuroMathComp, INRIA, Sophia Antipolis, France
| | - J. Inglis
- ToSCA/NeuroMathComp, INRIA, Sophia Antipolis, France
| |
Collapse
|
35
|
Brooks HA, Bressloff PC. Quasicycles in the stochastic hybrid Morris-Lecar neural model. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:012704. [PMID: 26274200 DOI: 10.1103/physreve.92.012704] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/02/2015] [Indexed: 06/04/2023]
Abstract
Intrinsic noise arising from the stochastic opening and closing of voltage-gated ion channels has been shown experimentally and mathematically to have important effects on a neuron's function. Study of classical neuron models with stochastic ion channels is becoming increasingly important, especially in understanding a cell's ability to produce subthreshold oscillations and to respond to weak periodic stimuli. While it is known that stochastic models can produce oscillations (quasicycles) in parameter regimes where the corresponding deterministic model has only a stable fixed point, little analytical work has been done to explore these connections within the context of channel noise. Using a stochastic hybrid Morris-Lecar (ML) model, we combine a system-size expansion in K(+) and a quasi-steady-state (QSS) approximation in persistent Na(+) in order to derive an effective Langevin equation that preserves the low-dimensional (planar) structure of the underlying deterministic ML model. (The QSS analysis exploits the fact that persistent Na(+) channels are fast.) By calculating the corresponding power spectrum, we determine analytically how noise significantly extends the parameter regime in which subthreshold oscillations occur.
Collapse
Affiliation(s)
- Heather A Brooks
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, Utah 84112, USA
| | - Paul C Bressloff
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, Utah 84112, USA
| |
Collapse
|
36
|
Bressloff PC. Path-integral methods for analyzing the effects of fluctuations in stochastic hybrid neural networks. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2015; 5:4. [PMID: 25852979 PMCID: PMC4385107 DOI: 10.1186/s13408-014-0016-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/03/2014] [Accepted: 12/11/2014] [Indexed: 06/04/2023]
Abstract
We consider applications of path-integral methods to the analysis of a stochastic hybrid model representing a network of synaptically coupled spiking neuronal populations. The state of each local population is described in terms of two stochastic variables, a continuous synaptic variable and a discrete activity variable. The synaptic variables evolve according to piecewise-deterministic dynamics describing, at the population level, synapses driven by spiking activity. The dynamical equations for the synaptic currents are only valid between jumps in spiking activity, and the latter are described by a jump Markov process whose transition rates depend on the synaptic variables. We assume a separation of time scales between fast spiking dynamics with time constant [Formula: see text] and slower synaptic dynamics with time constant τ. This naturally introduces a small positive parameter [Formula: see text], which can be used to develop various asymptotic expansions of the corresponding path-integral representation of the stochastic dynamics. First, we derive a variational principle for maximum-likelihood paths of escape from a metastable state (large deviations in the small noise limit [Formula: see text]). We then show how the path integral provides an efficient method for obtaining a diffusion approximation of the hybrid system for small ϵ. The resulting Langevin equation can be used to analyze the effects of fluctuations within the basin of attraction of a metastable state, that is, ignoring the effects of large deviations. We illustrate this by using the Langevin approximation to analyze the effects of intrinsic noise on pattern formation in a spatially structured hybrid network. In particular, we show how noise enlarges the parameter regime over which patterns occur, in an analogous fashion to PDEs. Finally, we carry out a [Formula: see text]-loop expansion of the path integral, and use this to derive corrections to voltage-based mean-field equations, analogous to the modified activity-based equations generated from a neural master equation.
Collapse
Affiliation(s)
- Paul C. Bressloff
- Department of Mathematics, University of Utah, 155 South 1400 East, Salt Lake City, UT 84112 USA
| |
Collapse
|
37
|
Litwin-Kumar A, Doiron B. Formation and maintenance of neuronal assemblies through synaptic plasticity. Nat Commun 2014; 5:5319. [PMID: 25395015 DOI: 10.1038/ncomms6319] [Citation(s) in RCA: 147] [Impact Index Per Article: 14.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2014] [Accepted: 09/18/2014] [Indexed: 01/12/2023] Open
Abstract
The architecture of cortex is flexible, permitting neuronal networks to store recent sensory experiences as specific synaptic connectivity patterns. However, it is unclear how these patterns are maintained in the face of the high spike time variability associated with cortex. Here we demonstrate, using a large-scale cortical network model, that realistic synaptic plasticity rules coupled with homeostatic mechanisms lead to the formation of neuronal assemblies that reflect previously experienced stimuli. Further, reverberation of past evoked states in spontaneous spiking activity stabilizes, rather than erases, this learned architecture. Spontaneous and evoked spiking activity contains a signature of learned assembly structures, leading to testable predictions about the effect of recent sensory experience on spike train statistics. Our work outlines requirements for synaptic plasticity rules capable of modifying spontaneous dynamics and shows that this modification is beneficial for stability of learned network architectures.
Collapse
Affiliation(s)
- Ashok Litwin-Kumar
- 1] Program for Neural Computation, Carnegie Mellon University and University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA [2] Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA [3] Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania 15213, USA
| | - Brent Doiron
- 1] Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA [2] Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania 15213, USA
| |
Collapse
|
38
|
Finite size effect induces stochastic gamma oscillation in inhibitory network with conduction delay. BMC Neurosci 2014. [PMCID: PMC4124993 DOI: 10.1186/1471-2202-15-s1-p115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
39
|
Dumont G, Northoff G, Longtin A. Linear noise approximation for oscillations in a stochastic inhibitory network with delay. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 90:012702. [PMID: 25122330 DOI: 10.1103/physreve.90.012702] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2013] [Indexed: 06/03/2023]
Abstract
Understanding neural variability is currently one of the biggest challenges in neuroscience. Using theory and computational modeling, we study the behavior of a globally coupled inhibitory neural network, in which each neuron follows a purely stochastic two-state spiking process. We investigate the role of both this intrinsic randomness and the conduction delay on the emergence of fast (e.g., gamma) oscillations. Toward that end, we expand the recently proposed linear noise approximation (LNA) technique to this non-Markovian "delay" case. The analysis first leads to a nonlinear delay-differential equation (DDE) with multiplicative noise for the mean activity. The LNA then yields two coupled DDEs, one of which is driven by additive Gaussian white noise. These equations on their own provide an excellent approximation to the full network dynamics, which are much longer to integrate. They further allow us to compute a theoretical expression for the power spectrum of the population activity. Our analytical result is in good agreement with the power spectrum obtained via numerical simulations of the full network dynamics, for the large range of parameters where both the intrinsic stochasticity and the conduction delay are necessary for the occurrence of oscillations. The intrinsic noise arises from the probabilistic description of each neuron, yet it is expressed at the system activity level, and it can only be controlled by the system size. In fact, its effect on the fluctuations in system activity disappears in the infinite network size limit, but the characteristics of the oscillatory activity depend on all model parameters including the system size. Using the Hilbert transform, we further show that the intrinsic noise causes sporadic strong fluctuations in the phase of the gamma rhythm.
Collapse
Affiliation(s)
- Grégory Dumont
- Physics Department, Ottawa University, Ontario, Canada and Mind, Brain Imaging and Neuroethics, Royal Ottawa Healthcare, Center for Neural Dynamics, Ottawa University, Ontario, Canada
| | - Georg Northoff
- Mind, Brain Imaging and Neuroethics, Royal Ottawa Healthcare, Institute of Mental Health Research, Ottawa, Canada and Center for Neural Dynamics, Ottawa University, Ontario, Canada
| | - André Longtin
- Physics Department, Ottawa University, Ontario, Canada and Center for Neural Dynamics, Ottawa University, Ontario, Canada
| |
Collapse
|
40
|
Jadi MP, Sejnowski TJ. Regulating Cortical Oscillations in an Inhibition-Stabilized Network. PROCEEDINGS OF THE IEEE. INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS 2014; 102:10.1109/JPROC.2014.2313113. [PMID: 24966414 PMCID: PMC4067313 DOI: 10.1109/jproc.2014.2313113] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Understanding the anatomical and functional architecture of the brain is essential for designing neurally inspired intelligent systems. Theoretical and empirical studies suggest a role for narrowband oscillations in shaping the functional architecture of the brain through their role in coding and communication of information. Such oscillations are ubiquitous signals in the electrical activity recorded from the brain. In the cortex, oscillations detected in the gamma range (30-80 Hz) are modulated by behavioral states and sensory features in complex ways. How is this regulation achieved? Although several underlying principles for the genesis of these oscillations have been proposed, a unifying account for their regulation has remained elusive. In a network of excitatory and inhibitory neurons operating in an inhibition-stabilized regime, we show that strongly superlinear responses of inhibitory neurons facilitate bidirectional regulation of oscillation frequency and power. In such a network, the balance of drives to the excitatory and inhibitory populations determines how the power and frequency of oscillations are modulated. The model accounts for the puzzling increase in their frequency with the salience of visual stimuli, and a decrease with their size. Oscillations in our model grow stronger as the mean firing level is reduced, accounting for the size dependence of visually evoked gamma rhythms, and suggesting a role for oscillations in improving the signal-to-noise ratio (SNR) of signals in the brain. Empirically testing such predictions is still challenging, and implementing the proposed coding and communication strategies in neuromorphic systems could assist in our understanding of the biological system.
Collapse
Affiliation(s)
- Monika P Jadi
- Computational Neurobiology Laboratory, Howard Hughes Medical Institute, Salk Institute for Biological Studies, La Jolla, CA 92037 USA, ( ; )
| | - Terrence J Sejnowski
- Computational Neurobiology Laboratory, Howard Hughes Medical Institute, Salk Institute for Biological Studies, La Jolla, CA 92037 USA, ( ; )
| |
Collapse
|
41
|
Cortical oscillations arise from contextual interactions that regulate sparse coding. Proc Natl Acad Sci U S A 2014; 111:6780-5. [PMID: 24742427 DOI: 10.1073/pnas.1405300111] [Citation(s) in RCA: 53] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Precise spike times carry information and are important for synaptic plasticity. Synchronizing oscillations such as gamma bursts could coordinate spike times, thus regulating information transmission in the cortex. Oscillations are driven by inhibitory neurons and are modulated by sensory stimuli and behavioral states. How their power and frequency are regulated is an open question. Using a model cortical circuit, we propose a regulatory mechanism that depends on the activity balance of monosynaptic and disynaptic pathways to inhibitory neurons: Monosynaptic input causes more powerful oscillations whereas disynaptic input increases the frequency of oscillations. The balance of stimulation to the two pathways modulates the overall distribution of spikes, with stronger disynaptic stimulation (e.g., preferred stimuli inside visual receptive fields) producing high firing rates and weak oscillations; in contrast, stronger monosynaptic stimulation (e.g., suppressive contextual stimulation from outside visual receptive fields) generates low firing rates and strong oscillatory regulation of spike timing, as observed in alert cortex processing complex natural stimuli. By accounting for otherwise paradoxical experimental findings, our results demonstrate how the frequency and power of oscillations, and hence spike times, can be modulated by both sensory input and behavioral context, with powerful oscillations signifying a cortical state under inhibitory control in which spikes are sparse and spike timing is precise.
Collapse
|
42
|
Taillefumier T, Magnasco M. A transition to sharp timing in stochastic leaky integrate-and-fire neurons driven by frozen noisy input. Neural Comput 2014; 26:819-59. [PMID: 24555453 DOI: 10.1162/neco_a_00577] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The firing activity of intracellularly stimulated neurons in cortical slices has been demonstrated to be profoundly affected by the temporal structure of the injected current (Mainen & Sejnowski, 1995 ). This suggests that the timing features of the neural response may be controlled as much by its own biophysical characteristics as by how a neuron is wired within a circuit. Modeling studies have shown that the interplay between internal noise and the fluctuations of the driving input controls the reliability and the precision of neuronal spiking (Cecchi et al., 2000 ; Tiesinga, 2002 ; Fellous, Rudolph, Destexhe, & Sejnowski, 2003 ). In order to investigate this interplay, we focus on the stochastic leaky integrate-and-fire neuron and identify the Hölder exponent H of the integrated input as the key mathematical property dictating the regime of firing of a single-unit neuron. We have recently provided numerical evidence (Taillefumier & Magnasco, 2013 ) for the existence of a phase transition when [Formula: see text] becomes less than the statistical Hölder exponent associated with internal gaussian white noise (H=1/2). Here we describe the theoretical and numerical framework devised for the study of a neuron that is periodically driven by frozen noisy inputs with exponent H>0. In doing so, we account for the existence of a transition between two regimes of firing when H=1/2, and we show that spiking times have a continuous density when the Hölder exponent satisfies H>1/2. The transition at H=1/2 formally separates rate codes, for which the neural firing probability varies smoothly, from temporal codes, for which the neuron fires at sharply defined times regardless of the intensity of internal noise.
Collapse
Affiliation(s)
- Thibaud Taillefumier
- Laboratory of Mathematical Physics, Rockefeller University, New York, NY 10065, U.S.A., and Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ 08544, U.S.A.
| | | |
Collapse
|
43
|
Short-term synaptic plasticity in the deterministic Tsodyks-Markram model leads to unpredictable network dynamics. Proc Natl Acad Sci U S A 2013; 110:16610-5. [PMID: 24062464 DOI: 10.1073/pnas.1316071110] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic plasticity strongly affects the neural dynamics of cortical networks. The Tsodyks and Markram (TM) model for short-term synaptic plasticity accurately accounts for a wide range of physiological responses at different types of cortical synapses. Here, we report a route to chaotic behavior via a Shilnikov homoclinic bifurcation that dynamically organizes some of the responses in the TM model. In particular, the presence of such a homoclinic bifurcation strongly affects the shape of the trajectories in the phase space and induces highly irregular transient dynamics; indeed, in the vicinity of the Shilnikov homoclinic bifurcation, the number of population spikes and their precise timing are unpredictable and highly sensitive to the initial conditions. Such an irregular deterministic dynamics has its counterpart in stochastic/network versions of the TM model: The existence of the Shilnikov homoclinic bifurcation generates complex and irregular spiking patterns and--acting as a sort of springboard--facilitates transitions between the down-state and unstable periodic orbits. The interplay between the (deterministic) homoclinic bifurcation and stochastic effects may give rise to some of the complex dynamics observed in neural systems.
Collapse
|
44
|
Newby J, Chapman J. Metastable behavior in Markov processes with internal states. J Math Biol 2013; 69:941-76. [PMID: 23995843 DOI: 10.1007/s00285-013-0723-1] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2012] [Revised: 08/13/2013] [Indexed: 12/21/2022]
Abstract
A perturbation framework is developed to analyze metastable behavior in stochastic processes with random internal and external states. The process is assumed to be under weak noise conditions, and the case where the deterministic limit is bistable is considered. A general analytical approximation is derived for the stationary probability density and the mean switching time between metastable states, which includes the pre exponential factor. The results are illustrated with a model of gene expression that displays bistable switching. In this model, the external state represents the number of protein molecules produced by a hypothetical gene. Once produced, a protein is eventually degraded. The internal state represents the activated or unactivated state of the gene; in the activated state the gene produces protein more rapidly than the unactivated state. The gene is activated by a dimer of the protein it produces so that the activation rate depends on the current protein level. This is a well studied model, and several model reductions and diffusion approximation methods are available to analyze its behavior. However, it is unclear if these methods accurately approximate long-time metastable behavior (i.e., mean switching time between metastable states of the bistable system). Diffusion approximations are generally known to fail in this regard.
Collapse
Affiliation(s)
- Jay Newby
- Mathematical Institute, University of Oxford, 24-29 St Giles', Oxford, OX1 3LB, UK,
| | | |
Collapse
|
45
|
Bressloff PC, Wilkerson J. Traveling pulses in a stochastic neural field model of direction selectivity. Front Comput Neurosci 2012. [PMID: 23181018 PMCID: PMC3501266 DOI: 10.3389/fncom.2012.00090] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/02/2022] Open
Abstract
We analyze the effects of extrinsic noise on traveling pulses in a neural field model of direction selectivity. The model consists of a one-dimensional scalar neural field with an asymmetric weight distribution consisting of an offset Mexican hat function. We first show how, in the absence of any noise, the system supports spontaneously propagating traveling pulses that can lock to externally moving stimuli. Using a separation of time-scales and perturbation methods previously developed for stochastic reaction-diffusion equations, we then show how extrinsic noise in the activity variables leads to a diffusive-like displacement (wandering) of the wave from its uniformly translating position at long time-scales, and fluctuations in the wave profile around its instantaneous position at short time-scales. In the case of freely propagating pulses, the wandering is characterized by pure Brownian motion, whereas in the case of stimulus-locked pulses, it is given by an Ornstein–Uhlenbeck process. This establishes that stimulus-locked pulses are more robust to noise.
Collapse
Affiliation(s)
- Paul C Bressloff
- Department of Mathematics, University of Utah Salt Lake City, UT, USA
| | | |
Collapse
|
46
|
Wallace E, Petzold L, Gillespie D, Sanft K. Linear noise approximation is valid over limited times for any chemical system that is sufficiently large. IET Syst Biol 2012; 6:102-15. [DOI: 10.1049/iet-syb.2011.0038] [Citation(s) in RCA: 62] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
|
47
|
Wang Y, Goodfellow M, Taylor PN, Baier G. Phase space approach for modeling of epileptic dynamics. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2012; 85:061918. [PMID: 23005138 DOI: 10.1103/physreve.85.061918] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/14/2012] [Revised: 05/13/2012] [Indexed: 06/01/2023]
Abstract
Epileptic electroencephalography recordings can be described in terms of four prototypic wave forms: fast sinusoidal oscillations, large slow waves, fast spiking, and spike waves. On the macroscopic level, these wave forms have been modeled by different mechanistic models which share canonical features. Here we derive a minimal model of excitatory and inhibitory processes with features common to all previous models. We can infer that at least three interacting processes are required to support the prototypic epileptic dynamics. Based on a separation of time scales we analyze the model in terms of interacting manifolds in phase space. This allows qualitative reverse engineering of all epileptic wave forms and transitions between them. We propose this method as a complement to traditional approaches to modeling epileptiform rhythms.
Collapse
Affiliation(s)
- Yujiang Wang
- Doctoral Training Centre Integrative Systems Biology, Manchester Interdisciplinary Biocentre, 131 Princess Street, Manchester M1 7DN, United Kingdom.
| | | | | | | |
Collapse
|
48
|
|
49
|
Leibold C. Renewal theory of coupled neuronal pools: stable states and slow trajectories. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2011; 84:031935. [PMID: 22060431 DOI: 10.1103/physreve.84.031935] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2011] [Revised: 07/24/2011] [Indexed: 05/31/2023]
Abstract
A theory is provided to analyze the dynamics of delay-coupled pools of spiking neurons based on stability analysis of stationary firing. Transitions between stable and unstable regimes can be predicted by bifurcation analysis of the underlying integral dynamics. Close to the bifurcation point the network exhibits slowly changing activities and allows for slow collective phenomena like continuous attractors.
Collapse
Affiliation(s)
- Christian Leibold
- Department Biologie II, LMU Munich, Großhadernerstrasse 2, D-82152 Planegg, Germany
| |
Collapse
|
50
|
Cortical attractor network dynamics with diluted connectivity. Brain Res 2011; 1434:212-25. [PMID: 21875702 DOI: 10.1016/j.brainres.2011.08.002] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2011] [Revised: 07/29/2011] [Accepted: 08/02/2011] [Indexed: 11/23/2022]
Abstract
The connectivity of the cerebral cortex is diluted, with the probability of excitatory connections between even nearby pyramidal cells rarely more than 0.1, and in the hippocampus 0.04. To investigate the extent to which this diluted connectivity affects the dynamics of attractor networks in the cerebral cortex, we simulated an integrate-and-fire attractor network taking decisions between competing inputs with diluted connectivity of 0.25 or 0.1, and with the same number of synaptic connections per neuron for the recurrent collateral synapses within an attractor population as for full connectivity. The results indicated that there was less spiking-related noise with the diluted connectivity in that the stability of the network when in the spontaneous state of firing increased, and the accuracy of the correct decisions increased. The decision times were a little slower with diluted than with complete connectivity. Given that the capacity of the network is set by the number of recurrent collateral synaptic connections per neuron, on which there is a biological limit, the findings indicate that the stability of cortical networks, and the accuracy of their correct decisions or memory recall operations, can be increased by utilizing diluted connectivity and correspondingly increasing the number of neurons in the network, with little impact on the speed of processing of the cortex. Thus diluted connectivity can decrease cortical spiking-related noise. In addition, we show that the Fano factor for the trial-to-trial variability of the neuronal firing decreases from the spontaneous firing state value when the attractor network makes a decision. This article is part of a Special Issue entitled "Neural Coding".
Collapse
|