1
|
Politi A, Torcini A. A robust balancing mechanism for spiking neural networks. CHAOS (WOODBURY, N.Y.) 2024; 34:041102. [PMID: 38639569 DOI: 10.1063/5.0199298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/22/2024] [Accepted: 02/03/2024] [Indexed: 04/20/2024]
Abstract
Dynamical balance of excitation and inhibition is usually invoked to explain the irregular low firing activity observed in the cortex. We propose a robust nonlinear balancing mechanism for a random network of spiking neurons, which works also in the absence of strong external currents. Biologically, the mechanism exploits the plasticity of excitatory-excitatory synapses induced by short-term depression. Mathematically, the nonlinear response of the synaptic activity is the key ingredient responsible for the emergence of a stable balanced regime. Our claim is supported by a simple self-consistent analysis accompanied by extensive simulations performed for increasing network sizes. The observed regime is essentially fluctuation driven and characterized by highly irregular spiking dynamics of all neurons.
Collapse
Affiliation(s)
- Antonio Politi
- Institute for Complex Systems and Mathematical Biology and Department of Physics, Aberdeen AB24 3UE, United Kingdom
- CNR-Consiglio Nazionale delle Ricerche-Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
| | - Alessandro Torcini
- CNR-Consiglio Nazionale delle Ricerche-Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
- Laboratoire de Physique Théorique et Modélisation, CY Cergy Paris Université, CNRS UMR 8089, 95302 Cergy-Pontoise cedex, France
- INFN Sezione di Firenze, Via Sansone 1 50019 Sesto Fiorentino, Italy
| |
Collapse
|
2
|
Evaluating the extent to which homeostatic plasticity learns to compute prediction errors in unstructured neuronal networks. J Comput Neurosci 2022; 50:357-373. [PMID: 35657570 DOI: 10.1007/s10827-022-00820-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2021] [Revised: 02/18/2022] [Accepted: 05/10/2022] [Indexed: 10/18/2022]
Abstract
The brain is believed to operate in part by making predictions about sensory stimuli and encoding deviations from these predictions in the activity of "prediction error neurons." This principle defines the widely influential theory of predictive coding. The precise circuitry and plasticity mechanisms through which animals learn to compute and update their predictions are unknown. Homeostatic inhibitory synaptic plasticity is a promising mechanism for training neuronal networks to perform predictive coding. Homeostatic plasticity causes neurons to maintain a steady, baseline firing rate in response to inputs that closely match the inputs on which a network was trained, but firing rates can deviate away from this baseline in response to stimuli that are mismatched from training. We combine computer simulations and mathematical analysis systematically to test the extent to which randomly connected, unstructured networks compute prediction errors after training with homeostatic inhibitory synaptic plasticity. We find that homeostatic plasticity alone is sufficient for computing prediction errors for trivial time-constant stimuli, but not for more realistic time-varying stimuli. We use a mean-field theory of plastic networks to explain our findings and characterize the assumptions under which they apply.
Collapse
|
3
|
di Volo M, Segneri M, Goldobin DS, Politi A, Torcini A. Coherent oscillations in balanced neural networks driven by endogenous fluctuations. CHAOS (WOODBURY, N.Y.) 2022; 32:023120. [PMID: 35232059 DOI: 10.1063/5.0075751] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Accepted: 01/26/2022] [Indexed: 06/14/2023]
Abstract
We present a detailed analysis of the dynamical regimes observed in a balanced network of identical quadratic integrate-and-fire neurons with sparse connectivity for homogeneous and heterogeneous in-degree distributions. Depending on the parameter values, either an asynchronous regime or periodic oscillations spontaneously emerge. Numerical simulations are compared with a mean-field model based on a self-consistent Fokker-Planck equation (FPE). The FPE reproduces quite well the asynchronous dynamics in the homogeneous case by either assuming a Poissonian or renewal distribution for the incoming spike trains. An exact self-consistent solution for the mean firing rate obtained in the limit of infinite in-degree allows identifying balanced regimes that can be either mean- or fluctuation-driven. A low-dimensional reduction of the FPE in terms of circular cumulants is also considered. Two cumulants suffice to reproduce the transition scenario observed in the network. The emergence of periodic collective oscillations is well captured both in the homogeneous and heterogeneous setups by the mean-field models upon tuning either the connectivity or the input DC current. In the heterogeneous situation, we analyze also the role of structural heterogeneity.
Collapse
Affiliation(s)
- Matteo di Volo
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| | - Marco Segneri
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| | - Denis S Goldobin
- Institute of Continuous Media Mechanics, Ural Branch of RAS, Acad. Korolev street 1, 614013 Perm, Russia
| | - Antonio Politi
- Institute for Pure and Applied Mathematics and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| |
Collapse
|
4
|
Bi H, di Volo M, Torcini A. Asynchronous and Coherent Dynamics in Balanced Excitatory-Inhibitory Spiking Networks. Front Syst Neurosci 2021; 15:752261. [PMID: 34955768 PMCID: PMC8702645 DOI: 10.3389/fnsys.2021.752261] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2021] [Accepted: 10/27/2021] [Indexed: 01/14/2023] Open
Abstract
Dynamic excitatory-inhibitory (E-I) balance is a paradigmatic mechanism invoked to explain the irregular low firing activity observed in the cortex. However, we will show that the E-I balance can be at the origin of other regimes observable in the brain. The analysis is performed by combining extensive simulations of sparse E-I networks composed of N spiking neurons with analytical investigations of low dimensional neural mass models. The bifurcation diagrams, derived for the neural mass model, allow us to classify the possible asynchronous and coherent behaviors emerging in balanced E-I networks with structural heterogeneity for any finite in-degree K. Analytic mean-field (MF) results show that both supra and sub-threshold balanced asynchronous regimes are observable in our system in the limit N >> K >> 1. Due to the heterogeneity, the asynchronous states are characterized at the microscopic level by the splitting of the neurons in to three groups: silent, fluctuation, and mean driven. These features are consistent with experimental observations reported for heterogeneous neural circuits. The coherent rhythms observed in our system can range from periodic and quasi-periodic collective oscillations (COs) to coherent chaos. These rhythms are characterized by regular or irregular temporal fluctuations joined to spatial coherence somehow similar to coherent fluctuations observed in the cortex over multiple spatial scales. The COs can emerge due to two different mechanisms. A first mechanism analogous to the pyramidal-interneuron gamma (PING), usually invoked for the emergence of γ-oscillations. The second mechanism is intimately related to the presence of current fluctuations, which sustain COs characterized by an essentially simultaneous bursting of the two populations. We observe period-doubling cascades involving the PING-like COs finally leading to the appearance of coherent chaos. Fluctuation driven COs are usually observable in our system as quasi-periodic collective motions characterized by two incommensurate frequencies. However, for sufficiently strong current fluctuations these collective rhythms can lock. This represents a novel mechanism of frequency locking in neural populations promoted by intrinsic fluctuations. COs are observable for any finite in-degree K, however, their existence in the limit N >> K >> 1 appears as uncertain.
Collapse
Affiliation(s)
- Hongjie Bi
- CY Cergy Paris Université, Laboratoire de Physique Théorique et Modélisation, CNRS, UMR 8089, Cergy-Pontoise, France
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology, Okinawa, Japan
| | - Matteo di Volo
- CY Cergy Paris Université, Laboratoire de Physique Théorique et Modélisation, CNRS, UMR 8089, Cergy-Pontoise, France
| | - Alessandro Torcini
- CY Cergy Paris Université, Laboratoire de Physique Théorique et Modélisation, CNRS, UMR 8089, Cergy-Pontoise, France
- CNR-Consiglio Nazionale delle Ricerche, Istituto dei Sistemi Complessi, Sesto Fiorentino, Italy
| |
Collapse
|
5
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
6
|
Baker C, Zhu V, Rosenbaum R. Nonlinear stimulus representations in neural circuits with approximate excitatory-inhibitory balance. PLoS Comput Biol 2020; 16:e1008192. [PMID: 32946433 PMCID: PMC7526938 DOI: 10.1371/journal.pcbi.1008192] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Revised: 09/30/2020] [Accepted: 07/24/2020] [Indexed: 12/02/2022] Open
Abstract
Balanced excitation and inhibition is widely observed in cortex. How does this balance shape neural computations and stimulus representations? This question is often studied using computational models of neuronal networks in a dynamically balanced state. But balanced network models predict a linear relationship between stimuli and population responses. So how do cortical circuits implement nonlinear representations and computations? We show that every balanced network architecture admits stimuli that break the balanced state and these breaks in balance push the network into a "semi-balanced state" characterized by excess inhibition to some neurons, but an absence of excess excitation. The semi-balanced state produces nonlinear stimulus representations and nonlinear computations, is unavoidable in networks driven by multiple stimuli, is consistent with cortical recordings, and has a direct mathematical relationship to artificial neural networks.
Collapse
Affiliation(s)
- Cody Baker
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Vicky Zhu
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, IN, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, IN, USA
| |
Collapse
|
7
|
Ebsch C, Rosenbaum R. Spatially extended balanced networks without translationally invariant connectivity. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2020; 10:8. [PMID: 32405723 PMCID: PMC7221049 DOI: 10.1186/s13408-020-00085-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/09/2019] [Accepted: 04/28/2020] [Indexed: 06/11/2023]
Abstract
Networks of neurons in the cerebral cortex exhibit a balance between excitation (positive input current) and inhibition (negative input current). Balanced network theory provides a parsimonious mathematical model of this excitatory-inhibitory balance using randomly connected networks of model neurons in which balance is realized as a stable fixed point of network dynamics in the limit of large network size. Balanced network theory reproduces many salient features of cortical network dynamics such as asynchronous-irregular spiking activity. Early studies of balanced networks did not account for the spatial topology of cortical networks. Later works introduced spatial connectivity structure, but were restricted to networks with translationally invariant connectivity structure in which connection probability depends on distance alone and boundaries are assumed to be periodic. Spatial connectivity structure in cortical network does not always satisfy these assumptions. We use the mathematical theory of integral equations to extend the mean-field theory of balanced networks to account for more general dependence of connection probability on the spatial location of pre- and postsynaptic neurons. We compare our mathematical derivations to simulations of large networks of recurrently connected spiking neuron models.
Collapse
Affiliation(s)
- Christopher Ebsch
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, USA.
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, USA.
| |
Collapse
|
8
|
Inference of synaptic connectivity and external variability in neural microcircuits. J Comput Neurosci 2020; 48:123-147. [PMID: 32080777 DOI: 10.1007/s10827-020-00739-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2019] [Revised: 11/15/2019] [Accepted: 01/03/2020] [Indexed: 10/25/2022]
Abstract
A major goal in neuroscience is to estimate neural connectivity from large scale extracellular recordings of neural activity in vivo. This is challenging in part because any such activity is modulated by the unmeasured external synaptic input to the network, known as the common input problem. Many different measures of functional connectivity have been proposed in the literature, but their direct relationship to synaptic connectivity is often assumed or ignored. For in vivo data, measurements of this relationship would require a knowledge of ground truth connectivity, which is nearly always unavailable. Instead, many studies use in silico simulations as benchmarks for investigation, but such approaches necessarily rely upon a variety of simplifying assumptions about the simulated network and can depend on numerous simulation parameters. We combine neuronal network simulations, mathematical analysis, and calcium imaging data to address the question of when and how functional connectivity, synaptic connectivity, and latent external input variability can be untangled. We show numerically and analytically that, even though the precision matrix of recorded spiking activity does not uniquely determine synaptic connectivity, it is in practice often closely related to synaptic connectivity. This relation becomes more pronounced when the spatial structure of neuronal variability is jointly considered.
Collapse
|
9
|
Vegué M, Roxin A. Firing rate distributions in spiking networks with heterogeneous connectivity. Phys Rev E 2019; 100:022208. [PMID: 31574753 DOI: 10.1103/physreve.100.022208] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2018] [Indexed: 11/07/2022]
Abstract
Mean-field theory for networks of spiking neurons based on the so-called diffusion approximation has been used to calculate certain measures of neuronal activity which can be compared with experimental data. This includes the distribution of firing rates across the network. However, the theory in its current form applies only to networks in which there is relatively little heterogeneity in the number of incoming and outgoing connections per neuron. Here we extend this theory to include networks with arbitrary degree distributions. Furthermore, the theory takes into account correlations in the in-degree and out-degree of neurons, which would arise, e.g., in the case of networks with hublike neurons. Finally, we show that networks with broad and positively correlated degrees can generate a large-amplitude sustained response to transient stimuli which does not occur in more homogeneous networks.
Collapse
Affiliation(s)
- Marina Vegué
- Centre de Recerca Matemàtica, Bellaterra, Spain and Departament de Matemàtiques, Universitat Politècnica de Catalunya, Barcelona, Spain and Instituto de Neurociencias, Consejo Superior de Investigaciones Científicas y Universidad Miguel Hernández, Sant Joan d'Alacant, Alicante, Spain
| | - Alex Roxin
- Centre de Recerca Matemàtica, Bellaterra, Spain and Barcelona Graduate School of Mathematics, Barcelona, Spain
| |
Collapse
|
10
|
Baker C, Ebsch C, Lampl I, Rosenbaum R. Correlated states in balanced neuronal networks. Phys Rev E 2019; 99:052414. [PMID: 31212573 DOI: 10.1103/physreve.99.052414] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2018] [Indexed: 06/09/2023]
Abstract
Understanding the magnitude and structure of interneuronal correlations and their relationship to synaptic connectivity structure is an important and difficult problem in computational neuroscience. Early studies show that neuronal network models with excitatory-inhibitory balance naturally create very weak spike train correlations, defining the "asynchronous state." Later work showed that, under some connectivity structures, balanced networks can produce larger correlations between some neuron pairs, even when the average correlation is very small. All of these previous studies assume that the local network receives feedforward synaptic input from a population of uncorrelated spike trains. We show that when spike trains providing feedforward input are correlated, the downstream recurrent network produces much larger correlations. We provide an in-depth analysis of the resulting "correlated state" in balanced networks and show that, unlike the asynchronous state, it produces a tight excitatory-inhibitory balance consistent with in vivo cortical recordings.
Collapse
Affiliation(s)
- Cody Baker
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Christopher Ebsch
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Ilan Lampl
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, 7610001, Israel
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana 46556, USA
| |
Collapse
|
11
|
Gu QLL, Li S, Dai WP, Zhou D, Cai D. Balanced Active Core in Heterogeneous Neuronal Networks. Front Comput Neurosci 2019; 12:109. [PMID: 30745868 PMCID: PMC6360995 DOI: 10.3389/fncom.2018.00109] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2018] [Accepted: 12/21/2018] [Indexed: 11/23/2022] Open
Abstract
It is hypothesized that cortical neuronal circuits operate in a global balanced state, i.e., the majority of neurons fire irregularly by receiving balanced inputs of excitation and inhibition. Meanwhile, it has been observed in experiments that sensory information is often sparsely encoded by only a small set of firing neurons, while neurons in the rest of the network are silent. The phenomenon of sparse coding challenges the hypothesis of a global balanced state in the brain. To reconcile this, here we address the issue of whether a balanced state can exist in a small number of firing neurons by taking account of the heterogeneity of network structure such as scale-free and small-world networks. We propose necessary conditions and show that, under these conditions, for sparsely but strongly connected heterogeneous networks with various types of single-neuron dynamics, despite the fact that the whole network receives external inputs, there is a small active subnetwork (active core) inherently embedded within it. The neurons in this active core have relatively high firing rates while the neurons in the rest of the network are quiescent. Surprisingly, although the whole network is heterogeneous and unbalanced, the active core possesses a balanced state and its connectivity structure is close to a homogeneous Erdös-Rényi network. The dynamics of the active core can be well-predicted using the Fokker-Planck equation. Our results suggest that the balanced state may be maintained by a small group of spiking neurons embedded in a large heterogeneous network in the brain. The existence of the small active core reconciles the balanced state and the sparse coding, and also provides a potential dynamical scenario underlying sparse coding in neuronal networks.
Collapse
Affiliation(s)
- Qing-Long L Gu
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - Songting Li
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - Wei P Dai
- Department of Physics and Astronomy, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- School of Mathematical Sciences, MOE-LSC, and Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China.,Courant Institute of Mathematical Sciences and Center for Neural Science, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| |
Collapse
|
12
|
di Volo M, Torcini A. Transition from Asynchronous to Oscillatory Dynamics in Balanced Spiking Networks with Instantaneous Synapses. PHYSICAL REVIEW LETTERS 2018; 121:128301. [PMID: 30296134 DOI: 10.1103/physrevlett.121.128301] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/11/2018] [Revised: 08/10/2018] [Indexed: 05/20/2023]
Abstract
We report a transition from asynchronous to oscillatory behavior in balanced inhibitory networks for class I and II neurons with instantaneous synapses. Collective oscillations emerge for sufficiently connected networks. Their origin is understood in terms of a recently developed mean-field model, whose stable solution is a focus. Microscopic irregular firings, due to balance, trigger sustained oscillations by exciting the relaxation dynamics towards the macroscopic focus. The same mechanism induces in balanced excitatory-inhibitory networks quasiperiodic collective oscillations.
Collapse
Affiliation(s)
- Matteo di Volo
- Unité de Neuroscience, Information et Complexité (UNIC), CNRS FRE 3693, 1 avenue de la Terrasse, 91198 Gif sur Yvette, France
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, Université de Cergy-Pontoise, CNRS, UMR 8089, 95302 Cergy-Pontoise cedex, France, Max Planck Institut für Physik komplexer Systeme, Nöthnitzer Str. 38, 01187 Dresden, Germany and CNR-Consiglio Nazionale delle Ricerche-Istituto dei Sistemi Complessi, via Madonna del Piano 10, 50019 Sesto Fiorentino, Italy
| |
Collapse
|
13
|
Imbalanced amplification: A mechanism of amplification and suppression from local imbalance of excitation and inhibition in cortical circuits. PLoS Comput Biol 2018; 14:e1006048. [PMID: 29543827 PMCID: PMC5871018 DOI: 10.1371/journal.pcbi.1006048] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2017] [Revised: 03/27/2018] [Accepted: 02/22/2018] [Indexed: 01/02/2023] Open
Abstract
Understanding the relationship between external stimuli and the spiking activity of cortical populations is a central problem in neuroscience. Dense recurrent connectivity in local cortical circuits can lead to counterintuitive response properties, raising the question of whether there are simple arithmetical rules for relating circuits’ connectivity structure to their response properties. One such arithmetic is provided by the mean field theory of balanced networks, which is derived in a limit where excitatory and inhibitory synaptic currents precisely balance on average. However, balanced network theory is not applicable to some biologically relevant connectivity structures. We show that cortical circuits with such structure are susceptible to an amplification mechanism arising when excitatory-inhibitory balance is broken at the level of local subpopulations, but maintained at a global level. This amplification, which can be quantified by a linear correction to the classical mean field theory of balanced networks, explains several response properties observed in cortical recordings and provides fundamental insights into the relationship between connectivity structure and neural responses in cortical circuits. Understanding how the brain represents and processes stimuli requires a quantitative understanding of how signals propagate through networks of neurons. Developing such an understanding is made difficult by the dense interconnectivity of neurons, especially in the cerebral cortex. One approach to quantifying neural processing in the cortex is derived from observations that excitatory (positive) and inhibitory (negative) interactions between neurons tend to balance each other in many brain areas. This balance is achieved under a class of computational models called “balanced networks.” However, previous approaches to the mathematical analysis of balanced network models is not possible under some biologically relevant connectivity structures. We show that, under these structures, balance between excitation and inhibition is necessarily broken and the resulting imbalance causes some stimulus features to be amplified. This “imbalanced amplification” of stimuli can explain several observations from recordings in mouse somatosensory and visual cortical circuits and provides fundamental insights into the relationship between connectivity structure and neural responses in cortical circuits.
Collapse
|
14
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
15
|
Pyle R, Rosenbaum R. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks. PHYSICAL REVIEW LETTERS 2017; 118:018103. [PMID: 28106418 DOI: 10.1103/physrevlett.118.018103] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/26/2016] [Indexed: 06/06/2023]
Abstract
Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.
Collapse
Affiliation(s)
- Ryan Pyle
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana 46556, USA
| |
Collapse
|
16
|
The Impact of Structural Heterogeneity on Excitation-Inhibition Balance in Cortical Networks. Neuron 2016; 92:1106-1121. [PMID: 27866797 PMCID: PMC5158120 DOI: 10.1016/j.neuron.2016.10.027] [Citation(s) in RCA: 72] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2015] [Revised: 08/26/2016] [Accepted: 09/29/2016] [Indexed: 11/21/2022]
Abstract
Models of cortical dynamics often assume a homogeneous connectivity structure. However, we show that heterogeneous input connectivity can prevent the dynamic balance between excitation and inhibition, a hallmark of cortical dynamics, and yield unrealistically sparse and temporally regular firing. Anatomically based estimates of the connectivity of layer 4 (L4) rat barrel cortex and numerical simulations of this circuit indicate that the local network possesses substantial heterogeneity in input connectivity, sufficient to disrupt excitation-inhibition balance. We show that homeostatic plasticity in inhibitory synapses can align the functional connectivity to compensate for structural heterogeneity. Alternatively, spike-frequency adaptation can give rise to a novel state in which local firing rates adjust dynamically so that adaptation currents and synaptic inputs are balanced. This theory is supported by simulations of L4 barrel cortex during spontaneous and stimulus-evoked conditions. Our study shows how synaptic and cellular mechanisms yield fluctuation-driven dynamics despite structural heterogeneity in cortical circuits. Structural heterogeneity threatens the dynamic balance of excitation and inhibition Reconstruction of cortical networks reveals significant structural heterogeneity Spike-frequency adaptation can act locally to facilitate global balance Inhibitory homeostatic plasticity can compensate for structural imbalance
Collapse
|