1
|
Pazó D. Discontinuous transition to chaos in a canonical random neural network. Phys Rev E 2024; 110:014201. [PMID: 39161016 DOI: 10.1103/physreve.110.014201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Accepted: 06/11/2024] [Indexed: 08/21/2024]
Abstract
We study a paradigmatic random recurrent neural network introduced by Sompolinsky, Crisanti, and Sommers (SCS). In the infinite size limit, this system exhibits a direct transition from a homogeneous rest state to chaotic behavior, with the Lyapunov exponent gradually increasing from zero. We generalize the SCS model considering odd saturating nonlinear transfer functions, beyond the usual choice ϕ(x)=tanhx. A discontinuous transition to chaos occurs whenever the slope of ϕ at 0 is a local minimum [i.e., for ϕ^{'''}(0)>0]. Chaos appears out of the blue, by an attractor-repeller fold. Accordingly, the Lyapunov exponent stays away from zero at the birth of chaos.
Collapse
|
2
|
Sanzeni A, Palmigiano A, Nguyen TH, Luo J, Nassi JJ, Reynolds JH, Histed MH, Miller KD, Brunel N. Mechanisms underlying reshuffling of visual responses by optogenetic stimulation in mice and monkeys. Neuron 2023; 111:4102-4115.e9. [PMID: 37865082 PMCID: PMC10841937 DOI: 10.1016/j.neuron.2023.09.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Revised: 05/05/2023] [Accepted: 09/15/2023] [Indexed: 10/23/2023]
Abstract
The ability to optogenetically perturb neural circuits opens an unprecedented window into mechanisms governing circuit function. We analyzed and theoretically modeled neuronal responses to visual and optogenetic inputs in mouse and monkey V1. In both species, optogenetic stimulation of excitatory neurons strongly modulated the activity of single neurons yet had weak or no effects on the distribution of firing rates across the population. Thus, the optogenetic inputs reshuffled firing rates across the network. Key statistics of mouse and monkey responses lay on a continuum, with mice/monkeys occupying the low-/high-rate regions, respectively. We show that neuronal reshuffling emerges generically in randomly connected excitatory/inhibitory networks, provided the coupling strength (combination of recurrent coupling and external input) is sufficient that powerful inhibitory feedback cancels the mean optogenetic input. A more realistic model, distinguishing tuned visual vs. untuned optogenetic input in a structured network, reduces the coupling strength needed to explain reshuffling.
Collapse
Affiliation(s)
- Alessandro Sanzeni
- Department of Computing Sciences, Bocconi University, 20100 Milan, Italy; Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA; Department of Neurobiology, Duke University, Durham, NC 27710, USA
| | - Agostina Palmigiano
- Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA
| | - Tuan H Nguyen
- Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA; Department of Physics, Columbia University, New York, NY 10027, USA
| | - Junxiang Luo
- Systems Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA
| | - Jonathan J Nassi
- Systems Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA
| | - John H Reynolds
- Systems Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA
| | - Mark H Histed
- National Institute of Mental Health Intramural Program, NIH, Bethesda, MD 20814, USA
| | - Kenneth D Miller
- Center for Theoretical Neuroscience and Mortimer B Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA; Department of Neuroscience, Swartz Program in Theoretical Neuroscience, Kavli Institute for Brain Science, College of Physicians and Surgeons and Mortimer B. Zuckerman Mind Brain Behavior Institute, Columbia University, New York City, NY 10027, USA.
| | - Nicolas Brunel
- Department of Neurobiology, Duke University, Durham, NC 27710, USA; Department of Physics, Duke University, Durham, NC 27710, USA.
| |
Collapse
|
3
|
Cimeša L, Ciric L, Ostojic S. Geometry of population activity in spiking networks with low-rank structure. PLoS Comput Biol 2023; 19:e1011315. [PMID: 37549194 PMCID: PMC10461857 DOI: 10.1371/journal.pcbi.1011315] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Revised: 08/28/2023] [Accepted: 06/27/2023] [Indexed: 08/09/2023] Open
Abstract
Recurrent network models are instrumental in investigating how behaviorally-relevant computations emerge from collective neural dynamics. A recently developed class of models based on low-rank connectivity provides an analytically tractable framework for understanding of how connectivity structure determines the geometry of low-dimensional dynamics and the ensuing computations. Such models however lack some fundamental biological constraints, and in particular represent individual neurons in terms of abstract units that communicate through continuous firing rates rather than discrete action potentials. Here we examine how far the theoretical insights obtained from low-rank rate networks transfer to more biologically plausible networks of spiking neurons. Adding a low-rank structure on top of random excitatory-inhibitory connectivity, we systematically compare the geometry of activity in networks of integrate-and-fire neurons to rate networks with statistically equivalent low-rank connectivity. We show that the mean-field predictions of rate networks allow us to identify low-dimensional dynamics at constant population-average activity in spiking networks, as well as novel non-linear regimes of activity such as out-of-phase oscillations and slow manifolds. We finally exploit these results to directly build spiking networks that perform nonlinear computations.
Collapse
Affiliation(s)
- Ljubica Cimeša
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Lazar Ciric
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
4
|
Lameu EL, Rasiah NP, Baimoukhametova DV, Loewen SP, Bains JS, Nicola W. Particle-swarm based modelling reveals two distinct classes of CRH PVN neurons. J Physiol 2023; 601:3151-3171. [PMID: 36223200 DOI: 10.1113/jp283133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Accepted: 09/28/2022] [Indexed: 11/08/2022] Open
Abstract
Electrophysiological recordings can provide detailed information of single neurons' dynamical features and shed light on their response to stimuli. Unfortunately, rapidly modelling electrophysiological data for inferring network-level behaviours remains challenging. Here, we investigate how modelled single neuron dynamics leads to network-level responses in the paraventricular nucleus of the hypothalamus (PVN), a critical nucleus for the mammalian stress response. Recordings of corticotropin releasing hormone neurons from the PVN (CRHPVN ) were performed using whole-cell current-clamp. These, neurons, which initiate the endocrine response to stress, were rapidly and automatically fit to a modified adaptive exponential integrate-and-fire model (AdEx) with particle swarm optimization (PSO). All CRHPVN neurons were accurately fit by the AdEx model with PSO. Multiple sets of parameters were found that reliably reproduced current-clamp traces for any single neuron. Despite multiple solutions, the dynamical features of the models such as the rheobase, fixed points, and bifurcations, were shown to be stable across fits. We found that CRHPVN neurons can be divided into two subtypes according to their bifurcation at the onset of firing: CRHPVN -integrators and CRHPVN -resonators. The existence of CRHPVN -resonators was then directly confirmed in a follow-up patch-clamp hyperpolarization protocol which readily induced post-inhibitory rebound spiking in 33% of patched neurons. We constructed networks of CRHPVN model neurons to investigate the network level responses of CRHPVN neurons. We found that CRHPVN -resonators maintain baseline firing in networks even when all inputs are inhibitory. The dynamics of a small subset of CRHPVN neurons may be critical to maintaining a baseline firing tone in the PVN. KEY POINTS: Corticotropin-releasing hormone neurons (CRHPVN ) in the paraventricular nucleus of the hypothalamus act as the final neural controllers of the stress response. We developed a computational modelling platform that uses particle swarm optimization to rapidly and accurately fit biophysical neuron models to patched CRHPVN neurons. A model was fitted to each patched neuron without the use of dynamic clamping, or other procedures requiring sophisticated inputs and fitting algorithms. Any neuron undergoing standard current clamp step protocols for a few minutes can be fitted by this procedure The dynamical analysis of the modelled neurons shows that CRHPVN neurons come in two specific 'flavours': CRHPVN -resonators and CRHPVN -integrators. We directly confirmed the existence of these two classes of CRHPVN neurons in subsequent experiments. Network simulations show that CRHPVN -resonators are critical to retaining the baseline firing rate of the entire network of CRHPVN neurons as these cells can fire rebound spikes and bursts in the presence of strong inhibitory synaptic input.
Collapse
Affiliation(s)
- Ewandson L Lameu
- Cell Biology and Anatomy Department, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Neilen P Rasiah
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Dinara V Baimoukhametova
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Spencer P Loewen
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Jaideep S Bains
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Wilten Nicola
- Cell Biology and Anatomy Department, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| |
Collapse
|
5
|
Hutt A, Rich S, Valiante TA, Lefebvre J. Intrinsic neural diversity quenches the dynamic volatility of neural networks. Proc Natl Acad Sci U S A 2023; 120:e2218841120. [PMID: 37399421 PMCID: PMC10334753 DOI: 10.1073/pnas.2218841120] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Accepted: 05/19/2023] [Indexed: 07/05/2023] Open
Abstract
Heterogeneity is the norm in biology. The brain is no different: Neuronal cell types are myriad, reflected through their cellular morphology, type, excitability, connectivity motifs, and ion channel distributions. While this biophysical diversity enriches neural systems' dynamical repertoire, it remains challenging to reconcile with the robustness and persistence of brain function over time (resilience). To better understand the relationship between excitability heterogeneity (variability in excitability within a population of neurons) and resilience, we analyzed both analytically and numerically a nonlinear sparse neural network with balanced excitatory and inhibitory connections evolving over long time scales. Homogeneous networks demonstrated increases in excitability, and strong firing rate correlations-signs of instability-in response to a slowly varying modulatory fluctuation. Excitability heterogeneity tuned network stability in a context-dependent way by restraining responses to modulatory challenges and limiting firing rate correlations, while enriching dynamics during states of low modulatory drive. Excitability heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in population size, connection probability, strength and variability of synaptic weights, by quenching the volatility (i.e., its susceptibility to critical transitions) of its dynamics. Together, these results highlight the fundamental role played by cell-to-cell heterogeneity in the robustness of brain function in the face of change.
Collapse
Affiliation(s)
- Axel Hutt
- Université de Strasbourg, CNRS, Inria, ICube, MLMS, MIMESIS, StrasbourgF-67000, France
| | - Scott Rich
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
| | - Taufik A. Valiante
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Electrical and Computer Engineering, University of Toronto, Toronto, ONM5S 3G8, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ONM5S 3G9, Canada
- Institute of Medical Sciences, University of Toronto, Toronto, ONM5S 1A8, Canada
- Division of Neurosurgery, Department of Surgery, University of Toronto, Toronto, ONM5G 2C4, Canada
- Center for Advancing Neurotechnological Innovation to Application, University of Toronto, Toronto, ONM5G 2A2, Canada
- Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, ONM5S 3G8, Canada
| | - Jérémie Lefebvre
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, ONK1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, ONM5S 2E4, Canada
| |
Collapse
|
6
|
Shi YL, Zeraati R, Levina A, Engel TA. Spatial and temporal correlations in neural networks with structured connectivity. PHYSICAL REVIEW RESEARCH 2023; 5:013005. [PMID: 38938692 PMCID: PMC11210526 DOI: 10.1103/physrevresearch.5.013005] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/29/2024]
Abstract
Correlated fluctuations in the activity of neural populations reflect the network's dynamics and connectivity. The temporal and spatial dimensions of neural correlations are interdependent. However, prior theoretical work mainly analyzed correlations in either spatial or temporal domains, oblivious to their interplay. We show that the network dynamics and connectivity jointly define the spatiotemporal profile of neural correlations. We derive analytical expressions for pairwise correlations in networks of binary units with spatially arranged connectivity in one and two dimensions. We find that spatial interactions among units generate multiple timescales in auto- and cross-correlations. Each timescale is associated with fluctuations at a particular spatial frequency, making a hierarchical contribution to the correlations. External inputs can modulate the correlation timescales when spatial interactions are nonlinear, and the modulation effect depends on the operating regime of network dynamics. These theoretical results open new ways to relate connectivity and dynamics in cortical networks via measurements of spatiotemporal neural correlations.
Collapse
Affiliation(s)
- Yan-Liang Shi
- Cold Spring Harbor Laboratory, Cold Spring Harbor, New York, USA
| | - Roxana Zeraati
- International Max Planck Research School for the Mechanisms of Mental Function and Dysfunction, University of Tübingen, Tübingen, Germany
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Anna Levina
- Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Department of Computer Science, University of Tübingen, Tübingen, Germany
- Bernstein Center for Computational Neuroscience Tübingen, Tübingen, Germany
| | - Tatiana A Engel
- Cold Spring Harbor Laboratory, Cold Spring Harbor, New York, USA
| |
Collapse
|
7
|
Input correlations impede suppression of chaos and learning in balanced firing-rate networks. PLoS Comput Biol 2022; 18:e1010590. [DOI: 10.1371/journal.pcbi.1010590] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 12/15/2022] [Accepted: 09/20/2022] [Indexed: 12/12/2022] Open
Abstract
Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks.
Collapse
|
8
|
Peng X, Lin W. Complex Dynamics of Noise-Perturbed Excitatory-Inhibitory Neural Networks With Intra-Correlative and Inter-Independent Connections. Front Physiol 2022; 13:915511. [PMID: 35812336 PMCID: PMC9263264 DOI: 10.3389/fphys.2022.915511] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Accepted: 05/09/2022] [Indexed: 11/24/2022] Open
Abstract
Real neural system usually contains two types of neurons, i.e., excitatory neurons and inhibitory ones. Analytical and numerical interpretation of dynamics induced by different types of interactions among the neurons of two types is beneficial to understanding those physiological functions of the brain. Here, we articulate a model of noise-perturbed random neural networks containing both excitatory and inhibitory (E&I) populations. Particularly, both intra-correlatively and inter-independently connected neurons in two populations are taken into account, which is different from the most existing E&I models only considering the independently-connected neurons. By employing the typical mean-field theory, we obtain an equivalent system of two dimensions with an input of stationary Gaussian process. Investigating the stationary autocorrelation functions along the obtained system, we analytically find the parameters’ conditions under which the synchronized behaviors between the two populations are sufficiently emergent. Taking the maximal Lyapunov exponent as an index, we also find different critical values of the coupling strength coefficients for the chaotic excitatory neurons and for the chaotic inhibitory ones. Interestingly, we reveal that the noise is able to suppress chaotic dynamics of the random neural networks having neurons in two populations, while an appropriate amount of correlation coefficient in intra-coupling strengths can enhance chaos occurrence. Finally, we also detect a previously-reported phenomenon where the parameters region corresponds to neither linearly stable nor chaotic dynamics; however, the size of the region area crucially depends on the populations’ parameters.
Collapse
Affiliation(s)
- Xiaoxiao Peng
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| | - Wei Lin
- Shanghai Center for Mathematical Sciences, School of Mathematical Sciences, and LMNS, Fudan University, Shanghai, China
- Research Institute of Intelligent Complex Systemsand Center for Computational Systems Biology, Fudan University, Shanghai, China
- State Key Laboratory of Medical Neurobiology, MOE Frontiers Center for Brain Science, and Institutes of Brain Science, Fudan University, Shanghai, China
- *Correspondence: Xiaoxiao Peng, ; Wei Lin,
| |
Collapse
|
9
|
Khajeh R, Fumarola F, Abbott LF. Sparse balance: Excitatory-inhibitory networks with small bias currents and broadly distributed synaptic weights. PLoS Comput Biol 2022; 18:e1008836. [PMID: 35139071 PMCID: PMC8827417 DOI: 10.1371/journal.pcbi.1008836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2021] [Accepted: 01/08/2022] [Indexed: 11/18/2022] Open
Abstract
Cortical circuits generate excitatory currents that must be cancelled by strong inhibition to assure stability. The resulting excitatory-inhibitory (E-I) balance can generate spontaneous irregular activity but, in standard balanced E-I models, this requires that an extremely strong feedforward bias current be included along with the recurrent excitation and inhibition. The absence of experimental evidence for such large bias currents inspired us to examine an alternative regime that exhibits asynchronous activity without requiring unrealistically large feedforward input. In these networks, irregular spontaneous activity is supported by a continually changing sparse set of neurons. To support this activity, synaptic strengths must be drawn from high-variance distributions. Unlike standard balanced networks, these sparse balance networks exhibit robust nonlinear responses to uniform inputs and non-Gaussian input statistics. Interestingly, the speed, not the size, of synaptic fluctuations dictates the degree of sparsity in the model. In addition to simulations, we provide a mean-field analysis to illustrate the properties of these networks.
Collapse
Affiliation(s)
- Ramin Khajeh
- Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York City, New York, United States of America
| | - Francesco Fumarola
- Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York City, New York, United States of America
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama, Japan
| | - LF Abbott
- Mortimer B. Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York City, New York, United States of America
| |
Collapse
|
10
|
The Mean Field Approach for Populations of Spiking Neurons. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:125-157. [DOI: 10.1007/978-3-030-89439-9_6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
AbstractMean field theory is a device to analyze the collective behavior of a dynamical system comprising many interacting particles. The theory allows to reduce the behavior of the system to the properties of a handful of parameters. In neural circuits, these parameters are typically the firing rates of distinct, homogeneous subgroups of neurons. Knowledge of the firing rates under conditions of interest can reveal essential information on both the dynamics of neural circuits and the way they can subserve brain function. The goal of this chapter is to provide an elementary introduction to the mean field approach for populations of spiking neurons. We introduce the general idea in networks of binary neurons, starting from the most basic results and then generalizing to more relevant situations. This allows to derive the mean field equations in a simplified setting. We then derive the mean field equations for populations of integrate-and-fire neurons. An effort is made to derive the main equations of the theory using only elementary methods from calculus and probability theory. The chapter ends with a discussion of the assumptions of the theory and some of the consequences of violating those assumptions. This discussion includes an introduction to balanced and metastable networks and a brief catalogue of successful applications of the mean field approach to the study of neural circuits.
Collapse
|
11
|
Endo D, Kobayashi R, Bartolo R, Averbeck BB, Sugase-Miyamoto Y, Hayashi K, Kawano K, Richmond BJ, Shinomoto S. A convolutional neural network for estimating synaptic connectivity from spike trains. Sci Rep 2021; 11:12087. [PMID: 34103546 PMCID: PMC8187444 DOI: 10.1038/s41598-021-91244-w] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Accepted: 05/21/2021] [Indexed: 02/05/2023] Open
Abstract
The recent increase in reliable, simultaneous high channel count extracellular recordings is exciting for physiologists and theoreticians because it offers the possibility of reconstructing the underlying neuronal circuits. We recently presented a method of inferring this circuit connectivity from neuronal spike trains by applying the generalized linear model to cross-correlograms. Although the algorithm can do a good job of circuit reconstruction, the parameters need to be carefully tuned for each individual dataset. Here we present another method using a Convolutional Neural Network for Estimating synaptic Connectivity from spike trains. After adaptation to huge amounts of simulated data, this method robustly captures the specific feature of monosynaptic impact in a noisy cross-correlogram. There are no user-adjustable parameters. With this new method, we have constructed diagrams of neuronal circuits recorded in several cortical areas of monkeys.
Collapse
Affiliation(s)
- Daisuke Endo
- Graduate School of Informatics, Kyoto University, Kyoto, 606-8501, Japan
| | - Ryota Kobayashi
- Mathematics and Informatics Center, The University of Tokyo, Tokyo, 113-8656, Japan
- Department of Complexity Science and Engineering, The University of Tokyo, Chiba, 277-8561, Japan
- JST, PRESTO, Saitama, 332-0012, Japan
| | - Ramon Bartolo
- Laboratory of Neuropsychology, NIMH/NIH/DHHS, Bethesda, MD, 20814, USA
| | - Bruno B Averbeck
- Laboratory of Neuropsychology, NIMH/NIH/DHHS, Bethesda, MD, 20814, USA
| | - Yasuko Sugase-Miyamoto
- Human Informatics and Interaction Research Institute, National Institute of Advanced Industrial Science and Technology, Tsukuba, 305-8568, Japan
| | - Kazuko Hayashi
- Human Informatics and Interaction Research Institute, National Institute of Advanced Industrial Science and Technology, Tsukuba, 305-8568, Japan
- Japan Society for the Promotion of Science, Tokyo, 102-0083, Japan
| | - Kenji Kawano
- Human Informatics and Interaction Research Institute, National Institute of Advanced Industrial Science and Technology, Tsukuba, 305-8568, Japan
| | - Barry J Richmond
- Laboratory of Neuropsychology, NIMH/NIH/DHHS, Bethesda, MD, 20814, USA
| | - Shigeru Shinomoto
- Graduate School of Informatics, Kyoto University, Kyoto, 606-8501, Japan.
- Brain Information Communication Research Laboratory Group, ATR Institute International, Kyoto, 619-0288, Japan.
| |
Collapse
|
12
|
Likhoshvai VA, Khlebodarova TM. Evolution and extinction can occur rapidly: a modeling approach. PeerJ 2021; 9:e11130. [PMID: 33954033 PMCID: PMC8051336 DOI: 10.7717/peerj.11130] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2020] [Accepted: 02/27/2021] [Indexed: 11/25/2022] Open
Abstract
Fossil record of Earth describing the last 500 million years is characterized by evolution discontinuity as well as recurring global extinctions of some species and their replacement by new types, the causes of which are still debate. We developed a model of evolutionary self-development of a large ecosystem. This model of biota evolution based on the universal laws of living systems functioning: reproduction, dependence of reproduction efficiency and mortality on biota density, mutational variability in the process of reproduction and selection of the most adapted individuals. We have shown that global extinctions and phases of rapid growth and biodiversity stasis can be a reflection of the emergence of bistability in a self-organizing system, which is the Earth’s biota. Bistability was found to be characteristic only for ecosystems with predominant sexual reproduction. The reason for the transition from one state to another is the selection of the most adapted individuals. That is, we explain the characteristics of the Earth’s fossil record during the last 500 million years by the internal laws of Earth’s ecosystem functioning, which appeared at a certain stage of evolution as a result of the emergence of life forms with an increased adaptive diversification associated with sexual dimorphism.
Collapse
Affiliation(s)
- Vitaly A Likhoshvai
- Department of Systems Biology, Institute of Cytology and Genetics, Siberian Branch of the Russian Academy of Sciences, Novosibirsk, Russian Federation
| | - Tamara M Khlebodarova
- Department of Systems Biology, Institute of Cytology and Genetics, Siberian Branch of the Russian Academy of Sciences, Novosibirsk, Russian Federation.,Kurchatov Genomics Center, Institute of Cytology and Genetics, Siberian Branch of the Russian Academy of Sciences, Novosibirsk, Russian Federation
| |
Collapse
|
13
|
Ingrosso A. Optimal learning with excitatory and inhibitory synapses. PLoS Comput Biol 2020; 16:e1008536. [PMID: 33370266 PMCID: PMC7793294 DOI: 10.1371/journal.pcbi.1008536] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2020] [Revised: 01/08/2021] [Accepted: 11/13/2020] [Indexed: 11/22/2022] Open
Abstract
Characterizing the relation between weight structure and input/output statistics is fundamental for understanding the computational capabilities of neural circuits. In this work, I study the problem of storing associations between analog signals in the presence of correlations, using methods from statistical mechanics. I characterize the typical learning performance in terms of the power spectrum of random input and output processes. I show that optimal synaptic weight configurations reach a capacity of 0.5 for any fraction of excitatory to inhibitory weights and have a peculiar synaptic distribution with a finite fraction of silent synapses. I further provide a link between typical learning performance and principal components analysis in single cases. These results may shed light on the synaptic profile of brain circuits, such as cerebellar structures, that are thought to engage in processing time-dependent signals and performing on-line prediction.
Collapse
Affiliation(s)
- Alessandro Ingrosso
- Zuckerman Mind, Brain, Behavior Institute, Columbia University, New York, New York, United States of America
| |
Collapse
|
14
|
Zhang Y, Young LS. DNN-assisted statistical analysis of a model of local cortical circuits. Sci Rep 2020; 10:20139. [PMID: 33208805 PMCID: PMC7674455 DOI: 10.1038/s41598-020-76770-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2020] [Accepted: 10/20/2020] [Indexed: 01/27/2023] Open
Abstract
In neuroscience, computational modeling is an effective way to gain insight into cortical mechanisms, yet the construction and analysis of large-scale network models—not to mention the extraction of underlying principles—are themselves challenging tasks, due to the absence of suitable analytical tools and the prohibitive costs of systematic numerical exploration of high-dimensional parameter spaces. In this paper, we propose a data-driven approach assisted by deep neural networks (DNN). The idea is to first discover certain input-output relations, and then to leverage this information and the superior computation speeds of the well-trained DNN to guide parameter searches and to deduce theoretical understanding. To illustrate this novel approach, we used as a test case a medium-size network of integrate-and-fire neurons intended to model local cortical circuits. With the help of an accurate yet extremely efficient DNN surrogate, we revealed the statistics of model responses, providing a detailed picture of model behavior. The information obtained is both general and of a fundamental nature, with direct application to neuroscience. Our results suggest that the methodology proposed can be scaled up to larger and more complex biological networks when used in conjunction with other techniques of biological modeling.
Collapse
Affiliation(s)
- Yaoyu Zhang
- School of Mathematical Sciences, Institute of Natural Sciences, MOE-LSC and Qing Yuan Research Institute, Shanghai Jiao Tong University, Shanghai, 200240, China.
| | - Lai-Sang Young
- School of Mathematics and School of Natural Sciences, Institute for Advanced Study, Princeton, NJ, 08540, USA. .,Courant Institute of Mathematical Sciences, New York University, New York, NY, 10012, USA.
| |
Collapse
|
15
|
Bachmann C, Tetzlaff T, Duarte R, Morrison A. Firing rate homeostasis counteracts changes in stability of recurrent neural networks caused by synapse loss in Alzheimer's disease. PLoS Comput Biol 2020; 16:e1007790. [PMID: 32841234 PMCID: PMC7505475 DOI: 10.1371/journal.pcbi.1007790] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2019] [Revised: 09/21/2020] [Accepted: 03/17/2020] [Indexed: 11/19/2022] Open
Abstract
The impairment of cognitive function in Alzheimer's disease is clearly correlated to synapse loss. However, the mechanisms underlying this correlation are only poorly understood. Here, we investigate how the loss of excitatory synapses in sparsely connected random networks of spiking excitatory and inhibitory neurons alters their dynamical characteristics. Beyond the effects on the activity statistics, we find that the loss of excitatory synapses on excitatory neurons reduces the network's sensitivity to small perturbations. This decrease in sensitivity can be considered as an indication of a reduction of computational capacity. A full recovery of the network's dynamical characteristics and sensitivity can be achieved by firing rate homeostasis, here implemented by an up-scaling of the remaining excitatory-excitatory synapses. Mean-field analysis reveals that the stability of the linearised network dynamics is, in good approximation, uniquely determined by the firing rate, and thereby explains why firing rate homeostasis preserves not only the firing rate but also the network's sensitivity to small perturbations.
Collapse
Affiliation(s)
- Claudia Bachmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Renato Duarte
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Abigail Morrison
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr-University Bochum, Bochum, Germany
| |
Collapse
|
16
|
Kuśmierz Ł, Ogawa S, Toyoizumi T. Edge of Chaos and Avalanches in Neural Networks with Heavy-Tailed Synaptic Weight Distribution. PHYSICAL REVIEW LETTERS 2020; 125:028101. [PMID: 32701351 DOI: 10.1103/physrevlett.125.028101] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2019] [Revised: 03/03/2020] [Accepted: 05/26/2020] [Indexed: 06/11/2023]
Abstract
We propose an analytically tractable neural connectivity model with power-law distributed synaptic strengths. When threshold neurons with biologically plausible number of incoming connections are considered, our model features a continuous transition to chaos and can reproduce biologically relevant low activity levels and scale-free avalanches, i.e., bursts of activity with power-law distributions of sizes and lifetimes. In contrast, the Gaussian counterpart exhibits a discontinuous transition to chaos and thus cannot be poised near the edge of chaos. We validate our predictions in simulations of networks of binary as well as leaky integrate-and-fire neurons. Our results suggest that heavy-tailed synaptic distribution may form a weakly informative sparse-connectivity prior that can be useful in biological and artificial adaptive systems.
Collapse
Affiliation(s)
- Łukasz Kuśmierz
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan
| | - Shun Ogawa
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan
| | - Taro Toyoizumi
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan
- Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, Tokyo 113-8656, Japan
| |
Collapse
|
17
|
Khlebodarova TM, Likhoshvai VA. Causes of global extinctions in the history of life: facts and hypotheses. Vavilovskii Zhurnal Genet Selektsii 2020; 24:407-419. [PMID: 33659824 PMCID: PMC7716527 DOI: 10.18699/vj20.633] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Abstract
Paleontologists define global extinctions on Earth as a loss of about three-quarters of plant and animal species over a relatively short period of time. At least five global extinctions are documented in the Phanerozoic fossil record (~500-million-year period): ~65, 200, 260, 380, and 440 million years ago. In addition, there is evidence of global extinctions in earlier periods of life on Earth - during the Late Cambrian (~500 million years ago) and Ediacaran periods (more than 540 million years ago). There is still no common opinion on the causes of their occurrence. The current study is a systematized review of the data on recorded extinctions of complex life forms on Earth from the moment of their occurrence during the Ediacaran period to the modern period. The review discusses possible causes for mass extinctions in the light of the influence of abiogenic factors, planetary or astronomical, and the consequences of their actions. We evaluate the pros and cons of the hypothesis on the presence of periodicity in the extinction of Phanerozoic marine biota. Strong evidence that allows us to hypothesize that additional mechanisms associated with various internal biotic factors are responsible for the emergence of extinctions in the evolution of complex life forms is discussed. Developing the idea of the internal causes of periodicity and discontinuity in evolution, we propose our own original hypothesis, according to which the bistability phenomenon underlies the complex dynamics of the biota development, which is manifested in the form of global extinctions. The bistability phenomenon arises only in ecosystems with predominant sexual reproduction. Our hypothesis suggests that even in the absence of global abiotic catastrophes, extinctions of biota would occur anyway. However, our hypothesis does not exclude the possibility that in different periods of the Earth's history the biota was subjected to powerful external influences that had a significant impact on its further development, which is reflected in the Earth's fossil record.
Collapse
Affiliation(s)
- T M Khlebodarova
- Institute of Cytology and Genetics of Siberian Branch of the Russian Academy of Sciences, Novosibirsk, Russia
| | - V A Likhoshvai
- Institute of Cytology and Genetics of Siberian Branch of the Russian Academy of Sciences, Novosibirsk, Russia
| |
Collapse
|
18
|
Stapmanns J, Kühn T, Dahmen D, Luu T, Honerkamp C, Helias M. Self-consistent formulations for stochastic nonlinear neuronal dynamics. Phys Rev E 2020; 101:042124. [PMID: 32422832 DOI: 10.1103/physreve.101.042124] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 12/18/2019] [Indexed: 01/28/2023]
Abstract
Neural dynamics is often investigated with tools from bifurcation theory. However, many neuron models are stochastic, mimicking fluctuations in the input from unknown parts of the brain or the spiking nature of signals. Noise changes the dynamics with respect to the deterministic model; in particular classical bifurcation theory cannot be applied. We formulate the stochastic neuron dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To formulate self-consistency equations, we derive a fundamental link between the effective action in the Onsager-Machlup (OM) formalism, which allows the study of phase transitions, and the MSRDJ effective action, which is computationally advantageous. These results in particular allow the derivation of an OM effective action for systems with non-Gaussian noise. This approach naturally leads to effective deterministic equations for the first moment of the stochastic system; they explain how nonlinearities and noise cooperate to produce memory effects. Moreover, the MSRDJ formulation yields an effective linear system that has identical power spectra and linear response. Starting from the better known loopwise approximation, we then discuss the use of the fRG as a method to obtain self-consistency beyond the mean. We present a new efficient truncation scheme for the hierarchy of flow equations for the vertex functions by adapting the Blaizot, Méndez, and Wschebor approximation from the derivative expansion to the vertex expansion. The methods are presented by means of the simplest possible example of a stochastic differential equation that has generic features of neuronal dynamics.
Collapse
Affiliation(s)
- Jonas Stapmanns
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Thomas Luu
- Institut für Kernphysik (IKP-3), Institute for Advanced Simulation (IAS-4) and Jülich Center for Hadron Physics, Jülich Research Centre, Jülich, Germany
| | - Carsten Honerkamp
- Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany.,JARA-FIT, Jülich Aachen Research Alliance-Fundamentals of Future Information Technology, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| |
Collapse
|
19
|
Ponzi A, Barton SJ, Bunner KD, Rangel-Barajas C, Zhang ES, Miller BR, Rebec GV, Kozloski J. Striatal network modeling in Huntington's Disease. PLoS Comput Biol 2020; 16:e1007648. [PMID: 32302302 PMCID: PMC7197869 DOI: 10.1371/journal.pcbi.1007648] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2019] [Revised: 05/04/2020] [Accepted: 01/09/2020] [Indexed: 12/26/2022] Open
Abstract
Medium spiny neurons (MSNs) comprise over 90% of cells in the striatum. In vivo MSNs display coherent burst firing cell assembly activity patterns, even though isolated MSNs do not burst fire intrinsically. This activity is important for the learning and execution of action sequences and is characteristically dysregulated in Huntington's Disease (HD). However, how dysregulation is caused by the various neural pathologies affecting MSNs in HD is unknown. Previous modeling work using simple cell models has shown that cell assembly activity patterns can emerge as a result of MSN inhibitory network interactions. Here, by directly estimating MSN network model parameters from single unit spiking data, we show that a network composed of much more physiologically detailed MSNs provides an excellent quantitative fit to wild type (WT) mouse spiking data, but only when network parameters are appropriate for the striatum. We find the WT MSN network is situated in a regime close to a transition from stable to strongly fluctuating network dynamics. This regime facilitates the generation of low-dimensional slowly varying coherent activity patterns and confers high sensitivity to variations in cortical driving. By re-estimating the model on HD spiking data we discover network parameter modifications are consistent across three very different types of HD mutant mouse models (YAC128, Q175, R6/2). In striking agreement with the known pathophysiology we find feedforward excitatory drive is reduced in HD compared to WT mice, while recurrent inhibition also shows phenotype dependency. We show that these modifications shift the HD MSN network to a sub-optimal regime where higher dimensional incoherent rapidly fluctuating activity predominates. Our results provide insight into a diverse range of experimental findings in HD, including cognitive and motor symptoms, and may suggest new avenues for treatment.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
- * E-mail:
| | - Scott J. Barton
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Kendra D. Bunner
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Claudia Rangel-Barajas
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Emily S. Zhang
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Benjamin R. Miller
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - George V. Rebec
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - James Kozloski
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
| |
Collapse
|
20
|
Mahrach A, Chen G, Li N, van Vreeswijk C, Hansel D. Mechanisms underlying the response of mouse cortical networks to optogenetic manipulation. eLife 2020; 9:e49967. [PMID: 31951197 PMCID: PMC7012611 DOI: 10.7554/elife.49967] [Citation(s) in RCA: 34] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2019] [Accepted: 12/25/2019] [Indexed: 12/28/2022] Open
Abstract
GABAergic interneurons can be subdivided into three subclasses: parvalbumin positive (PV), somatostatin positive (SOM) and serotonin positive neurons. With principal cells (PCs) they form complex networks. We examine PCs and PV responses in mouse anterior lateral motor cortex (ALM) and barrel cortex (S1) upon PV photostimulation in vivo. In ALM layer five and S1, the PV response is paradoxical: photoexcitation reduces their activity. This is not the case in ALM layer 2/3. We combine analytical calculations and numerical simulations to investigate how these results constrain the architecture. Two-population models cannot explain the results. Four-population networks with V1-like architecture account for the data in ALM layer 2/3 and layer 5. Our data in S1 can be explained if SOM neurons receive inputs only from PCs and PV neurons. In both four-population models, the paradoxical effect implies not too strong recurrent excitation. It is not evidence for stabilization by inhibition.
Collapse
Affiliation(s)
- Alexandre Mahrach
- CNRS-UMR 8002, Integrative Neuroscience and Cognition CenterParisFrance
| | - Guang Chen
- Department of NeuroscienceBaylor College of MedicineHoustonUnited States
| | - Nuo Li
- Department of NeuroscienceBaylor College of MedicineHoustonUnited States
| | | | - David Hansel
- CNRS-UMR 8002, Integrative Neuroscience and Cognition CenterParisFrance
| |
Collapse
|
21
|
Training dynamically balanced excitatory-inhibitory networks. PLoS One 2019; 14:e0220547. [PMID: 31393909 PMCID: PMC6687153 DOI: 10.1371/journal.pone.0220547] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2019] [Accepted: 07/19/2019] [Indexed: 12/02/2022] Open
Abstract
The construction of biologically plausible models of neural circuits is crucial for understanding the computational properties of the nervous system. Constructing functional networks composed of separate excitatory and inhibitory neurons obeying Dale’s law presents a number of challenges. We show how a target-based approach, when combined with a fast online constrained optimization technique, is capable of building functional models of rate and spiking recurrent neural networks in which excitation and inhibition are balanced. Balanced networks can be trained to produce complicated temporal patterns and to solve input-output tasks while retaining biologically desirable features such as Dale’s law and response variability.
Collapse
|
22
|
Hennequin G, Ahmadian Y, Rubin DB, Lengyel M, Miller KD. The Dynamical Regime of Sensory Cortex: Stable Dynamics around a Single Stimulus-Tuned Attractor Account for Patterns of Noise Variability. Neuron 2019; 98:846-860.e5. [PMID: 29772203 PMCID: PMC5971207 DOI: 10.1016/j.neuron.2018.04.017] [Citation(s) in RCA: 79] [Impact Index Per Article: 15.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2016] [Revised: 02/14/2018] [Accepted: 04/12/2018] [Indexed: 12/16/2022]
Abstract
Correlated variability in cortical activity is ubiquitously quenched following stimulus onset, in a stimulus-dependent manner. These modulations have been attributed to circuit dynamics involving either multiple stable states (“attractors”) or chaotic activity. Here we show that a qualitatively different dynamical regime, involving fluctuations about a single, stimulus-driven attractor in a loosely balanced excitatory-inhibitory network (the stochastic “stabilized supralinear network”), best explains these modulations. Given the supralinear input/output functions of cortical neurons, increased stimulus drive strengthens effective network connectivity. This shifts the balance from interactions that amplify variability to suppressive inhibitory feedback, quenching correlated variability around more strongly driven steady states. Comparing to previously published and original data analyses, we show that this mechanism, unlike previous proposals, uniquely accounts for the spatial patterns and fast temporal dynamics of variability suppression. Specifying the cortical operating regime is key to understanding the computations underlying perception. A simple network model explains stimulus-tuning of cortical variability suppression Inhibition stabilizes recurrently interacting neurons with supralinear I/O functions Stimuli strengthen inhibitory stabilization around a stable state, quenching variability Single-trial V1 data are compatible with this model and rules out competing proposals
Collapse
Affiliation(s)
- Guillaume Hennequin
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, UK.
| | - Yashar Ahmadian
- Center for Theoretical Neuroscience, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Department of Neuroscience, Swartz Program in Theoretical Neuroscience, Kavli Institute for Brain Science, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Centre de Neurophysique, Physiologie, et Pathologie, CNRS, 75270 Paris Cedex 06, France; Institute of Neuroscience, Department of Biology and Mathematics, University of Oregon, Eugene, OR 97403, USA
| | - Daniel B Rubin
- Center for Theoretical Neuroscience, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Department of Neurology, Massachusetts General Hospital and Brigham and Women's Hospital, Harvard Medical School, Boston, MA 02115, USA
| | - Máté Lengyel
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, UK; Department of Cognitive Science, Central European University, 1051 Budapest, Hungary
| | - Kenneth D Miller
- Center for Theoretical Neuroscience, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Department of Neuroscience, Swartz Program in Theoretical Neuroscience, Kavli Institute for Brain Science, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA
| |
Collapse
|
23
|
La Camera G, Fontanini A, Mazzucato L. Cortical computations via metastable activity. Curr Opin Neurobiol 2019; 58:37-45. [PMID: 31326722 DOI: 10.1016/j.conb.2019.06.007] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2018] [Accepted: 06/22/2019] [Indexed: 12/27/2022]
Abstract
Metastable brain dynamics are characterized by abrupt, jump-like modulations so that the neural activity in single trials appears to unfold as a sequence of discrete, quasi-stationary 'states'. Evidence that cortical neural activity unfolds as a sequence of metastable states is accumulating at fast pace. Metastable activity occurs both in response to an external stimulus and during ongoing, self-generated activity. These spontaneous metastable states are increasingly found to subserve internal representations that are not locked to external triggers, including states of deliberations, attention and expectation. Moreover, decoding stimuli or decisions via metastable states can be carried out trial-by-trial. Focusing on metastability will allow us to shift our perspective on neural coding from traditional concepts based on trial-averaging to models based on dynamic ensemble representations. Recent theoretical work has started to characterize the mechanistic origin and potential roles of metastable representations. In this article we review recent findings on metastable activity, how it may arise in biologically realistic models, and its potential role for representing internal states as well as relevant task variables.
Collapse
Affiliation(s)
- Giancarlo La Camera
- Department of Neurobiology and Behavior, State University of New York at Stony Brook, Stony Brook, NY 11794, United States; Graduate Program in Neuroscience, State University of New York at Stony Brook, Stony Brook, NY 11794, United States.
| | - Alfredo Fontanini
- Department of Neurobiology and Behavior, State University of New York at Stony Brook, Stony Brook, NY 11794, United States; Graduate Program in Neuroscience, State University of New York at Stony Brook, Stony Brook, NY 11794, United States
| | - Luca Mazzucato
- Departments of Biology and Mathematics and Institute of Neuroscience, University of Oregon, Eugene, OR 97403, United States
| |
Collapse
|
24
|
Nicola W, Clopath C. A diversity of interneurons and Hebbian plasticity facilitate rapid compressible learning in the hippocampus. Nat Neurosci 2019; 22:1168-1181. [PMID: 31235906 DOI: 10.1038/s41593-019-0415-2] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2018] [Accepted: 04/23/2019] [Indexed: 11/09/2022]
Abstract
The hippocampus is able to rapidly learn incoming information, even if that information is only observed once. Furthermore, this information can be replayed in a compressed format in either forward or reverse modes during sharp wave-ripples (SPW-Rs). We leveraged state-of-the-art techniques in training recurrent spiking networks to demonstrate how primarily interneuron networks can achieve the following: (1) generate internal theta sequences to bind externally elicited spikes in the presence of inhibition from the medial septum; (2) compress learned spike sequences in the form of a SPW-R when septal inhibition is removed; (3) generate and refine high-frequency assemblies during SPW-R-mediated compression; and (4) regulate the inter-SPW interval timing between SPW-Rs in ripple clusters. From the fast timescale of neurons to the slow timescale of behaviors, interneuron networks serve as the scaffolding for one-shot learning by replaying, reversing, refining, and regulating spike sequences.
Collapse
Affiliation(s)
- Wilten Nicola
- Department of Bioengineering, Imperial College London, London, UK
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, London, UK.
| |
Collapse
|
25
|
Mastrogiuseppe F, Ostojic S. A Geometrical Analysis of Global Stability in Trained Feedback Networks. Neural Comput 2019; 31:1139-1182. [DOI: 10.1162/neco_a_01187] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Recurrent neural networks have been extensively studied in the context of neuroscience and machine learning due to their ability to implement complex computations. While substantial progress in designing effective learning algorithms has been achieved, a full understanding of trained recurrent networks is still lacking. Specifically, the mechanisms that allow computations to emerge from the underlying recurrent dynamics are largely unknown. Here we focus on a simple yet underexplored computational setup: a feedback architecture trained to associate a stationary output to a stationary input. As a starting point, we derive an approximate analytical description of global dynamics in trained networks, which assumes uncorrelated connectivity weights in the feedback and in the random bulk. The resulting mean-field theory suggests that the task admits several classes of solutions, which imply different stability properties. Different classes are characterized in terms of the geometrical arrangement of the readout with respect to the input vectors, defined in the high-dimensional space spanned by the network population. We find that such an approximate theoretical approach can be used to understand how standard training techniques implement the input-output task in finite-size feedback networks. In particular, our simplified description captures the local and the global stability properties of the target solution, and thus predicts training performance.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives et Computationelles, INSERM U960, and Laboratoire de Physique Statistique, CNRS UMR 8550, Ecole Normale Supérieure–PSL Research University, Paris 75005, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationelles, INSERM U960, Ecole Normale Supérieure–PSL Research University, Paris 75005, France
| |
Collapse
|
26
|
Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS Comput Biol 2019; 15:e1006893. [PMID: 30897092 PMCID: PMC6445477 DOI: 10.1371/journal.pcbi.1006893] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Revised: 04/02/2019] [Accepted: 02/19/2019] [Indexed: 11/19/2022] Open
Abstract
Neural activity in awake behaving animals exhibits a vast range of timescales that can be several fold larger than the membrane time constant of individual neurons. Two types of mechanisms have been proposed to explain this conundrum. One possibility is that large timescales are generated by a network mechanism based on positive feedback, but this hypothesis requires fine-tuning of the strength or structure of the synaptic connections. A second possibility is that large timescales in the neural dynamics are inherited from large timescales of underlying biophysical processes, two prominent candidates being intrinsic adaptive ionic currents and synaptic transmission. How the timescales of adaptation or synaptic transmission influence the timescale of the network dynamics has however not been fully explored. To address this question, here we analyze large networks of randomly connected excitatory and inhibitory units with additional degrees of freedom that correspond to adaptation or synaptic filtering. We determine the fixed points of the systems, their stability to perturbations and the corresponding dynamical timescales. Furthermore, we apply dynamical mean field theory to study the temporal statistics of the activity in the fluctuating regime, and examine how the adaptation and synaptic timescales transfer from individual units to the whole population. Our overarching finding is that synaptic filtering and adaptation in single neurons have very different effects at the network level. Unexpectedly, the macroscopic network dynamics do not inherit the large timescale present in adaptive currents. In contrast, the timescales of network activity increase proportionally to the time constant of the synaptic filter. Altogether, our study demonstrates that the timescales of different biophysical processes have different effects on the network level, so that the slow processes within individual neurons do not necessarily induce slow activity in large recurrent neural networks.
Collapse
Affiliation(s)
- Manuel Beiran
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
27
|
Martí D, Brunel N, Ostojic S. Correlations between synapses in pairs of neurons slow down dynamics in randomly connected neural networks. Phys Rev E 2018; 97:062314. [PMID: 30011528 DOI: 10.1103/physreve.97.062314] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2017] [Indexed: 01/11/2023]
Abstract
Networks of randomly connected neurons are among the most popular models in theoretical neuroscience. The connectivity between neurons in the cortex is however not fully random, the simplest and most prominent deviation from randomness found in experimental data being the overrepresentation of bidirectional connections among pyramidal cells. Using numerical and analytical methods, we investigate the effects of partially symmetric connectivity on the dynamics in networks of rate units. We consider the two dynamical regimes exhibited by random neural networks: the weak-coupling regime, where the firing activity decays to a single fixed point unless the network is stimulated, and the strong-coupling or chaotic regime, characterized by internally generated fluctuating firing rates. In the weak-coupling regime, we compute analytically, for an arbitrary degree of symmetry, the autocorrelation of network activity in the presence of external noise. In the chaotic regime, we perform simulations to determine the timescale of the intrinsic fluctuations. In both cases, symmetry increases the characteristic asymptotic decay time of the autocorrelation function and therefore slows down the dynamics in the network.
Collapse
Affiliation(s)
- Daniel Martí
- Laboratoire de Neurosciences Cognitives, Inserm UMR No. 960, Ecole Normale Supérieure, PSL Research University, 75230 Paris, France
| | - Nicolas Brunel
- Department of Statistics and Department of Neurobiology, University of Chicago, Chicago, Illinois 60637, USA.,Department of Neurobiology and Department of Physics, Duke University, Durham, North Carolina 27710, USA
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, Inserm UMR No. 960, Ecole Normale Supérieure, PSL Research University, 75230 Paris, France
| |
Collapse
|
28
|
Ullner E, Politi A, Torcini A. Ubiquity of collective irregular dynamics in balanced networks of spiking neurons. CHAOS (WOODBURY, N.Y.) 2018; 28:081106. [PMID: 30180628 DOI: 10.1063/1.5049902] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2018] [Accepted: 08/09/2018] [Indexed: 06/08/2023]
Abstract
We revisit the dynamics of a prototypical model of balanced activity in networks of spiking neurons. A detailed investigation of the thermodynamic limit for fixed density of connections (massive coupling) shows that, when inhibition prevails, the asymptotic regime is not asynchronous but rather characterized by a self-sustained irregular, macroscopic (collective) dynamics. So long as the connectivity is massive, this regime is found in many different setups: leaky as well as quadratic integrate-and-fire neurons; large and small coupling strength; and weak and strong external currents.
Collapse
Affiliation(s)
- Ekkehard Ullner
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Antonio Politi
- Institute for Complex Systems and Mathematical Biology and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Alessandro Torcini
- Max Planck Institut für Physik komplexer Systeme, Nöthnitzer Str. 38, 01187 Dresden, Germany
| |
Collapse
|
29
|
Mastrogiuseppe F, Ostojic S. Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks. Neuron 2018; 99:609-623.e29. [PMID: 30057201 DOI: 10.1016/j.neuron.2018.07.003] [Citation(s) in RCA: 146] [Impact Index Per Article: 24.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2017] [Revised: 04/27/2018] [Accepted: 07/02/2018] [Indexed: 11/18/2022]
Abstract
Large-scale neural recordings have established that the transformation of sensory stimuli into motor outputs relies on low-dimensional dynamics at the population level, while individual neurons exhibit complex selectivity. Understanding how low-dimensional computations on mixed, distributed representations emerge from the structure of the recurrent connectivity and inputs to cortical networks is a major challenge. Here, we study a class of recurrent network models in which the connectivity is a sum of a random part and a minimal, low-dimensional structure. We show that, in such networks, the dynamics are low dimensional and can be directly inferred from connectivity using a geometrical approach. We exploit this understanding to determine minimal connectivity required to implement specific computations and find that the dynamical range and computational capacity quickly increase with the dimensionality of the connectivity structure. This framework produces testable experimental predictions for the relationship between connectivity, low-dimensional dynamics, and computational features of recorded neurons.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, 75005 Paris, France; Laboratoire de Physique Statistique, CNRS UMR 8550, École Normale Supérieure - PSL Research University, 75005 Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, 75005 Paris, France.
| |
Collapse
|
30
|
Gu QLL, Tian ZQK, Kovačič G, Zhou D, Cai D. The Dynamics of Balanced Spiking Neuronal Networks Under Poisson Drive Is Not Chaotic. Front Comput Neurosci 2018; 12:47. [PMID: 30013471 PMCID: PMC6036256 DOI: 10.3389/fncom.2018.00047] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2017] [Accepted: 05/30/2018] [Indexed: 11/30/2022] Open
Abstract
Some previous studies have shown that chaotic dynamics in the balanced state, i.e., one with balanced excitatory and inhibitory inputs into cortical neurons, is the underlying mechanism for the irregularity of neural activity. In this work, we focus on networks of current-based integrate-and-fire neurons with delta-pulse coupling. While we show that the balanced state robustly persists in this system within a broad range of parameters, we mathematically prove that the largest Lyapunov exponent of this type of neuronal networks is negative. Therefore, the irregular firing activity can exist in the system without the chaotic dynamics. That is the irregularity of balanced neuronal networks need not arise from chaos.
Collapse
Affiliation(s)
- Qing-Long L Gu
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - Zhong-Qi K Tian
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - Gregor Kovačič
- Mathematical Sciences Department, Rensselaer Polytechnic Institute, Troy, NY, United States
| | - Douglas Zhou
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China
| | - David Cai
- School of Mathematical Sciences, MOE-LSC, Institute of Natural Sciences, Shanghai Jiao Tong University, Shanghai, China.,Courant Institute of Mathematical Sciences, Center for Neural Science, New York University, New York, NY, United States.,NYUAD Institute, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| |
Collapse
|
31
|
Pereira U, Brunel N. Attractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data. Neuron 2018; 99:227-238.e4. [PMID: 29909997 PMCID: PMC6091895 DOI: 10.1016/j.neuron.2018.05.038] [Citation(s) in RCA: 33] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2017] [Revised: 04/08/2018] [Accepted: 05/23/2018] [Indexed: 01/12/2023]
Abstract
The attractor neural network scenario is a popular scenario for memory storage in the association cortex, but there is still a large gap between models based on this scenario and experimental data. We study a recurrent network model in which both learning rules and distribution of stored patterns are inferred from distributions of visual responses for novel and familiar images in the inferior temporal cortex (ITC). Unlike classical attractor neural network models, our model exhibits graded activity in retrieval states, with distributions of firing rates that are close to lognormal. Inferred learning rules are close to maximizing the number of stored patterns within a family of unsupervised Hebbian learning rules, suggesting that learning rules in ITC are optimized to store a large number of attractor states. Finally, we show that there exist two types of retrieval states: one in which firing rates are constant in time and another in which firing rates fluctuate chaotically.
Collapse
Affiliation(s)
- Ulises Pereira
- Department of Statistics, The University of Chicago, Chicago, IL 60637, USA
| | - Nicolas Brunel
- Department of Statistics, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, Duke University, Durham, NC 27710, USA; Department of Physics, Duke University, Durham, NC 27708, USA.
| |
Collapse
|
32
|
Synchronization transition in neuronal networks composed of chaotic or non-chaotic oscillators. Sci Rep 2018; 8:8370. [PMID: 29849108 PMCID: PMC5976724 DOI: 10.1038/s41598-018-26730-9] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 05/11/2018] [Indexed: 12/20/2022] Open
Abstract
Chaotic dynamics has been shown in the dynamics of neurons and neural networks, in experimental data and numerical simulations. Theoretical studies have proposed an underlying role of chaos in neural systems. Nevertheless, whether chaotic neural oscillators make a significant contribution to network behaviour and whether the dynamical richness of neural networks is sensitive to the dynamics of isolated neurons, still remain open questions. We investigated synchronization transitions in heterogeneous neural networks of neurons connected by electrical coupling in a small world topology. The nodes in our model are oscillatory neurons that – when isolated – can exhibit either chaotic or non-chaotic behaviour, depending on conductance parameters. We found that the heterogeneity of firing rates and firing patterns make a greater contribution than chaos to the steepness of the synchronization transition curve. We also show that chaotic dynamics of the isolated neurons do not always make a visible difference in the transition to full synchrony. Moreover, macroscopic chaos is observed regardless of the dynamics nature of the neurons. However, performing a Functional Connectivity Dynamics analysis, we show that chaotic nodes can promote what is known as multi-stable behaviour, where the network dynamically switches between a number of different semi-synchronized, metastable states.
Collapse
|
33
|
Kim CM, Chow CC. Learning recurrent dynamics in spiking networks. eLife 2018; 7:37124. [PMID: 30234488 PMCID: PMC6195349 DOI: 10.7554/elife.37124] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2018] [Accepted: 09/14/2018] [Indexed: 01/27/2023] Open
Abstract
Spiking activity of neurons engaged in learning and performing a task show complex spatiotemporal dynamics. While the output of recurrent network models can learn to perform various tasks, the possible range of recurrent dynamics that emerge after learning remains unknown. Here we show that modifying the recurrent connectivity with a recursive least squares algorithm provides sufficient flexibility for synaptic and spiking rate dynamics of spiking networks to produce a wide range of spatiotemporal activity. We apply the training method to learn arbitrary firing patterns, stabilize irregular spiking activity in a network of excitatory and inhibitory neurons respecting Dale's law, and reproduce the heterogeneous spiking rate patterns of cortical neurons engaged in motor planning and movement. We identify sufficient conditions for successful learning, characterize two types of learning errors, and assess the network capacity. Our findings show that synaptically-coupled recurrent spiking networks possess a vast computational capability that can support the diverse activity patterns in the brain.
Collapse
Affiliation(s)
- Christopher M Kim
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney DiseasesNational Institutes of HealthBethesdaUnited States
| | - Carson C Chow
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney DiseasesNational Institutes of HealthBethesdaUnited States
| |
Collapse
|
34
|
Nicola W, Clopath C. Supervised learning in spiking neural networks with FORCE training. Nat Commun 2017; 8:2208. [PMID: 29263361 PMCID: PMC5738356 DOI: 10.1038/s41467-017-01827-3] [Citation(s) in RCA: 83] [Impact Index Per Article: 11.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2016] [Accepted: 10/19/2017] [Indexed: 12/31/2022] Open
Abstract
Populations of neurons display an extraordinary diversity in the behaviors they affect and display. Machine learning techniques have recently emerged that allow us to create networks of model neurons that display behaviors of similar complexity. Here we demonstrate the direct applicability of one such technique, the FORCE method, to spiking neural networks. We train these networks to mimic dynamical systems, classify inputs, and store discrete sequences that correspond to the notes of a song. Finally, we use FORCE training to create two biologically motivated model circuits. One is inspired by the zebra finch and successfully reproduces songbird singing. The second network is motivated by the hippocampus and is trained to store and replay a movie scene. FORCE trained networks reproduce behaviors comparable in complexity to their inspired circuits and yield information not easily obtainable with other techniques, such as behavioral responses to pharmacological manipulations and spike timing statistics.
Collapse
Affiliation(s)
- Wilten Nicola
- Department of Bioengineering, Imperial College London, Royal School of Mines, London, SW7 2AZ, UK
| | - Claudia Clopath
- Department of Bioengineering, Imperial College London, Royal School of Mines, London, SW7 2AZ, UK.
| |
Collapse
|
35
|
Abstract
Implicit expectations induced by predictable stimuli sequences affect neuronal response to upcoming stimuli at both single cell and neural population levels. Temporally regular sensory streams also phase entrain ongoing low frequency brain oscillations but how and why this happens is unknown. Here we investigate how random recurrent neural networks without plasticity respond to stimuli streams containing oddballs. We found the neuronal correlates of sensory stream adaptation emerge if networks generate chaotic oscillations which can be phase entrained by stimulus streams. The resultant activity patterns are close to critical and support history dependent response on long timescales. Because critical network entrainment is a slow process stimulus response adapts gradually over multiple repetitions. Repeated stimuli generate suppressed responses but oddball responses are large and distinct. Oscillatory mismatch responses persist in population activity for long periods after stimulus offset while individual cell mismatch responses are strongly phasic. These effects are weakened in temporally irregular sensory streams. Thus we show that network phase entrainment provides a biologically plausible mechanism for neural oddball detection. Our results do not depend on specific network characteristics, are consistent with experimental studies and may be relevant for multiple pathologies demonstrating altered mismatch processing such as schizophrenia and depression.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM T.J. Watson Research Center, Yorktown Heights, NY, USA.
- Okinawa Institute of Science and Technology Graduate University (OIST), Okinawa, Japan.
| |
Collapse
|
36
|
Jedlicka P. Revisiting the Quantum Brain Hypothesis: Toward Quantum (Neuro)biology? Front Mol Neurosci 2017; 10:366. [PMID: 29163041 PMCID: PMC5681944 DOI: 10.3389/fnmol.2017.00366] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2017] [Accepted: 10/24/2017] [Indexed: 12/14/2022] Open
Abstract
The nervous system is a non-linear dynamical complex system with many feedback loops. A conventional wisdom is that in the brain the quantum fluctuations are self-averaging and thus functionally negligible. However, this intuition might be misleading in the case of non-linear complex systems. Because of an extreme sensitivity to initial conditions, in complex systems the microscopic fluctuations may be amplified and thereby affect the system's behavior. In this way quantum dynamics might influence neuronal computations. Accumulating evidence in non-neuronal systems indicates that biological evolution is able to exploit quantum stochasticity. The recent rise of quantum biology as an emerging field at the border between quantum physics and the life sciences suggests that quantum events could play a non-trivial role also in neuronal cells. Direct experimental evidence for this is still missing but future research should address the possibility that quantum events contribute to an extremely high complexity, variability and computational power of neuronal dynamics.
Collapse
|
37
|
Balanced excitation and inhibition are required for high-capacity, noise-robust neuronal selectivity. Proc Natl Acad Sci U S A 2017; 114:E9366-E9375. [PMID: 29042519 DOI: 10.1073/pnas.1705841114] [Citation(s) in RCA: 49] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Neurons and networks in the cerebral cortex must operate reliably despite multiple sources of noise. To evaluate the impact of both input and output noise, we determine the robustness of single-neuron stimulus selective responses, as well as the robustness of attractor states of networks of neurons performing memory tasks. We find that robustness to output noise requires synaptic connections to be in a balanced regime in which excitation and inhibition are strong and largely cancel each other. We evaluate the conditions required for this regime to exist and determine the properties of networks operating within it. A plausible synaptic plasticity rule for learning that balances weight configurations is presented. Our theory predicts an optimal ratio of the number of excitatory and inhibitory synapses for maximizing the encoding capacity of balanced networks for given statistics of afferent activations. Previous work has shown that balanced networks amplify spatiotemporal variability and account for observed asynchronous irregular states. Here we present a distinct type of balanced network that amplifies small changes in the impinging signals and emerges automatically from learning to perform neuronal and network functions robustly.
Collapse
|
38
|
How linear response shaped models of neural circuits and the quest for alternatives. Curr Opin Neurobiol 2017; 46:234-240. [DOI: 10.1016/j.conb.2017.09.001] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Accepted: 09/07/2017] [Indexed: 11/23/2022]
|
39
|
Huang C, Doiron B. Once upon a (slow) time in the land of recurrent neuronal networks…. Curr Opin Neurobiol 2017; 46:31-38. [DOI: 10.1016/j.conb.2017.07.003] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2017] [Revised: 06/21/2017] [Accepted: 07/06/2017] [Indexed: 12/22/2022]
|
40
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
41
|
Hennequin G, Agnes EJ, Vogels TP. Inhibitory Plasticity: Balance, Control, and Codependence. Annu Rev Neurosci 2017; 40:557-579. [DOI: 10.1146/annurev-neuro-072116-031005] [Citation(s) in RCA: 140] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Guillaume Hennequin
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, Cambridge CB2 3EJ, United Kingdom
| | - Everton J. Agnes
- Centre for Neural Circuits and Behaviour, Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford OX1 3SR, United Kingdom
| | - Tim P. Vogels
- Centre for Neural Circuits and Behaviour, Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford OX1 3SR, United Kingdom
| |
Collapse
|
42
|
Abstract
Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks’ spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities—including those of different cell types—combine with connectivity to shape population activity and function. Neuronal networks, like many biological systems, exhibit variable activity. This activity is shaped by both the underlying biology of the component neurons and the structure of their interactions. How can we combine knowledge of these two things—that is, models of individual neurons and of their interactions—to predict the statistics of single- and multi-neuron activity? Current approaches rely on linearizing neural activity around a stationary state. In the face of neural nonlinearities, however, these linear methods can fail to predict spiking statistics and even fail to correctly predict whether activity is stable or pathological. Here, we show how to calculate any spike train cumulant in a broad class of models, while systematically accounting for nonlinear effects. We then study a fundamental effect of nonlinear input-rate transfer–coupling between different orders of spiking statistic–and how this depends on single-neuron and network properties.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Allen Institute for Brain Science, Seattle, Washington, United States of America
| | - Krešimir Josić
- Department of Mathematics and Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- Department of BioSciences, Rice University, Houston, Texas, United States of America
| | - Eric Shea-Brown
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, and UW Institute of Neuroengineering, University of Washington, Seattle, Washington, United States of America
| | - Michael A. Buice
- Allen Institute for Brain Science, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- * E-mail:
| |
Collapse
|
43
|
A canonical neural mechanism for behavioral variability. Nat Commun 2017; 8:15415. [PMID: 28530225 PMCID: PMC5458148 DOI: 10.1038/ncomms15415] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2016] [Accepted: 03/22/2017] [Indexed: 02/01/2023] Open
Abstract
The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5-6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these 'universal' statistics.
Collapse
|
44
|
Duarte R, Seeholzer A, Zilles K, Morrison A. Synaptic patterning and the timescales of cortical dynamics. Curr Opin Neurobiol 2017; 43:156-165. [PMID: 28407562 DOI: 10.1016/j.conb.2017.02.007] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2016] [Revised: 11/22/2016] [Accepted: 02/08/2017] [Indexed: 11/19/2022]
Abstract
Neocortical circuits, as large heterogeneous recurrent networks, can potentially operate and process signals at multiple timescales, but appear to be differentially tuned to operate within certain temporal receptive windows. The modular and hierarchical organization of this selectivity mirrors anatomical and physiological relations throughout the cortex and is likely determined by the regional electrochemical composition. Being consistently patterned and actively regulated, the expression of molecules involved in synaptic transmission constitutes the most significant source of laminar and regional variability. Due to their complex kinetics and adaptability, synapses form a natural primary candidate underlying this regional temporal selectivity. The ability of cortical networks to reflect the temporal structure of the sensory environment can thus be regulated by evolutionary and experience-dependent processes.
Collapse
Affiliation(s)
- Renato Duarte
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany; Bernstein Center Freiburg, Albert-Ludwig University of Freiburg, Germany; Faculty of Biology, Albert-Ludwig University of Freiburg, Freiburg im Breisgau, Germany; Institute of Adaptive and Neural Computation, School of Informatics, University of Edinburgh, UK.
| | - Alexander Seeholzer
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Karl Zilles
- Institute of Neuroscience and Medicine (INM-1), Jülich Research Centre, Jülich, Germany; JARA-BRAIN, Aachen, Germany
| | - Abigail Morrison
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany; Bernstein Center Freiburg, Albert-Ludwig University of Freiburg, Germany; Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr-University Bochum, Bochum, Germany
| |
Collapse
|
45
|
Mastrogiuseppe F, Ostojic S. Intrinsically-generated fluctuating activity in excitatory-inhibitory networks. PLoS Comput Biol 2017; 13:e1005498. [PMID: 28437436 PMCID: PMC5421821 DOI: 10.1371/journal.pcbi.1005498] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Revised: 05/08/2017] [Accepted: 04/04/2017] [Indexed: 12/05/2022] Open
Abstract
Recurrent networks of non-linear units display a variety of dynamical regimes depending on the structure of their synaptic connectivity. A particularly remarkable phenomenon is the appearance of strongly fluctuating, chaotic activity in networks of deterministic, but randomly connected rate units. How this type of intrinsically generated fluctuations appears in more realistic networks of spiking neurons has been a long standing question. To ease the comparison between rate and spiking networks, recent works investigated the dynamical regimes of randomly-connected rate networks with segregated excitatory and inhibitory populations, and firing rates constrained to be positive. These works derived general dynamical mean field (DMF) equations describing the fluctuating dynamics, but solved these equations only in the case of purely inhibitory networks. Using a simplified excitatory-inhibitory architecture in which DMF equations are more easily tractable, here we show that the presence of excitation qualitatively modifies the fluctuating activity compared to purely inhibitory networks. In presence of excitation, intrinsically generated fluctuations induce a strong increase in mean firing rates, a phenomenon that is much weaker in purely inhibitory networks. Excitation moreover induces two different fluctuating regimes: for moderate overall coupling, recurrent inhibition is sufficient to stabilize fluctuations; for strong coupling, firing rates are stabilized solely by the upper bound imposed on activity, even if inhibition is stronger than excitation. These results extend to more general network architectures, and to rate networks receiving noisy inputs mimicking spiking activity. Finally, we show that signatures of the second dynamical regime appear in networks of integrate-and-fire neurons.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
- Laboratoire de Physique Statistique, CNRS UMR 8550, École Normale Supérieure - PSL Research University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
| |
Collapse
|
46
|
Bimbard C, Ledoux E, Ostojic S. Instability to a heterogeneous oscillatory state in randomly connected recurrent networks with delayed interactions. Phys Rev E 2016; 94:062207. [PMID: 28085410 DOI: 10.1103/physreve.94.062207] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2016] [Indexed: 06/06/2023]
Abstract
Oscillatory dynamics are ubiquitous in biological networks. Possible sources of oscillations are well understood in low-dimensional systems but have not been fully explored in high-dimensional networks. Here we study large networks consisting of randomly coupled rate units. We identify a type of bifurcation in which a continuous part of the eigenvalue spectrum of the linear stability matrix crosses the instability line at nonzero frequency. This bifurcation occurs when the interactions are delayed and partially antisymmetric and leads to a heterogeneous oscillatory state in which oscillations are apparent in the activity of individual units but not on the population-average level.
Collapse
Affiliation(s)
- Célian Bimbard
- Laboratoire des Systèmes Perceptifs, Équipe Audition, CNRS UMR 8248, École Normale Supérieure, Paris, France
| | - Erwan Ledoux
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure-PSL Research University, Paris, France
| | - Srdjan Ostojic
- Group for Neural Theory, Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure-PSL Research University, Paris, France
| |
Collapse
|
47
|
Engelken R, Farkhooi F, Hansel D, van Vreeswijk C, Wolf F. A reanalysis of "Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons". F1000Res 2016; 5:2043. [PMID: 27746905 PMCID: PMC5040152 DOI: 10.12688/f1000research.9144.1] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 08/08/2016] [Indexed: 11/20/2022] Open
Abstract
Neuronal activity in the central nervous system varies strongly in time and across neuronal populations. It is a longstanding proposal that such fluctuations generically arise from chaotic network dynamics. Various theoretical studies predict that the rich dynamics of rate models operating in the chaotic regime can subserve circuit computation and learning. Neurons in the brain, however, communicate via spikes and it is a theoretical challenge to obtain similar rate fluctuations in networks of spiking neuron models. A recent study investigated spiking balanced networks of leaky integrate and fire (LIF) neurons and compared their dynamics to a matched rate network with identical topology, where single unit input-output functions were chosen from isolated LIF neurons receiving Gaussian white noise input. A mathematical analogy between the chaotic instability in networks of rate units and the spiking network dynamics was proposed. Here we revisit the behavior of the spiking LIF networks and these matched rate networks. We find expected hallmarks of a chaotic instability in the rate network: For supercritical coupling strength near the transition point, the autocorrelation time diverges. For subcritical coupling strengths, we observe critical slowing down in response to small external perturbations. In the spiking network, we found in contrast that the timescale of the autocorrelations is insensitive to the coupling strength and that rate deviations resulting from small input perturbations rapidly decay. The decay speed even accelerates for increasing coupling strength. In conclusion, our reanalysis demonstrates fundamental differences between the behavior of pulse-coupled spiking LIF networks and rate networks with matched topology and input-output function. In particular there is no indication of a corresponding chaotic instability in the spiking network.
Collapse
Affiliation(s)
- Rainer Engelken
- Max Planck Institute for Dynamics and Self-Organization (MPI-DS), Bernstein Center for Computational Neuroscience Göttingen, Faculty of Physics,, University of Göttingen, Göttingen, Germany.,Collaborative Research Center 889, University of Göttingen, Göttingen, 37099, Germany
| | - Farzad Farkhooi
- Institut für Mathematik, Technische Universität Berlin and Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - David Hansel
- Cerebral Dynamics, Learning and Memory Research Group, Center for Neurophysics, Physiology and Pathology, CNRS UMR8119, Université Paris Descartes, Paris, France
| | - Carl van Vreeswijk
- Cerebral Dynamics, Learning and Memory Research Group, Center for Neurophysics, Physiology and Pathology, CNRS UMR8119, Université Paris Descartes, Paris, France
| | - Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization (MPI-DS), Bernstein Center for Computational Neuroscience Göttingen, Faculty of Physics,, University of Göttingen, Göttingen, Germany.,Collaborative Research Center 889, University of Göttingen, Göttingen, 37099, Germany
| |
Collapse
|
48
|
Doiron B, Litwin-Kumar A, Rosenbaum R, Ocker GK, Josić K. The mechanics of state-dependent neural correlations. Nat Neurosci 2016; 19:383-93. [PMID: 26906505 DOI: 10.1038/nn.4242] [Citation(s) in RCA: 173] [Impact Index Per Article: 21.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2015] [Accepted: 01/12/2016] [Indexed: 12/12/2022]
Abstract
Simultaneous recordings from large neural populations are becoming increasingly common. An important feature of population activity is the trial-to-trial correlated fluctuation of spike train outputs from recorded neuron pairs. Similar to the firing rate of single neurons, correlated activity can be modulated by a number of factors, from changes in arousal and attentional state to learning and task engagement. However, the physiological mechanisms that underlie these changes are not fully understood. We review recent theoretical results that identify three separate mechanisms that modulate spike train correlations: changes in input correlations, internal fluctuations and the transfer function of single neurons. We first examine these mechanisms in feedforward pathways and then show how the same approach can explain the modulation of correlations in recurrent networks. Such mechanistic constraints on the modulation of population activity will be important in statistical analyses of high-dimensional neural data.
Collapse
Affiliation(s)
- Brent Doiron
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA
| | - Ashok Litwin-Kumar
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA.,Center for Theoretical Neuroscience, Columbia University, New York, New York, USA
| | - Robert Rosenbaum
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA.,Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, USA.,Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, USA
| | - Gabriel K Ocker
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA.,Allen Institute for Brain Science, Seattle, Washington, USA
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, USA.,Department of Biology and Biochemistry, University of Houston, Houston, Texas, USA
| |
Collapse
|
49
|
Thalmeier D, Uhlmann M, Kappen HJ, Memmesheimer RM. Learning Universal Computations with Spikes. PLoS Comput Biol 2016; 12:e1004895. [PMID: 27309381 PMCID: PMC4911146 DOI: 10.1371/journal.pcbi.1004895] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2015] [Accepted: 04/01/2016] [Indexed: 11/19/2022] Open
Abstract
Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them.
Collapse
Affiliation(s)
- Dominik Thalmeier
- Donders Institute, Department of Biophysics, Radboud University, Nijmegen, Netherlands
| | - Marvin Uhlmann
- Max Planck Institute for Psycholinguistics, Department for Neurobiology of Language, Nijmegen, Netherlands
- Donders Institute, Department for Neuroinformatics, Radboud University, Nijmegen, Netherlands
| | - Hilbert J. Kappen
- Donders Institute, Department of Biophysics, Radboud University, Nijmegen, Netherlands
| | - Raoul-Martin Memmesheimer
- Donders Institute, Department for Neuroinformatics, Radboud University, Nijmegen, Netherlands
- Center for Theoretical Neuroscience, Columbia University, New York, New York, United States of America
- * E-mail:
| |
Collapse
|
50
|
Tan AYY. Spatial diversity of spontaneous activity in the cortex. Front Neural Circuits 2015; 9:48. [PMID: 26441547 PMCID: PMC4585302 DOI: 10.3389/fncir.2015.00048] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2015] [Accepted: 08/24/2015] [Indexed: 12/05/2022] Open
Abstract
The neocortex is a layered sheet across which a basic organization is thought to widely apply. The variety of spontaneous activity patterns is similar throughout the cortex, consistent with the notion of a basic cortical organization. However, the basic organization is only an outline which needs adjustments and additions to account for the structural and functional diversity across cortical layers and areas. Such diversity suggests that spontaneous activity is spatially diverse in any particular behavioral state. Accordingly, this review summarizes the laminar and areal diversity in cortical activity during fixation and slow oscillations, and the effects of attention, anesthesia and plasticity on the cortical distribution of spontaneous activity. Among questions that remain open, characterizing the spatial diversity in spontaneous membrane potential may help elucidate how differences in circuitry among cortical regions supports their varied functions. More work is also needed to understand whether cortical spontaneous activity not only reflects cortical circuitry, but also contributes to determining the outcome of plasticity, so that it is itself a factor shaping the functional diversity of the cortex.
Collapse
Affiliation(s)
- Andrew Y Y Tan
- Center for Perceptual Systems and Department of Neuroscience, The University of Texas at Austin Austin, TX, USA
| |
Collapse
|