1
|
Rudelt L, González Marx D, Spitzner FP, Cramer B, Zierenberg J, Priesemann V. Signatures of hierarchical temporal processing in the mouse visual system. PLoS Comput Biol 2024; 20:e1012355. [PMID: 39173067 PMCID: PMC11373856 DOI: 10.1371/journal.pcbi.1012355] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2024] [Revised: 09/04/2024] [Accepted: 07/23/2024] [Indexed: 08/24/2024] Open
Abstract
A core challenge for the brain is to process information across various timescales. This could be achieved by a hierarchical organization of temporal processing through intrinsic mechanisms (e.g., recurrent coupling or adaptation), but recent evidence from spike recordings of the rodent visual system seems to conflict with this hypothesis. Here, we used an optimized information-theoretic and classical autocorrelation analysis to show that information- and correlation timescales of spiking activity increase along the anatomical hierarchy of the mouse visual system under visual stimulation, while information-theoretic predictability decreases. Moreover, intrinsic timescales for spontaneous activity displayed a similar hierarchy, whereas the hierarchy of predictability was stimulus-dependent. We could reproduce these observations in a basic recurrent network model with correlated sensory input. Our findings suggest that the rodent visual system employs intrinsic mechanisms to achieve longer integration for higher cortical areas, while simultaneously reducing predictability for an efficient neural code.
Collapse
Affiliation(s)
- Lucas Rudelt
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, Göttingen, Germany
| | - Daniel González Marx
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, Göttingen, Germany
| | - F Paul Spitzner
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, Göttingen, Germany
| | - Benjamin Cramer
- Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany
| | - Johannes Zierenberg
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, Göttingen, Germany
| | - Viola Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, Göttingen, Germany
- Bernstein Center for Computational Neuroscience (BCCN), Göttingen, Germany
| |
Collapse
|
2
|
Dubinin I, Effenberger F. Fading memory as inductive bias in residual recurrent networks. Neural Netw 2024; 173:106179. [PMID: 38387205 DOI: 10.1016/j.neunet.2024.106179] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2023] [Revised: 02/07/2024] [Accepted: 02/13/2024] [Indexed: 02/24/2024]
Abstract
Residual connections have been proposed as an architecture-based inductive bias to mitigate the problem of exploding and vanishing gradients and increased task performance in both feed-forward and recurrent networks (RNNs) when trained with the backpropagation algorithm. Yet, little is known about how residual connections in RNNs influence their dynamics and fading memory properties. Here, we introduce weakly coupled residual recurrent networks (WCRNNs) in which residual connections result in well-defined Lyapunov exponents and allow for studying properties of fading memory. We investigate how the residual connections of WCRNNs influence their performance, network dynamics, and memory properties on a set of benchmark tasks. We show that several distinct forms of residual connections yield effective inductive biases that result in increased network expressivity. In particular, those are residual connections that (i) result in network dynamics at the proximity of the edge of chaos, (ii) allow networks to capitalize on characteristic spectral properties of the data, and (iii) result in heterogeneous memory properties. In addition, we demonstrate how our results can be extended to non-linear residuals and introduce a weakly coupled residual initialization scheme that can be used for Elman RNNs.
Collapse
Affiliation(s)
- Igor Dubinin
- Ernst Strüngmann Institute, Deutschordenstraße 46, Frankfurt am Main, 60528, Germany; Frankfurt Institute for Advanced Studies, Ruth-Moufang-Straße 1, Frankfurt am Main, 60438, Germany.
| | - Felix Effenberger
- Ernst Strüngmann Institute, Deutschordenstraße 46, Frankfurt am Main, 60528, Germany.
| |
Collapse
|
3
|
Chen R, Singh M, Braver TS, Ching S. Dynamical models reveal anatomically reliable attractor landscapes embedded in resting state brain networks. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.01.15.575745. [PMID: 38293124 PMCID: PMC10827065 DOI: 10.1101/2024.01.15.575745] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2024]
Abstract
Analyses of functional connectivity (FC) in resting-state brain networks (RSNs) have generated many insights into cognition. However, the mechanistic underpinnings of FC and RSNs are still not well-understood. It remains debated whether resting state activity is best characterized as noise-driven fluctuations around a single stable state, or instead, as a nonlinear dynamical system with nontrivial attractors embedded in the RSNs. Here, we provide evidence for the latter, by constructing whole-brain dynamical systems models from individual resting-state fMRI (rfMRI) recordings, using the Mesoscale Individualized NeuroDynamic (MINDy) platform. The MINDy models consist of hundreds of neural masses representing brain parcels, connected by fully trainable, individualized weights. We found that our models manifested a diverse taxonomy of nontrivial attractor landscapes including multiple equilibria and limit cycles. However, when projected into anatomical space, these attractors mapped onto a limited set of canonical RSNs, including the default mode network (DMN) and frontoparietal control network (FPN), which were reliable at the individual level. Further, by creating convex combinations of models, bifurcations were induced that recapitulated the full spectrum of dynamics found via fitting. These findings suggest that the resting brain traverses a diverse set of dynamics, which generates several distinct but anatomically overlapping attractor landscapes. Treating rfMRI as a unimodal stationary process (i.e., conventional FC) may miss critical attractor properties and structure within the resting brain. Instead, these may be better captured through neural dynamical modeling and analytic approaches. The results provide new insights into the generative mechanisms and intrinsic spatiotemporal organization of brain networks.
Collapse
Affiliation(s)
- Ruiqi Chen
- Division of Biology and Biomedical Sciences, Washington University in St. Louis, St. Louis, MO 63108
| | - Matthew Singh
- Department of Electrical and Systems Engineering, Washington University in St. Louis, St. Louis, MO 63108
| | - Todd S. Braver
- Department of Psychological & Brain Sciences, Washington University in St. Louis, St. Louis, MO 63108
| | - ShiNung Ching
- Department of Electrical and Systems Engineering, Washington University in St. Louis, St. Louis, MO 63108
| |
Collapse
|
4
|
Grosu GF, Hopp AV, Moca VV, Bârzan H, Ciuparu A, Ercsey-Ravasz M, Winkel M, Linde H, Mureșan RC. The fractal brain: scale-invariance in structure and dynamics. Cereb Cortex 2023; 33:4574-4605. [PMID: 36156074 PMCID: PMC10110456 DOI: 10.1093/cercor/bhac363] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Revised: 08/09/2022] [Accepted: 08/10/2022] [Indexed: 11/12/2022] Open
Abstract
The past 40 years have witnessed extensive research on fractal structure and scale-free dynamics in the brain. Although considerable progress has been made, a comprehensive picture has yet to emerge, and needs further linking to a mechanistic account of brain function. Here, we review these concepts, connecting observations across different levels of organization, from both a structural and functional perspective. We argue that, paradoxically, the level of cortical circuits is the least understood from a structural point of view and perhaps the best studied from a dynamical one. We further link observations about scale-freeness and fractality with evidence that the environment provides constraints that may explain the usefulness of fractal structure and scale-free dynamics in the brain. Moreover, we discuss evidence that behavior exhibits scale-free properties, likely emerging from similarly organized brain dynamics, enabling an organism to thrive in an environment that shares the same organizational principles. Finally, we review the sparse evidence for and try to speculate on the functional consequences of fractality and scale-freeness for brain computation. These properties may endow the brain with computational capabilities that transcend current models of neural computation and could hold the key to unraveling how the brain constructs percepts and generates behavior.
Collapse
Affiliation(s)
- George F Grosu
- Department of Experimental and Theoretical Neuroscience, Transylvanian Institute of Neuroscience, Str. Ploiesti 33, 400157 Cluj-Napoca, Romania
- Faculty of Electronics, Telecommunications and Information Technology, Technical University of Cluj-Napoca, Str. Memorandumului 28, 400114 Cluj-Napoca, Romania
| | | | - Vasile V Moca
- Department of Experimental and Theoretical Neuroscience, Transylvanian Institute of Neuroscience, Str. Ploiesti 33, 400157 Cluj-Napoca, Romania
| | - Harald Bârzan
- Department of Experimental and Theoretical Neuroscience, Transylvanian Institute of Neuroscience, Str. Ploiesti 33, 400157 Cluj-Napoca, Romania
- Faculty of Electronics, Telecommunications and Information Technology, Technical University of Cluj-Napoca, Str. Memorandumului 28, 400114 Cluj-Napoca, Romania
| | - Andrei Ciuparu
- Department of Experimental and Theoretical Neuroscience, Transylvanian Institute of Neuroscience, Str. Ploiesti 33, 400157 Cluj-Napoca, Romania
- Faculty of Electronics, Telecommunications and Information Technology, Technical University of Cluj-Napoca, Str. Memorandumului 28, 400114 Cluj-Napoca, Romania
| | - Maria Ercsey-Ravasz
- Department of Experimental and Theoretical Neuroscience, Transylvanian Institute of Neuroscience, Str. Ploiesti 33, 400157 Cluj-Napoca, Romania
- Faculty of Physics, Babes-Bolyai University, Str. Mihail Kogalniceanu 1, 400084 Cluj-Napoca, Romania
| | - Mathias Winkel
- Merck KGaA, Frankfurter Straße 250, 64293 Darmstadt, Germany
| | - Helmut Linde
- Department of Experimental and Theoretical Neuroscience, Transylvanian Institute of Neuroscience, Str. Ploiesti 33, 400157 Cluj-Napoca, Romania
- Merck KGaA, Frankfurter Straße 250, 64293 Darmstadt, Germany
| | - Raul C Mureșan
- Department of Experimental and Theoretical Neuroscience, Transylvanian Institute of Neuroscience, Str. Ploiesti 33, 400157 Cluj-Napoca, Romania
| |
Collapse
|
5
|
Jones SA, Barfield JH, Norman VK, Shew WL. Scale-free behavioral dynamics directly linked with scale-free cortical dynamics. eLife 2023; 12:e79950. [PMID: 36705565 PMCID: PMC9931391 DOI: 10.7554/elife.79950] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2022] [Accepted: 01/06/2023] [Indexed: 01/28/2023] Open
Abstract
Naturally occurring body movements and collective neural activity both exhibit complex dynamics, often with scale-free, fractal spatiotemporal structure. Scale-free dynamics of both brain and behavior are important because each is associated with functional benefits to the organism. Despite their similarities, scale-free brain activity and scale-free behavior have been studied separately, without a unified explanation. Here, we show that scale-free dynamics of mouse behavior and neurons in the visual cortex are strongly related. Surprisingly, the scale-free neural activity is limited to specific subsets of neurons, and these scale-free subsets exhibit stochastic winner-take-all competition with other neural subsets. This observation is inconsistent with prevailing theories of scale-free dynamics in neural systems, which stem from the criticality hypothesis. We develop a computational model which incorporates known cell-type-specific circuit structure, explaining our findings with a new type of critical dynamics. Our results establish neural underpinnings of scale-free behavior and clear behavioral relevance of scale-free neural activity.
Collapse
Affiliation(s)
- Sabrina A Jones
- Department of Physics, University of Arkansas at FayettevilleFayettevilleUnited States
| | - Jacob H Barfield
- Department of Physics, University of Arkansas at FayettevilleFayettevilleUnited States
| | - V Kindler Norman
- Department of Physics, University of Arkansas at FayettevilleFayettevilleUnited States
| | - Woodrow L Shew
- Department of Physics, University of Arkansas at FayettevilleFayettevilleUnited States
| |
Collapse
|
6
|
Davenport F, Gallacher J, Kourtzi Z, Koychev I, Matthews PM, Oxtoby NP, Parkes LM, Priesemann V, Rowe JB, Smye SW, Zetterberg H. Neurodegenerative disease of the brain: a survey of interdisciplinary approaches. J R Soc Interface 2023; 20:20220406. [PMID: 36651180 PMCID: PMC9846433 DOI: 10.1098/rsif.2022.0406] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2022] [Accepted: 12/16/2022] [Indexed: 01/19/2023] Open
Abstract
Neurodegenerative diseases of the brain pose a major and increasing global health challenge, with only limited progress made in developing effective therapies over the last decade. Interdisciplinary research is improving understanding of these diseases and this article reviews such approaches, with particular emphasis on tools and techniques drawn from physics, chemistry, artificial intelligence and psychology.
Collapse
Affiliation(s)
| | - John Gallacher
- Director of Dementias Platform, Department of Psychiatry, University of Oxford, Oxford, UK
| | - Zoe Kourtzi
- Professor of Cognitive Computational Neuroscience, Department of Psychology, University of Cambridge, UK
| | - Ivan Koychev
- Senior Clinical Researcher, Department of Psychiatry, University of Oxford, Oxford, UK
- Consultant Neuropsychiatrist, Oxford University Hospitals NHS Foundation Trust, Oxford, UK
| | - Paul M. Matthews
- Department of Brain Sciences and UK Dementia Research Institute Centre, Imperial College London, Oxford, UK
| | - Neil P. Oxtoby
- UCL Centre for Medical Image Computing and Department of Computer Science, University College London, Gower Street, London, UK
| | - Laura M. Parkes
- School of Health Sciences, Faculty of Biology, Medicine and Health, The University of Manchester, Oxford Road, Manchester, M13 9PL, UK
- Geoffrey Jefferson Brain Research Centre, Manchester Academic Health Science Centre, Manchester, UK
| | - Viola Priesemann
- Max Planck Group Leader and Fellow of the Schiemann Kolleg, Max Planck Institute for Dynamics and Self-Organization and Bernstein Center for Computational Neuroscience, Göttingen, Germany
| | - James B. Rowe
- Department of Clinical Neurosciences, MRC Cognition and Brain Sciences Unit and Cambridge University Hospitals NHS Trust, University of Cambridge, Cambridge, UK
| | | | - Henrik Zetterberg
- Department of Neurodegenerative Disease, UCL Institute of Neurology, Queen Square, London, UK
- Department of Psychiatry and Neurochemistry, Institute of Neuroscience and Physiology, the Sahlgrenska Academy at the University of Gothenburg, Mölndal, Sweden
- Clinical Neurochemistry Laboratory, Sahlgrenska University Hospital, Mölndal, Sweden
- UK Dementia Research Institute at UCL, London, UK
- Hong Kong Center for Neurodegenerative Diseases, Clear Water Bay, Hong Kong, People's Republic of China
| |
Collapse
|
7
|
Neto JP, Spitzner FP, Priesemann V. Sampling effects and measurement overlap can bias the inference of neuronal avalanches. PLoS Comput Biol 2022; 18:e1010678. [PMID: 36445932 PMCID: PMC9733887 DOI: 10.1371/journal.pcbi.1010678] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Revised: 12/09/2022] [Accepted: 10/24/2022] [Indexed: 12/02/2022] Open
Abstract
To date, it is still impossible to sample the entire mammalian brain with single-neuron precision. This forces one to either use spikes (focusing on few neurons) or to use coarse-sampled activity (averaging over many neurons, e.g. LFP). Naturally, the sampling technique impacts inference about collective properties. Here, we emulate both sampling techniques on a simple spiking model to quantify how they alter observed correlations and signatures of criticality. We describe a general effect: when the inter-electrode distance is small, electrodes sample overlapping regions in space, which increases the correlation between the signals. For coarse-sampled activity, this can produce power-law distributions even for non-critical systems. In contrast, spike recordings do not suffer this particular bias and underlying dynamics can be identified. This may resolve why coarse measures and spikes have produced contradicting results in the past.
Collapse
Affiliation(s)
- Joao Pinheiro Neto
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - F. Paul Spitzner
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Viola Priesemann
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
- Georg-August University Göttingen, Göttingen, Germany
- * E-mail:
| |
Collapse
|
8
|
O'Byrne J, Jerbi K. How critical is brain criticality? Trends Neurosci 2022; 45:820-837. [PMID: 36096888 DOI: 10.1016/j.tins.2022.08.007] [Citation(s) in RCA: 54] [Impact Index Per Article: 27.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Revised: 07/27/2022] [Accepted: 08/10/2022] [Indexed: 10/31/2022]
Abstract
Criticality is the singular state of complex systems poised at the brink of a phase transition between order and randomness. Such systems display remarkable information-processing capabilities, evoking the compelling hypothesis that the brain may itself be critical. This foundational idea is now drawing renewed interest thanks to high-density data and converging cross-disciplinary knowledge. Together, these lines of inquiry have shed light on the intimate link between criticality, computation, and cognition. Here, we review these emerging trends in criticality neuroscience, highlighting new data pertaining to the edge of chaos and near-criticality, and making a case for the distance to criticality as a useful metric for probing cognitive states and mental illness. This unfolding progress in the field contributes to establishing criticality theory as a powerful mechanistic framework for studying emergent function and its efficiency in both biological and artificial neural networks.
Collapse
Affiliation(s)
- Jordan O'Byrne
- Cognitive and Computational Neuroscience Lab, Psychology Department, University of Montreal, Montreal, Quebec, Canada
| | - Karim Jerbi
- Cognitive and Computational Neuroscience Lab, Psychology Department, University of Montreal, Montreal, Quebec, Canada; MILA (Quebec Artificial Intelligence Institute), Montreal, Quebec, Canada; UNIQUE Center (Quebec Neuro-AI Research Center), Montreal, Quebec, Canada.
| |
Collapse
|
9
|
Suryadi, Cheng RK, Birkett E, Jesuthasan S, Chew LY. Dynamics and potential significance of spontaneous activity in the habenula. eNeuro 2022; 9:ENEURO.0287-21.2022. [PMID: 35981869 PMCID: PMC9450562 DOI: 10.1523/eneuro.0287-21.2022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2021] [Revised: 05/31/2022] [Accepted: 06/27/2022] [Indexed: 11/21/2022] Open
Abstract
The habenula is an evolutionarily conserved structure of the vertebrate brain that is essential for behavioural flexibility and mood control. It is spontaneously active and is able to access diverse states when the animal is exposed to sensory stimuli. Here we investigate the dynamics of habenula spontaneous activity, to gain insight into how sensitivity is optimized. Two-photon calcium imaging was performed in resting zebrafish larvae at single cell resolution. An analysis of avalanches of inferred spikes suggests that the habenula is subcritical. Activity had low covariance and a small mean, arguing against dynamic criticality. A multiple regression estimator of autocorrelation time suggests that the habenula is neither fully asynchronous nor perfectly critical, but is reverberating. This pattern of dynamics may enable integration of information and high flexibility in the tuning of network properties, thus providing a potential mechanism for the optimal responses to a changing environment.Significance StatementSpontaneous activity in neurons shapes the response to stimuli. One structure with a high level of spontaneous neuronal activity is the habenula, a regulator of broadly acting neuromodulators involved in mood and learning. How does this activity influence habenula function? We show here that the habenula of a resting animal is near criticality, in a state termed reverberation. This pattern of dynamics is consistent with high sensitivity and flexibility, and may enable the habenula to respond optimally to a wide range of stimuli.
Collapse
Affiliation(s)
- Suryadi
- School of Physical & Mathematical Sciences, Nanyang Technological University, Singapore 637371
| | - Ruey-Kuang Cheng
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore 636921
| | - Elliot Birkett
- Institute of Molecular and Cell Biology, Singapore 138673
- School of Biosciences, University of Sheffield, Sheffield, United Kingdom
| | - Suresh Jesuthasan
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore 636921
- Institute of Molecular and Cell Biology, Singapore 138673
| | - Lock Yue Chew
- School of Physical & Mathematical Sciences, Nanyang Technological University, Singapore 637371
- Complexity Institute, Nanyang Technological University, Singapore 637335
| |
Collapse
|
10
|
Yu C. Toward a Unified Analysis of the Brain Criticality Hypothesis: Reviewing Several Available Tools. Front Neural Circuits 2022; 16:911245. [PMID: 35669452 PMCID: PMC9164306 DOI: 10.3389/fncir.2022.911245] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2022] [Accepted: 04/19/2022] [Indexed: 11/13/2022] Open
Abstract
The study of the brain criticality hypothesis has been going on for about 20 years, various models and methods have been developed for probing this field, together with large amounts of controversial experimental findings. However, no standardized protocol of analysis has been established so far. Therefore, hoping to make some contributions to standardization of such analysis, we review several available tools used for estimating the criticality of the brain in this paper.
Collapse
|
11
|
Knipper M, Mazurek B, van Dijk P, Schulze H. Too Blind to See the Elephant? Why Neuroscientists Ought to Be Interested in Tinnitus. J Assoc Res Otolaryngol 2021; 22:609-621. [PMID: 34686939 PMCID: PMC8599745 DOI: 10.1007/s10162-021-00815-1] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2021] [Accepted: 08/30/2021] [Indexed: 01/13/2023] Open
Abstract
A curative therapy for tinnitus currently does not exist. One may actually exist but cannot currently be causally linked to tinnitus due to the lack of consistency of concepts about the neural correlate of tinnitus. Depending on predictions, these concepts would require either a suppression or enhancement of brain activity or an increase in inhibition or disinhibition. Although procedures with a potential to silence tinnitus may exist, the lack of rationale for their curative success hampers an optimization of therapeutic protocols. We discuss here six candidate contributors to tinnitus that have been suggested by a variety of scientific experts in the field and that were addressed in a virtual panel discussion at the ARO round table in February 2021. In this discussion, several potential tinnitus contributors were considered: (i) inhibitory circuits, (ii) attention, (iii) stress, (iv) unidentified sub-entities, (v) maladaptive information transmission, and (vi) minor cochlear deafferentation. Finally, (vii) some potential therapeutic approaches were discussed. The results of this discussion is reflected here in view of potential blind spots that may still remain and that have been ignored in most tinnitus literature. We strongly suggest to consider the high impact of connecting the controversial findings to unravel the whole complexity of the tinnitus phenomenon; an essential prerequisite for establishing suitable therapeutic approaches.
Collapse
Affiliation(s)
- Marlies Knipper
- Molecular Physiology of Hearing, Tübingen Hearing Research Centre (THRC), Department of Otolaryngology, Head & Neck Surgery, University of Tübingen, Elfriede-Aulhorn-Straße 5, 72076, Tübingen, Germany.
| | - Birgit Mazurek
- Tinnitus Center Charité, Universitätsmedizin Berlin, Berlin, Germany
| | - Pim van Dijk
- Department of Otorhinolaryngology/Head and Neck Surgery, University of Groningen, University Medical Center Groningen, Groningen, The Netherlands
- Graduate School of Medical Sciences (Research School of Behavioural and Cognitive Neurosciences), University of Groningen, Groningen, The Netherlands
| | - Holger Schulze
- Experimental Otolaryngology, Friedrich-Alexander Universität Erlangen-Nürnberg, Waldstrasse 1, 91054, Erlangen, Germany
| |
Collapse
|
12
|
Rudelt L, González Marx D, Wibral M, Priesemann V. Embedding optimization reveals long-lasting history dependence in neural spiking activity. PLoS Comput Biol 2021; 17:e1008927. [PMID: 34061837 PMCID: PMC8205186 DOI: 10.1371/journal.pcbi.1008927] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2020] [Revised: 06/15/2021] [Accepted: 03/31/2021] [Indexed: 11/19/2022] Open
Abstract
Information processing can leave distinct footprints on the statistics of neural spiking. For example, efficient coding minimizes the statistical dependencies on the spiking history, while temporal integration of information may require the maintenance of information over different timescales. To investigate these footprints, we developed a novel approach to quantify history dependence within the spiking of a single neuron, using the mutual information between the entire past and current spiking. This measure captures how much past information is necessary to predict current spiking. In contrast, classical time-lagged measures of temporal dependence like the autocorrelation capture how long-potentially redundant-past information can still be read out. Strikingly, we find for model neurons that our method disentangles the strength and timescale of history dependence, whereas the two are mixed in classical approaches. When applying the method to experimental data, which are necessarily of limited size, a reliable estimation of mutual information is only possible for a coarse temporal binning of past spiking, a so-called past embedding. To still account for the vastly different spiking statistics and potentially long history dependence of living neurons, we developed an embedding-optimization approach that does not only vary the number and size, but also an exponential stretching of past bins. For extra-cellular spike recordings, we found that the strength and timescale of history dependence indeed can vary independently across experimental preparations. While hippocampus indicated strong and long history dependence, in visual cortex it was weak and short, while in vitro the history dependence was strong but short. This work enables an information-theoretic characterization of history dependence in recorded spike trains, which captures a footprint of information processing that is beyond time-lagged measures of temporal dependence. To facilitate the application of the method, we provide practical guidelines and a toolbox.
Collapse
Affiliation(s)
- Lucas Rudelt
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | | | - Michael Wibral
- Campus Institute for Dynamics of Biological Networks, University of Göttingen, Göttingen, Germany
| | - Viola Priesemann
- Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
13
|
Spitzner FP, Dehning J, Wilting J, Hagemann A, P. Neto J, Zierenberg J, Priesemann V. MR. Estimator, a toolbox to determine intrinsic timescales from subsampled spiking activity. PLoS One 2021; 16:e0249447. [PMID: 33914774 PMCID: PMC8084202 DOI: 10.1371/journal.pone.0249447] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2020] [Accepted: 03/18/2021] [Indexed: 11/23/2022] Open
Abstract
Here we present our Python toolbox "MR. Estimator" to reliably estimate the intrinsic timescale from electrophysiologal recordings of heavily subsampled systems. Originally intended for the analysis of time series from neuronal spiking activity, our toolbox is applicable to a wide range of systems where subsampling-the difficulty to observe the whole system in full detail-limits our capability to record. Applications range from epidemic spreading to any system that can be represented by an autoregressive process. In the context of neuroscience, the intrinsic timescale can be thought of as the duration over which any perturbation reverberates within the network; it has been used as a key observable to investigate a functional hierarchy across the primate cortex and serves as a measure of working memory. It is also a proxy for the distance to criticality and quantifies a system's dynamic working point.
Collapse
Affiliation(s)
- F. P. Spitzner
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. Dehning
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - A. Hagemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. P. Neto
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - J. Zierenberg
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - V. Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein-Center for Computational Neuroscience (BCCN) Göttingen, Göttingen, Germany
| |
Collapse
|
14
|
Gross T. Not One, but Many Critical States: A Dynamical Systems Perspective. Front Neural Circuits 2021; 15:614268. [PMID: 33737868 PMCID: PMC7960911 DOI: 10.3389/fncir.2021.614268] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2020] [Accepted: 02/05/2021] [Indexed: 02/01/2023] Open
Abstract
The past decade has seen growing support for the critical brain hypothesis, i.e., the possibility that the brain could operate at or very near a critical state between two different dynamical regimes. Such critical states are well-studied in different disciplines, therefore there is potential for a continued transfer of knowledge. Here, I revisit foundations of bifurcation theory, the mathematical theory of transitions. While the mathematics is well-known it's transfer to neural dynamics leads to new insights and hypothesis.
Collapse
Affiliation(s)
- Thilo Gross
- Helmholtz Institute for Functional Marine Biodiversity (HIFMB), Oldenburg, Germany
- Institute for Chemistry and Biology of the Marine Environment (ICBM), Carl-von-Ossietzky Universität Oldenburg, Oldenburg, Germany
- Helmholtz Center for Marine and Polar Research, Alfred-Wegener-Institute, Bremerhaven, Germany
| |
Collapse
|
15
|
Hagemann A, Wilting J, Samimizad B, Mormann F, Priesemann V. Assessing criticality in pre-seizure single-neuron activity of human epileptic cortex. PLoS Comput Biol 2021; 17:e1008773. [PMID: 33684101 PMCID: PMC7971851 DOI: 10.1371/journal.pcbi.1008773] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2020] [Revised: 03/18/2021] [Accepted: 02/04/2021] [Indexed: 11/18/2022] Open
Abstract
Epileptic seizures are characterized by abnormal and excessive neural activity, where cortical network dynamics seem to become unstable. However, most of the time, during seizure-free periods, cortex of epilepsy patients shows perfectly stable dynamics. This raises the question of how recurring instability can arise in the light of this stable default state. In this work, we examine two potential scenarios of seizure generation: (i) epileptic cortical areas might generally operate closer to instability, which would make epilepsy patients generally more susceptible to seizures, or (ii) epileptic cortical areas might drift systematically towards instability before seizure onset. We analyzed single-unit spike recordings from both the epileptogenic (focal) and the nonfocal cortical hemispheres of 20 epilepsy patients. We quantified the distance to instability in the framework of criticality, using a novel estimator, which enables an unbiased inference from a small set of recorded neurons. Surprisingly, we found no evidence for either scenario: Neither did focal areas generally operate closer to instability, nor were seizures preceded by a drift towards instability. In fact, our results from both pre-seizure and seizure-free intervals suggest that despite epilepsy, human cortex operates in the stable, slightly subcritical regime, just like cortex of other healthy mammalians.
Collapse
Affiliation(s)
- Annika Hagemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Jens Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - Bita Samimizad
- Department of Epileptology, University of Bonn Medical Centre, Bonn, Germany
| | - Florian Mormann
- Department of Epileptology, University of Bonn Medical Centre, Bonn, Germany
| | - Viola Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
- Bernstein Center for Computational Neuroscience (BCCN) Göttingen, Germany
| |
Collapse
|
16
|
Heiney K, Huse Ramstad O, Fiskum V, Christiansen N, Sandvig A, Nichele S, Sandvig I. Criticality, Connectivity, and Neural Disorder: A Multifaceted Approach to Neural Computation. Front Comput Neurosci 2021; 15:611183. [PMID: 33643017 PMCID: PMC7902700 DOI: 10.3389/fncom.2021.611183] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2020] [Accepted: 01/18/2021] [Indexed: 01/03/2023] Open
Abstract
It has been hypothesized that the brain optimizes its capacity for computation by self-organizing to a critical point. The dynamical state of criticality is achieved by striking a balance such that activity can effectively spread through the network without overwhelming it and is commonly identified in neuronal networks by observing the behavior of cascades of network activity termed "neuronal avalanches." The dynamic activity that occurs in neuronal networks is closely intertwined with how the elements of the network are connected and how they influence each other's functional activity. In this review, we highlight how studying criticality with a broad perspective that integrates concepts from physics, experimental and theoretical neuroscience, and computer science can provide a greater understanding of the mechanisms that drive networks to criticality and how their disruption may manifest in different disorders. First, integrating graph theory into experimental studies on criticality, as is becoming more common in theoretical and modeling studies, would provide insight into the kinds of network structures that support criticality in networks of biological neurons. Furthermore, plasticity mechanisms play a crucial role in shaping these neural structures, both in terms of homeostatic maintenance and learning. Both network structures and plasticity have been studied fairly extensively in theoretical models, but much work remains to bridge the gap between theoretical and experimental findings. Finally, information theoretical approaches can tie in more concrete evidence of a network's computational capabilities. Approaching neural dynamics with all these facets in mind has the potential to provide a greater understanding of what goes wrong in neural disorders. Criticality analysis therefore holds potential to identify disruptions to healthy dynamics, granted that robust methods and approaches are considered.
Collapse
Affiliation(s)
- Kristine Heiney
- Department of Computer Science, Oslo Metropolitan University, Oslo, Norway
- Department of Computer Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Ola Huse Ramstad
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Vegard Fiskum
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Nicholas Christiansen
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Axel Sandvig
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Department of Clinical Neuroscience, Umeå University Hospital, Umeå, Sweden
- Department of Neurology, St. Olav's Hospital, Trondheim, Norway
| | - Stefano Nichele
- Department of Computer Science, Oslo Metropolitan University, Oslo, Norway
- Department of Holistic Systems, Simula Metropolitan, Oslo, Norway
| | - Ioanna Sandvig
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| |
Collapse
|
17
|
Li J, Shew WL. Tuning network dynamics from criticality to an asynchronous state. PLoS Comput Biol 2020; 16:e1008268. [PMID: 32986705 PMCID: PMC7544040 DOI: 10.1371/journal.pcbi.1008268] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2020] [Revised: 10/08/2020] [Accepted: 08/17/2020] [Indexed: 01/03/2023] Open
Abstract
According to many experimental observations, neurons in cerebral cortex tend to operate in an asynchronous regime, firing independently of each other. In contrast, many other experimental observations reveal cortical population firing dynamics that are relatively coordinated and occasionally synchronous. These discrepant observations have naturally led to competing hypotheses. A commonly hypothesized explanation of asynchronous firing is that excitatory and inhibitory synaptic inputs are precisely correlated, nearly canceling each other, sometimes referred to as 'balanced' excitation and inhibition. On the other hand, the 'criticality' hypothesis posits an explanation of the more coordinated state that also requires a certain balance of excitatory and inhibitory interactions. Both hypotheses claim the same qualitative mechanism-properly balanced excitation and inhibition. Thus, a natural question arises: how are asynchronous population dynamics and critical dynamics related, how do they differ? Here we propose an answer to this question based on investigation of a simple, network-level computational model. We show that the strength of inhibitory synapses relative to excitatory synapses can be tuned from weak to strong to generate a family of models that spans a continuum from critical dynamics to asynchronous dynamics. Our results demonstrate that the coordinated dynamics of criticality and asynchronous dynamics can be generated by the same neural system if excitatory and inhibitory synapses are tuned appropriately.
Collapse
Affiliation(s)
- Jingwen Li
- Department of Physics, University of Arkansas, Fayetteville, Arkansas, United States of America
| | - Woodrow L. Shew
- Department of Physics, University of Arkansas, Fayetteville, Arkansas, United States of America
- * E-mail:
| |
Collapse
|
18
|
Wilting J, Priesemann V. Between Perfectly Critical and Fully Irregular: A Reverberating Model Captures and Predicts Cortical Spike Propagation. Cereb Cortex 2020; 29:2759-2770. [PMID: 31008508 PMCID: PMC6519697 DOI: 10.1093/cercor/bhz049] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2018] [Revised: 01/20/2019] [Indexed: 12/11/2022] Open
Abstract
Knowledge about the collective dynamics of cortical spiking is very informative about the underlying coding principles. However, even most basic properties are not known with certainty, because their assessment is hampered by spatial subsampling, i.e., the limitation that only a tiny fraction of all neurons can be recorded simultaneously with millisecond precision. Building on a novel, subsampling-invariant estimator, we fit and carefully validate a minimal model for cortical spike propagation. The model interpolates between two prominent states: asynchronous and critical. We find neither of them in cortical spike recordings across various species, but instead identify a narrow "reverberating" regime. This approach enables us to predict yet unknown properties from very short recordings and for every circuit individually, including responses to minimal perturbations, intrinsic network timescales, and the strength of external input compared to recurrent activation "thereby informing about the underlying coding principles for each circuit, area, state and task.
Collapse
Affiliation(s)
- J Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faß berg 17, Göttingen, Germany
| | - V Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faß berg 17, Göttingen, Germany.,Bernstein-Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
19
|
Time-dependent branching processes: a model of oscillating neuronal avalanches. Sci Rep 2020; 10:13678. [PMID: 32792658 PMCID: PMC7426838 DOI: 10.1038/s41598-020-69705-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2020] [Accepted: 07/15/2020] [Indexed: 11/08/2022] Open
Abstract
Recently, neuronal avalanches have been observed to display oscillations, a phenomenon regarded as the co-existence of a scale-free behaviour (the avalanches close to criticality) and scale-dependent dynamics (the oscillations). Ordinary continuous-time branching processes with constant extinction and branching rates are commonly used as models of neuronal activity, yet they lack any such time-dependence. In the present work, we extend a basic branching process by allowing the extinction rate to oscillate in time as a new model to describe cortical dynamics. By means of a perturbative field theory, we derive relevant observables in closed form. We support our findings by quantitative comparison to numerics and qualitative comparison to available experimental results.
Collapse
|
20
|
Cramer B, Stöckel D, Kreft M, Wibral M, Schemmel J, Meier K, Priesemann V. Control of criticality and computation in spiking neuromorphic networks with plasticity. Nat Commun 2020; 11:2853. [PMID: 32503982 PMCID: PMC7275091 DOI: 10.1038/s41467-020-16548-3] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2019] [Accepted: 04/23/2020] [Indexed: 11/08/2022] Open
Abstract
The critical state is assumed to be optimal for any computation in recurrent neural networks, because criticality maximizes a number of abstract computational properties. We challenge this assumption by evaluating the performance of a spiking recurrent neural network on a set of tasks of varying complexity at - and away from critical network dynamics. To that end, we developed a plastic spiking network on a neuromorphic chip. We show that the distance to criticality can be easily adapted by changing the input strength, and then demonstrate a clear relation between criticality, task-performance and information-theoretic fingerprint. Whereas the information-theoretic measures all show that network capacity is maximal at criticality, only the complex tasks profit from criticality, whereas simple tasks suffer. Thereby, we challenge the general assumption that criticality would be beneficial for any task, and provide instead an understanding of how the collective network state should be tuned to task requirement.
Collapse
Affiliation(s)
- Benjamin Cramer
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany.
| | - David Stöckel
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Markus Kreft
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Michael Wibral
- Campus Institute for Dynamics of Biological Networks, Georg-August University, Hermann-Rein-Straße 3, 37075, Göttingen, Germany
| | - Johannes Schemmel
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Karlheinz Meier
- Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, 69120, Heidelberg, Germany
| | - Viola Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faßberg 17, 37077, Göttingen, Germany.
- Bernstein Center for Computational Neuroscience, Georg-August University, Am Faßberg 17, 37077, Göttingen, Germany.
- Department of Physics, Georg-August University, Friedrich-Hund-Platz 1, 37077, Göttingen, Germany.
| |
Collapse
|
21
|
Wilting J, Priesemann V. 25 years of criticality in neuroscience - established results, open controversies, novel concepts. Curr Opin Neurobiol 2019; 58:105-111. [PMID: 31546053 DOI: 10.1016/j.conb.2019.08.002] [Citation(s) in RCA: 66] [Impact Index Per Article: 13.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2019] [Accepted: 08/25/2019] [Indexed: 12/19/2022]
Abstract
Twenty-five years ago, Dunkelmann and Radons (1994) showed that neural networks can self-organize to a critical state. In models, the critical state offers a number of computational advantages. Thus this hypothesis, and in particular the experimental work by Beggs and Plenz (2003), has triggered an avalanche of research, with thousands of studies referring to it. Nonetheless, experimental results are still contradictory. How is it possible, that a hypothesis has attracted active research for decades, but nonetheless remains controversial? We discuss the experimental and conceptual controversy, and then present a parsimonious solution that (i) unifies the contradictory experimental results, (ii) avoids disadvantages of a critical state, and (iii) enables rapid, adaptive tuning of network properties to task requirements.
Collapse
Affiliation(s)
- J Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - V Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany; Bernstein-Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
22
|
Nolte M, Reimann MW, King JG, Markram H, Muller EB. Cortical reliability amid noise and chaos. Nat Commun 2019; 10:3792. [PMID: 31439838 PMCID: PMC6706377 DOI: 10.1038/s41467-019-11633-8] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2018] [Accepted: 07/23/2019] [Indexed: 02/01/2023] Open
Abstract
Typical responses of cortical neurons to identical sensory stimuli appear highly variable. It has thus been proposed that the cortex primarily uses a rate code. However, other studies have argued for spike-time coding under certain conditions. The potential role of spike-time coding is directly limited by the internally generated variability of cortical circuits, which remains largely unexplored. Here, we quantify this internally generated variability using a biophysical model of rat neocortical microcircuitry with biologically realistic noise sources. We find that stochastic neurotransmitter release is a critical component of internally generated variability, causing rapidly diverging, chaotic recurrent network dynamics. Surprisingly, the same nonlinear recurrent network dynamics can transiently overcome the chaos in response to weak feed-forward thalamocortical inputs, and support reliable spike times with millisecond precision. Our model shows that the noisy and chaotic network dynamics of recurrent cortical microcircuitry are compatible with stimulus-evoked, millisecond spike-time reliability, resolving a long-standing debate.
Collapse
Affiliation(s)
- Max Nolte
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland.
| | - Michael W Reimann
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland
| | - James G King
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland
| | - Henry Markram
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland
- Laboratory of Neural Microcircuitry, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, 1015, Lausanne, Switzerland
| | - Eilif B Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, 1202, Geneva, Switzerland.
| |
Collapse
|
23
|
Krauss P, Schuster M, Dietrich V, Schilling A, Schulze H, Metzner C. Weight statistics controls dynamics in recurrent neural networks. PLoS One 2019; 14:e0214541. [PMID: 30964879 PMCID: PMC6456246 DOI: 10.1371/journal.pone.0214541] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2018] [Accepted: 03/14/2019] [Indexed: 11/19/2022] Open
Abstract
Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths wij between the individual neurons. However, little is known to which extent network dynamics is tunable on a more coarse-grained level by the statistical features of the weight matrix. In this work, we investigate the dynamics of recurrent networks of Boltzmann neurons. In particular we study the impact of three statistical parameters: density (the fraction of non-zero connections), balance (the ratio of excitatory to inhibitory connections), and symmetry (the fraction of neuron pairs with wij = wji). By computing a 'phase diagram' of network dynamics, we find that balance is the essential control parameter: Its gradual increase from negative to positive values drives the system from oscillatory behavior into a chaotic regime, and eventually into stationary fixed points. Only directly at the border of the chaotic regime do the neural networks display rich but regular dynamics, thus enabling actual information processing. These results suggest that the brain, too, is fine-tuned to the 'edge of chaos' by assuring a proper balance between excitatory and inhibitory neural connections.
Collapse
Affiliation(s)
- Patrick Krauss
- Cognitive Computational Neuroscience Group at the Chair of English Philology and Linguistics, Department of English and American Studies, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
- Experimental Otolaryngology, Neuroscience Group, University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Marc Schuster
- Experimental Otolaryngology, Neuroscience Group, University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Verena Dietrich
- Experimental Otolaryngology, Neuroscience Group, University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Achim Schilling
- Cognitive Computational Neuroscience Group at the Chair of English Philology and Linguistics, Department of English and American Studies, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
- Experimental Otolaryngology, Neuroscience Group, University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Holger Schulze
- Experimental Otolaryngology, Neuroscience Group, University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Claus Metzner
- Experimental Otolaryngology, Neuroscience Group, University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
- Biophysics Group, Department of Physics, Friedrich-Alexander University Erlangen-Nürnberg (FAU), Erlangen, Germany
| |
Collapse
|
24
|
Abstract
The dynamics of complex systems generally include high-dimensional, nonstationary, and nonlinear behavior, all of which pose fundamental challenges to quantitative understanding. To address these difficulties, we detail an approach based on local linear models within windows determined adaptively from data. While the dynamics within each window are simple, consisting of exponential decay, growth, and oscillations, the collection of local parameters across all windows provides a principled characterization of the full time series. To explore the resulting model space, we develop a likelihood-based hierarchical clustering, and we examine the eigenvalues of the linear dynamics. We demonstrate our analysis with the Lorenz system undergoing stable spiral dynamics and in the standard chaotic regime. Applied to the posture dynamics of the nematode Caenorhabditis elegans, our approach identifies fine-grained behavioral states and model dynamics which fluctuate about an instability boundary, and we detail a bifurcation in a transition from forward to backward crawling. We analyze whole-brain imaging in C. elegans and show that global brain dynamics is damped away from the instability boundary by a decrease in oxygen concentration. We provide additional evidence for such near-critical dynamics from the analysis of electrocorticography in monkey and the imaging of a neural population from mouse visual cortex at single-cell resolution.
Collapse
Affiliation(s)
- Antonio C Costa
- Department of Physics and Astronomy, Vrije Universiteit Amsterdam, 1081HV Amsterdam, The Netherlands
| | - Tosif Ahamed
- Biological Physics Theory Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa 904-0495, Japan
| | - Greg J Stephens
- Department of Physics and Astronomy, Vrije Universiteit Amsterdam, 1081HV Amsterdam, The Netherlands;
- Biological Physics Theory Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa 904-0495, Japan
| |
Collapse
|