1
|
Terada Y, Toyoizumi T. Chaotic neural dynamics facilitate probabilistic computations through sampling. Proc Natl Acad Sci U S A 2024; 121:e2312992121. [PMID: 38648479 PMCID: PMC11067032 DOI: 10.1073/pnas.2312992121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2023] [Accepted: 02/13/2024] [Indexed: 04/25/2024] Open
Abstract
Cortical neurons exhibit highly variable responses over trials and time. Theoretical works posit that this variability arises potentially from chaotic network dynamics of recurrently connected neurons. Here, we demonstrate that chaotic neural dynamics, formed through synaptic learning, allow networks to perform sensory cue integration in a sampling-based implementation. We show that the emergent chaotic dynamics provide neural substrates for generating samples not only of a static variable but also of a dynamical trajectory, where generic recurrent networks acquire these abilities with a biologically plausible learning rule through trial and error. Furthermore, the networks generalize their experience in the stimulus-evoked samples to the inference without partial or all sensory information, which suggests a computational role of spontaneous activity as a representation of the priors as well as a tractable biological computation for marginal distributions. These findings suggest that chaotic neural dynamics may serve for the brain function as a Bayesian generative model.
Collapse
Affiliation(s)
- Yu Terada
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama351-0198, Japan
- Department of Neurobiology, University of California, San Diego, La Jolla, CA92093
- The Institute for Physics of Intelligence, The University of Tokyo, Tokyo113-0033, Japan
| | - Taro Toyoizumi
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Saitama351-0198, Japan
- Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, Tokyo113-8656, Japan
| |
Collapse
|
2
|
Metzner C, Yamakou ME, Voelkl D, Schilling A, Krauss P. Quantifying and Maximizing the Information Flux in Recurrent Neural Networks. Neural Comput 2024; 36:351-384. [PMID: 38363658 DOI: 10.1162/neco_a_01651] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 12/04/2023] [Indexed: 02/18/2024]
Abstract
Free-running recurrent neural networks (RNNs), especially probabilistic models, generate an ongoing information flux that can be quantified with the mutual information I[x→(t),x→(t+1)] between subsequent system states x→. Although previous studies have shown that I depends on the statistics of the network's connection weights, it is unclear how to maximize I systematically and how to quantify the flux in large systems where computing the mutual information becomes intractable. Here, we address these questions using Boltzmann machines as model systems. We find that in networks with moderately strong connections, the mutual information I is approximately a monotonic transformation of the root-mean-square averaged Pearson correlations between neuron pairs, a quantity that can be efficiently computed even in large systems. Furthermore, evolutionary maximization of I[x→(t),x→(t+1)] reveals a general design principle for the weight matrices enabling the systematic construction of systems with a high spontaneous information flux. Finally, we simultaneously maximize information flux and the mean period length of cyclic attractors in the state-space of these dynamical networks. Our results are potentially useful for the construction of RNNs that serve as short-time memories or pattern generators.
Collapse
Affiliation(s)
- Claus Metzner
- Neuroscience Lab, University Hospital Erlangen, 91054 Erlangen, Germany
- Biophysics Lab, Friedrich-Alexander University of Erlangen-Nuremberg, 91054 Erlangen, Germany
| | - Marius E Yamakou
- Department of Data Science, Friedrich-Alexander University Erlangen-Nuremberg, 91054 Erlangen, Germany
| | - Dennis Voelkl
- Neuroscience Lab, University Hospital Erlangen, 91054 Erlangen, Germany
| | - Achim Schilling
- Neuroscience Lab, University Hospital Erlangen, 91054 Erlangen, Germany
- Cognitive Computational Neuroscience Group, Friedrich-Alexander University Erlangen-Nuremberg, 91054 Erlangen, Germany
| | - Patrick Krauss
- Neuroscience Lab, University Hospital Erlangen, 91054 Erlangen, Germany
- Cognitive Computational Neuroscience Group, Friedrich-Alexander University Erlangen-Nuremberg, 91054 Erlangen, Germany
- Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg, 91054 Erlangen, Germany
| |
Collapse
|
3
|
Clark DG, Abbott LF, Litwin-Kumar A. Dimension of Activity in Random Neural Networks. PHYSICAL REVIEW LETTERS 2023; 131:118401. [PMID: 37774280 DOI: 10.1103/physrevlett.131.118401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/09/2022] [Revised: 05/25/2023] [Accepted: 08/08/2023] [Indexed: 10/01/2023]
Abstract
Neural networks are high-dimensional nonlinear dynamical systems that process information through the coordinated activity of many connected units. Understanding how biological and machine-learning networks function and learn requires knowledge of the structure of this coordinated activity, information contained, for example, in cross covariances between units. Self-consistent dynamical mean field theory (DMFT) has elucidated several features of random neural networks-in particular, that they can generate chaotic activity-however, a calculation of cross covariances using this approach has not been provided. Here, we calculate cross covariances self-consistently via a two-site cavity DMFT. We use this theory to probe spatiotemporal features of activity coordination in a classic random-network model with independent and identically distributed (i.i.d.) couplings, showing an extensive but fractionally low effective dimension of activity and a long population-level timescale. Our formulas apply to a wide range of single-unit dynamics and generalize to non-i.i.d. couplings. As an example of the latter, we analyze the case of partially symmetric couplings.
Collapse
Affiliation(s)
- David G Clark
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| | - L F Abbott
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| | - Ashok Litwin-Kumar
- Zuckerman Institute, Department of Neuroscience, Columbia University, New York, New York 10027, USA
| |
Collapse
|
4
|
Haruna T, Shirakawa T. Noise-induced bistability in a simple mutual inhibition system. Phys Rev E 2023; 108:024108. [PMID: 37723728 DOI: 10.1103/physreve.108.024108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Accepted: 07/14/2023] [Indexed: 09/20/2023]
Abstract
In this study, we study noise-induced bistability in a simple bivariate mutual inhibition system with slow fluctuating responses to external signals. We give a general condition that the marginal stationary probability density of one of the two variables experiences a transition from a unimodal shape to a bimodal one. We show that the transition occurs even when the stationary probability density of the response to external signals is monotone. The mechanism for the transition is investigated in terms of the calculation of the mean first passage time. We also discuss the genericity of the transition mechanism.
Collapse
Affiliation(s)
- Taichi Haruna
- Department of Information and Sciences, School of Arts and Sciences, Tokyo Woman's Christian University, 2-6-1 Zempukuji, Suginami-ku, Tokyo 167-8585, Japan
| | - Tomohiro Shirakawa
- Department of Information and Management Systems Engineering, Graduate School of Engineering, Nagaoka University of Technology, 1603-1 Kamitomioka, Nagaoka, Niigata 940-2188, Japan
| |
Collapse
|
5
|
Haruna J, Toshio R, Nakano N. Path integral approach to universal dynamics of reservoir computers. Phys Rev E 2023; 107:034306. [PMID: 37073052 DOI: 10.1103/physreve.107.034306] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 02/06/2023] [Indexed: 04/20/2023]
Abstract
In this work, we give a characterization of the reservoir computer (RC) by the network structure, especially the probability distribution of random coupling constants. First, based on the path integral method, we clarify the universal behavior of the random network dynamics in the thermodynamic limit, which depends only on the asymptotic behavior of the second cumulant generating functions of the network coupling constants. This result enables us to classify the random networks into several universality classes, according to the distribution function of coupling constants chosen for the networks. Interestingly, it is revealed that such a classification has a close relationship with the distribution of eigenvalues of the random coupling matrix. We also comment on the relation between our theory and some practical choices of random connectivity in the RC. Subsequently, we investigate the relationship between the RC's computational power and the network parameters for several universality classes. We perform several numerical simulations to evaluate the phase diagrams of the steady reservoir states, common-signal-induced synchronization, and the computational power in the chaotic time series inference tasks. As a result, we clarify the close relationship between these quantities, especially a remarkable computational performance near the phase transitions, which is realized even near a nonchaotic transition boundary. These results may provide us with a new perspective on the designing principle for the RC.
Collapse
Affiliation(s)
- Junichi Haruna
- Department of Physics, Kyoto University, Kyoto 606-8502, Japan
| | - Riki Toshio
- Department of Physics, Kyoto University, Kyoto 606-8502, Japan
| | - Naoto Nakano
- Graduate School of Advanced Mathematical Sciences, Meiji University, Tokyo 164-8525, Japan
| |
Collapse
|
6
|
Mosheiff N, Ermentrout B, Huang C. Chaotic dynamics in spatially distributed neuronal networks generate population-wide shared variability. PLoS Comput Biol 2023; 19:e1010843. [PMID: 36626362 PMCID: PMC9870129 DOI: 10.1371/journal.pcbi.1010843] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 01/23/2023] [Accepted: 12/26/2022] [Indexed: 01/11/2023] Open
Abstract
Neural activity in the cortex is highly variable in response to repeated stimuli. Population recordings across the cortex demonstrate that the variability of neuronal responses is shared among large groups of neurons and concentrates in a low dimensional space. However, the source of the population-wide shared variability is unknown. In this work, we analyzed the dynamical regimes of spatially distributed networks of excitatory and inhibitory neurons. We found chaotic spatiotemporal dynamics in networks with similar excitatory and inhibitory projection widths, an anatomical feature of the cortex. The chaotic solutions contain broadband frequency power in rate variability and have distance-dependent and low-dimensional correlations, in agreement with experimental findings. In addition, rate chaos can be induced by globally correlated noisy inputs. These results suggest that spatiotemporal chaos in cortical networks can explain the shared variability observed in neuronal population responses.
Collapse
Affiliation(s)
- Noga Mosheiff
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, United States of America
| | - Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Chengcheng Huang
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, United States of America
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
7
|
Input correlations impede suppression of chaos and learning in balanced firing-rate networks. PLoS Comput Biol 2022; 18:e1010590. [DOI: 10.1371/journal.pcbi.1010590] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 12/15/2022] [Accepted: 09/20/2022] [Indexed: 12/12/2022] Open
Abstract
Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks.
Collapse
|
8
|
Tian Y, Tan Z, Hou H, Li G, Cheng A, Qiu Y, Weng K, Chen C, Sun P. Theoretical foundations of studying criticality in the brain. Netw Neurosci 2022; 6:1148-1185. [PMID: 38800464 PMCID: PMC11117095 DOI: 10.1162/netn_a_00269] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2022] [Accepted: 07/12/2022] [Indexed: 05/29/2024] Open
Abstract
Criticality is hypothesized as a physical mechanism underlying efficient transitions between cortical states and remarkable information-processing capacities in the brain. While considerable evidence generally supports this hypothesis, nonnegligible controversies persist regarding the ubiquity of criticality in neural dynamics and its role in information processing. Validity issues frequently arise during identifying potential brain criticality from empirical data. Moreover, the functional benefits implied by brain criticality are frequently misconceived or unduly generalized. These problems stem from the nontriviality and immaturity of the physical theories that analytically derive brain criticality and the statistic techniques that estimate brain criticality from empirical data. To help solve these problems, we present a systematic review and reformulate the foundations of studying brain criticality, that is, ordinary criticality (OC), quasi-criticality (qC), self-organized criticality (SOC), and self-organized quasi-criticality (SOqC), using the terminology of neuroscience. We offer accessible explanations of the physical theories and statistical techniques of brain criticality, providing step-by-step derivations to characterize neural dynamics as a physical system with avalanches. We summarize error-prone details and existing limitations in brain criticality analysis and suggest possible solutions. Moreover, we present a forward-looking perspective on how optimizing the foundations of studying brain criticality can deepen our understanding of various neuroscience questions.
Collapse
Affiliation(s)
- Yang Tian
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
- Laboratory of Advanced Computing and Storage, Central Research Institute, 2012 Laboratories, Huawei Technologies Co. Ltd., Beijing, China
| | - Zeren Tan
- Institute for Interdisciplinary Information Science, Tsinghua University, Beijing, China
| | - Hedong Hou
- UFR de Mathématiques, Université de Paris, Paris, France
| | - Guoqi Li
- Institute of Automation, Chinese Academy of Science, Beijing, China
- University of Chinese Academy of Science, Beijing, China
| | - Aohua Cheng
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Yike Qiu
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Kangyu Weng
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Chun Chen
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
| | - Pei Sun
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
| |
Collapse
|
9
|
Metzner C, Krauss P. Dynamics and Information Import in Recurrent Neural Networks. Front Comput Neurosci 2022; 16:876315. [PMID: 35573264 PMCID: PMC9091337 DOI: 10.3389/fncom.2022.876315] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Accepted: 04/04/2022] [Indexed: 12/27/2022] Open
Abstract
Recurrent neural networks (RNNs) are complex dynamical systems, capable of ongoing activity without any driving input. The long-term behavior of free-running RNNs, described by periodic, chaotic and fixed point attractors, is controlled by the statistics of the neural connection weights, such as the density d of non-zero connections, or the balance b between excitatory and inhibitory connections. However, for information processing purposes, RNNs need to receive external input signals, and it is not clear which of the dynamical regimes is optimal for this information import. We use both the average correlations C and the mutual information I between the momentary input vector and the next system state vector as quantitative measures of information import and analyze their dependence on the balance and density of the network. Remarkably, both resulting phase diagrams C(b, d) and I(b, d) are highly consistent, pointing to a link between the dynamical systems and the information-processing approach to complex systems. Information import is maximal not at the "edge of chaos," which is optimally suited for computation, but surprisingly in the low-density chaotic regime and at the border between the chaotic and fixed point regime. Moreover, we find a completely new type of resonance phenomenon, which we call "Import Resonance" (IR), where the information import shows a maximum, i.e., a peak-like dependence on the coupling strength between the RNN and its external input. IR complements previously found Recurrence Resonance (RR), where correlation and mutual information of successive system states peak for a certain amplitude of noise added to the system. Both IR and RR can be exploited to optimize information processing in artificial neural networks and might also play a crucial role in biological neural systems.
Collapse
Affiliation(s)
- Claus Metzner
- Neuroscience Lab, University Hospital Erlangen, Erlangen, Germany
| | - Patrick Krauss
- Neuroscience Lab, University Hospital Erlangen, Erlangen, Germany
- Cognitive Computational Neuroscience Group, Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, Germany
- Pattern Recognition Lab, Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, Germany
| |
Collapse
|
10
|
Krishnamurthy K, Can T, Schwab DJ. Theory of Gating in Recurrent Neural Networks. PHYSICAL REVIEW. X 2022; 12:011011. [PMID: 36545030 PMCID: PMC9762509 DOI: 10.1103/physrevx.12.011011] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience. Prior theoretical work has focused on RNNs with additive interactions. However gating i.e., multiplicative interactions are ubiquitous in real neurons and also the central feature of the best-performing RNNs in ML. Here, we show that gating offers flexible control of two salient features of the collective dynamics: (i) timescales and (ii) dimensionality. The gate controlling timescales leads to a novel marginally stable state, where the network functions as a flexible integrator. Unlike previous approaches, gating permits this important function without parameter fine-tuning or special symmetries. Gates also provide a flexible, context-dependent mechanism to reset the memory trace, thus complementing the memory function. The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs. At this transition, unlike additive RNNs, the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity). The rich dynamics are summarized in phase diagrams, thus providing a map for principled parameter initialization choices to ML practitioners.
Collapse
Affiliation(s)
- Kamesh Krishnamurthy
- Joseph Henry Laboratories of Physics and PNI, Princeton University, Princeton, New Jersey 08544, USA
| | - Tankut Can
- Institute for Advanced Study, Princeton, New Jersey 08540, USA
| | - David J. Schwab
- Initiative for Theoretical Sciences, Graduate Center, CUNY, New York, New York 10016, USA
| |
Collapse
|
11
|
The Mean Field Approach for Populations of Spiking Neurons. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:125-157. [DOI: 10.1007/978-3-030-89439-9_6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
AbstractMean field theory is a device to analyze the collective behavior of a dynamical system comprising many interacting particles. The theory allows to reduce the behavior of the system to the properties of a handful of parameters. In neural circuits, these parameters are typically the firing rates of distinct, homogeneous subgroups of neurons. Knowledge of the firing rates under conditions of interest can reveal essential information on both the dynamics of neural circuits and the way they can subserve brain function. The goal of this chapter is to provide an elementary introduction to the mean field approach for populations of spiking neurons. We introduce the general idea in networks of binary neurons, starting from the most basic results and then generalizing to more relevant situations. This allows to derive the mean field equations in a simplified setting. We then derive the mean field equations for populations of integrate-and-fire neurons. An effort is made to derive the main equations of the theory using only elementary methods from calculus and probability theory. The chapter ends with a discussion of the assumptions of the theory and some of the consequences of violating those assumptions. This discussion includes an introduction to balanced and metastable networks and a brief catalogue of successful applications of the mean field approach to the study of neural circuits.
Collapse
|
12
|
Kuśmierz Ł, Ogawa S, Toyoizumi T. Edge of Chaos and Avalanches in Neural Networks with Heavy-Tailed Synaptic Weight Distribution. PHYSICAL REVIEW LETTERS 2020; 125:028101. [PMID: 32701351 DOI: 10.1103/physrevlett.125.028101] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2019] [Revised: 03/03/2020] [Accepted: 05/26/2020] [Indexed: 06/11/2023]
Abstract
We propose an analytically tractable neural connectivity model with power-law distributed synaptic strengths. When threshold neurons with biologically plausible number of incoming connections are considered, our model features a continuous transition to chaos and can reproduce biologically relevant low activity levels and scale-free avalanches, i.e., bursts of activity with power-law distributions of sizes and lifetimes. In contrast, the Gaussian counterpart exhibits a discontinuous transition to chaos and thus cannot be poised near the edge of chaos. We validate our predictions in simulations of networks of binary as well as leaky integrate-and-fire neurons. Our results suggest that heavy-tailed synaptic distribution may form a weakly informative sparse-connectivity prior that can be useful in biological and artificial adaptive systems.
Collapse
Affiliation(s)
- Łukasz Kuśmierz
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan
| | - Shun Ogawa
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan
| | - Taro Toyoizumi
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan
- Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, Tokyo 113-8656, Japan
| |
Collapse
|
13
|
Haruna T, Nakajima K. Optimal short-term memory before the edge of chaos in driven random recurrent networks. Phys Rev E 2019; 100:062312. [PMID: 31962477 DOI: 10.1103/physreve.100.062312] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2019] [Indexed: 06/10/2023]
Abstract
The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated with mean-field theory. The combination of a small input strength and mean-field assumptions makes it possible to derive an approximate expression for the conditional probability density of the state of a neuron given a past input signal. From this conditional probability density, we can analytically calculate short-term memory measures, such as memory capacity, mutual information, and Fisher information, and determine the relationships among these measures, which have not been clarified to date to the best of our knowledge. We show that the network contribution of these short-term memory measures peaks before the edge of chaos, where the dynamics of input-driven networks is stable but corresponding systems without input signals are unstable.
Collapse
Affiliation(s)
- Taichi Haruna
- Department of Information and Sciences, School of Arts and Sciences, Tokyo Woman's Christian University, 2-6-1 Zempukuji, Suginami-ku, Tokyo 167-8585, Japan
| | - Kohei Nakajima
- Graduate School of Information Science and Technology, University of Tokyo, Bunkyo-ku, Tokyo 113-8656, Japan
| |
Collapse
|
14
|
Hennequin G, Ahmadian Y, Rubin DB, Lengyel M, Miller KD. The Dynamical Regime of Sensory Cortex: Stable Dynamics around a Single Stimulus-Tuned Attractor Account for Patterns of Noise Variability. Neuron 2019; 98:846-860.e5. [PMID: 29772203 PMCID: PMC5971207 DOI: 10.1016/j.neuron.2018.04.017] [Citation(s) in RCA: 73] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2016] [Revised: 02/14/2018] [Accepted: 04/12/2018] [Indexed: 12/16/2022]
Abstract
Correlated variability in cortical activity is ubiquitously quenched following stimulus onset, in a stimulus-dependent manner. These modulations have been attributed to circuit dynamics involving either multiple stable states (“attractors”) or chaotic activity. Here we show that a qualitatively different dynamical regime, involving fluctuations about a single, stimulus-driven attractor in a loosely balanced excitatory-inhibitory network (the stochastic “stabilized supralinear network”), best explains these modulations. Given the supralinear input/output functions of cortical neurons, increased stimulus drive strengthens effective network connectivity. This shifts the balance from interactions that amplify variability to suppressive inhibitory feedback, quenching correlated variability around more strongly driven steady states. Comparing to previously published and original data analyses, we show that this mechanism, unlike previous proposals, uniquely accounts for the spatial patterns and fast temporal dynamics of variability suppression. Specifying the cortical operating regime is key to understanding the computations underlying perception. A simple network model explains stimulus-tuning of cortical variability suppression Inhibition stabilizes recurrently interacting neurons with supralinear I/O functions Stimuli strengthen inhibitory stabilization around a stable state, quenching variability Single-trial V1 data are compatible with this model and rules out competing proposals
Collapse
Affiliation(s)
- Guillaume Hennequin
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, UK.
| | - Yashar Ahmadian
- Center for Theoretical Neuroscience, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Department of Neuroscience, Swartz Program in Theoretical Neuroscience, Kavli Institute for Brain Science, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Centre de Neurophysique, Physiologie, et Pathologie, CNRS, 75270 Paris Cedex 06, France; Institute of Neuroscience, Department of Biology and Mathematics, University of Oregon, Eugene, OR 97403, USA
| | - Daniel B Rubin
- Center for Theoretical Neuroscience, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Department of Neurology, Massachusetts General Hospital and Brigham and Women's Hospital, Harvard Medical School, Boston, MA 02115, USA
| | - Máté Lengyel
- Computational and Biological Learning Lab, Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, UK; Department of Cognitive Science, Central European University, 1051 Budapest, Hungary
| | - Kenneth D Miller
- Center for Theoretical Neuroscience, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA; Department of Neuroscience, Swartz Program in Theoretical Neuroscience, Kavli Institute for Brain Science, College of Physicians and Surgeons, Columbia University, New York, NY 10032, USA
| |
Collapse
|
15
|
Puelma Touzel M, Wolf F. Statistical mechanics of spike events underlying phase space partitioning and sequence codes in large-scale models of neural circuits. Phys Rev E 2019; 99:052402. [PMID: 31212548 DOI: 10.1103/physreve.99.052402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2017] [Indexed: 11/07/2022]
Abstract
Cortical circuits operate in an inhibition-dominated regime of spiking activity. Recently, it was found that spiking circuit models in this regime can, despite disordered connectivity and asynchronous irregular activity, exhibit a locally stable dynamics that may be used for neural computation. The lack of existing mathematical tools has precluded analytical insight into this phase. Here we present analytical methods tailored to the granularity of spike-based interactions for analyzing attractor geometry in high-dimensional spiking dynamics. We apply them to reveal the properties of the complex geometry of trajectories of population spiking activity in a canonical model of locally stable spiking dynamics. We find that attractor basin boundaries are the preimages of spike-time collision events involving connected neurons. These spike-based instabilities control the divergence rate of neighboring basins and have no equivalent in rate-based models. They are located according to the disordered connectivity at a random subset of edges in a hypercube representation of the phase space. Iterating backward these edges using the stable dynamics induces a partition refinement on this space that converges to the attractor basins. We formulate a statistical theory of the locations of such events relative to attracting trajectories via a tractable representation of local trajectory ensembles. Averaging over the disorder, we derive the basin diameter distribution, whose characteristic scale emerges from the relative strengths of the stabilizing inhibitory coupling and destabilizing spike interactions. Our study provides an approach to analytically dissect how connectivity, coupling strength, and single-neuron dynamics shape the phase space geometry in the locally stable regime of spiking neural circuit dynamics.
Collapse
Affiliation(s)
- Maximilian Puelma Touzel
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany and Mila, Université de Montréal, Montréal, Quebec, Canada H2S 3H1
| | - Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany; Faculty of Physics, Georg August University, 37077 Göttingen, Germany; Bernstein Center for Computational Neuroscience, 37077 Göttingen, Germany; and Kavli Institute for Theoretical Physics, University of California, Santa Barbara, Santa Barbara, California 93106-4111, USA
| |
Collapse
|
16
|
Mastrogiuseppe F, Ostojic S. A Geometrical Analysis of Global Stability in Trained Feedback Networks. Neural Comput 2019; 31:1139-1182. [DOI: 10.1162/neco_a_01187] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Recurrent neural networks have been extensively studied in the context of neuroscience and machine learning due to their ability to implement complex computations. While substantial progress in designing effective learning algorithms has been achieved, a full understanding of trained recurrent networks is still lacking. Specifically, the mechanisms that allow computations to emerge from the underlying recurrent dynamics are largely unknown. Here we focus on a simple yet underexplored computational setup: a feedback architecture trained to associate a stationary output to a stationary input. As a starting point, we derive an approximate analytical description of global dynamics in trained networks, which assumes uncorrelated connectivity weights in the feedback and in the random bulk. The resulting mean-field theory suggests that the task admits several classes of solutions, which imply different stability properties. Different classes are characterized in terms of the geometrical arrangement of the readout with respect to the input vectors, defined in the high-dimensional space spanned by the network population. We find that such an approximate theoretical approach can be used to understand how standard training techniques implement the input-output task in finite-size feedback networks. In particular, our simplified description captures the local and the global stability properties of the target solution, and thus predicts training performance.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives et Computationelles, INSERM U960, and Laboratoire de Physique Statistique, CNRS UMR 8550, Ecole Normale Supérieure–PSL Research University, Paris 75005, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationelles, INSERM U960, Ecole Normale Supérieure–PSL Research University, Paris 75005, France
| |
Collapse
|
17
|
Anand K, Khedair J, Kühn R. Structural model for fluctuations in financial markets. Phys Rev E 2018; 97:052312. [PMID: 29906875 DOI: 10.1103/physreve.97.052312] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2017] [Indexed: 11/07/2022]
Abstract
In this paper we provide a comprehensive analysis of a structural model for the dynamics of prices of assets traded in a market which takes the form of an interacting generalization of the geometric Brownian motion model. It is formally equivalent to a model describing the stochastic dynamics of a system of analog neurons, which is expected to exhibit glassy properties and thus many metastable states in a large portion of its parameter space. We perform a generating functional analysis, introducing a slow driving of the dynamics to mimic the effect of slowly varying macroeconomic conditions. Distributions of asset returns over various time separations are evaluated analytically and are found to be fat-tailed in a manner broadly in line with empirical observations. Our model also allows us to identify collective, interaction-mediated properties of pricing distributions and it predicts pricing distributions which are significantly broader than their noninteracting counterparts, if interactions between prices in the model contain a ferromagnetic bias. Using simulations, we are able to substantiate one of the main hypotheses underlying the original modeling, viz., that the phenomenon of volatility clustering can be rationalized in terms of an interplay between the dynamics within metastable states and the dynamics of occasional transitions between them.
Collapse
Affiliation(s)
- Kartik Anand
- Deutsche Bundesbank, Wilhelm-Epstein-Strasse 14, 60431 Frankfurt am Main, Germany
| | - Jonathan Khedair
- Department of Mathematics, King's College London, Strand, London WC2R 2LS, United Kingdom
| | - Reimer Kühn
- Department of Mathematics, King's College London, Strand, London WC2R 2LS, United Kingdom
| |
Collapse
|
18
|
Abstract
Implicit expectations induced by predictable stimuli sequences affect neuronal response to upcoming stimuli at both single cell and neural population levels. Temporally regular sensory streams also phase entrain ongoing low frequency brain oscillations but how and why this happens is unknown. Here we investigate how random recurrent neural networks without plasticity respond to stimuli streams containing oddballs. We found the neuronal correlates of sensory stream adaptation emerge if networks generate chaotic oscillations which can be phase entrained by stimulus streams. The resultant activity patterns are close to critical and support history dependent response on long timescales. Because critical network entrainment is a slow process stimulus response adapts gradually over multiple repetitions. Repeated stimuli generate suppressed responses but oddball responses are large and distinct. Oscillatory mismatch responses persist in population activity for long periods after stimulus offset while individual cell mismatch responses are strongly phasic. These effects are weakened in temporally irregular sensory streams. Thus we show that network phase entrainment provides a biologically plausible mechanism for neural oddball detection. Our results do not depend on specific network characteristics, are consistent with experimental studies and may be relevant for multiple pathologies demonstrating altered mismatch processing such as schizophrenia and depression.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM T.J. Watson Research Center, Yorktown Heights, NY, USA.
- Okinawa Institute of Science and Technology Graduate University (OIST), Okinawa, Japan.
| |
Collapse
|
19
|
Inubushi M, Yoshimura K. Reservoir Computing Beyond Memory-Nonlinearity Trade-off. Sci Rep 2017; 7:10199. [PMID: 28860513 PMCID: PMC5579006 DOI: 10.1038/s41598-017-10257-6] [Citation(s) in RCA: 53] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2017] [Accepted: 08/08/2017] [Indexed: 11/09/2022] Open
Abstract
Reservoir computing is a brain-inspired machine learning framework that employs a signal-driven dynamical system, in particular harnessing common-signal-induced synchronization which is a widely observed nonlinear phenomenon. Basic understanding of a working principle in reservoir computing can be expected to shed light on how information is stored and processed in nonlinear dynamical systems, potentially leading to progress in a broad range of nonlinear sciences. As a first step toward this goal, from the viewpoint of nonlinear physics and information theory, we study the memory-nonlinearity trade-off uncovered by Dambre et al. (2012). Focusing on a variational equation, we clarify a dynamical mechanism behind the trade-off, which illustrates why nonlinear dynamics degrades memory stored in dynamical system in general. Moreover, based on the trade-off, we propose a mixture reservoir endowed with both linear and nonlinear dynamics and show that it improves the performance of information processing. Interestingly, for some tasks, significant improvements are observed by adding a few linear dynamics to the nonlinear dynamical system. By employing the echo state network model, the effect of the mixture reservoir is numerically verified for a simple function approximation task and for more complex tasks.
Collapse
Affiliation(s)
- Masanobu Inubushi
- NTT Communication Science Laboratories, NTT Corporation, 3-1, Morinosato Wakamiya Atsugi-shi, Kanagawa, 243-0198, Japan.
| | - Kazuyuki Yoshimura
- Department of Information and Electronics, Graduate School of Engineering, Tottori University, 4-101 Koyama-Minami, Tottori, 680-8552, Japan
| |
Collapse
|
20
|
Mastrogiuseppe F, Ostojic S. Intrinsically-generated fluctuating activity in excitatory-inhibitory networks. PLoS Comput Biol 2017; 13:e1005498. [PMID: 28437436 PMCID: PMC5421821 DOI: 10.1371/journal.pcbi.1005498] [Citation(s) in RCA: 43] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2016] [Revised: 05/08/2017] [Accepted: 04/04/2017] [Indexed: 12/05/2022] Open
Abstract
Recurrent networks of non-linear units display a variety of dynamical regimes depending on the structure of their synaptic connectivity. A particularly remarkable phenomenon is the appearance of strongly fluctuating, chaotic activity in networks of deterministic, but randomly connected rate units. How this type of intrinsically generated fluctuations appears in more realistic networks of spiking neurons has been a long standing question. To ease the comparison between rate and spiking networks, recent works investigated the dynamical regimes of randomly-connected rate networks with segregated excitatory and inhibitory populations, and firing rates constrained to be positive. These works derived general dynamical mean field (DMF) equations describing the fluctuating dynamics, but solved these equations only in the case of purely inhibitory networks. Using a simplified excitatory-inhibitory architecture in which DMF equations are more easily tractable, here we show that the presence of excitation qualitatively modifies the fluctuating activity compared to purely inhibitory networks. In presence of excitation, intrinsically generated fluctuations induce a strong increase in mean firing rates, a phenomenon that is much weaker in purely inhibitory networks. Excitation moreover induces two different fluctuating regimes: for moderate overall coupling, recurrent inhibition is sufficient to stabilize fluctuations; for strong coupling, firing rates are stabilized solely by the upper bound imposed on activity, even if inhibition is stronger than excitation. These results extend to more general network architectures, and to rate networks receiving noisy inputs mimicking spiking activity. Finally, we show that signatures of the second dynamical regime appear in networks of integrate-and-fire neurons.
Collapse
Affiliation(s)
- Francesca Mastrogiuseppe
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
- Laboratoire de Physique Statistique, CNRS UMR 8550, École Normale Supérieure - PSL Research University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives, INSERM U960, École Normale Supérieure - PSL Research University, Paris, France
| |
Collapse
|
21
|
Lajoie G, Lin KK, Thivierge JP, Shea-Brown E. Encoding in Balanced Networks: Revisiting Spike Patterns and Chaos in Stimulus-Driven Systems. PLoS Comput Biol 2016; 12:e1005258. [PMID: 27973557 PMCID: PMC5156368 DOI: 10.1371/journal.pcbi.1005258] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2016] [Accepted: 11/21/2016] [Indexed: 11/22/2022] Open
Abstract
Highly connected recurrent neural networks often produce chaotic dynamics, meaning their precise activity is sensitive to small perturbations. What are the consequences of chaos for how such networks encode streams of temporal stimuli? On the one hand, chaos is a strong source of randomness, suggesting that small changes in stimuli will be obscured by intrinsically generated variability. On the other hand, recent work shows that the type of chaos that occurs in spiking networks can have a surprisingly low-dimensional structure, suggesting that there may be room for fine stimulus features to be precisely resolved. Here we show that strongly chaotic networks produce patterned spikes that reliably encode time-dependent stimuli: using a decoder sensitive to spike times on timescales of 10’s of ms, one can easily distinguish responses to very similar inputs. Moreover, recurrence serves to distribute signals throughout chaotic networks so that small groups of cells can encode substantial information about signals arriving elsewhere. A conclusion is that the presence of strong chaos in recurrent networks need not exclude precise encoding of temporal stimuli via spike patterns. Recurrently connected populations of excitatory and inhibitory neurons found in cortex are known to produce rich and irregular spiking activity, with complex trial-to-trial variability in response to input stimuli. Many theoretical studies found this firing regime to be associated with chaos, where tiny perturbations explode to impact subsequent neural activity. As a result, the precise spiking patterns produced by such networks would be expected to be too fragile to carry any valuable information about stimuli, since inevitable sources of noise such as synaptic failure or ion channel fluctuations would be amplified by chaotic dynamics on repeated trials. In this article we revisit the implications of chaos in input-driven networks and directly measure its impact on evoked population spike patterns. We find that chaotic network dynamics can, in fact, produce highly patterned spiking activity which can be used by a simple decoder to perform input-classification tasks. This can be explained by the presence of low-dimensional, input-specific chaotic attractors, leading to a form of trial-to-trial variability that is intermittent, rather than uniformly random. We propose that chaos is a manageable by-product of recurrent connectivity, which serves to efficiently distribute information about stimuli throughout a network.
Collapse
Affiliation(s)
- Guillaume Lajoie
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, Washington, United States of America
- Department of Nonlinear Dynamics, Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
- * E-mail:
| | - Kevin K. Lin
- School of Mathematics, University of Arizona, Tucson, Arizona, United States of America
| | | | - Eric Shea-Brown
- University of Washington Institute for Neuroengineering, University of Washington, Seattle, Washington, United States of America
- Department of Applied Mathematics, University of Washington, Seattle, Washington, United States of America
- Department of Physiology and Biophysics, University of Washington, Seattle, Washington, United States of America
| |
Collapse
|
22
|
Vincent-Lamarre P, Lajoie G, Thivierge JP. Driving reservoir models with oscillations: a solution to the extreme structural sensitivity of chaotic networks. J Comput Neurosci 2016; 41:305-322. [DOI: 10.1007/s10827-016-0619-3] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2016] [Revised: 07/20/2016] [Accepted: 08/05/2016] [Indexed: 01/01/2023]
|
23
|
A local Echo State Property through the largest Lyapunov exponent. Neural Netw 2016; 76:39-45. [DOI: 10.1016/j.neunet.2015.12.013] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2015] [Revised: 11/29/2015] [Accepted: 12/23/2015] [Indexed: 11/23/2022]
|
24
|
Rajan K, Harvey CD, Tank DW. Recurrent Network Models of Sequence Generation and Memory. Neuron 2016; 90:128-42. [PMID: 26971945 DOI: 10.1016/j.neuron.2016.02.009] [Citation(s) in RCA: 179] [Impact Index Per Article: 22.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2015] [Revised: 12/03/2015] [Accepted: 02/02/2016] [Indexed: 12/29/2022]
Abstract
Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here we demonstrate that, starting from random connectivity and modifying a small fraction of connections, a largely disordered recurrent network can produce sequences and implement working memory efficiently. We use this process, called Partial In-Network Training (PINning), to model and match cellular resolution imaging data from the posterior parietal cortex during a virtual memory-guided two-alternative forced-choice task. Analysis of the connectivity reveals that sequences propagate by the cooperation between recurrent synaptic interactions and external inputs, rather than through feedforward or asymmetric connections. Together our results suggest that neural sequences may emerge through learning from largely unstructured network architectures.
Collapse
Affiliation(s)
- Kanaka Rajan
- Joseph Henry Laboratories of Physics and Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ 08544, USA.
| | | | - David W Tank
- Department of Molecular Biology and Princeton Neuroscience Institute, Princeton University, Princeton, NJ 08544, USA.
| |
Collapse
|
25
|
Aljadeff J, Renfrew D, Vegué M, Sharpee TO. Low-dimensional dynamics of structured random networks. Phys Rev E 2016; 93:022302. [PMID: 26986347 DOI: 10.1103/physreve.93.022302] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2015] [Indexed: 01/12/2023]
Abstract
Using a generalized random recurrent neural network model, and by extending our recently developed mean-field approach [J. Aljadeff, M. Stern, and T. Sharpee, Phys. Rev. Lett. 114, 088101 (2015)], we study the relationship between the network connectivity structure and its low-dimensional dynamics. Each connection in the network is a random number with mean 0 and variance that depends on pre- and postsynaptic neurons through a sufficiently smooth function g of their identities. We find that these networks undergo a phase transition from a silent to a chaotic state at a critical point we derive as a function of g. Above the critical point, although unit activation levels are chaotic, their autocorrelation functions are restricted to a low-dimensional subspace. This provides a direct link between the network's structure and some of its functional characteristics. We discuss example applications of the general results to neuroscience where we derive the support of the spectrum of connectivity matrices with heterogeneous and possibly correlated degree distributions, and to ecology where we study the stability of the cascade model for food web structure.
Collapse
Affiliation(s)
- Johnatan Aljadeff
- Department of Neurobiology, University of Chicago, Chicago, Illinois, USA.,Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, California, USA
| | - David Renfrew
- Department of Mathematics, University of California Los Angeles, Los Angeles, California, USA
| | - Marina Vegué
- Centre de Recerca Matemàtica, Campus de Bellaterra, Barcelona, Spain.,Departament de Matemàtiques, Universitat Politècnica de Catalunya, Barcelona, Spain
| | - Tatyana O Sharpee
- Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, California, USA
| |
Collapse
|
26
|
Wainrib G, Galtier M. Regular graphs maximize the variability of random neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:032802. [PMID: 26465523 DOI: 10.1103/physreve.92.032802] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/17/2014] [Indexed: 06/05/2023]
Abstract
In this work we study the dynamics of systems composed of numerous interacting elements interconnected through a random weighted directed graph, such as models of random neural networks. We develop an original theoretical approach based on a combination of a classical mean-field theory originally developed in the context of dynamical spin-glass models, and the heterogeneous mean-field theory developed to study epidemic propagation on graphs. Our main result is that, surprisingly, increasing the variance of the in-degree distribution does not result in a more variable dynamical behavior, but on the contrary that the most variable behaviors are obtained in the regular graph setting. We further study how the dynamical complexity of the attractors is influenced by the statistical properties of the in-degree distribution.
Collapse
Affiliation(s)
- Gilles Wainrib
- Ecole Normale Supérieure, Département d'Informatique, équipe DATA, Paris, France
| | - Mathieu Galtier
- European Institute for Theoretical Neuroscience, Paris, France
| |
Collapse
|
27
|
Aljadeff J, Stern M, Sharpee T. Transition to chaos in random networks with cell-type-specific connectivity. PHYSICAL REVIEW LETTERS 2015; 114:088101. [PMID: 25768781 PMCID: PMC4527561 DOI: 10.1103/physrevlett.114.088101] [Citation(s) in RCA: 67] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/08/2014] [Indexed: 05/29/2023]
Abstract
In neural circuits, statistical connectivity rules strongly depend on cell-type identity. We study dynamics of neural networks with cell-type-specific connectivity by extending the dynamic mean-field method and find that these networks exhibit a phase transition between silent and chaotic activity. By analyzing the locus of this transition, we derive a new result in random matrix theory: the spectral radius of a random connectivity matrix with block-structured variances. We apply our results to show how a small group of hyperexcitable neurons within the network can significantly increase the network's computational capacity by bringing it into the chaotic regime.
Collapse
Affiliation(s)
- Johnatan Aljadeff
- Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, California 92037, USA and Center for Theoretical Biological Physics and Department of Physics, University of California, San Diego 92093, USA
| | - Merav Stern
- Department of Neuroscience, Columbia University, New York, New York 10032, USA and The Edmond and Lily Safra Center for Brain Sciences, Hebrew University, Jerusalem 9190401, Israel
| | - Tatyana Sharpee
- Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, California 92037, USA and Center for Theoretical Biological Physics and Department of Physics, University of California, San Diego 92093, USA
| |
Collapse
|
28
|
Lajoie G, Thivierge JP, Shea-Brown E. Structured chaos shapes spike-response noise entropy in balanced neural networks. Front Comput Neurosci 2014; 8:123. [PMID: 25324772 PMCID: PMC4183092 DOI: 10.3389/fncom.2014.00123] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2014] [Accepted: 09/11/2014] [Indexed: 11/13/2022] Open
Abstract
Large networks of sparsely coupled, excitatory and inhibitory cells occur throughout the brain. For many models of these networks, a striking feature is that their dynamics are chaotic and thus, are sensitive to small perturbations. How does this chaos manifest in the neural code? Specifically, how variable are the spike patterns that such a network produces in response to an input signal? To answer this, we derive a bound for a general measure of variability-spike-train entropy. This leads to important insights on the variability of multi-cell spike pattern distributions in large recurrent networks of spiking neurons responding to fluctuating inputs. The analysis is based on results from random dynamical systems theory and is complemented by detailed numerical simulations. We find that the spike pattern entropy is an order of magnitude lower than what would be extrapolated from single cells. This holds despite the fact that network coupling becomes vanishingly sparse as network size grows-a phenomenon that depends on "extensive chaos," as previously discovered for balanced networks without stimulus drive. Moreover, we show how spike pattern entropy is controlled by temporal features of the inputs. Our findings provide insight into how neural networks may encode stimuli in the presence of inherently chaotic dynamics.
Collapse
Affiliation(s)
- Guillaume Lajoie
- Nonlinear Dynamics Department, Max Planck Institute for Dynamics and Self-Organization Goettingen, Germany ; Bernstein Center for Computational Neuroscience, Max Planck Institute for Dynamics and Self-Organization Goettingen, Germany ; Applied Mathematics Department, University of Washington Seattle, WA, USA
| | - Jean-Philippe Thivierge
- School of Psychology and Center for Neural Dynamics, University of Ottawa Ottawa, ON, Canada
| | - Eric Shea-Brown
- Applied Mathematics Department, University of Washington Seattle, WA, USA ; Physiology and Biophysics Department, University of Washington Seattle, WA, USA
| |
Collapse
|
29
|
Lajoie G, Lin KK, Shea-Brown E. Chaos and reliability in balanced spiking networks with temporal drive. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2013; 87:052901. [PMID: 23767592 PMCID: PMC4124755 DOI: 10.1103/physreve.87.052901] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/13/2012] [Revised: 12/21/2012] [Indexed: 06/02/2023]
Abstract
Biological information processing is often carried out by complex networks of interconnected dynamical units. A basic question about such networks is that of reliability: If the same signal is presented many times with the network in different initial states, will the system entrain to the signal in a repeatable way? Reliability is of particular interest in neuroscience, where large, complex networks of excitatory and inhibitory cells are ubiquitous. These networks are known to autonomously produce strongly chaotic dynamics-an obvious threat to reliability. Here, we show that such chaos persists in the presence of weak and strong stimuli, but that even in the presence of chaos, intermittent periods of highly reliable spiking often coexist with unreliable activity. We elucidate the local dynamical mechanisms involved in this intermittent reliability, and investigate the relationship between this phenomenon and certain time-dependent attractors arising from the dynamics. A conclusion is that chaotic dynamics do not have to be an obstacle to precise spike responses, a fact with implications for signal coding in large networks.
Collapse
Affiliation(s)
- Guillaume Lajoie
- Department of Applied Mathematics, University of Washington, Seattle, Washington 98195, USA
| | | | | |
Collapse
|
30
|
Orio P, Soudry D. Simple, fast and accurate implementation of the diffusion approximation algorithm for stochastic ion channels with multiple states. PLoS One 2012; 7:e36670. [PMID: 22629320 PMCID: PMC3358312 DOI: 10.1371/journal.pone.0036670] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2011] [Accepted: 04/11/2012] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND The phenomena that emerge from the interaction of the stochastic opening and closing of ion channels (channel noise) with the non-linear neural dynamics are essential to our understanding of the operation of the nervous system. The effects that channel noise can have on neural dynamics are generally studied using numerical simulations of stochastic models. Algorithms based on discrete Markov Chains (MC) seem to be the most reliable and trustworthy, but even optimized algorithms come with a non-negligible computational cost. Diffusion Approximation (DA) methods use Stochastic Differential Equations (SDE) to approximate the behavior of a number of MCs, considerably speeding up simulation times. However, model comparisons have suggested that DA methods did not lead to the same results as in MC modeling in terms of channel noise statistics and effects on excitability. Recently, it was shown that the difference arose because MCs were modeled with coupled gating particles, while the DA was modeled using uncoupled gating particles. Implementations of DA with coupled particles, in the context of a specific kinetic scheme, yielded similar results to MC. However, it remained unclear how to generalize these implementations to different kinetic schemes, or whether they were faster than MC algorithms. Additionally, a steady state approximation was used for the stochastic terms, which, as we show here, can introduce significant inaccuracies. MAIN CONTRIBUTIONS We derived the SDE explicitly for any given ion channel kinetic scheme. The resulting generic equations were surprisingly simple and interpretable--allowing an easy, transparent and efficient DA implementation, avoiding unnecessary approximations. The algorithm was tested in a voltage clamp simulation and in two different current clamp simulations, yielding the same results as MC modeling. Also, the simulation efficiency of this DA method demonstrated considerable superiority over MC methods, except when short time steps or low channel numbers were used.
Collapse
Affiliation(s)
- Patricio Orio
- Centro Interdisciplinario de Neurociencia de Valparaíso, Facultad de Ciencias, Universidad de Valparaíso, Valparaíso, Chile.
| | | |
Collapse
|
31
|
Soudry D, Meir R. Conductance-based neuron models and the slow dynamics of excitability. Front Comput Neurosci 2012; 6:4. [PMID: 22355288 PMCID: PMC3280430 DOI: 10.3389/fncom.2012.00004] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2011] [Accepted: 01/11/2012] [Indexed: 12/03/2022] Open
Abstract
In recent experiments, synaptically isolated neurons from rat cortical culture, were stimulated with periodic extracellular fixed-amplitude current pulses for extended durations of days. The neuron’s response depended on its own history, as well as on the history of the input, and was classified into several modes. Interestingly, in one of the modes the neuron behaved intermittently, exhibiting irregular firing patterns changing in a complex and variable manner over the entire range of experimental timescales, from seconds to days. With the aim of developing a minimal biophysical explanation for these results, we propose a general scheme, that, given a few assumptions (mainly, a timescale separation in kinetics) closely describes the response of deterministic conductance-based neuron models under pulse stimulation, using a discrete time piecewise linear mapping, which is amenable to detailed mathematical analysis. Using this method we reproduce the basic modes exhibited by the neuron experimentally, as well as the mean response in each mode. Specifically, we derive precise closed-form input-output expressions for the transient timescale and firing rates, which are expressed in terms of experimentally measurable variables, and conform with the experimental results. However, the mathematical analysis shows that the resulting firing patterns in these deterministic models are always regular and repeatable (i.e., no chaos), in contrast to the irregular and variable behavior displayed by the neuron in certain regimes. This fact, and the sensitive near-threshold dynamics of the model, indicate that intrinsic ion channel noise has a significant impact on the neuronal response, and may help reproduce the experimentally observed variability, as we also demonstrate numerically. In a companion paper, we extend our analysis to stochastic conductance-based models, and show how these can be used to reproduce the details of the observed irregular and variable neuronal response.
Collapse
Affiliation(s)
- Daniel Soudry
- Department of Electrical Engineering, The Laboratory for Network Biology Research Technion, Haifa, Israel
| | | |
Collapse
|
32
|
Toyoizumi T, Abbott LF. Beyond the edge of chaos: amplification and temporal integration by recurrent networks in the chaotic regime. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2011; 84:051908. [PMID: 22181445 PMCID: PMC5558624 DOI: 10.1103/physreve.84.051908] [Citation(s) in RCA: 57] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/24/2011] [Revised: 09/29/2011] [Indexed: 05/12/2023]
Abstract
Randomly connected networks of neurons exhibit a transition from fixed-point to chaotic activity as the variance of their synaptic connection strengths is increased. In this study, we analytically evaluate how well a small external input can be reconstructed from a sparse linear readout of network activity. At the transition point, known as the edge of chaos, networks display a number of desirable features, including large gains and integration times. Away from this edge, in the nonchaotic regime that has been the focus of most models and studies, gains and integration times fall off dramatically, which implies that parameters must be fine tuned with considerable precision if high performance is required. Here we show that, near the edge, decoding performance is characterized by a critical exponent that takes a different value on the two sides. As a result, when the network units have an odd saturating nonlinear response function, the falloff in gains and integration times is much slower on the chaotic side of the transition. This means that, under appropriate conditions, good performance can be achieved with less fine tuning beyond the edge, within the chaotic regime.
Collapse
Affiliation(s)
- T Toyoizumi
- Department of Neuroscience, Columbia University, New York, New York 10032, USA.
| | | |
Collapse
|
33
|
Rajan K, Abbott LF, Sompolinsky H. Stimulus-dependent suppression of chaos in recurrent neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2010; 82:011903. [PMID: 20866644 PMCID: PMC10683875 DOI: 10.1103/physreve.82.011903] [Citation(s) in RCA: 139] [Impact Index Per Article: 9.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/31/2009] [Revised: 04/22/2010] [Indexed: 05/29/2023]
Abstract
Neuronal activity arises from an interaction between ongoing firing generated spontaneously by neural circuits and responses driven by external stimuli. Using mean-field analysis, we ask how a neural network that intrinsically generates chaotic patterns of activity can remain sensitive to extrinsic input. We find that inputs not only drive network responses, but they also actively suppress ongoing activity, ultimately leading to a phase transition in which chaos is completely eliminated. The critical input intensity at the phase transition is a nonmonotonic function of stimulus frequency, revealing a "resonant" frequency at which the input is most effective at suppressing chaos even though the power spectrum of the spontaneous activity peaks at zero and falls exponentially. A prediction of our analysis is that the variance of neural responses should be most strongly suppressed at frequencies matching the range over which many sensory systems operate.
Collapse
Affiliation(s)
- Kanaka Rajan
- Lewis-Sigler Institute for Integrative Genomics, Icahn 262, Princeton University, Princeton, New Jersey 08544, USA.
| | | | | |
Collapse
|
34
|
Sussillo D, Abbott LF. Generating coherent patterns of activity from chaotic neural networks. Neuron 2009; 63:544-57. [PMID: 19709635 PMCID: PMC2756108 DOI: 10.1016/j.neuron.2009.07.018] [Citation(s) in RCA: 481] [Impact Index Per Article: 32.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2009] [Revised: 06/11/2009] [Accepted: 07/17/2009] [Indexed: 11/21/2022]
Abstract
Neural circuits display complex activity patterns both spontaneously and when responding to a stimulus or generating a motor output. How are these two forms of activity related? We develop a procedure called FORCE learning for modifying synaptic strengths either external to or within a model neural network to change chaotic spontaneous activity into a wide variety of desired activity patterns. FORCE learning works even though the networks we train are spontaneously chaotic and we leave feedback loops intact and unclamped during learning. Using this approach, we construct networks that produce a wide variety of complex output patterns, input-output transformations that require memory, multiple outputs that can be switched by control inputs, and motor patterns matching human motion capture data. Our results reproduce data on premovement activity in motor and premotor cortex, and suggest that synaptic plasticity may be a more rapid and powerful modulator of network activity than generally appreciated.
Collapse
Affiliation(s)
- David Sussillo
- Department of Neuroscience, Department of Physiology and Cellular Biophysics, Columbia University College of Physicians and Surgeons, New York, NY 10032-2695, USA.
| | | |
Collapse
|
35
|
Faugeras O, Touboul J, Cessac B. A constructive mean-field analysis of multi-population neural networks with random synaptic weights and stochastic inputs. Front Comput Neurosci 2009; 3:1. [PMID: 19255631 PMCID: PMC2649202 DOI: 10.3389/neuro.10.001.2009] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2008] [Accepted: 01/26/2009] [Indexed: 11/13/2022] Open
Abstract
We deal with the problem of bridging the gap between two scales in neuronal modeling. At the first (microscopic) scale, neurons are considered individually and their behavior described by stochastic differential equations that govern the time variations of their membrane potentials. They are coupled by synaptic connections acting on their resulting activity, a nonlinear function of their membrane potential. At the second (mesoscopic) scale, interacting populations of neurons are described individually by similar equations. The equations describing the dynamical and the stationary mean-field behaviors are considered as functional equations on a set of stochastic processes. Using this new point of view allows us to prove that these equations are well-posed on any finite time interval and to provide a constructive method for effectively computing their unique solution. This method is proved to converge to the unique solution and we characterize its complexity and convergence rate. We also provide partial results for the stationary problem on infinite time intervals. These results shed some new light on such neural mass models as the one of Jansen and Rit (1995): their dynamics appears as a coarse approximation of the much richer dynamics that emerges from our analysis. Our numerical experiments confirm that the framework we propose and the numerical methods we derive from it provide a new and powerful tool for the exploration of neural behaviors at different scales.
Collapse
|
36
|
Cortes JM, Torres JJ, Marro J. Control of neural chaos by synaptic noise. Biosystems 2007; 87:186-90. [PMID: 17084962 DOI: 10.1016/j.biosystems.2006.09.013] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2005] [Revised: 07/08/2006] [Accepted: 07/15/2006] [Indexed: 11/24/2022]
Abstract
We study neural automata - or neurobiologically inspired cellular automata - which exhibits chaotic itinerancy among the different stored patterns or memories. This is a consequence of activity-dependent synaptic fluctuations, which continuously destabilize the attractor and induce irregular hopping to other possible attractors. The nature of these irregularities depends on the dynamic details, namely, on the intensity of the synaptic noise and the number of sites of the network, which are synchronously updated at each time step. Varying these factors, different regimes occur, ranging from regular to chaotic dynamics. As a result, and in absence of external agents, the chaotic behavior may turn regular after tuning the noise intensity. It is argued that a similar mechanism might be on the basis of self-controlling chaos in natural systems.
Collapse
Affiliation(s)
- J M Cortes
- Institute Carlos I for Theoretical and Computational Physics, and Departamento de Electromagnetismo y Fisica de la Materia, University of Granada, E-18071 Granada, Spain.
| | | | | |
Collapse
|
37
|
Bornholdt S, Röhl T. Self-organized critical neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2003; 67:066118. [PMID: 16241315 DOI: 10.1103/physreve.67.066118] [Citation(s) in RCA: 38] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/02/2002] [Revised: 04/11/2003] [Indexed: 05/04/2023]
Abstract
A mechanism for self-organization of the degree of connectivity in model neural networks is studied. Network connectivity is regulated locally on the basis of an order parameter of the global dynamics, which is estimated from an observable at the single synapse level. This principle is studied in a two-dimensional neural network with randomly wired asymmetric weights. In this class of networks, network connectivity is closely related to a phase transition between ordered and disordered dynamics. A slow topology change is imposed on the network through a local rewiring rule motivated by activity-dependent synaptic development: Neighbor neurons whose activity is correlated, on average develop a new connection while uncorrelated neighbors tend to disconnect. As a result, robust self-organization of the network towards the order disorder transition occurs. Convergence is independent of initial conditions, robust against thermal noise, and does not require fine tuning of parameters.
Collapse
|
38
|
Stacey WC, Durand DM. Noise and coupling affect signal detection and bursting in a simulated physiological neural network. J Neurophysiol 2002; 88:2598-611. [PMID: 12424297 DOI: 10.1152/jn.00223.2002] [Citation(s) in RCA: 40] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Signal detection in the CNS relies on a complex interaction between the numerous synaptic inputs to the detecting cells. Two effects, stochastic resonance (SR) and coherence resonance (CR) have been shown to affect signal detection in arrays of basic neuronal models. Here, an array of simulated hippocampal CA1 neurons was used to test the hypothesis that physiological noise and electrical coupling can interact to modulate signal detection in the CA1 region of the hippocampus. The array was tested using varying levels of coupling and noise with different input signals. Detection of a subthreshold signal in the network improved as the number of detecting cells increased and as coupling was increased as predicted by previous studies in SR; however, the response depended greatly on the noise characteristics present and varied from SR predictions at times. Careful evaluation of noise characteristics may be necessary to form conclusions about the role of SR in complex systems such as physiological neurons. The coupled array fired synchronous, periodic bursts when presented with noise alone. The synchrony of this firing changed as a function of noise and coupling as predicted by CR. The firing was very similar to certain models of epileptiform activity, leading to a discussion of CR as a possible simple model of epilepsy. A single neuron was unable to recruit its neighbors to a periodic signal unless the signal was very close to the synchronous bursting frequency. These findings, when viewed in comparison with physiological parameters in the hippocampus, suggest that both SR and CR can have significant effects on signal processing in vivo.
Collapse
Affiliation(s)
- William C Stacey
- Department of Biomedical Engineering, Case Western Reserve University, Ohio 44106, USA
| | | |
Collapse
|
39
|
Favorov OV, Hester JT, Lao R, Tommerdahl M. Spurious dynamics in somatosensory cortex. Behav Brain Res 2002; 135:75-82. [PMID: 12356437 DOI: 10.1016/s0166-4328(02)00158-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Cortical networks are dynamical systems whose task is to process information. However, in addition to 'intended' dynamical behaviors, the sheer complexity of a cortical network's structure-regardless of its precise details-should generate additional 'unintended' dynamical behaviors. Dynamics observed in cortical network models and in the somatosensory cortex suggest that such spurious dynamical behaviors are likely to be pervasive but relatively simple, contributing to-rather than dominating-a network's response to stimuli. Spurious dynamics may be responsible for a variety of experimentally observed intriguing features of cortical dynamics. Because of their distributed origins and emergent nature, such dynamical features, while clearly identifiable, will resist attempts at identifying specific mechanisms to explain them. We describe some of the spurious dynamical phenomena associated with somatosensory cortical response to brushing stimulation, to illustrate how spurious dynamics can affect neurons' functional properties, cortical stimulus representation and, ultimately, perception.
Collapse
Affiliation(s)
- O V Favorov
- School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, FL 32816, USA.
| | | | | | | |
Collapse
|
40
|
Cristescu CP, Stan C, Alexandroaei D. Control of chaos by random noise in a system of two coupled perturbed van der Pol oscillators modeling an electrical discharge plasma. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2002; 66:016602. [PMID: 12241495 DOI: 10.1103/physreve.66.016602] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/23/2002] [Indexed: 05/23/2023]
Abstract
The control of chaos in nonlinear systems by different methods is still a high interest topic particularly when this is achieved by random noise as in this work. The change of chaotic dynamics into periodic dynamics induced by random noise in a system of two coupled perturbed van der Pol oscillators and comparison with the experimentally observed behavior of a double discharge plasma that it models is presented. Methods specific to nonlinear analysis such as phase portraits, Lyapunov exponents, and Fourier spectra are used to demonstrate the changeover from chaotic to regular dynamics induced by random noise. A phase diagram determines the range of noise parameters corresponding to the lowest orders of an observed bifurcation sequence of 3 x 2(n) type and particulars of the transitions are presented.
Collapse
Affiliation(s)
- C P Cristescu
- Polytechnic University of Bucharest, Department of Physics, 313 Splaiul Independentei, 77206 Bucharest, Romania
| | | | | |
Collapse
|
41
|
Wieland C. Controlling chaos in higher dimensional maps with constant feedback: an analytical approach. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2002; 66:016205. [PMID: 12241459 DOI: 10.1103/physreve.66.016205] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/20/2001] [Indexed: 05/23/2023]
Abstract
We introduce two methods to control chaos in higher-dimensional discrete maps with constant feedback. It is analytically shown for a general class of function vectors that chaotic attractors can be converted into fixed point attractors. Additionally, a method to choose an appropriate constant feedback is presented. The application of these methods does not require a priori knowledge of system equations, since time series information can be used. Desired periodic orbits can be accessed by varying the constant feedback. As an example, the methods are applied to the Hénon map.
Collapse
|
42
|
Caroppo D, Mannarelli M, Nardulli G, Stramaglia S. Chaos in neural networks with a nonmonotonic transfer function. PHYSICAL REVIEW. E, STATISTICAL PHYSICS, PLASMAS, FLUIDS, AND RELATED INTERDISCIPLINARY TOPICS 1999; 60:2186-92. [PMID: 11970013 DOI: 10.1103/physreve.60.2186] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/23/1999] [Indexed: 11/07/2022]
Abstract
Time evolution of diluted neural networks with a nonmonotonic transfer function is analytically described by flow equations for macroscopic variables. The macroscopic dynamics shows a rich variety of behaviors: fixed-point, periodicity, and chaos. We examine in detail the structure of the strange attractor and in particular we study the main features of the stable and unstable manifolds, the hyperbolicity of the attractor, and the existence of homoclinic intersections. We also discuss the problem of the robustness of the chaos and we prove that in the present model chaotic behavior is fragile (chaotic regions are densely intercalated with periodicity windows), according to a recently discussed conjecture. Finally we perform an analysis of the microscopic behavior and in particular we examine the occurrence of damage spreading by studying the time evolution of two almost identical initial configurations. We show that for any choice of the parameters the two initial states remain microscopically distinct.
Collapse
Affiliation(s)
- D Caroppo
- Dipartimento Interateneo di Fisica and Istituto Nazionale di Fisica Nucleare, Sezione di Bari, via Amendola 173, 70126 Bari, Italy
| | | | | | | |
Collapse
|
43
|
Mirus KA, Sprott JC. Controlling chaos in low- and high-dimensional systems with periodic parametric perturbations. PHYSICAL REVIEW. E, STATISTICAL PHYSICS, PLASMAS, FLUIDS, AND RELATED INTERDISCIPLINARY TOPICS 1999; 59:5313-24. [PMID: 11969491 DOI: 10.1103/physreve.59.5313] [Citation(s) in RCA: 20] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/29/1998] [Indexed: 04/18/2023]
Abstract
The effect of applying a periodic perturbation to an accessible parameter of various chaotic systems is examined. Numerical results indicate that perturbation frequencies near the natural frequencies of the unstable periodic orbits of the chaotic systems can result in limit cycles for relatively small perturbations. Such perturbations can also control or significantly reduce the dimension of high-dimensional systems. Initial application to the control of fluctuations in a prototypical magnetic fusion plasma device will be reviewed.
Collapse
Affiliation(s)
- K A Mirus
- Department of Physics, University of Wisconsin, Madison, Wisconsin 53706, USA
| | | |
Collapse
|
44
|
Bastolla U, Parisi G. Relaxation, closing probabilities and transition from oscillatory to chaotic attractors in asymmetric neural networks. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/31/20/003] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
45
|
Samuelides M, Doyon B, Cessac B, Quoy M, Dauce E. Self-organization and dynamics reduction in recurrent networks: stimulus presentation and learning. Neural Netw 1998; 11:521-533. [PMID: 12662827 DOI: 10.1016/s0893-6080(97)00131-7] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
Freeman's investigations on the olfactory bulb of the rabbit showed that its signal dynamics was chaotic, and that recognition of a learned stimulus is linked to a dimension reduction of the dynamics attractor. In this paper we address the question whether this behavior is specific of this particular architecture, or if it is a general property. We study the dynamics of a non-convergent recurrent model-the random recurrent neural networks. In that model a mean-field theory can be used to analyze the autonomous dynamics. We extend this approach with various observations on significant changes in the dynamical regime when sending static random stimuli. Then we propose a Hebb-like learning rule, viewed as a self-organization dynamical process inducing specific reactivity to one random stimulus. We numerically show the dynamics reduction during learning and recognition processes and analyze it in terms of dynamical repartition of local neural activity.
Collapse
Affiliation(s)
- Manuel Samuelides
- ONERA-CERT/DTIM, 2 avenue Edouard Belin, BP 4025, 31055, Toulouse, France
| | | | | | | | | |
Collapse
|
46
|
Kivshar YS, Rödelsperger F, Benner H. Suppression of chaos by nonresonant parametric perturbations. PHYSICAL REVIEW. E, STATISTICAL PHYSICS, PLASMAS, FLUIDS, AND RELATED INTERDISCIPLINARY TOPICS 1994; 49:319-324. [PMID: 9961220 DOI: 10.1103/physreve.49.319] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
|