1
|
Torres JJ, Marro J. Physics Clues on the Mind Substrate and Attributes. Front Comput Neurosci 2022; 16:836532. [PMID: 35465268 PMCID: PMC9026167 DOI: 10.3389/fncom.2022.836532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Accepted: 02/07/2022] [Indexed: 11/16/2022] Open
Abstract
The last decade has witnessed a remarkable progress in our understanding of the brain. This has mainly been based on the scrutiny and modeling of the transmission of activity among neurons across lively synapses. A main conclusion, thus far, is that essential features of the mind rely on collective phenomena that emerge from a willful interaction of many neurons that, mediating other cells, form a complex network whose details keep constantly adapting to their activity and surroundings. In parallel, theoretical and computational studies developed to understand many natural and artificial complex systems, which have truthfully explained their amazing emergent features and precise the role of the interaction dynamics and other conditions behind the different collective phenomena they happen to display. Focusing on promising ideas that arise when comparing these neurobiology and physics studies, the present perspective article shortly reviews such fascinating scenarios looking for clues about how high-level cognitive processes such as consciousness, intelligence, and identity can emerge. We, thus, show that basic concepts of physics, such as dynamical phases and non-equilibrium phase transitions, become quite relevant to the brain activity while determined by factors at the subcellular, cellular, and network levels. We also show how these transitions depend on details of the processing mechanism of stimuli in a noisy background and, most important, that one may detect them in familiar electroencephalogram (EEG) recordings. Thus, we associate the existence of such phases, which reveal a brain operating at (non-equilibrium) criticality, with the emergence of most interesting phenomena during memory tasks.
Collapse
|
2
|
Millán AP, Torres JJ, Marro J. How Memory Conforms to Brain Development. Front Comput Neurosci 2019; 13:22. [PMID: 31057385 PMCID: PMC6477510 DOI: 10.3389/fncom.2019.00022] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2018] [Accepted: 03/26/2019] [Indexed: 12/20/2022] Open
Abstract
Nature exhibits countless examples of adaptive networks, whose topology evolves constantly coupled with the activity due to its function. The brain is an illustrative example of a system in which a dynamic complex network develops by the generation and pruning of synaptic contacts between neurons while memories are acquired and consolidated. Here, we consider a recently proposed brain developing model to study how mechanisms responsible for the evolution of brain structure affect and are affected by memory storage processes. Following recent experimental observations, we assume that the basic rules for adding and removing synapses depend on local synaptic currents at the respective neurons in addition to global mechanisms depending on the mean connectivity. In this way a feedback loop between "form" and "function" spontaneously emerges that influences the ability of the system to optimally store and retrieve sensory information in patterns of brain activity or memories. In particular, we report here that, as a consequence of such a feedback-loop, oscillations in the activity of the system among the memorized patterns can occur, depending on parameters, reminding mind dynamical processes. Such oscillations have their origin in the destabilization of memory attractors due to the pruning dynamics, which induces a kind of structural disorder or noise in the system at a long-term scale. This constantly modifies the synaptic disorder induced by the interference among the many patterns of activity memorized in the system. Such new intriguing oscillatory behavior is to be associated only to long-term synaptic mechanisms during the network evolution dynamics, and it does not depend on short-term synaptic processes, as assumed in other studies, that are not present in our model.
Collapse
Affiliation(s)
| | - Joaquín J. Torres
- Institute “Carlos I” for Theoretical and Computational Physics, University of Granada, Granada, Spain
| | | |
Collapse
|
3
|
Li W, Ovchinnikov IV, Chen H, Wang Z, Lee A, Lee H, Cepeda C, Schwartz RN, Meier K, Wang KL. A Basic Phase Diagram of Neuronal Dynamics. Neural Comput 2018; 30:2418-2438. [PMID: 29894659 DOI: 10.1162/neco_a_01103] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The extreme complexity of the brain has attracted the attention of neuroscientists and other researchers for a long time. More recently, the neuromorphic hardware has matured to provide a new powerful tool to study neuronal dynamics. Here, we study neuronal dynamics using different settings on a neuromorphic chip built with flexible parameters of neuron models. Our unique setting in the network of leaky integrate-and-fire (LIF) neurons is to introduce a weak noise environment. We observed three different types of collective neuronal activities, or phases, separated by sharp boundaries, or phase transitions. From this, we construct a rudimentary phase diagram of neuronal dynamics and demonstrate that a noise-induced chaotic phase (N-phase), which is dominated by neuronal avalanche activity (intermittent aperiodic neuron firing), emerges in the presence of noise and its width grows with the noise intensity. The dynamics can be manipulated in this N-phase. Our results and comparison with clinical data is consistent with the literature and our previous work showing that healthy brain must reside in the N-phase. We argue that the brain phase diagram with further refinement may be used for the diagnosis and treatment of mental disease and also suggest that the dynamics may be manipulated to serve as a means of new information processing (e.g., for optimization). Neuromorphic chips, similar to the one we used but with a variety of neuron models, may be used to further enhance the understanding of human brain function and accelerate the development of neuroscience research.
Collapse
Affiliation(s)
- Wenyuan Li
- Department of Electrical Engineering, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Igor V Ovchinnikov
- Department of Electrical Engineering, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Honglin Chen
- Department of Mathematics, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Zhe Wang
- Department of Mechanical Engineering, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Albert Lee
- Department of Electrical Engineering, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Houchul Lee
- Department of Electrical Engineering, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Carlos Cepeda
- David Geffen School of Medicine, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Robert N Schwartz
- Department of Electrical Engineering, UCLA, Los Angeles, CA 90095, U.S.A.
| | - Karlheinz Meier
- Kirchhoff Institute for Physics, Heidelberg University, 69120 Heidelberg, Germany
| | - Kang L Wang
- Department of Electrical Engineering, UCLA, Los Angeles, CA 90095, U.S.A.
| |
Collapse
|
4
|
|
5
|
Livi L, Bianchi FM, Alippi C. Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:706-717. [PMID: 28092580 DOI: 10.1109/tnnls.2016.2644268] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
It is a widely accepted fact that the computational capability of recurrent neural networks (RNNs) is maximized on the so-called "edge of criticality." Once the network operates in this configuration, it performs efficiently on a specific application both in terms of: 1) low prediction error and 2) high short-term memory capacity. Since the behavior of recurrent networks is strongly influenced by the particular input signal driving the dynamics, a universal, application-independent method for determining the edge of criticality is still missing. In this paper, we aim at addressing this issue by proposing a theoretically motivated, unsupervised method based on Fisher information for determining the edge of criticality in RNNs. It is proved that Fisher information is maximized for (finite-size) systems operating in such critical regions. However, Fisher information is notoriously difficult to compute and requires the analytic form of the probability density function ruling the system behavior. This paper takes advantage of a recently developed nonparametric estimator of the Fisher information matrix and provides a method to determine the critical region of echo state networks (ESNs), a particular class of recurrent networks. The considered control parameters, which indirectly affect the ESN performance, are explored to identify those configurations lying on the edge of criticality and, as such, maximizing Fisher information and computational performance. Experimental results on benchmarks and real-world data demonstrate the effectiveness of the proposed method.
Collapse
|
6
|
Sase T, Katori Y, Komuro M, Aihara K. Bifurcation Analysis on Phase-Amplitude Cross-Frequency Coupling in Neural Networks with Dynamic Synapses. Front Comput Neurosci 2017; 11:18. [PMID: 28424606 PMCID: PMC5371682 DOI: 10.3389/fncom.2017.00018] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2016] [Accepted: 03/13/2017] [Indexed: 11/13/2022] Open
Abstract
We investigate a discrete-time network model composed of excitatory and inhibitory neurons and dynamic synapses with the aim at revealing dynamical properties behind oscillatory phenomena possibly related to brain functions. We use a stochastic neural network model to derive the corresponding macroscopic mean field dynamics, and subsequently analyze the dynamical properties of the network. In addition to slow and fast oscillations arising from excitatory and inhibitory networks, respectively, we show that the interaction between these two networks generates phase-amplitude cross-frequency coupling (CFC), in which multiple different frequency components coexist and the amplitude of the fast oscillation is modulated by the phase of the slow oscillation. Furthermore, we clarify the detailed properties of the oscillatory phenomena by applying the bifurcation analysis to the mean field model, and accordingly show that the intermittent and the continuous CFCs can be characterized by an aperiodic orbit on a closed curve and one on a torus, respectively. These two CFC modes switch depending on the coupling strength from the excitatory to inhibitory networks, via the saddle-node cycle bifurcation of a one-dimensional torus in map (MT1SNC), and may be associated with the function of multi-item representation. We believe that the present model might have potential for studying possible functional roles of phase-amplitude CFC in the cerebral cortex.
Collapse
Affiliation(s)
- Takumi Sase
- Graduate School of Information Science and Technology, The University of TokyoTokyo, Japan
| | - Yuichi Katori
- Institute of Industrial Science, The University of TokyoTokyo, Japan.,The School of Systems Information Science, Future University HakodateHokkaido, Japan
| | - Motomasa Komuro
- Center for Fundamental Education, Teikyo University of ScienceYamanashi, Japan
| | - Kazuyuki Aihara
- Graduate School of Information Science and Technology, The University of TokyoTokyo, Japan.,Institute of Industrial Science, The University of TokyoTokyo, Japan
| |
Collapse
|
7
|
Brochini L, de Andrade Costa A, Abadi M, Roque AC, Stolfi J, Kinouchi O. Phase transitions and self-organized criticality in networks of stochastic spiking neurons. Sci Rep 2016; 6:35831. [PMID: 27819336 PMCID: PMC5098137 DOI: 10.1038/srep35831] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2016] [Accepted: 10/05/2016] [Indexed: 12/03/2022] Open
Abstract
Phase transitions and critical behavior are crucial issues both in theoretical and experimental neuroscience. We report analytic and computational results about phase transitions and self-organized criticality (SOC) in networks with general stochastic neurons. The stochastic neuron has a firing probability given by a smooth monotonic function Φ(V) of the membrane potential V, rather than a sharp firing threshold. We find that such networks can operate in several dynamic regimes (phases) depending on the average synaptic weight and the shape of the firing function Φ. In particular, we encounter both continuous and discontinuous phase transitions to absorbing states. At the continuous transition critical boundary, neuronal avalanches occur whose distributions of size and duration are given by power laws, as observed in biological neural networks. We also propose and test a new mechanism to produce SOC: the use of dynamic neuronal gains - a form of short-term plasticity probably located at the axon initial segment (AIS) - instead of depressing synapses at the dendrites (as previously studied in the literature). The new self-organization mechanism produces a slightly supercritical state, that we called SOSC, in accord to some intuitions of Alan Turing.
Collapse
Affiliation(s)
- Ludmila Brochini
- Universidade de São Paulo, Departamento de Estatística-IME, São Paulo-SP, 05508-090, Brazil
| | | | - Miguel Abadi
- Universidade de São Paulo, Departamento de Estatística-IME, São Paulo-SP, 05508-090, Brazil
| | - Antônio C. Roque
- Universidade de São Paulo, Departamento de Física-FFCLRP, Ribeirão Preto-SP, 14040-901, Brazil
| | - Jorge Stolfi
- Universidade de Campinas, Instituto de Computação, Campinas-SP, 13083-852, Brazil
| | - Osame Kinouchi
- Universidade de São Paulo, Departamento de Física-FFCLRP, Ribeirão Preto-SP, 14040-901, Brazil
| |
Collapse
|
8
|
Siettos C, Starke J. Multiscale modeling of brain dynamics: from single neurons and networks to mathematical tools. WILEY INTERDISCIPLINARY REVIEWS-SYSTEMS BIOLOGY AND MEDICINE 2016; 8:438-58. [PMID: 27340949 DOI: 10.1002/wsbm.1348] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/07/2016] [Revised: 05/01/2016] [Accepted: 05/14/2016] [Indexed: 11/09/2022]
Abstract
The extreme complexity of the brain naturally requires mathematical modeling approaches on a large variety of scales; the spectrum ranges from single neuron dynamics over the behavior of groups of neurons to neuronal network activity. Thus, the connection between the microscopic scale (single neuron activity) to macroscopic behavior (emergent behavior of the collective dynamics) and vice versa is a key to understand the brain in its complexity. In this work, we attempt a review of a wide range of approaches, ranging from the modeling of single neuron dynamics to machine learning. The models include biophysical as well as data-driven phenomenological models. The discussed models include Hodgkin-Huxley, FitzHugh-Nagumo, coupled oscillators (Kuramoto oscillators, Rössler oscillators, and the Hindmarsh-Rose neuron), Integrate and Fire, networks of neurons, and neural field equations. In addition to the mathematical models, important mathematical methods in multiscale modeling and reconstruction of the causal connectivity are sketched. The methods include linear and nonlinear tools from statistics, data analysis, and time series analysis up to differential equations, dynamical systems, and bifurcation theory, including Granger causal connectivity analysis, phase synchronization connectivity analysis, principal component analysis (PCA), independent component analysis (ICA), and manifold learning algorithms such as ISOMAP, and diffusion maps and equation-free techniques. WIREs Syst Biol Med 2016, 8:438-458. doi: 10.1002/wsbm.1348 For further resources related to this article, please visit the WIREs website.
Collapse
Affiliation(s)
- Constantinos Siettos
- School of Applied Mathematics and Physical Sciences, National Technical University of Athens, Athens, Greece
| | - Jens Starke
- School of Mathematical Sciences, Queen Mary University of London, London, UK
| |
Collapse
|