1
|
Tian Y, Tan Z, Hou H, Li G, Cheng A, Qiu Y, Weng K, Chen C, Sun P. Theoretical foundations of studying criticality in the brain. Netw Neurosci 2022; 6:1148-1185. [PMID: 38800464 PMCID: PMC11117095 DOI: 10.1162/netn_a_00269] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2022] [Accepted: 07/12/2022] [Indexed: 05/29/2024] Open
Abstract
Criticality is hypothesized as a physical mechanism underlying efficient transitions between cortical states and remarkable information-processing capacities in the brain. While considerable evidence generally supports this hypothesis, nonnegligible controversies persist regarding the ubiquity of criticality in neural dynamics and its role in information processing. Validity issues frequently arise during identifying potential brain criticality from empirical data. Moreover, the functional benefits implied by brain criticality are frequently misconceived or unduly generalized. These problems stem from the nontriviality and immaturity of the physical theories that analytically derive brain criticality and the statistic techniques that estimate brain criticality from empirical data. To help solve these problems, we present a systematic review and reformulate the foundations of studying brain criticality, that is, ordinary criticality (OC), quasi-criticality (qC), self-organized criticality (SOC), and self-organized quasi-criticality (SOqC), using the terminology of neuroscience. We offer accessible explanations of the physical theories and statistical techniques of brain criticality, providing step-by-step derivations to characterize neural dynamics as a physical system with avalanches. We summarize error-prone details and existing limitations in brain criticality analysis and suggest possible solutions. Moreover, we present a forward-looking perspective on how optimizing the foundations of studying brain criticality can deepen our understanding of various neuroscience questions.
Collapse
Affiliation(s)
- Yang Tian
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
- Laboratory of Advanced Computing and Storage, Central Research Institute, 2012 Laboratories, Huawei Technologies Co. Ltd., Beijing, China
| | - Zeren Tan
- Institute for Interdisciplinary Information Science, Tsinghua University, Beijing, China
| | - Hedong Hou
- UFR de Mathématiques, Université de Paris, Paris, France
| | - Guoqi Li
- Institute of Automation, Chinese Academy of Science, Beijing, China
- University of Chinese Academy of Science, Beijing, China
| | - Aohua Cheng
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Yike Qiu
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Kangyu Weng
- Tsien Excellence in Engineering Program, School of Aerospace Engineering, Tsinghua University, Beijing, China
| | - Chun Chen
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
| | - Pei Sun
- Department of Psychology & Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
| |
Collapse
|
2
|
Heiney K, Huse Ramstad O, Fiskum V, Christiansen N, Sandvig A, Nichele S, Sandvig I. Criticality, Connectivity, and Neural Disorder: A Multifaceted Approach to Neural Computation. Front Comput Neurosci 2021; 15:611183. [PMID: 33643017 PMCID: PMC7902700 DOI: 10.3389/fncom.2021.611183] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2020] [Accepted: 01/18/2021] [Indexed: 01/03/2023] Open
Abstract
It has been hypothesized that the brain optimizes its capacity for computation by self-organizing to a critical point. The dynamical state of criticality is achieved by striking a balance such that activity can effectively spread through the network without overwhelming it and is commonly identified in neuronal networks by observing the behavior of cascades of network activity termed "neuronal avalanches." The dynamic activity that occurs in neuronal networks is closely intertwined with how the elements of the network are connected and how they influence each other's functional activity. In this review, we highlight how studying criticality with a broad perspective that integrates concepts from physics, experimental and theoretical neuroscience, and computer science can provide a greater understanding of the mechanisms that drive networks to criticality and how their disruption may manifest in different disorders. First, integrating graph theory into experimental studies on criticality, as is becoming more common in theoretical and modeling studies, would provide insight into the kinds of network structures that support criticality in networks of biological neurons. Furthermore, plasticity mechanisms play a crucial role in shaping these neural structures, both in terms of homeostatic maintenance and learning. Both network structures and plasticity have been studied fairly extensively in theoretical models, but much work remains to bridge the gap between theoretical and experimental findings. Finally, information theoretical approaches can tie in more concrete evidence of a network's computational capabilities. Approaching neural dynamics with all these facets in mind has the potential to provide a greater understanding of what goes wrong in neural disorders. Criticality analysis therefore holds potential to identify disruptions to healthy dynamics, granted that robust methods and approaches are considered.
Collapse
Affiliation(s)
- Kristine Heiney
- Department of Computer Science, Oslo Metropolitan University, Oslo, Norway
- Department of Computer Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Ola Huse Ramstad
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Vegard Fiskum
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Nicholas Christiansen
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| | - Axel Sandvig
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Department of Clinical Neuroscience, Umeå University Hospital, Umeå, Sweden
- Department of Neurology, St. Olav's Hospital, Trondheim, Norway
| | - Stefano Nichele
- Department of Computer Science, Oslo Metropolitan University, Oslo, Norway
- Department of Holistic Systems, Simula Metropolitan, Oslo, Norway
| | - Ioanna Sandvig
- Department of Neuromedicine and Movement Science, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
| |
Collapse
|
3
|
Jung N, Le QA, Lee KE, Lee JW. Avalanche size distribution of an integrate-and-fire neural model on complex networks. CHAOS (WOODBURY, N.Y.) 2020; 30:063118. [PMID: 32611110 DOI: 10.1063/5.0008767] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Accepted: 05/07/2020] [Indexed: 06/11/2023]
Abstract
We considered the neural avalanche dynamics of a modified integrate-and-fire model on complex networks, as well as the neural dynamics in a fully connected network, random network, small-world network, and scale-free network. We observed the self-organized criticality of the neural model on complex networks. The probability distribution of the avalanche size and lifetime follow the power law at the critical synaptic strength. Neuronal dynamics on a complex network are not universal. The critical exponents of the avalanche dynamics depend on the structure of the complex network. We observed that the critical exponents deviate from the mean-field value.
Collapse
Affiliation(s)
- Nam Jung
- Department of Physics, Inha University, Incheon 22212, Korea
| | - Quang Anh Le
- Department of Physics, Inha University, Incheon 22212, Korea
| | - Kyoung-Eun Lee
- Ecology and Future Research Institute, 45 Dusilo, Geumjeong-gu, Busan 46228, Korea
| | - Jae Woo Lee
- Department of Physics, Inha University, Incheon 22212, Korea
| |
Collapse
|
4
|
Wang R, Fan Y, Wu Y. Spontaneous electromagnetic induction promotes the formation of economical neuronal network structure via self-organization process. Sci Rep 2019; 9:9698. [PMID: 31273270 PMCID: PMC6609776 DOI: 10.1038/s41598-019-46104-z] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2018] [Accepted: 06/24/2019] [Indexed: 12/16/2022] Open
Abstract
Developed through evolution, brain neural system self-organizes into an economical and dynamic network structure with the modulation of repetitive neuronal firing activities through synaptic plasticity. These highly variable electric activities inevitably produce a spontaneous magnetic field, which also significantly modulates the dynamic neuronal behaviors in the brain. However, how this spontaneous electromagnetic induction affects the self-organization process and what is its role in the formation of an economical neuronal network still have not been reported. Here, we investigate the effects of spontaneous electromagnetic induction on the self-organization process and the topological properties of the self-organized neuronal network. We first find that spontaneous electromagnetic induction slows down the self-organization process of the neuronal network by decreasing the neuronal excitability. In addition, spontaneous electromagnetic induction can result in a more homogeneous directed-weighted network structure with lower causal relationship and less modularity which supports weaker neuronal synchronization. Furthermore, we show that spontaneous electromagnetic induction can reconfigure synaptic connections to optimize the economical connectivity pattern of self-organized neuronal networks, endowing it with enhanced local and global efficiency from the perspective of graph theory. Our results reveal the critical role of spontaneous electromagnetic induction in the formation of an economical self-organized neuronal network and are also helpful for understanding the evolution of the brain neural system.
Collapse
Affiliation(s)
- Rong Wang
- College of Science, Xi'an University of Science and Technology, Xi'an, 710054, China.
| | - Yongchen Fan
- State Key Laboratory for Strength and Vibration of Mechanical Structures, Shaanxi Engineering Laboratory for Vibration Control of Aerospace Structures, School of Aerospace, Xi'an Jiaotong University, Xi'an, 710049, China
| | - Ying Wu
- State Key Laboratory for Strength and Vibration of Mechanical Structures, Shaanxi Engineering Laboratory for Vibration Control of Aerospace Structures, School of Aerospace, Xi'an Jiaotong University, Xi'an, 710049, China
| |
Collapse
|
5
|
Optimal Microbiome Networks: Macroecology and Criticality. ENTROPY 2019; 21:e21050506. [PMID: 33267220 PMCID: PMC7514995 DOI: 10.3390/e21050506] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/25/2019] [Revised: 05/04/2019] [Accepted: 05/13/2019] [Indexed: 12/11/2022]
Abstract
The human microbiome is an extremely complex ecosystem considering the number of bacterial species, their interactions, and its variability over space and time. Here, we untangle the complexity of the human microbiome for the Irritable Bowel Syndrome (IBS) that is the most prevalent functional gastrointestinal disorder in human populations. Based on a novel information theoretic network inference model, we detected potential species interaction networks that are functionally and structurally different for healthy and unhealthy individuals. Healthy networks are characterized by a neutral symmetrical pattern of species interactions and scale-free topology versus random unhealthy networks. We detected an inverse scaling relationship between species total outgoing information flow, meaningful of node interactivity, and relative species abundance (RSA). The top ten interacting species are also the least relatively abundant for the healthy microbiome and the most detrimental. These findings support the idea about the diminishing role of network hubs and how these should be defined considering the total outgoing information flow rather than the node degree. Macroecologically, the healthy microbiome is characterized by the highest Pareto total species diversity growth rate, the lowest species turnover, and the smallest variability of RSA for all species. This result challenges current views that posit a universal association between healthy states and the highest absolute species diversity in ecosystems. Additionally, we show how the transitory microbiome is unstable and microbiome criticality is not necessarily at the phase transition between healthy and unhealthy states. We stress the importance of considering portfolios of interacting pairs versus single node dynamics when characterizing the microbiome and of ranking these pairs in terms of their interactions (i.e., species collective behavior) that shape transition from healthy to unhealthy states. The macroecological characterization of the microbiome is useful for public health and disease diagnosis and etiognosis, while species-specific analyses can detect beneficial species leading to personalized design of pre- and probiotic treatments and microbiome engineering.
Collapse
|
6
|
Optimizing information processing in neuronal networks beyond critical states. PLoS One 2017; 12:e0184367. [PMID: 28922366 PMCID: PMC5603180 DOI: 10.1371/journal.pone.0184367] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2017] [Accepted: 08/22/2017] [Indexed: 11/19/2022] Open
Abstract
Critical dynamics have been postulated as an ideal regime for neuronal networks in the brain, considering optimal dynamic range and information processing. Herein, we focused on how information entropy encoded in spatiotemporal activity patterns may vary in critical networks. We employed branching process based models to investigate how entropy can be embedded in spatiotemporal patterns. We determined that the information capacity of critical networks may vary depending on the manipulation of microscopic parameters. Specifically, the mean number of connections governed the number of spatiotemporal patterns in the networks. These findings are compatible with those of the real neuronal networks observed in specific brain circuitries, where critical behavior is necessary for the optimal dynamic range response but the uncertainty provided by high entropy as coded by spatiotemporal patterns is not required. With this, we were able to reveal that information processing can be optimized in neuronal networks beyond critical states.
Collapse
|
7
|
Li X, Chen Q, Xue F. Biological modelling of a computational spiking neural network with neuronal avalanches. PHILOSOPHICAL TRANSACTIONS. SERIES A, MATHEMATICAL, PHYSICAL, AND ENGINEERING SCIENCES 2017; 375:20160286. [PMID: 28507231 PMCID: PMC5434077 DOI: 10.1098/rsta.2016.0286] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 12/12/2016] [Indexed: 05/24/2023]
Abstract
In recent years, an increasing number of studies have demonstrated that networks in the brain can self-organize into a critical state where dynamics exhibit a mixture of ordered and disordered patterns. This critical branching phenomenon is termed neuronal avalanches. It has been hypothesized that the homeostatic level balanced between stability and plasticity of this critical state may be the optimal state for performing diverse neural computational tasks. However, the critical region for high performance is narrow and sensitive for spiking neural networks (SNNs). In this paper, we investigated the role of the critical state in neural computations based on liquid-state machines, a biologically plausible computational neural network model for real-time computing. The computational performance of an SNN when operating at the critical state and, in particular, with spike-timing-dependent plasticity for updating synaptic weights is investigated. The network is found to show the best computational performance when it is subjected to critical dynamic states. Moreover, the active-neuron-dominant structure refined from synaptic learning can remarkably enhance the robustness of the critical state and further improve computational accuracy. These results may have important implications in the modelling of spiking neural networks with optimal computational performance.This article is part of the themed issue 'Mathematical methods in medicine: neuroscience, cardiology and pathology'.
Collapse
Affiliation(s)
- Xiumin Li
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, People's Republic of China
- College of Automation, Chongqing University, Chongqing 400044, People's Republic of China
| | - Qing Chen
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, People's Republic of China
- College of Automation, Chongqing University, Chongqing 400044, People's Republic of China
| | - Fangzheng Xue
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, People's Republic of China
- College of Automation, Chongqing University, Chongqing 400044, People's Republic of China
| |
Collapse
|
8
|
Liu H, Song Y, Xue F, Li X. Effects of bursting dynamic features on the generation of multi-clustered structure of neural network with symmetric spike-timing-dependent plasticity learning rule. CHAOS (WOODBURY, N.Y.) 2015; 25:113108. [PMID: 26627568 DOI: 10.1063/1.4935281] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
In this paper, the generation of multi-clustered structure of self-organized neural network with different neuronal firing patterns, i.e., bursting or spiking, has been investigated. The initially all-to-all-connected spiking neural network or bursting neural network can be self-organized into clustered structure through the symmetric spike-timing-dependent plasticity learning for both bursting and spiking neurons. However, the time consumption of this clustering procedure of the burst-based self-organized neural network (BSON) is much shorter than the spike-based self-organized neural network (SSON). Our results show that the BSON network has more obvious small-world properties, i.e., higher clustering coefficient and smaller shortest path length than the SSON network. Also, the results of larger structure entropy and activity entropy of the BSON network demonstrate that this network has higher topological complexity and dynamical diversity, which benefits for enhancing information transmission of neural circuits. Hence, we conclude that the burst firing can significantly enhance the efficiency of clustering procedure and the emergent clustered structure renders the whole network more synchronous and therefore more sensitive to weak input. This result is further confirmed from its improved performance on stochastic resonance. Therefore, we believe that the multi-clustered neural network which self-organized from the bursting dynamics has high efficiency in information processing.
Collapse
Affiliation(s)
- Hui Liu
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, China
| | - Yongduan Song
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, China
| | - Fangzheng Xue
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, China
| | - Xiumin Li
- Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044, China
| |
Collapse
|
9
|
Yu H, Guo X, Wang J, Deng B, Wei X. Effects of spike-time-dependent plasticity on the stochastic resonance of small-world neuronal networks. CHAOS (WOODBURY, N.Y.) 2014; 24:033125. [PMID: 25273205 DOI: 10.1063/1.4893773] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
The phenomenon of stochastic resonance in Newman-Watts small-world neuronal networks is investigated when the strength of synaptic connections between neurons is adaptively adjusted by spike-time-dependent plasticity (STDP). It is shown that irrespective of the synaptic connectivity is fixed or adaptive, the phenomenon of stochastic resonance occurs. The efficiency of network stochastic resonance can be largely enhanced by STDP in the coupling process. Particularly, the resonance for adaptive coupling can reach a much larger value than that for fixed one when the noise intensity is small or intermediate. STDP with dominant depression and small temporal window ratio is more efficient for the transmission of weak external signal in small-world neuronal networks. In addition, we demonstrate that the effect of stochastic resonance can be further improved via fine-tuning of the average coupling strength of the adaptive network. Furthermore, the small-world topology can significantly affect stochastic resonance of excitable neuronal networks. It is found that there exists an optimal probability of adding links by which the noise-induced transmission of weak periodic signal peaks.
Collapse
Affiliation(s)
- Haitao Yu
- School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072, People's Republic of China
| | - Xinmeng Guo
- School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072, People's Republic of China
| | - Jiang Wang
- School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072, People's Republic of China
| | - Bin Deng
- School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072, People's Republic of China
| | - Xile Wei
- School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072, People's Republic of China
| |
Collapse
|