1
|
Puttkammer F, Lindner B. Fluctuation-response relations for integrate-and-fire models with an absolute refractory period. BIOLOGICAL CYBERNETICS 2024; 118:7-19. [PMID: 38261004 PMCID: PMC11068698 DOI: 10.1007/s00422-023-00982-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 12/11/2023] [Indexed: 01/24/2024]
Abstract
We study the problem of relating the spontaneous fluctuations of a stochastic integrate-and-fire (IF) model to the response of the instantaneous firing rate to time-dependent stimulation if the IF model is endowed with a non-vanishing refractory period and a finite (stereotypical) spike shape. This seemingly harmless addition to the model is shown to complicate the analysis put forward by Lindner Phys. Rev. Lett. (2022), i.e., the incorporation of the reset into the model equation, the Rice-like averaging of the stochastic differential equation, and the application of the Furutsu-Novikov theorem. We derive a still exact (although more complicated) fluctuation-response relation (FRR) for an IF model with refractory state and a white Gaussian background noise. We also briefly discuss an approximation for the case of a colored Gaussian noise and conclude with a summary and outlook on open problems.
Collapse
Affiliation(s)
- Friedrich Puttkammer
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| |
Collapse
|
2
|
Papo D, Buldú JM. Does the brain behave like a (complex) network? I. Dynamics. Phys Life Rev 2024; 48:47-98. [PMID: 38145591 DOI: 10.1016/j.plrev.2023.12.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Accepted: 12/10/2023] [Indexed: 12/27/2023]
Abstract
Graph theory is now becoming a standard tool in system-level neuroscience. However, endowing observed brain anatomy and dynamics with a complex network structure does not entail that the brain actually works as a network. Asking whether the brain behaves as a network means asking whether network properties count. From the viewpoint of neurophysiology and, possibly, of brain physics, the most substantial issues a network structure may be instrumental in addressing relate to the influence of network properties on brain dynamics and to whether these properties ultimately explain some aspects of brain function. Here, we address the dynamical implications of complex network, examining which aspects and scales of brain activity may be understood to genuinely behave as a network. To do so, we first define the meaning of networkness, and analyse some of its implications. We then examine ways in which brain anatomy and dynamics can be endowed with a network structure and discuss possible ways in which network structure may be shown to represent a genuine organisational principle of brain activity, rather than just a convenient description of its anatomy and dynamics.
Collapse
Affiliation(s)
- D Papo
- Department of Neuroscience and Rehabilitation, Section of Physiology, University of Ferrara, Ferrara, Italy; Center for Translational Neurophysiology, Fondazione Istituto Italiano di Tecnologia, Ferrara, Italy.
| | - J M Buldú
- Complex Systems Group & G.I.S.C., Universidad Rey Juan Carlos, Madrid, Spain
| |
Collapse
|
3
|
Yamaguchi YY, Terada Y. Reconstruction of phase dynamics from macroscopic observations based on linear and nonlinear response theories. Phys Rev E 2024; 109:024217. [PMID: 38491619 DOI: 10.1103/physreve.109.024217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2023] [Accepted: 01/22/2024] [Indexed: 03/18/2024]
Abstract
We propose a method to reconstruct the phase dynamics in rhythmical interacting systems from macroscopic responses to weak inputs by developing linear and nonlinear response theories, which predict the responses in a given system. By solving an inverse problem, the method infers an unknown system: the natural frequency distribution, the coupling function, and the time delay which is inevitable in real systems. In contrast to previous methods, our method requires neither strong invasiveness nor microscopic observations. We demonstrate that the method reconstructs two phase systems from observed responses accurately. The qualitative methodological advantages demonstrated by our quantitative numerical examinations suggest its broad applicability in various fields, including brain systems, which are often observed through macroscopic signals such as electroencephalograms and functional magnetic response imaging.
Collapse
Affiliation(s)
| | - Yu Terada
- Department of Neurobiology, University of California San Diego, La Jolla, California 92093, USA
- Institute for Physics of Intelligence, Department of Physics, Graduate School of Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033, Japan
- Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, 2-1 Hirosawa, Wako, Saitama 351-0198, Japan
| |
Collapse
|
4
|
Fan X, Zhang H, Zhang Y. IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation. Biomimetics (Basel) 2023; 8:375. [PMID: 37622980 PMCID: PMC10452895 DOI: 10.3390/biomimetics8040375] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Revised: 08/03/2023] [Accepted: 08/15/2023] [Indexed: 08/26/2023] Open
Abstract
Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively, caused by the direct training and conversion from artificial neural network (ANN) training methods. Aiming to address these limitations, we propose a novel training pipeline (called IDSNN) based on parameter initialization and knowledge distillation, using ANN as a parameter source and teacher. IDSNN maximizes the knowledge extracted from ANNs and achieves competitive top-1 accuracy for CIFAR10 (94.22%) and CIFAR100 (75.41%) with low latency. More importantly, it can achieve 14× faster convergence speed than directly training SNNs under limited training resources, which demonstrates its practical value in applications.
Collapse
Affiliation(s)
- Xiongfei Fan
- State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China; (X.F.); (H.Z.)
| | - Hong Zhang
- State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China; (X.F.); (H.Z.)
| | - Yu Zhang
- State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China; (X.F.); (H.Z.)
- Key Laboratory of Collaborative Sensing and Autonomous Unmanned Systems of Zhejiang Province, Hangzhou 310027, China
| |
Collapse
|
5
|
Cimeša L, Ciric L, Ostojic S. Geometry of population activity in spiking networks with low-rank structure. PLoS Comput Biol 2023; 19:e1011315. [PMID: 37549194 PMCID: PMC10461857 DOI: 10.1371/journal.pcbi.1011315] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Revised: 08/28/2023] [Accepted: 06/27/2023] [Indexed: 08/09/2023] Open
Abstract
Recurrent network models are instrumental in investigating how behaviorally-relevant computations emerge from collective neural dynamics. A recently developed class of models based on low-rank connectivity provides an analytically tractable framework for understanding of how connectivity structure determines the geometry of low-dimensional dynamics and the ensuing computations. Such models however lack some fundamental biological constraints, and in particular represent individual neurons in terms of abstract units that communicate through continuous firing rates rather than discrete action potentials. Here we examine how far the theoretical insights obtained from low-rank rate networks transfer to more biologically plausible networks of spiking neurons. Adding a low-rank structure on top of random excitatory-inhibitory connectivity, we systematically compare the geometry of activity in networks of integrate-and-fire neurons to rate networks with statistically equivalent low-rank connectivity. We show that the mean-field predictions of rate networks allow us to identify low-dimensional dynamics at constant population-average activity in spiking networks, as well as novel non-linear regimes of activity such as out-of-phase oscillations and slow manifolds. We finally exploit these results to directly build spiking networks that perform nonlinear computations.
Collapse
Affiliation(s)
- Ljubica Cimeša
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Lazar Ciric
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives Computationnelles, Département d’Études Cognitives, École Normale Supérieure, INSERM U960, PSL University, Paris, France
| |
Collapse
|
6
|
Lameu EL, Rasiah NP, Baimoukhametova DV, Loewen SP, Bains JS, Nicola W. Particle-swarm based modelling reveals two distinct classes of CRH PVN neurons. J Physiol 2023; 601:3151-3171. [PMID: 36223200 DOI: 10.1113/jp283133] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Accepted: 09/28/2022] [Indexed: 11/08/2022] Open
Abstract
Electrophysiological recordings can provide detailed information of single neurons' dynamical features and shed light on their response to stimuli. Unfortunately, rapidly modelling electrophysiological data for inferring network-level behaviours remains challenging. Here, we investigate how modelled single neuron dynamics leads to network-level responses in the paraventricular nucleus of the hypothalamus (PVN), a critical nucleus for the mammalian stress response. Recordings of corticotropin releasing hormone neurons from the PVN (CRHPVN ) were performed using whole-cell current-clamp. These, neurons, which initiate the endocrine response to stress, were rapidly and automatically fit to a modified adaptive exponential integrate-and-fire model (AdEx) with particle swarm optimization (PSO). All CRHPVN neurons were accurately fit by the AdEx model with PSO. Multiple sets of parameters were found that reliably reproduced current-clamp traces for any single neuron. Despite multiple solutions, the dynamical features of the models such as the rheobase, fixed points, and bifurcations, were shown to be stable across fits. We found that CRHPVN neurons can be divided into two subtypes according to their bifurcation at the onset of firing: CRHPVN -integrators and CRHPVN -resonators. The existence of CRHPVN -resonators was then directly confirmed in a follow-up patch-clamp hyperpolarization protocol which readily induced post-inhibitory rebound spiking in 33% of patched neurons. We constructed networks of CRHPVN model neurons to investigate the network level responses of CRHPVN neurons. We found that CRHPVN -resonators maintain baseline firing in networks even when all inputs are inhibitory. The dynamics of a small subset of CRHPVN neurons may be critical to maintaining a baseline firing tone in the PVN. KEY POINTS: Corticotropin-releasing hormone neurons (CRHPVN ) in the paraventricular nucleus of the hypothalamus act as the final neural controllers of the stress response. We developed a computational modelling platform that uses particle swarm optimization to rapidly and accurately fit biophysical neuron models to patched CRHPVN neurons. A model was fitted to each patched neuron without the use of dynamic clamping, or other procedures requiring sophisticated inputs and fitting algorithms. Any neuron undergoing standard current clamp step protocols for a few minutes can be fitted by this procedure The dynamical analysis of the modelled neurons shows that CRHPVN neurons come in two specific 'flavours': CRHPVN -resonators and CRHPVN -integrators. We directly confirmed the existence of these two classes of CRHPVN neurons in subsequent experiments. Network simulations show that CRHPVN -resonators are critical to retaining the baseline firing rate of the entire network of CRHPVN neurons as these cells can fire rebound spikes and bursts in the presence of strong inhibitory synaptic input.
Collapse
Affiliation(s)
- Ewandson L Lameu
- Cell Biology and Anatomy Department, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Neilen P Rasiah
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Dinara V Baimoukhametova
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Spencer P Loewen
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Jaideep S Bains
- Department of Physiology and Pharmacology, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| | - Wilten Nicola
- Cell Biology and Anatomy Department, Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, Canada
| |
Collapse
|
7
|
Levenstein D, Okun M. Logarithmically scaled, gamma distributed neuronal spiking. J Physiol 2023; 601:3055-3069. [PMID: 36086892 PMCID: PMC10952267 DOI: 10.1113/jp282758] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2022] [Accepted: 07/28/2022] [Indexed: 11/08/2022] Open
Abstract
Naturally log-scaled quantities abound in the nervous system. Distributions of these quantities have non-intuitive properties, which have implications for data analysis and the understanding of neural circuits. Here, we review the log-scaled statistics of neuronal spiking and the relevant analytical probability distributions. Recent work using log-scaling revealed that interspike intervals of forebrain neurons segregate into discrete modes reflecting spiking at different timescales and are each well-approximated by a gamma distribution. Each neuron spends most of the time in an irregular spiking 'ground state' with the longest intervals, which determines the mean firing rate of the neuron. Across the entire neuronal population, firing rates are log-scaled and well approximated by the gamma distribution, with a small number of highly active neurons and an overabundance of low rate neurons (the 'dark matter'). These results are intricately linked to a heterogeneous balanced operating regime, which confers upon neuronal circuits multiple computational advantages and has evolutionarily ancient origins.
Collapse
Affiliation(s)
- Daniel Levenstein
- Department of Neurology and NeurosurgeryMcGill UniversityMontrealQCCanada
- MilaMontréalQCCanada
| | - Michael Okun
- Department of Psychology and Neuroscience InstituteUniversity of SheffieldSheffieldUK
| |
Collapse
|
8
|
Hutt A, Rich S, Valiante TA, Lefebvre J. Intrinsic neural diversity quenches the dynamic volatility of neural networks. Proc Natl Acad Sci U S A 2023; 120:e2218841120. [PMID: 37399421 PMCID: PMC10334753 DOI: 10.1073/pnas.2218841120] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Accepted: 05/19/2023] [Indexed: 07/05/2023] Open
Abstract
Heterogeneity is the norm in biology. The brain is no different: Neuronal cell types are myriad, reflected through their cellular morphology, type, excitability, connectivity motifs, and ion channel distributions. While this biophysical diversity enriches neural systems' dynamical repertoire, it remains challenging to reconcile with the robustness and persistence of brain function over time (resilience). To better understand the relationship between excitability heterogeneity (variability in excitability within a population of neurons) and resilience, we analyzed both analytically and numerically a nonlinear sparse neural network with balanced excitatory and inhibitory connections evolving over long time scales. Homogeneous networks demonstrated increases in excitability, and strong firing rate correlations-signs of instability-in response to a slowly varying modulatory fluctuation. Excitability heterogeneity tuned network stability in a context-dependent way by restraining responses to modulatory challenges and limiting firing rate correlations, while enriching dynamics during states of low modulatory drive. Excitability heterogeneity was found to implement a homeostatic control mechanism enhancing network resilience to changes in population size, connection probability, strength and variability of synaptic weights, by quenching the volatility (i.e., its susceptibility to critical transitions) of its dynamics. Together, these results highlight the fundamental role played by cell-to-cell heterogeneity in the robustness of brain function in the face of change.
Collapse
Affiliation(s)
- Axel Hutt
- Université de Strasbourg, CNRS, Inria, ICube, MLMS, MIMESIS, StrasbourgF-67000, France
| | - Scott Rich
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
| | - Taufik A. Valiante
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Electrical and Computer Engineering, University of Toronto, Toronto, ONM5S 3G8, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ONM5S 3G9, Canada
- Institute of Medical Sciences, University of Toronto, Toronto, ONM5S 1A8, Canada
- Division of Neurosurgery, Department of Surgery, University of Toronto, Toronto, ONM5G 2C4, Canada
- Center for Advancing Neurotechnological Innovation to Application, University of Toronto, Toronto, ONM5G 2A2, Canada
- Max Planck-University of Toronto Center for Neural Science and Technology, University of Toronto, Toronto, ONM5S 3G8, Canada
| | - Jérémie Lefebvre
- Krembil Brain Institute, Division of Clinical and Computational Neuroscience, University Health Network, Toronto, ONM5T 0S8, Canada
- Department of Biology, University of Ottawa, Ottawa, ONK1N 6N5, Canada
- Department of Mathematics, University of Toronto, Toronto, ONM5S 2E4, Canada
| |
Collapse
|
9
|
Zhang R, Nie Y, Dai W, Wang S, Geng X. Balance between pallidal neural oscillations correlated with dystonic activity and severity. Neurobiol Dis 2023:106178. [PMID: 37268239 DOI: 10.1016/j.nbd.2023.106178] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2023] [Revised: 05/14/2023] [Accepted: 05/28/2023] [Indexed: 06/04/2023] Open
Abstract
BACKGROUND AND OBJECTIVE The balance between neural oscillations provides valuable insights into the organisation of neural oscillations related to brain states, which may play important roles in dystonia. We aim to investigate the relationship of the balance in the globus pallidus internus (GPi) with the dystonic severity under different muscular contraction conditions. METHODS Twenty-one patients with dystonia were recruited. All of them underwent bilateral GPi implantation, and local field potentials (LFPs) from the GPi were recorded via simultaneous surface electromyography. The power spectral ratio between neural oscillations was computed as the measure of neural balance. This ratio was calculated under high and low dystonic muscular contraction conditions, and its correlation with the dystonic severity was assessed using clinical scores. RESULTS The power spectral of the pallidal LFPs peaked in the theta and alpha bands. Within participant comparison showed that the power spectral of the theta oscillations significantly increased during high muscle contraction compared with that during low contraction. The power spectral ratios between the theta and alpha, theta and low beta, and theta and high gamma oscillations were significantly higher during high contraction than during low contraction. The total score and motor score were associated with the power spectral ratio between the low and high beta oscillations, which was correlated with the dystonic severity both during high and low contractions. The power spectral ratios between the low beta and low gamma and between the low beta and high gamma oscillations showed a significantly positive correlation with the total score during both high and low contractions; a correlation with the motor scale score was found only during high contraction. Meanwhile, the power spectral ratio between the theta and alpha oscillations during low contraction showed a significantly negative correlation with the total score. The power spectral ratios between the alpha and high beta, alpha and low gamma, and alpha and high gamma oscillations were significantly correlated with the dystonic severity only during low contraction. CONCLUSION The balance between neural oscillations, as quantified by the power ratio between specific frequency bands, differed between the high and low muscular contraction conditions and was correlated with the dystonic severity. The balance between the low and high beta oscillations was correlated with the dystonic severity during both conditions, making this parameter a new possible biomarker for closed-loop deep brain stimulation in patients with dystonia.
Collapse
Affiliation(s)
- Ruili Zhang
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, China; MOE Frontiers Center for Brain Science, Fudan University, Shanghai, China; Zhangjiang Fudan International Innovation Center, Shanghai, China
| | - Yingnan Nie
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, China; MOE Frontiers Center for Brain Science, Fudan University, Shanghai, China; Zhangjiang Fudan International Innovation Center, Shanghai, China
| | - Wen Dai
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, China; MOE Frontiers Center for Brain Science, Fudan University, Shanghai, China; Zhangjiang Fudan International Innovation Center, Shanghai, China
| | - Shouyan Wang
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, China; MOE Frontiers Center for Brain Science, Fudan University, Shanghai, China; Zhangjiang Fudan International Innovation Center, Shanghai, China; Shanghai Engineering Research Center of AI & Robotics, Fudan University, Shanghai, China; Engineering Research Center of AI & Robotics, Ministry of Education, Fudan University, Shanghai, China
| | - Xinyi Geng
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Fudan University, Ministry of Education, China; MOE Frontiers Center for Brain Science, Fudan University, Shanghai, China; Zhangjiang Fudan International Innovation Center, Shanghai, China.
| |
Collapse
|
10
|
Ranft J, Lindner B. Theory of the asynchronous state of structured rotator networks and its application to recurrent networks of excitatory and inhibitory units. Phys Rev E 2023; 107:044306. [PMID: 37198857 DOI: 10.1103/physreve.107.044306] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2022] [Accepted: 03/28/2023] [Indexed: 05/19/2023]
Abstract
Recurrently coupled oscillators that are sufficiently heterogeneous and/or randomly coupled can show an asynchronous activity in which there are no significant correlations among the units of the network. The asynchronous state can nevertheless exhibit a rich temporal correlation statistics that is generally difficult to capture theoretically. For randomly coupled rotator networks, it is possible to derive differential equations that determine the autocorrelation functions of the network noise and of the single elements in the network. So far, the theory has been restricted to statistically homogeneous networks, making it difficult to apply this framework to real-world networks, which are structured with respect to the properties of the single units and their connectivity. A particularly striking case are neural networks for which one has to distinguish between excitatory and inhibitory neurons, which drive their target neurons towards or away from the firing threshold. To take into account network structures like that, here we extend the theory for rotator networks to the case of multiple populations. Specifically, we derive a system of differential equations that govern the self-consistent autocorrelation functions of the network fluctuations in the respective populations. We then apply this general theory to the special but important case of recurrent networks of excitatory and inhibitory units in the balanced case and compare our theory to numerical simulations. We inspect the effect of the network structure on the noise statistics by comparing our results to the case of an equivalent homogeneous network devoid of internal structure. Our results show that structured connectivity and heterogeneity of the oscillator type can both enhance or reduce the overall strength of the generated network noise and shape its temporal correlations.
Collapse
Affiliation(s)
- Jonas Ranft
- Institut de Biologie de l'ENS, Ecole Normale Supérieure, CNRS, Inserm, Université PSL, 46 rue d'Ulm, 75005 Paris, France
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
11
|
Borges FS, Gabrick EC, Protachevicz PR, Higa GSV, Lameu EL, Rodriguez PXR, Ferraz MSA, Szezech JD, Batista AM, Kihara AH. Intermittency properties in a temporal lobe epilepsy model. Epilepsy Behav 2023; 139:109072. [PMID: 36652897 DOI: 10.1016/j.yebeh.2022.109072] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/05/2022] [Revised: 12/22/2022] [Accepted: 12/26/2022] [Indexed: 01/18/2023]
Abstract
Neuronal synchronization is important for communication between brain regions and plays a key role in learning. However, changes in connectivity can lead to hyper-synchronized states related to epileptic seizures that occur intermittently with asynchronous states. The activity-regulated cytoskeleton-associated protein (ARC) is related to synaptic alterations which can lead to epilepsy. Induction of status epilepticus in rodent models causes the appearance of intense ARC immunoreactive neurons (IAINs), which present a higher number of connections and conductance intensity than non-IAINs. This alteration might contribute to abnormal epileptic seizure activity. In this work, we investigated how IAINs connectivity influences the firing pattern and synchronization in neural networks. Firstly, we showed the appearance of synchronized burst patterns due to the emergence of IAINs. Second, we described how the increase of IAINs connectivity favors the appearance of intermittent up and down activities associated with synchronous bursts and asynchronous spikes, respectively. Once the intermittent activity was properly characterized, we applied the optogenetics control of the high synchronous activities in the intermittent regime. To do this, we considered that 1% of neurons were transfected and became photosensitive. We observed that optogenetics methods to control synchronized burst patterns are effective when IAINs are chosen as photosensitive, but not effective in non-IAINs. Therefore, our analyses suggest that IAINs play a pivotal role in both the generation and suppression of highly synchronized activities.
Collapse
Affiliation(s)
- F S Borges
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, Brooklyn, NY, USA; Center for Mathematics, Computation, and Cognition, Federal University of ABC, São Bernardo do Campo, SP, Brazil.
| | - E C Gabrick
- Graduate in Science Program - Physics, State University of Ponta Grossa, Ponta Grossa, PR, Brazil
| | - P R Protachevicz
- Institute of Physics, University of São Paulo, São Paulo, SP, Brazil
| | - G S V Higa
- Center for Mathematics, Computation, and Cognition, Federal University of ABC, São Bernardo do Campo, SP, Brazil; Institute of Chemistry, University of São Paulo, São Paulo, SP, Brazil
| | - E L Lameu
- Snyder Institute for Chronic Diseases, University of Calgary, Calgary, AB, Canada
| | - P X R Rodriguez
- Center for Mathematics, Computation, and Cognition, Federal University of ABC, São Bernardo do Campo, SP, Brazil; Faculty of Medicine, University of Bonn, Bonn, Germany
| | - M S A Ferraz
- Center for Mathematics, Computation, and Cognition, Federal University of ABC, São Bernardo do Campo, SP, Brazil
| | - J D Szezech
- Graduate in Science Program - Physics, State University of Ponta Grossa, Ponta Grossa, PR, Brazil; Department of Mathematics and Statistics, State University of Ponta Grossa, Ponta Grossa, PR, Brazil
| | - A M Batista
- Graduate in Science Program - Physics, State University of Ponta Grossa, Ponta Grossa, PR, Brazil; Institute of Physics, University of São Paulo, São Paulo, SP, Brazil; Department of Mathematics and Statistics, State University of Ponta Grossa, Ponta Grossa, PR, Brazil
| | - A H Kihara
- Center for Mathematics, Computation, and Cognition, Federal University of ABC, São Bernardo do Campo, SP, Brazil.
| |
Collapse
|
12
|
Shao Y, Ostojic S. Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks. PLoS Comput Biol 2023; 19:e1010855. [PMID: 36689488 PMCID: PMC9894562 DOI: 10.1371/journal.pcbi.1010855] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 02/02/2023] [Accepted: 01/06/2023] [Indexed: 01/24/2023] Open
Abstract
How the connectivity of cortical networks determines the neural dynamics and the resulting computations is one of the key questions in neuroscience. Previous works have pursued two complementary approaches to quantify the structure in connectivity. One approach starts from the perspective of biological experiments where only the local statistics of connectivity motifs between small groups of neurons are accessible. Another approach is based instead on the perspective of artificial neural networks where the global connectivity matrix is known, and in particular its low-rank structure can be used to determine the resulting low-dimensional dynamics. A direct relationship between these two approaches is however currently missing. Specifically, it remains to be clarified how local connectivity statistics and the global low-rank connectivity structure are inter-related and shape the low-dimensional activity. To bridge this gap, here we develop a method for mapping local connectivity statistics onto an approximate global low-rank structure. Our method rests on approximating the global connectivity matrix using dominant eigenvectors, which we compute using perturbation theory for random matrices. We demonstrate that multi-population networks defined from local connectivity statistics for which the central limit theorem holds can be approximated by low-rank connectivity with Gaussian-mixture statistics. We specifically apply this method to excitatory-inhibitory networks with reciprocal motifs, and show that it yields reliable predictions for both the low-dimensional dynamics, and statistics of population activity. Importantly, it analytically accounts for the activity heterogeneity of individual neurons in specific realizations of local connectivity. Altogether, our approach allows us to disentangle the effects of mean connectivity and reciprocal motifs on the global recurrent feedback, and provides an intuitive picture of how local connectivity shapes global network dynamics.
Collapse
Affiliation(s)
- Yuxiu Shao
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Superieure—PSL Research University, Paris, France
- * E-mail: (YS); (SO)
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationnelles, INSERM U960, Ecole Normale Superieure—PSL Research University, Paris, France
- * E-mail: (YS); (SO)
| |
Collapse
|
13
|
Mosheiff N, Ermentrout B, Huang C. Chaotic dynamics in spatially distributed neuronal networks generate population-wide shared variability. PLoS Comput Biol 2023; 19:e1010843. [PMID: 36626362 PMCID: PMC9870129 DOI: 10.1371/journal.pcbi.1010843] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 01/23/2023] [Accepted: 12/26/2022] [Indexed: 01/11/2023] Open
Abstract
Neural activity in the cortex is highly variable in response to repeated stimuli. Population recordings across the cortex demonstrate that the variability of neuronal responses is shared among large groups of neurons and concentrates in a low dimensional space. However, the source of the population-wide shared variability is unknown. In this work, we analyzed the dynamical regimes of spatially distributed networks of excitatory and inhibitory neurons. We found chaotic spatiotemporal dynamics in networks with similar excitatory and inhibitory projection widths, an anatomical feature of the cortex. The chaotic solutions contain broadband frequency power in rate variability and have distance-dependent and low-dimensional correlations, in agreement with experimental findings. In addition, rate chaos can be induced by globally correlated noisy inputs. These results suggest that spatiotemporal chaos in cortical networks can explain the shared variability observed in neuronal population responses.
Collapse
Affiliation(s)
- Noga Mosheiff
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, United States of America
| | - Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Chengcheng Huang
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
- Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, United States of America
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
14
|
Protachevicz PR, Bonin CA, Iarosz KC, Caldas IL, Batista AM. Large coefficient of variation of inter-spike intervals induced by noise current in the resonate-and-fire model neuron. Cogn Neurodyn 2022; 16:1461-1470. [PMID: 36408063 PMCID: PMC9666614 DOI: 10.1007/s11571-022-09789-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 02/03/2022] [Accepted: 02/08/2022] [Indexed: 11/26/2022] Open
Abstract
Neuronal spike variability is a statistical property associated with the noise environment. Considering a linearised Hodgkin-Huxley model, we investigate how large spike variability can be induced in a typical stellate cell when submitted to constant and noise current amplitudes. For low noise current, we observe only periodic firing (active) or silence activities. For intermediate noise values, in addition to only active or inactive periods, we also identify a single transition from an initial spike-train (active) to silence dynamics over time, where the spike variability is low. However, for high noise current, we find intermittent active and silence periods with different values. The spike intervals during active and silent states follow the exponential distribution, which is similar to the Poisson process. For non-maximal noise current, we observe the highest values of inter-spike variability. Our results suggest sub-threshold oscillations as a possible mechanism for the appearance of high spike variability in a single neuron due to noise currents.
Collapse
Affiliation(s)
| | - C. A. Bonin
- Department of Mathematics and Statistics, State University of Ponta Grossa, Ponta Grossa, Brazil
| | - K. C. Iarosz
- Engineering Department, Faculdade de Telêmaco Borba, Telêmaco Borba, Brazil
| | - I. L. Caldas
- Institute of Physics, University of São Paulo, São Paulo, Brazil
| | - A. M. Batista
- Department of Mathematics and Statistics, State University of Ponta Grossa, Ponta Grossa, Brazil
| |
Collapse
|
15
|
Paradoxical self-sustained dynamics emerge from orchestrated excitatory and inhibitory homeostatic plasticity rules. Proc Natl Acad Sci U S A 2022; 119:e2200621119. [PMID: 36251988 PMCID: PMC9618084 DOI: 10.1073/pnas.2200621119] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/03/2022] Open
Abstract
Cortical networks have the remarkable ability to self-assemble into dynamic regimes in which excitatory positive feedback is balanced by recurrent inhibition. This inhibition-stabilized regime is increasingly viewed as the default dynamic regime of the cortex, but how it emerges in an unsupervised manner remains unknown. We prove that classic forms of homeostatic plasticity are unable to drive recurrent networks to an inhibition-stabilized regime due to the well-known paradoxical effect. We next derive a novel family of cross-homeostatic rules that lead to the unsupervised emergence of inhibition-stabilized networks. These rules shed new light on how the brain may reach its default dynamic state and provide a valuable tool to self-assemble artificial neural networks into ideal computational regimes. Self-sustained neural activity maintained through local recurrent connections is of fundamental importance to cortical function. Converging theoretical and experimental evidence indicates that cortical circuits generating self-sustained dynamics operate in an inhibition-stabilized regime. Theoretical work has established that four sets of weights (WE←E, WE←I, WI←E, and WI←I) must obey specific relationships to produce inhibition-stabilized dynamics, but it is not known how the brain can appropriately set the values of all four weight classes in an unsupervised manner to be in the inhibition-stabilized regime. We prove that standard homeostatic plasticity rules are generally unable to generate inhibition-stabilized dynamics and that their instability is caused by a signature property of inhibition-stabilized networks: the paradoxical effect. In contrast, we show that a family of “cross-homeostatic” rules overcome the paradoxical effect and robustly lead to the emergence of stable dynamics. This work provides a model of how—beginning from a silent network—self-sustained inhibition-stabilized dynamics can emerge from learning rules governing all four synaptic weight classes in an orchestrated manner.
Collapse
|
16
|
Suryadi, Cheng RK, Birkett E, Jesuthasan S, Chew LY. Dynamics and potential significance of spontaneous activity in the habenula. eNeuro 2022; 9:ENEURO.0287-21.2022. [PMID: 35981869 PMCID: PMC9450562 DOI: 10.1523/eneuro.0287-21.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2021] [Revised: 05/31/2022] [Accepted: 06/27/2022] [Indexed: 11/21/2022] Open
Abstract
The habenula is an evolutionarily conserved structure of the vertebrate brain that is essential for behavioural flexibility and mood control. It is spontaneously active and is able to access diverse states when the animal is exposed to sensory stimuli. Here we investigate the dynamics of habenula spontaneous activity, to gain insight into how sensitivity is optimized. Two-photon calcium imaging was performed in resting zebrafish larvae at single cell resolution. An analysis of avalanches of inferred spikes suggests that the habenula is subcritical. Activity had low covariance and a small mean, arguing against dynamic criticality. A multiple regression estimator of autocorrelation time suggests that the habenula is neither fully asynchronous nor perfectly critical, but is reverberating. This pattern of dynamics may enable integration of information and high flexibility in the tuning of network properties, thus providing a potential mechanism for the optimal responses to a changing environment.Significance StatementSpontaneous activity in neurons shapes the response to stimuli. One structure with a high level of spontaneous neuronal activity is the habenula, a regulator of broadly acting neuromodulators involved in mood and learning. How does this activity influence habenula function? We show here that the habenula of a resting animal is near criticality, in a state termed reverberation. This pattern of dynamics is consistent with high sensitivity and flexibility, and may enable the habenula to respond optimally to a wide range of stimuli.
Collapse
Affiliation(s)
- Suryadi
- School of Physical & Mathematical Sciences, Nanyang Technological University, Singapore 637371
| | - Ruey-Kuang Cheng
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore 636921
| | - Elliot Birkett
- Institute of Molecular and Cell Biology, Singapore 138673
- School of Biosciences, University of Sheffield, Sheffield, United Kingdom
| | - Suresh Jesuthasan
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore 636921
- Institute of Molecular and Cell Biology, Singapore 138673
| | - Lock Yue Chew
- School of Physical & Mathematical Sciences, Nanyang Technological University, Singapore 637371
- Complexity Institute, Nanyang Technological University, Singapore 637335
| |
Collapse
|
17
|
Herbert E, Ostojic S. The impact of sparsity in low-rank recurrent neural networks. PLoS Comput Biol 2022; 18:e1010426. [PMID: 35944030 PMCID: PMC9390915 DOI: 10.1371/journal.pcbi.1010426] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2022] [Revised: 08/19/2022] [Accepted: 07/22/2022] [Indexed: 11/18/2022] Open
Abstract
Neural population dynamics are often highly coordinated, allowing task-related computations to be understood as neural trajectories through low-dimensional subspaces. How the network connectivity and input structure give rise to such activity can be investigated with the aid of low-rank recurrent neural networks, a recently-developed class of computational models which offer a rich theoretical framework linking the underlying connectivity structure to emergent low-dimensional dynamics. This framework has so far relied on the assumption of all-to-all connectivity, yet cortical networks are known to be highly sparse. Here we investigate the dynamics of low-rank recurrent networks in which the connections are randomly sparsified, which makes the network connectivity formally full-rank. We first analyse the impact of sparsity on the eigenvalue spectrum of low-rank connectivity matrices, and use this to examine the implications for the dynamics. We find that in the presence of sparsity, the eigenspectra in the complex plane consist of a continuous bulk and isolated outliers, a form analogous to the eigenspectra of connectivity matrices composed of a low-rank and a full-rank random component. This analogy allows us to characterise distinct dynamical regimes of the sparsified low-rank network as a function of key network parameters. Altogether, we find that the low-dimensional dynamics induced by low-rank connectivity structure are preserved even at high levels of sparsity, and can therefore support rich and robust computations even in networks sparsified to a biologically-realistic extent. In large networks of neurons, the activity displayed by the population depends on the strength of the connections between each neuron. In cortical regions engaged in cognitive tasks, this population activity is often seen to be highly coordinated and low-dimensional. A recent line of theoretical work explores how such coordinated activity can arise in a network of neurons in which the matrix defining the connections is constrained to be mathematically low-rank. Until now, this connectivity structure has only been explored in fully-connected networks, in which every neuron is connected to every other. However, in the brain, network connections are often highly sparse, in the sense that most neurons do not share direct connections. Here, we test the robustness of the theoretical framework of low-rank networks to the reality of sparsity present in biological networks. By mathematically analysing the impact of removing connections, we find that the low-dimensional dynamics previously found in dense low-rank networks can in fact persist even at very high levels of sparsity. This has promising implications for the proposal that complex cortical computations which appear to rely on low-dimensional dynamics may be underpinned by a network which has a fundamentally low-rank structure, albeit with only a small fraction of possible connections present.
Collapse
Affiliation(s)
- Elizabeth Herbert
- Laboratoire de Neurosciences Cognitives et Computationnelles, Département d’Études Cognitives, INSERM U960, École Normale Supérieure - PSL University, Paris, France
- * E-mail: (EH); (SO)
| | - Srdjan Ostojic
- Laboratoire de Neurosciences Cognitives et Computationnelles, Département d’Études Cognitives, INSERM U960, École Normale Supérieure - PSL University, Paris, France
- * E-mail: (EH); (SO)
| |
Collapse
|
18
|
Ranft J, Lindner B. A self-consistent analytical theory for rotator networks under stochastic forcing: Effects of intrinsic noise and common input. CHAOS (WOODBURY, N.Y.) 2022; 32:063131. [PMID: 35778158 DOI: 10.1063/5.0096000] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Accepted: 05/23/2022] [Indexed: 06/15/2023]
Abstract
Despite the incredible complexity of our brains' neural networks, theoretical descriptions of neural dynamics have led to profound insights into possible network states and dynamics. It remains challenging to develop theories that apply to spiking networks and thus allow one to characterize the dynamic properties of biologically more realistic networks. Here, we build on recent work by van Meegen and Lindner who have shown that "rotator networks," while considerably simpler than real spiking networks and, therefore, more amenable to mathematical analysis, still allow one to capture dynamical properties of networks of spiking neurons. This framework can be easily extended to the case where individual units receive uncorrelated stochastic input, which can be interpreted as intrinsic noise. However, the assumptions of the theory do not apply anymore when the input received by the single rotators is strongly correlated among units. As we show, in this case, the network fluctuations become significantly non-Gaussian, which calls for reworking of the theory. Using a cumulant expansion, we develop a self-consistent analytical theory that accounts for the observed non-Gaussian statistics. Our theory provides a starting point for further studies of more general network setups and information transmission properties of these networks.
Collapse
Affiliation(s)
- Jonas Ranft
- Institut de Biologie de l'ENS, Ecole Normale Supérieure, CNRS, Inserm, Université PSL, 46 rue d'Ulm, 75005 Paris, France
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Department of Physics, Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
19
|
Layer M, Senk J, Essink S, van Meegen A, Bos H, Helias M. NNMT: Mean-Field Based Analysis Tools for Neuronal Network Models. Front Neuroinform 2022; 16:835657. [PMID: 35712677 PMCID: PMC9196133 DOI: 10.3389/fninf.2022.835657] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 03/17/2022] [Indexed: 11/13/2022] Open
Abstract
Mean-field theory of neuronal networks has led to numerous advances in our analytical and intuitive understanding of their dynamics during the past decades. In order to make mean-field based analysis tools more accessible, we implemented an extensible, easy-to-use open-source Python toolbox that collects a variety of mean-field methods for the leaky integrate-and-fire neuron model. The Neuronal Network Mean-field Toolbox (NNMT) in its current state allows for estimating properties of large neuronal networks, such as firing rates, power spectra, and dynamical stability in mean-field and linear response approximation, without running simulations. In this article, we describe how the toolbox is implemented, show how it is used to reproduce results of previous studies, and discuss different use-cases, such as parameter space explorations, or mapping different network models. Although the initial version of the toolbox focuses on methods for leaky integrate-and-fire neurons, its structure is designed to be open and extensible. It aims to provide a platform for collecting analytical methods for neuronal network model analysis, such that the neuroscientific community can take maximal advantage of them.
Collapse
Affiliation(s)
- Moritz Layer
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Johanna Senk
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Simon Essink
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Alexander van Meegen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Institute of Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Cologne, Germany
| | - Hannah Bos
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
20
|
Loss of neuronal heterogeneity in epileptogenic human tissue impairs network resilience to sudden changes in synchrony. Cell Rep 2022; 39:110863. [PMID: 35613586 DOI: 10.1016/j.celrep.2022.110863] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Revised: 03/16/2022] [Accepted: 05/03/2022] [Indexed: 12/25/2022] Open
Abstract
A myriad of pathological changes associated with epilepsy can be recast as decreases in cell and circuit heterogeneity. We thus propose recontextualizing epileptogenesis as a process where reduction in cellular heterogeneity, in part, renders neural circuits less resilient to seizure. By comparing patch clamp recordings from human layer 5 (L5) cortical pyramidal neurons from epileptogenic and non-epileptogenic tissue, we demonstrate significantly decreased biophysical heterogeneity in seizure-generating areas. Implemented computationally, this renders model neural circuits prone to sudden transitions into synchronous states with increased firing activity, paralleling ictogenesis. This computational work also explains the surprising finding of significantly decreased excitability in the population-activation functions of neurons from epileptogenic tissue. Finally, mathematical analyses reveal a bifurcation structure arising only with low heterogeneity and associated with seizure-like dynamics. Taken together, this work provides experimental, computational, and mathematical support for the theory that ictogenic dynamics accompany a reduction in biophysical heterogeneity.
Collapse
|
21
|
Fang X, Duan S, Wang L. Memristive Izhikevich Spiking Neuron Model and Its Application in Oscillatory Associative Memory. Front Neurosci 2022; 16:885322. [PMID: 35592261 PMCID: PMC9110805 DOI: 10.3389/fnins.2022.885322] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2022] [Accepted: 04/13/2022] [Indexed: 11/30/2022] Open
Abstract
The Izhikevich (IZH) spiking neuron model can display spiking and bursting behaviors of neurons. Based on the switching property and bio-plausibility of the memristor, the memristive Izhikevich (MIZH) spiking neuron model is built. Firstly, the MIZH spiking model is introduced and used to generate 23 spiking patterns. We compare the 23 spiking patterns produced by the IZH and MIZH spiking models. Secondly, the MIZH spiking model actively reproduces various neuronal behaviors, including the excitatory cortical neurons, the inhibitory cortical neurons, and other cortical neurons. Finally, the collective dynamic activities of the MIZH neuronal network are performed, and the MIZH oscillatory network is constructed. Experimental results illustrate that the constructed MIZH spiking neuron model performs high firing frequency and good frequency adaptation. The model can easily simulate various spiking and bursting patterns of distinct neurons in the brain. The MIZH neuronal network realizes the synchronous and asynchronous collective behaviors. The MIZH oscillatory network can memorize and retrieve the information patterns correctly and efficiently with high retrieval accuracy.
Collapse
|
22
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
23
|
Abstract
Neurophysiological measurements suggest that human information processing is evinced by neuronal activity. However, the quantitative relationship between the activity of a brain region and its information processing capacity remains unclear. We introduce and validate a mathematical model of the information processing capacity of a brain region in terms of neuronal activity, input storage capacity, and the arrival rate of afferent information. We applied the model to fMRI data obtained from a flanker paradigm in young and old subjects. Our analysis showed that—for a given cognitive task and subject—higher information processing capacity leads to lower neuronal activity and faster responses. Crucially, processing capacity—as estimated from fMRI data—predicted task and age-related differences in reaction times, speaking to the model’s predictive validity. This model offers a framework for modelling of brain dynamics in terms of information processing capacity, and may be exploited for studies of predictive coding and Bayes-optimal decision-making.
Collapse
|
24
|
di Volo M, Segneri M, Goldobin DS, Politi A, Torcini A. Coherent oscillations in balanced neural networks driven by endogenous fluctuations. CHAOS (WOODBURY, N.Y.) 2022; 32:023120. [PMID: 35232059 DOI: 10.1063/5.0075751] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/18/2021] [Accepted: 01/26/2022] [Indexed: 06/14/2023]
Abstract
We present a detailed analysis of the dynamical regimes observed in a balanced network of identical quadratic integrate-and-fire neurons with sparse connectivity for homogeneous and heterogeneous in-degree distributions. Depending on the parameter values, either an asynchronous regime or periodic oscillations spontaneously emerge. Numerical simulations are compared with a mean-field model based on a self-consistent Fokker-Planck equation (FPE). The FPE reproduces quite well the asynchronous dynamics in the homogeneous case by either assuming a Poissonian or renewal distribution for the incoming spike trains. An exact self-consistent solution for the mean firing rate obtained in the limit of infinite in-degree allows identifying balanced regimes that can be either mean- or fluctuation-driven. A low-dimensional reduction of the FPE in terms of circular cumulants is also considered. Two cumulants suffice to reproduce the transition scenario observed in the network. The emergence of periodic collective oscillations is well captured both in the homogeneous and heterogeneous setups by the mean-field models upon tuning either the connectivity or the input DC current. In the heterogeneous situation, we analyze also the role of structural heterogeneity.
Collapse
Affiliation(s)
- Matteo di Volo
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| | - Marco Segneri
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| | - Denis S Goldobin
- Institute of Continuous Media Mechanics, Ural Branch of RAS, Acad. Korolev street 1, 614013 Perm, Russia
| | - Antonio Politi
- Institute for Pure and Applied Mathematics and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, UMR 8089, CY Cergy Paris Université, CNRS, 95302 Cergy-Pontoise, France
| |
Collapse
|
25
|
Liang J, Zhou C. Criticality enhances the multilevel reliability of stimulus responses in cortical neural networks. PLoS Comput Biol 2022; 18:e1009848. [PMID: 35100254 PMCID: PMC8830719 DOI: 10.1371/journal.pcbi.1009848] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2021] [Revised: 02/10/2022] [Accepted: 01/18/2022] [Indexed: 11/18/2022] Open
Abstract
Cortical neural networks exhibit high internal variability in spontaneous dynamic activities and they can robustly and reliably respond to external stimuli with multilevel features–from microscopic irregular spiking of neurons to macroscopic oscillatory local field potential. A comprehensive study integrating these multilevel features in spontaneous and stimulus–evoked dynamics with seemingly distinct mechanisms is still lacking. Here, we study the stimulus–response dynamics of biologically plausible excitation–inhibition (E–I) balanced networks. We confirm that networks around critical synchronous transition states can maintain strong internal variability but are sensitive to external stimuli. In this dynamical region, applying a stimulus to the network can reduce the trial-to-trial variability and shift the network oscillatory frequency while preserving the dynamical criticality. These multilevel features widely observed in different experiments cannot simultaneously occur in non-critical dynamical states. Furthermore, the dynamical mechanisms underlying these multilevel features are revealed using a semi-analytical mean-field theory that derives the macroscopic network field equations from the microscopic neuronal networks, enabling the analysis by nonlinear dynamics theory and linear noise approximation. The generic dynamical principle revealed here contributes to a more integrative understanding of neural systems and brain functions and incorporates multimodal and multilevel experimental observations. The E–I balanced neural network in combination with the effective mean-field theory can serve as a mechanistic modeling framework to study the multilevel neural dynamics underlying neural information and cognitive processes. The complexity and variability of brain dynamical activity range from neuronal spiking and neural avalanches to oscillatory local field potentials of local neural circuits in both spontaneous and stimulus-evoked states. Such multilevel variable brain dynamics are functionally and behaviorally relevant and are principal components of the underlying circuit organization. To more comprehensively clarify their neural mechanisms, we use a bottom-up approach to study the stimulus–response dynamics of neural circuits. Our model assumes the following key biologically plausible components: excitation–inhibition (E–I) neuronal interaction and chemical synaptic coupling. We show that the circuits with E–I balance have a special dynamic sub-region, the critical region. Circuits around this region could account for the emergence of multilevel brain response patterns, both ongoing and stimulus-induced, observed in different experiments, including the reduction of trial-to-trial variability, effective modulation of gamma frequency, and preservation of criticality in the presence of a stimulus. We further analyze the corresponding nonlinear dynamical principles using a novel and highly generalizable semi-analytical mean-field theory. Our computational and theoretical studies explain the cross-level brain dynamical organization of spontaneous and evoked states in a more integrative manner.
Collapse
Affiliation(s)
- Junhao Liang
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong SAR, China
- Centre for Integrative Neuroscience, Eberhard Karls University of Tübingen, Tübingen, Germany
- Department for Sensory and Sensorimotor Systems, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong SAR, China
- Department of Physics, Zhejiang University, Hangzhou, China
- * E-mail:
| |
Collapse
|
26
|
Kullmann R, Knoll G, Bernardi D, Lindner B. Critical current for giant Fano factor in neural models with bistable firing dynamics and implications for signal transmission. Phys Rev E 2022; 105:014416. [PMID: 35193262 DOI: 10.1103/physreve.105.014416] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2020] [Accepted: 01/05/2022] [Indexed: 06/14/2023]
Abstract
Bistability in the firing rate is a prominent feature in different types of neurons as well as in neural networks. We show that for a constant input below a critical value, such bistability can lead to a giant spike-count diffusion. We study the transmission of a periodic signal and demonstrate that close to the critical bias current, the signal-to-noise ratio suffers a sharp increase, an effect that can be traced back to the giant diffusion and large Fano factor.
Collapse
Affiliation(s)
- Richard Kullmann
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Davide Bernardi
- Center for Translational Neurophysiology of Speech and Communication, Fondazione Istituto Italiano di Tecnologia, via Fossato di Mortara 19, 44121 Ferrara, Italy
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
27
|
The Mean Field Approach for Populations of Spiking Neurons. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:125-157. [DOI: 10.1007/978-3-030-89439-9_6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
AbstractMean field theory is a device to analyze the collective behavior of a dynamical system comprising many interacting particles. The theory allows to reduce the behavior of the system to the properties of a handful of parameters. In neural circuits, these parameters are typically the firing rates of distinct, homogeneous subgroups of neurons. Knowledge of the firing rates under conditions of interest can reveal essential information on both the dynamics of neural circuits and the way they can subserve brain function. The goal of this chapter is to provide an elementary introduction to the mean field approach for populations of spiking neurons. We introduce the general idea in networks of binary neurons, starting from the most basic results and then generalizing to more relevant situations. This allows to derive the mean field equations in a simplified setting. We then derive the mean field equations for populations of integrate-and-fire neurons. An effort is made to derive the main equations of the theory using only elementary methods from calculus and probability theory. The chapter ends with a discussion of the assumptions of the theory and some of the consequences of violating those assumptions. This discussion includes an introduction to balanced and metastable networks and a brief catalogue of successful applications of the mean field approach to the study of neural circuits.
Collapse
|
28
|
Bi H, di Volo M, Torcini A. Asynchronous and Coherent Dynamics in Balanced Excitatory-Inhibitory Spiking Networks. Front Syst Neurosci 2021; 15:752261. [PMID: 34955768 PMCID: PMC8702645 DOI: 10.3389/fnsys.2021.752261] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2021] [Accepted: 10/27/2021] [Indexed: 01/14/2023] Open
Abstract
Dynamic excitatory-inhibitory (E-I) balance is a paradigmatic mechanism invoked to explain the irregular low firing activity observed in the cortex. However, we will show that the E-I balance can be at the origin of other regimes observable in the brain. The analysis is performed by combining extensive simulations of sparse E-I networks composed of N spiking neurons with analytical investigations of low dimensional neural mass models. The bifurcation diagrams, derived for the neural mass model, allow us to classify the possible asynchronous and coherent behaviors emerging in balanced E-I networks with structural heterogeneity for any finite in-degree K. Analytic mean-field (MF) results show that both supra and sub-threshold balanced asynchronous regimes are observable in our system in the limit N >> K >> 1. Due to the heterogeneity, the asynchronous states are characterized at the microscopic level by the splitting of the neurons in to three groups: silent, fluctuation, and mean driven. These features are consistent with experimental observations reported for heterogeneous neural circuits. The coherent rhythms observed in our system can range from periodic and quasi-periodic collective oscillations (COs) to coherent chaos. These rhythms are characterized by regular or irregular temporal fluctuations joined to spatial coherence somehow similar to coherent fluctuations observed in the cortex over multiple spatial scales. The COs can emerge due to two different mechanisms. A first mechanism analogous to the pyramidal-interneuron gamma (PING), usually invoked for the emergence of γ-oscillations. The second mechanism is intimately related to the presence of current fluctuations, which sustain COs characterized by an essentially simultaneous bursting of the two populations. We observe period-doubling cascades involving the PING-like COs finally leading to the appearance of coherent chaos. Fluctuation driven COs are usually observable in our system as quasi-periodic collective motions characterized by two incommensurate frequencies. However, for sufficiently strong current fluctuations these collective rhythms can lock. This represents a novel mechanism of frequency locking in neural populations promoted by intrinsic fluctuations. COs are observable for any finite in-degree K, however, their existence in the limit N >> K >> 1 appears as uncertain.
Collapse
Affiliation(s)
- Hongjie Bi
- CY Cergy Paris Université, Laboratoire de Physique Théorique et Modélisation, CNRS, UMR 8089, Cergy-Pontoise, France
- Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology, Okinawa, Japan
| | - Matteo di Volo
- CY Cergy Paris Université, Laboratoire de Physique Théorique et Modélisation, CNRS, UMR 8089, Cergy-Pontoise, France
| | - Alessandro Torcini
- CY Cergy Paris Université, Laboratoire de Physique Théorique et Modélisation, CNRS, UMR 8089, Cergy-Pontoise, France
- CNR-Consiglio Nazionale delle Ricerche, Istituto dei Sistemi Complessi, Sesto Fiorentino, Italy
| |
Collapse
|
29
|
Huang C. Modulation of the dynamical state in cortical network models. Curr Opin Neurobiol 2021; 70:43-50. [PMID: 34403890 PMCID: PMC8688204 DOI: 10.1016/j.conb.2021.07.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Revised: 05/18/2021] [Accepted: 07/14/2021] [Indexed: 11/29/2022]
Abstract
Cortical neural responses can be modulated by various factors, such as stimulus inputs and the behavior state of the animal. Understanding the circuit mechanisms underlying modulations of network dynamics is important to understand the flexibility of circuit computations. Identifying the dynamical state of a network is an important first step to predict network responses to external stimulus and top-down modulatory inputs. Models in stable or unstable dynamical regimes require different analytic tools to estimate the network responses to inputs and the structure of neural variability. In this article, I review recent cortical models of state-dependent responses and their predictions about the underlying modulatory mechanisms.
Collapse
Affiliation(s)
- Chengcheng Huang
- Departments of Neuroscience and Mathematics, University of Pittsburgh, Pittsburgh, PA, USA; Center for the Neural Basis of Cognition, Pittsburgh, PA, USA.
| |
Collapse
|
30
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel’s time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability. The elementary processing units in the central nervous system are neurons that transmit information by short electrical pulses, so called action potentials or spikes. The generation of the action potential is a random process that can be shaped by correlated fluctuations (colored noise) and by adaptation. A consequence of these two ubiquitous features is that the successive time intervals between spikes, the interspike intervals, are not independent but correlated. As these correlations can significantly improve information transmission and weak-signal detection, it is an important task to develop analytical approaches to these statistics for well-established computational models. Here we present a theory of interval correlations for a widely used class of integrate-and-fire models endowed with an adaptation mechanism and subject to correlated fluctuations. We demonstrate which patterns of interval correlations can be expected from the interplay of colored noise, adaptation and intrinsic nonlinear dynamics.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
- * E-mail:
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
31
|
Bolfe M, Metz FL, Guzmán-González E, Castillo IP. Analytic solution of the two-star model with correlated degrees. Phys Rev E 2021; 104:014147. [PMID: 34412227 DOI: 10.1103/physreve.104.014147] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Accepted: 07/07/2021] [Indexed: 11/06/2022]
Abstract
Exponential random graphs are important to model the structure of real-world complex networks. Here we solve the two-star model with degree-degree correlations in the sparse regime. The model constraints the average correlation between the degrees of adjacent nodes (nearest neighbors) and between the degrees at the end-points of two-stars (next nearest neighbors). We compute exactly the network free energy and show that this model undergoes a first-order transition to a condensed phase. For non-negative degree correlations between next nearest neighbors, the degree distribution inside the condensed phase has a single peak at the largest degree, while for negative degree correlations between next nearest neighbors the condensed phase is characterized by a bimodal degree distribution. We calculate the degree assortativities and show they are nonmonotonic functions of the model parameters, with a discontinuous behavior at the first-order transition. The first-order critical line terminates at a second-order critical point, whose location in the phase diagram can be accurately determined. Our results can help to develop more detailed models of complex networks with correlated degrees.
Collapse
Affiliation(s)
- Maíra Bolfe
- Physics Department, Federal University of Santa Maria, 97105-900 Santa Maria, Brazil
| | - Fernando L Metz
- Physics Institute, Federal University of Rio Grande do Sul, 91501-970 Porto Alegre, Brazil and London Mathematical Laboratory, 8 Margravine Gardens, London W6 8RH, United Kingdom
| | - Edgar Guzmán-González
- Departamento de Física, Universidad Autónoma Metropolitana-Iztapalapa, San Rafael Atlixco 186, Ciudad de México 09340, México
| | - Isaac Pérez Castillo
- Departamento de Física, Universidad Autónoma Metropolitana-Iztapalapa, San Rafael Atlixco 186, Ciudad de México 09340, México
| |
Collapse
|
32
|
Houben AM. Frequency Selectivity of Neural Circuits With Heterogeneous Discrete Transmission Delays. Neural Comput 2021; 33:2068-2086. [PMID: 34310671 DOI: 10.1162/neco_a_01404] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2020] [Accepted: 02/24/2021] [Indexed: 11/04/2022]
Abstract
Neurons are connected to other neurons by axons and dendrites that conduct signals with finite velocities, resulting in delays between the firing of a neuron and the arrival of the resultant impulse at other neurons. Since delays greatly complicate the analytical treatment and interpretation of models, they are usually neglected or taken to be uniform, leading to a lack in the comprehension of the effects of delays in neural systems. This letter shows that heterogeneous transmission delays make small groups of neurons respond selectively to inputs with differing frequency spectra. By studying a single integrate-and-fire neuron receiving correlated time-shifted inputs, it is shown how the frequency response is linked to both the strengths and delay times of the afferent connections. The results show that incorporating delays alters the functioning of neural networks, and changes the effect that neural connections and synaptic strengths have.
Collapse
|
33
|
Liang J, Wang SJ, Zhou C. Less is more: Wiring-economical modular networks support self-sustained firing-economical neural avalanches for efficient processing. Natl Sci Rev 2021; 9:nwab102. [PMID: 35355506 PMCID: PMC8962757 DOI: 10.1093/nsr/nwab102] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 04/28/2021] [Accepted: 05/13/2021] [Indexed: 11/12/2022] Open
Abstract
The brain network is notably cost-efficient, while the fundamental physical and dynamic mechanisms underlying its economical optimization in network structure and activity have not been determined. In this study, we investigate the intricate cost-efficient interplay between structure and dynamics in biologically plausible spatial modular neuronal network models. We observe that critical avalanche states from excitation-inhibition balance under modular network topology with less wiring cost can also achieve lower costs in firing but with strongly enhanced response sensitivity to stimuli. We derive mean-field equations that govern the macroscopic network dynamics through a novel approximate theory. The mechanism of low firing cost and stronger response in the form of critical avalanches is explained as a proximity to a Hopf bifurcation of the modules when increasing their connection density. Our work reveals the generic mechanism underlying the cost-efficient modular organization and critical dynamics widely observed in neural systems, providing insights into brain-inspired efficient computational designs.
Collapse
|
34
|
Knoll G, Lindner B. Recurrence-mediated suprathreshold stochastic resonance. J Comput Neurosci 2021; 49:407-418. [PMID: 34003421 PMCID: PMC8556192 DOI: 10.1007/s10827-021-00788-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 04/21/2021] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
It has previously been shown that the encoding of time-dependent signals by feedforward networks (FFNs) of processing units exhibits suprathreshold stochastic resonance (SSR), which is an optimal signal transmission for a finite level of independent, individual stochasticity in the single units. In this study, a recurrent spiking network is simulated to demonstrate that SSR can be also caused by network noise in place of intrinsic noise. The level of autonomously generated fluctuations in the network can be controlled by the strength of synapses, and hence the coding fraction (our measure of information transmission) exhibits a maximum as a function of the synaptic coupling strength. The presence of a coding peak at an optimal coupling strength is robust over a wide range of individual, network, and signal parameters, although the optimal strength and peak magnitude depend on the parameter being varied. We also perform control experiments with an FFN illustrating that the optimized coding fraction is due to the change in noise level and not from other effects entailed when changing the coupling strength. These results also indicate that the non-white (temporally correlated) network noise in general provides an extra boost to encoding performance compared to the FFN driven by intrinsic white noise fluctuations.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany. .,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany.,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
35
|
Akil AE, Rosenbaum R, Josić K. Balanced networks under spike-time dependent plasticity. PLoS Comput Biol 2021; 17:e1008958. [PMID: 33979336 PMCID: PMC8143429 DOI: 10.1371/journal.pcbi.1008958] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Revised: 05/24/2021] [Accepted: 04/12/2021] [Indexed: 11/28/2022] Open
Abstract
The dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory–inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How do the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a theory of spike–timing dependent plasticity in balanced networks. We show that balance can be attained and maintained under plasticity–induced weight changes. We find that correlations in the input mildly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input. Animals are able to learn complex tasks through changes in individual synapses between cells. Such changes lead to the coevolution of neural activity patterns and the structure of neural connectivity, but the consequences of these interactions are not fully understood. We consider plasticity in model neural networks which achieve an average balance between the excitatory and inhibitory synaptic inputs to different cells, and display cortical–like, irregular activity. We extend the theory of balanced networks to account for synaptic plasticity and show which rules can maintain balance, and which will drive the network into a different state. This theory of plasticity can provide insights into the relationship between stimuli, network dynamics, and synaptic circuitry.
Collapse
Affiliation(s)
- Alan Eric Akil
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, United States of America
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, United States of America
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, United States of America
- Department of Biology and Biochemistry, University of Houston, Houston, Texas, United States of America
- * E-mail:
| |
Collapse
|
36
|
Kim CM, Chow CC. Training Spiking Neural Networks in the Strong Coupling Regime. Neural Comput 2021; 33:1199-1233. [PMID: 34496392 DOI: 10.1162/neco_a_01379] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Accepted: 11/23/2020] [Indexed: 11/04/2022]
Abstract
Recurrent neural networks trained to perform complex tasks can provide insight into the dynamic mechanism that underlies computations performed by cortical circuits. However, due to a large number of unconstrained synaptic connections, the recurrent connectivity that emerges from network training may not be biologically plausible. Therefore, it remains unknown if and how biological neural circuits implement dynamic mechanisms proposed by the models. To narrow this gap, we developed a training scheme that, in addition to achieving learning goals, respects the structural and dynamic properties of a standard cortical circuit model: strongly coupled excitatory-inhibitory spiking neural networks. By preserving the strong mean excitatory and inhibitory coupling of initial networks, we found that most of trained synapses obeyed Dale's law without additional constraints, exhibited large trial-to-trial spiking variability, and operated in inhibition-stabilized regime. We derived analytical estimates on how training and network parameters constrained the changes in mean synaptic strength during training. Our results demonstrate that training recurrent neural networks subject to strong coupling constraints can result in connectivity structure and dynamic regime relevant to cortical circuits.
Collapse
Affiliation(s)
- Christopher M Kim
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases/National Institutes of Health, Bethesda, MD 20814, U.S.A.
| | - Carson C Chow
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases/National Institutes of Health, Bethesda, MD 20814, U.S.A.
| |
Collapse
|
37
|
Ullner E, Politi A. Collective dynamics in the presence of finite-width pulses. CHAOS (WOODBURY, N.Y.) 2021; 31:043135. [PMID: 34251252 DOI: 10.1063/5.0046691] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Accepted: 04/02/2021] [Indexed: 06/13/2023]
Abstract
The idealization of neuronal pulses as δ-spikes is a convenient approach in neuroscience but can sometimes lead to erroneous conclusions. We investigate the effect of a finite pulse width on the dynamics of balanced neuronal networks. In particular, we study two populations of identical excitatory and inhibitory neurons in a random network of phase oscillators coupled through exponential pulses with different widths. We consider three coupling functions inspired by leaky integrate-and-fire neurons with delay and type I phase-response curves. By exploring the role of the pulse widths for different coupling strengths, we find a robust collective irregular dynamics, which collapses onto a fully synchronous regime if the inhibitory pulses are sufficiently wider than the excitatory ones. The transition to synchrony is accompanied by hysteretic phenomena (i.e., the co-existence of collective irregular and synchronous dynamics). Our numerical results are supported by a detailed scaling and stability analysis of the fully synchronous solution. A conjectured first-order phase transition emerging for δ-spikes is smoothed out for finite-width pulses.
Collapse
Affiliation(s)
- Ekkehard Ullner
- Institute for Pure and Applied Mathematics and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| | - Antonio Politi
- Institute for Pure and Applied Mathematics and Department of Physics (SUPA), Old Aberdeen, Aberdeen AB24 3UE, United Kingdom
| |
Collapse
|
38
|
Martínez-Cañada P, Ness TV, Einevoll GT, Fellin T, Panzeri S. Computation of the electroencephalogram (EEG) from network models of point neurons. PLoS Comput Biol 2021; 17:e1008893. [PMID: 33798190 PMCID: PMC8046357 DOI: 10.1371/journal.pcbi.1008893] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2020] [Revised: 04/14/2021] [Accepted: 03/18/2021] [Indexed: 12/28/2022] Open
Abstract
The electroencephalogram (EEG) is a major tool for non-invasively studying brain function and dysfunction. Comparing experimentally recorded EEGs with neural network models is important to better interpret EEGs in terms of neural mechanisms. Most current neural network models use networks of simple point neurons. They capture important properties of cortical dynamics, and are numerically or analytically tractable. However, point neurons cannot generate an EEG, as EEG generation requires spatially separated transmembrane currents. Here, we explored how to compute an accurate approximation of a rodent's EEG with quantities defined in point-neuron network models. We constructed different approximations (or proxies) of the EEG signal that can be computed from networks of leaky integrate-and-fire (LIF) point neurons, such as firing rates, membrane potentials, and combinations of synaptic currents. We then evaluated how well each proxy reconstructed a ground-truth EEG obtained when the synaptic currents of the LIF model network were fed into a three-dimensional network model of multicompartmental neurons with realistic morphologies. Proxies based on linear combinations of AMPA and GABA currents performed better than proxies based on firing rates or membrane potentials. A new class of proxies, based on an optimized linear combination of time-shifted AMPA and GABA currents, provided the most accurate estimate of the EEG over a wide range of network states. The new linear proxies explained 85-95% of the variance of the ground-truth EEG for a wide range of network configurations including different cell morphologies, distributions of presynaptic inputs, positions of the recording electrode, and spatial extensions of the network. Non-linear EEG proxies using a convolutional neural network (CNN) on synaptic currents increased proxy performance by a further 2-8%. Our proxies can be used to easily calculate a biologically realistic EEG signal directly from point-neuron simulations thus facilitating a quantitative comparison between computational models and experimental EEG recordings.
Collapse
Affiliation(s)
- Pablo Martínez-Cañada
- Neural Coding Laboratory, Istituto Italiano di Tecnologia, Genova, Italy
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy
- Optical Approaches to Brain Function Laboratory, Istituto Italiano di Tecnologia, Genova, Italy
| | - Torbjørn V. Ness
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Gaute T. Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- Department of Physics, University of Oslo, Oslo, Norway
| | - Tommaso Fellin
- Neural Coding Laboratory, Istituto Italiano di Tecnologia, Genova, Italy
- Optical Approaches to Brain Function Laboratory, Istituto Italiano di Tecnologia, Genova, Italy
| | - Stefano Panzeri
- Neural Coding Laboratory, Istituto Italiano di Tecnologia, Genova, Italy
- Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy
| |
Collapse
|
39
|
Bernardi D, Doron G, Brecht M, Lindner B. A network model of the barrel cortex combined with a differentiator detector reproduces features of the behavioral response to single-neuron stimulation. PLoS Comput Biol 2021; 17:e1007831. [PMID: 33556070 PMCID: PMC7895413 DOI: 10.1371/journal.pcbi.1007831] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 02/19/2021] [Accepted: 01/17/2021] [Indexed: 11/23/2022] Open
Abstract
The stimulation of a single neuron in the rat somatosensory cortex can elicit a behavioral response. The probability of a behavioral response does not depend appreciably on the duration or intensity of a constant stimulation, whereas the response probability increases significantly upon injection of an irregular current. Biological mechanisms that can potentially suppress a constant input signal are present in the dynamics of both neurons and synapses and seem ideal candidates to explain these experimental findings. Here, we study a large network of integrate-and-fire neurons with several salient features of neuronal populations in the rat barrel cortex. The model includes cellular spike-frequency adaptation, experimentally constrained numbers and types of chemical synapses endowed with short-term plasticity, and gap junctions. Numerical simulations of this model indicate that cellular and synaptic adaptation mechanisms alone may not suffice to account for the experimental results if the local network activity is read out by an integrator. However, a circuit that approximates a differentiator can detect the single-cell stimulation with a reliability that barely depends on the length or intensity of the stimulus, but that increases when an irregular signal is used. This finding is in accordance with the experimental results obtained for the stimulation of a regularly-spiking excitatory cell. It is widely assumed that only a large group of neurons can encode a stimulus or control behavior. This tenet of neuroscience has been challenged by experiments in which stimulating a single cortical neuron has had a measurable effect on an animal’s behavior. Recently, theoretical studies have explored how a single-neuron stimulation could be detected in a large recurrent network. However, these studies missed essential biological mechanisms of cortical networks and are unable to explain more recent experiments in the barrel cortex. Here, to describe the stimulated brain area, we propose and study a network model endowed with many important biological features of the barrel cortex. Importantly, we also investigate different readout mechanisms, i.e. ways in which the stimulation effects can propagate to other brain areas. We show that a readout network which tracks rapid variations in the local network activity is in agreement with the experiments. Our model demonstrates a possible mechanism for how the stimulation of a single neuron translates into a signal at the population level, which is taken as a proxy of the animal’s response. Our results illustrate the power of spiking neural networks to properly describe the effects of a single neuron’s activity.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- Center for Translational Neurophysiology of Speech and Communication, Fondazione Istituto Italiano di Tecnologia, Ferrara, Italy
- * E-mail:
| | - Guy Doron
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Michael Brecht
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
40
|
Protachevicz PR, Iarosz KC, Caldas IL, Antonopoulos CG, Batista AM, Kurths J. Influence of Autapses on Synchronization in Neural Networks With Chemical Synapses. Front Syst Neurosci 2020; 14:604563. [PMID: 33328913 PMCID: PMC7734146 DOI: 10.3389/fnsys.2020.604563] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2020] [Accepted: 11/05/2020] [Indexed: 11/29/2022] Open
Abstract
A great deal of research has been devoted on the investigation of neural dynamics in various network topologies. However, only a few studies have focused on the influence of autapses, synapses from a neuron onto itself via closed loops, on neural synchronization. Here, we build a random network with adaptive exponential integrate-and-fire neurons coupled with chemical synapses, equipped with autapses, to study the effect of the latter on synchronous behavior. We consider time delay in the conductance of the pre-synaptic neuron for excitatory and inhibitory connections. Interestingly, in neural networks consisting of both excitatory and inhibitory neurons, we uncover that synchronous behavior depends on their synapse type. Our results provide evidence on the synchronous and desynchronous activities that emerge in random neural networks with chemical, inhibitory and excitatory synapses where neurons are equipped with autapses.
Collapse
Affiliation(s)
| | - Kelly C Iarosz
- Faculdade de Telêmaco Borba, FATEB, Telêmaco Borba, Brazil.,Graduate Program in Chemical Engineering, Federal University of Technology Paraná, Ponta Grossa, Brazil
| | - Iberê L Caldas
- Institute of Physics, University of São Paulo, São Paulo, Brazil
| | - Chris G Antonopoulos
- Department of Mathematical Sciences, University of Essex, Colchester, United Kingdom
| | - Antonio M Batista
- Institute of Physics, University of São Paulo, São Paulo, Brazil.,Department of Mathematics and Statistics, State University of Ponta Grossa, Ponta Grossa, Brazil
| | - Jurgen Kurths
- Department Complexity Science, Potsdam Institute for Climate Impact Research, Potsdam, Germany.,Department of Physics, Humboldt University, Berlin, Germany.,Centre for Analysis of Complex Systems, Sechenov First Moscow State Medical University, Moscow, Russia
| |
Collapse
|
41
|
Sadeh S, Clopath C. Inhibitory stabilization and cortical computation. Nat Rev Neurosci 2020; 22:21-37. [PMID: 33177630 DOI: 10.1038/s41583-020-00390-z] [Citation(s) in RCA: 54] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/22/2020] [Indexed: 12/22/2022]
Abstract
Neuronal networks with strong recurrent connectivity provide the brain with a powerful means to perform complex computational tasks. However, high-gain excitatory networks are susceptible to instability, which can lead to runaway activity, as manifested in pathological regimes such as epilepsy. Inhibitory stabilization offers a dynamic, fast and flexible compensatory mechanism to balance otherwise unstable networks, thus enabling the brain to operate in its most efficient regimes. Here we review recent experimental evidence for the presence of such inhibition-stabilized dynamics in the brain and discuss their consequences for cortical computation. We show how the study of inhibition-stabilized networks in the brain has been facilitated by recent advances in the technological toolbox and perturbative techniques, as well as a concomitant development of biologically realistic computational models. By outlining future avenues, we suggest that inhibitory stabilization can offer an exemplary case of how experimental neuroscience can progress in tandem with technology and theory to advance our understanding of the brain.
Collapse
Affiliation(s)
- Sadra Sadeh
- Bioengineering Department, Imperial College London, London, UK
| | - Claudia Clopath
- Bioengineering Department, Imperial College London, London, UK.
| |
Collapse
|
42
|
Li J, Shew WL. Tuning network dynamics from criticality to an asynchronous state. PLoS Comput Biol 2020; 16:e1008268. [PMID: 32986705 PMCID: PMC7544040 DOI: 10.1371/journal.pcbi.1008268] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2020] [Revised: 10/08/2020] [Accepted: 08/17/2020] [Indexed: 01/03/2023] Open
Abstract
According to many experimental observations, neurons in cerebral cortex tend to operate in an asynchronous regime, firing independently of each other. In contrast, many other experimental observations reveal cortical population firing dynamics that are relatively coordinated and occasionally synchronous. These discrepant observations have naturally led to competing hypotheses. A commonly hypothesized explanation of asynchronous firing is that excitatory and inhibitory synaptic inputs are precisely correlated, nearly canceling each other, sometimes referred to as 'balanced' excitation and inhibition. On the other hand, the 'criticality' hypothesis posits an explanation of the more coordinated state that also requires a certain balance of excitatory and inhibitory interactions. Both hypotheses claim the same qualitative mechanism-properly balanced excitation and inhibition. Thus, a natural question arises: how are asynchronous population dynamics and critical dynamics related, how do they differ? Here we propose an answer to this question based on investigation of a simple, network-level computational model. We show that the strength of inhibitory synapses relative to excitatory synapses can be tuned from weak to strong to generate a family of models that spans a continuum from critical dynamics to asynchronous dynamics. Our results demonstrate that the coordinated dynamics of criticality and asynchronous dynamics can be generated by the same neural system if excitatory and inhibitory synapses are tuned appropriately.
Collapse
Affiliation(s)
- Jingwen Li
- Department of Physics, University of Arkansas, Fayetteville, Arkansas, United States of America
| | - Woodrow L. Shew
- Department of Physics, University of Arkansas, Fayetteville, Arkansas, United States of America
- * E-mail:
| |
Collapse
|
43
|
Wilting J, Priesemann V. Between Perfectly Critical and Fully Irregular: A Reverberating Model Captures and Predicts Cortical Spike Propagation. Cereb Cortex 2020; 29:2759-2770. [PMID: 31008508 PMCID: PMC6519697 DOI: 10.1093/cercor/bhz049] [Citation(s) in RCA: 33] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2018] [Revised: 01/20/2019] [Indexed: 12/11/2022] Open
Abstract
Knowledge about the collective dynamics of cortical spiking is very informative about the underlying coding principles. However, even most basic properties are not known with certainty, because their assessment is hampered by spatial subsampling, i.e., the limitation that only a tiny fraction of all neurons can be recorded simultaneously with millisecond precision. Building on a novel, subsampling-invariant estimator, we fit and carefully validate a minimal model for cortical spike propagation. The model interpolates between two prominent states: asynchronous and critical. We find neither of them in cortical spike recordings across various species, but instead identify a narrow "reverberating" regime. This approach enables us to predict yet unknown properties from very short recordings and for every circuit individually, including responses to minimal perturbations, intrinsic network timescales, and the strength of external input compared to recurrent activation "thereby informing about the underlying coding principles for each circuit, area, state and task.
Collapse
Affiliation(s)
- J Wilting
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faß berg 17, Göttingen, Germany
| | - V Priesemann
- Max-Planck-Institute for Dynamics and Self-Organization, Am Faß berg 17, Göttingen, Germany.,Bernstein-Center for Computational Neuroscience, Göttingen, Germany
| |
Collapse
|
44
|
Kim CM, Egert U, Kumar A. Dynamics of multiple interacting excitatory and inhibitory populations with delays. Phys Rev E 2020; 102:022308. [PMID: 32942361 DOI: 10.1103/physreve.102.022308] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2018] [Accepted: 07/15/2020] [Indexed: 11/07/2022]
Abstract
A network consisting of excitatory and inhibitory (EI) neurons is a canonical model for understanding local cortical network activity. In this study, we extended the local circuit model and investigated how its dynamical landscape can be enriched when it interacts with another excitatory (E) population with long transmission delays. Through analysis of a rate model and numerical simulations of a corresponding network of spiking neurons, we studied the transition from stationary to oscillatory states by analyzing the Hopf bifurcation structure in terms of two network parameters: (1) transmission delay between the EI subnetwork and the E population and (2) inhibitory couplings that induced oscillatory activity in the EI subnetwork. We found that the critical coupling strength can strongly modulate as a function of transmission delay, and consequently the stationary state can be interwoven intricately with the oscillatory state. Such a dynamical landscape gave rise to an isolated stationary state surrounded by multiple oscillatory states that generated different frequency modes, and cross-frequency coupling developed naturally at the bifurcation points. We identified the network motifs with short- and long-range inhibitory connections that underlie the emergence of oscillatory states with multiple frequencies. Thus, we provided a mechanistic explanation of how the transmission delay to and from the additional E population altered the dynamical landscape. In summary, our results demonstrated the potential role of long-range connections in shaping the network activity of local cortical circuits.
Collapse
Affiliation(s)
| | - Ulrich Egert
- Bernstein Center Freiburg, 79104 Freiburg, Germany.,Biomicrotechnology, IMTEK-Department of Microsystems Engineering, University of Freiburg, 79110 Freiburg, Germany
| | - Arvind Kumar
- Bernstein Center Freiburg, 79104 Freiburg, Germany.,Department of Computational Science and Technology, School for Electrical Engineering and Computer Science, KTH Royal Institute of Technology, Lindstedtsvägen 3, 11428 Stockholm, Sweden
| |
Collapse
|
45
|
Bachmann C, Tetzlaff T, Duarte R, Morrison A. Firing rate homeostasis counteracts changes in stability of recurrent neural networks caused by synapse loss in Alzheimer's disease. PLoS Comput Biol 2020; 16:e1007790. [PMID: 32841234 PMCID: PMC7505475 DOI: 10.1371/journal.pcbi.1007790] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2019] [Revised: 09/21/2020] [Accepted: 03/17/2020] [Indexed: 11/19/2022] Open
Abstract
The impairment of cognitive function in Alzheimer's disease is clearly correlated to synapse loss. However, the mechanisms underlying this correlation are only poorly understood. Here, we investigate how the loss of excitatory synapses in sparsely connected random networks of spiking excitatory and inhibitory neurons alters their dynamical characteristics. Beyond the effects on the activity statistics, we find that the loss of excitatory synapses on excitatory neurons reduces the network's sensitivity to small perturbations. This decrease in sensitivity can be considered as an indication of a reduction of computational capacity. A full recovery of the network's dynamical characteristics and sensitivity can be achieved by firing rate homeostasis, here implemented by an up-scaling of the remaining excitatory-excitatory synapses. Mean-field analysis reveals that the stability of the linearised network dynamics is, in good approximation, uniquely determined by the firing rate, and thereby explains why firing rate homeostasis preserves not only the firing rate but also the network's sensitivity to small perturbations.
Collapse
Affiliation(s)
- Claudia Bachmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Tom Tetzlaff
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Renato Duarte
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Abigail Morrison
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr-University Bochum, Bochum, Germany
| |
Collapse
|
46
|
Krug K. Coding Perceptual Decisions: From Single Units to Emergent Signaling Properties in Cortical Circuits. Annu Rev Vis Sci 2020; 6:387-409. [PMID: 32600168 DOI: 10.1146/annurev-vision-030320-041223] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Spiking activity in single neurons of the primate visual cortex has been tightly linked to perceptual decisions. Any mechanism that reads out these perceptual signals to support behavior must respect the underlying neuroanatomy that shapes the functional properties of sensory neurons. Spatial distribution and timing of inputs to the next processing levels are critical, as conjoint activity of precursor neurons increases the spiking rate of downstream neurons and ultimately drives behavior. I set out how correlated activity might coalesce into a micropool of task-sensitive neurons signaling a particular percept to determine perceptual decision signals locally and for flexible interarea transmission depending on the task context. As data from more and more neurons and their complex interactions are analyzed, the space of computational mechanisms must be constrained based on what is plausible within neurobiological limits. This review outlines experiments to test the new perspectives offered by these extended methods.
Collapse
Affiliation(s)
- Kristine Krug
- Lehrstuhl für Sensorische Physiologie, Institut für Biologie, Otto-von-Guericke-Universität Magdeburg, 39120 Magdeburg, Germany; .,Leibniz-Institut für Neurobiologie, 39118 Magdeburg, Germany.,Department of Physiology, Anatomy, and Genetics, Oxford University, Oxford OX1 3PT, United Kingdom
| |
Collapse
|
47
|
Chimera states in hybrid coupled neuron populations. Neural Netw 2020; 126:108-117. [DOI: 10.1016/j.neunet.2020.03.002] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2019] [Revised: 02/03/2020] [Accepted: 03/02/2020] [Indexed: 01/01/2023]
|
48
|
Shao Y, Zhang J, Tao L. Dimensional reduction of emergent spatiotemporal cortical dynamics via a maximum entropy moment closure. PLoS Comput Biol 2020; 16:e1007265. [PMID: 32516336 PMCID: PMC7304648 DOI: 10.1371/journal.pcbi.1007265] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2019] [Revised: 06/19/2020] [Accepted: 04/29/2020] [Indexed: 11/22/2022] Open
Abstract
Modern electrophysiological recordings and optical imaging techniques have revealed a diverse spectrum of spatiotemporal neural activities underlying fundamental cognitive processing. Oscillations, traveling waves and other complex population dynamical patterns are often concomitant with sensory processing, information transfer, decision making and memory consolidation. While neural population models such as neural mass, population density and kinetic theoretical models have been used to capture a wide range of the experimentally observed dynamics, a full account of how the multi-scale dynamics emerges from the detailed biophysical properties of individual neurons and the network architecture remains elusive. Here we apply a recently developed coarse-graining framework for reduced-dimensional descriptions of neuronal networks to model visual cortical dynamics. We show that, without introducing any new parameters, how a sequence of models culminating in an augmented system of spatially-coupled ODEs can effectively model a wide range of the observed cortical dynamics, ranging from visual stimulus orientation dynamics to traveling waves induced by visual illusory stimuli. In addition to an efficient simulation method, this framework also offers an analytic approach to studying large-scale network dynamics. As such, the dimensional reduction naturally leads to mesoscopic variables that capture the interplay between neuronal population stochasticity and network architecture that we believe to underlie many emergent cortical phenomena.
Collapse
Affiliation(s)
- Yuxiu Shao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, China
| | - Jiwei Zhang
- School of Mathematics and Statistics, and Hubei Key Laboratory of Computational Science, Wuhan University, China
| | - Louis Tao
- Center for Bioinformatics, National Laboratory of Protein Engineering and Plant Genetic Engineering, School of Life Sciences, Peking University, Beijing, China
- Center for Quantitative Biology, Peking University, Beijing, China
| |
Collapse
|
49
|
Abstract
Contemporary brain research seeks to understand how cognition is reducible to neural activity. Crucially, much of this effort is guided by a scientific paradigm that views neural activity as essentially driven by external stimuli. In contrast, recent perspectives argue that this paradigm is by itself inadequate and that understanding patterns of activity intrinsic to the brain is needed to explain cognition. Yet, despite this critique, the stimulus-driven paradigm still dominates-possibly because a convincing alternative has not been clear. Here, we review a series of findings suggesting such an alternative. These findings indicate that neural activity in the hippocampus occurs in one of three brain states that have radically different anatomical, physiological, representational, and behavioral correlates, together implying different functional roles in cognition. This three-state framework also indicates that neural representations in the hippocampus follow a surprising pattern of organization at the timescale of ∼1 s or longer. Lastly, beyond the hippocampus, recent breakthroughs indicate three parallel states in the cortex, suggesting shared principles and brain-wide organization of intrinsic neural activity.
Collapse
Affiliation(s)
- Kenneth Kay
- Howard Hughes Medical Institute, Kavli Institute for Fundamental Neuroscience, Department of Physiology, University of California San Francisco, San Francisco, California
| | - Loren M Frank
- Howard Hughes Medical Institute, Kavli Institute for Fundamental Neuroscience, Department of Physiology, University of California San Francisco, San Francisco, California
| |
Collapse
|
50
|
Whitwell HJ, Bacalini MG, Blyuss O, Chen S, Garagnani P, Gordleeva SY, Jalan S, Ivanchenko M, Kanakov O, Kustikova V, Mariño IP, Meyerov I, Ullner E, Franceschi C, Zaikin A. The Human Body as a Super Network: Digital Methods to Analyze the Propagation of Aging. Front Aging Neurosci 2020; 12:136. [PMID: 32523526 PMCID: PMC7261843 DOI: 10.3389/fnagi.2020.00136] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2020] [Accepted: 04/22/2020] [Indexed: 12/13/2022] Open
Abstract
Biological aging is a complex process involving multiple biological processes. These can be understood theoretically though considering them as individual networks-e.g., epigenetic networks, cell-cell networks (such as astroglial networks), and population genetics. Mathematical modeling allows the combination of such networks so that they may be studied in unison, to better understand how the so-called "seven pillars of aging" combine and to generate hypothesis for treating aging as a condition at relatively early biological ages. In this review, we consider how recent progression in mathematical modeling can be utilized to investigate aging, particularly in, but not exclusive to, the context of degenerative neuronal disease. We also consider how the latest techniques for generating biomarker models for disease prediction, such as longitudinal analysis and parenclitic analysis can be applied to as both biomarker platforms for aging, as well as to better understand the inescapable condition. This review is written by a highly diverse and multi-disciplinary team of scientists from across the globe and calls for greater collaboration between diverse fields of research.
Collapse
Affiliation(s)
- Harry J Whitwell
- Department of Chemical Engineering, Imperial College London, London, United Kingdom
| | | | - Oleg Blyuss
- School of Physics, Astronomy and Mathematics, University of Hertfordshire, Harfield, United Kingdom.,Department of Paediatrics and Paediatric Infectious Diseases, Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Shangbin Chen
- Britton Chance Centre for Biomedical Photonics, Wuhan National Laboratory for Optoelectronics-Huazhong University of Science and Technology, Wuhan, China
| | - Paolo Garagnani
- Department of Experimental, Diagnostic and Specialty Medicine (DIMES), University of Bologna, Bologna, Italy
| | - Susan Yu Gordleeva
- Laboratory of Systems Medicine of Healthy Aging, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Sarika Jalan
- Complex Systems Laboratory, Discipline of Physics, Indian Institute of Technology Indore, Indore, India.,Centre for Bio-Science and Bio-Medical Engineering, Indian Institute of Technology Indore, Indore, India
| | - Mikhail Ivanchenko
- Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Oleg Kanakov
- Laboratory of Systems Medicine of Healthy Aging, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Valentina Kustikova
- Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Ines P Mariño
- Department of Biology and Geology, Physics and Inorganic Chemistry, Universidad Rey Juan Carlos, Madrid, Spain
| | - Iosif Meyerov
- Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Ekkehard Ullner
- Department of Physics (SUPA), Institute for Complex Systems and Mathematical Biology, University of Aberdeen, Aberdeen, United Kingdom
| | - Claudio Franceschi
- Laboratory of Systems Medicine of Healthy Aging, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia.,Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia
| | - Alexey Zaikin
- Department of Paediatrics and Paediatric Infectious Diseases, Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia.,Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University of Nizhny Novgorod, Nizhny Novgorod, Russia.,Department of Mathematics, Institute for Women's Health, University College London, London, United Kingdom
| |
Collapse
|