1
|
Koren V, Malerba SB, Schwalger T, Panzeri S. Efficient coding in biophysically realistic excitatory-inhibitory spiking networks. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2025:2024.04.24.590955. [PMID: 38712237 PMCID: PMC11071478 DOI: 10.1101/2024.04.24.590955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2024]
Abstract
The principle of efficient coding posits that sensory cortical networks are designed to encode maximal sensory information with minimal metabolic cost. Despite the major influence of efficient coding in neuroscience, it has remained unclear whether fundamental empirical properties of neural network activity can be explained solely based on this normative principle. Here, we derive the structural, coding, and biophysical properties of excitatory-inhibitory recurrent networks of spiking neurons that emerge directly from imposing that the network minimizes an instantaneous loss function and a time-averaged performance measure enacting efficient coding. We assumed that the network encodes a number of independent stimulus features varying with a time scale equal to the membrane time constant of excitatory and inhibitory neurons. The optimal network has biologically-plausible biophysical features, including realistic integrate-and-fire spiking dynamics, spike-triggered adaptation, and a non-specific excitatory external input. The excitatory-inhibitory recurrent connectivity between neurons with similar stimulus tuning implements feature-specific competition, similar to that recently found in visual cortex. Networks with unstructured connectivity cannot reach comparable levels of coding efficiency. The optimal ratio of excitatory vs inhibitory neurons and the ratio of mean inhibitory-to-inhibitory vs excitatory-to-inhibitory connectivity are comparable to those of cortical sensory networks. The efficient network solution exhibits an instantaneous balance between excitation and inhibition. The network can perform efficient coding even when external stimuli vary over multiple time scales. Together, these results suggest that key properties of biological neural networks may be accounted for by efficient coding.
Collapse
Affiliation(s)
- Veronika Koren
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
- Institute of Mathematics, Technische Universität Berlin, 10623 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Simone Blanco Malerba
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
| | - Tilo Schwalger
- Institute of Mathematics, Technische Universität Berlin, 10623 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Stefano Panzeri
- Institute of Neural Information Processing, Center for Molecular Neurobiology (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), 20251 Hamburg, Germany
| |
Collapse
|
2
|
Senk J, Hagen E, van Albada SJ, Diesmann M. Reconciliation of weak pairwise spike-train correlations and highly coherent local field potentials across space. Cereb Cortex 2024; 34:bhae405. [PMID: 39462814 PMCID: PMC11513197 DOI: 10.1093/cercor/bhae405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2023] [Revised: 09/09/2024] [Accepted: 09/23/2024] [Indexed: 10/29/2024] Open
Abstract
Multi-electrode arrays covering several square millimeters of neural tissue provide simultaneous access to population signals such as extracellular potentials and spiking activity of one hundred or more individual neurons. The interpretation of the recorded data calls for multiscale computational models with corresponding spatial dimensions and signal predictions. Multi-layer spiking neuron network models of local cortical circuits covering about $1\,{\text{mm}^{2}}$ have been developed, integrating experimentally obtained neuron-type-specific connectivity data and reproducing features of observed in-vivo spiking statistics. Local field potentials can be computed from the simulated spiking activity. We here extend a local network and local field potential model to an area of $4\times 4\,{\text{mm}^{2}}$, preserving the neuron density and introducing distance-dependent connection probabilities and conduction delays. We find that the upscaling procedure preserves the overall spiking statistics of the original model and reproduces asynchronous irregular spiking across populations and weak pairwise spike-train correlations in agreement with experimental recordings from sensory cortex. Also compatible with experimental observations, the correlation of local field potential signals is strong and decays over a distance of several hundred micrometers. Enhanced spatial coherence in the low-gamma band around $50\,\text{Hz}$ may explain the recent report of an apparent band-pass filter effect in the spatial reach of the local field potential.
Collapse
Affiliation(s)
- Johanna Senk
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Wilhelm-Johnen-Str., 52428 Jülich, Germany
- Sussex AI, School of Engineering and Informatics, University of Sussex, Chichester, Falmer, Brighton BN1 9QJ, United Kingdom
| | - Espen Hagen
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Wilhelm-Johnen-Str., 52428 Jülich, Germany
- Centre for Precision Psychiatry, Institute of Clinical Medicine, University of Oslo, and Division of Mental Health and Addiction, Oslo University Hospital, Ullevål Hospital, 0424 Oslo, Norway
| | - Sacha J van Albada
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Wilhelm-Johnen-Str., 52428 Jülich, Germany
- Institute of Zoology, University of Cologne, Zülpicher Str., 50674 Cologne, Germany
| | - Markus Diesmann
- Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Wilhelm-Johnen-Str., 52428 Jülich, Germany
- JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Wilhelm-Johnen-Str., 52428 Jülich Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Pauwelsstr., 52074 Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Otto-Blumenthal-Str., 52074 Aachen, Germany
| |
Collapse
|
3
|
Zheng T, Sugino M, Jimbo Y, Ermentrout GB, Kotani K. Analyzing top-down visual attention in the context of gamma oscillations: a layer- dependent network-of- networks approach. Front Comput Neurosci 2024; 18:1439632. [PMID: 39376575 PMCID: PMC11456483 DOI: 10.3389/fncom.2024.1439632] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2024] [Accepted: 09/03/2024] [Indexed: 10/09/2024] Open
Abstract
Top-down visual attention is a fundamental cognitive process that allows individuals to selectively attend to salient visual stimuli in the environment. Recent empirical findings have revealed that gamma oscillations participate in the modulation of visual attention. However, computational studies face challenges when analyzing the attentional process in the context of gamma oscillation due to the unstable nature of gamma oscillations and the complexity induced by the layered fashion in the visual cortex. In this study, we propose a layer-dependent network-of-networks approach to analyze such attention with gamma oscillations. The model is validated by reproducing empirical findings on orientation preference and the enhancement of neuronal response due to top-down attention. We perform parameter plane analysis to classify neuronal responses into several patterns and find that the neuronal response to sensory and attention signals was modulated by the heterogeneity of the neuronal population. Furthermore, we revealed a counter-intuitive scenario that the excitatory populations in layer 2/3 and layer 5 exhibit opposite responses to the attentional input. By modification of the original model, we confirmed layer 6 plays an indispensable role in such cases. Our findings uncover the layer-dependent dynamics in the cortical processing of visual attention and open up new possibilities for further research on layer-dependent properties in the cerebral cortex.
Collapse
Affiliation(s)
- Tianyi Zheng
- Department of Precision Engineering, The University of Tokyo, Tokyo, Japan
| | - Masato Sugino
- Department of Precision Engineering, The University of Tokyo, Tokyo, Japan
| | - Yasuhiko Jimbo
- Department of Precision Engineering, The University of Tokyo, Tokyo, Japan
| | - G. Bard Ermentrout
- Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, United States
| | - Kiyoshi Kotani
- Department of Human and Engineered Environmental Studies, The University of Tokyo, Chiba, Japan
| |
Collapse
|
4
|
Becker MP, Idiart MAP. Mean-field method for generic conductance-based integrate-and-fire neurons with finite timescales. Phys Rev E 2024; 109:024406. [PMID: 38491595 DOI: 10.1103/physreve.109.024406] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Accepted: 11/08/2023] [Indexed: 03/18/2024]
Abstract
The construction of transfer functions in theoretical neuroscience plays an important role in determining the spiking rate behavior of neurons in networks. These functions can be obtained through various fitting methods, but the biological relevance of the parameters is not always clear. However, for stationary inputs, such functions can be obtained without the adjustment of free parameters by using mean-field methods. In this work, we expand current Fokker-Planck approaches to account for the concurrent influence of colored and multiplicative noise terms on generic conductance-based integrate-and-fire neurons. We reduce the resulting stochastic system through the application of the diffusion approximation to a one-dimensional Langevin equation. An effective Fokker-Planck is then constructed using Fox Theory, which is solved numerically using a newly developed double integration procedure to obtain the transfer function and the membrane potential distribution. The solution is capable of reproducing the transfer function and the stationary voltage distribution of simulated neurons across a wide range of parameters. The method can also be easily extended to account for different sources of noise with various multiplicative terms, and it can be used in other types of problems in principle.
Collapse
Affiliation(s)
- Marcelo P Becker
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| | - Marco A P Idiart
- Department of Physics, Institute of Physics, Federal University of Rio Grande do Sul, Porto Alegre, Brazil
| |
Collapse
|
5
|
Friedenberger Z, Naud R. Dendritic excitability controls overdispersion. NATURE COMPUTATIONAL SCIENCE 2024; 4:19-28. [PMID: 38177495 DOI: 10.1038/s43588-023-00580-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Accepted: 11/29/2023] [Indexed: 01/06/2024]
Abstract
The brain is an intricate assembly of intercommunicating neurons whose input-output function is only partially understood. The role of active dendrites in shaping spiking responses, in particular, is unclear. Although existing models account for active dendrites and spiking responses, they are too complex to analyze analytically and demand long stochastic simulations. Here we combine cable and renewal theory to describe how input fluctuations shape the response of neuronal ensembles with active dendrites. We found that dendritic input readily and potently controls interspike interval dispersion. This phenomenon can be understood by considering that neurons display three fundamental operating regimes: one mean-driven regime and two fluctuation-driven regimes. We show that these results are expected to appear for a wide range of dendritic properties and verify predictions of the model in experimental data. These findings have implications for the role of interspike interval dispersion in learning and for theories of attractor states.
Collapse
Affiliation(s)
- Zachary Friedenberger
- Centre for Neural Dynamics and Artificial Intelligence, University of Ottawa, Ottawa, Ontario, Canada
- Department of Physics, University of Ottawa, Ottawa, Ontario, Canada
| | - Richard Naud
- Centre for Neural Dynamics and Artificial Intelligence, University of Ottawa, Ottawa, Ontario, Canada.
- Department of Physics, University of Ottawa, Ottawa, Ontario, Canada.
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada.
| |
Collapse
|
6
|
Skaar JEW, Haug N, Stasik AJ, Einevoll GT, Tøndel K. Metamodelling of a two-population spiking neural network. PLoS Comput Biol 2023; 19:e1011625. [PMID: 38032904 PMCID: PMC10688753 DOI: 10.1371/journal.pcbi.1011625] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Accepted: 10/23/2023] [Indexed: 12/02/2023] Open
Abstract
In computational neuroscience, hypotheses are often formulated as bottom-up mechanistic models of the systems in question, consisting of differential equations that can be numerically integrated forward in time. Candidate models can then be validated by comparison against experimental data. The model outputs of neural network models depend on both neuron parameters, connectivity parameters and other model inputs. Successful model fitting requires sufficient exploration of the model parameter space, which can be computationally demanding. Additionally, identifying degeneracy in the parameters, i.e. different combinations of parameter values that produce similar outputs, is of interest, as they define the subset of parameter values consistent with the data. In this computational study, we apply metamodels to a two-population recurrent spiking network of point-neurons, the so-called Brunel network. Metamodels are data-driven approximations to more complex models with more desirable computational properties, which can be run considerably faster than the original model. Specifically, we apply and compare two different metamodelling techniques, masked autoregressive flows (MAF) and deep Gaussian process regression (DGPR), to estimate the power spectra of two different signals; the population spiking activities and the local field potential. We find that the metamodels are able to accurately model the power spectra in the asynchronous irregular regime, and that the DGPR metamodel provides a more accurate representation of the simulator compared to the MAF metamodel. Using the metamodels, we estimate the posterior probability distributions over parameters given observed simulator outputs separately for both LFP and population spiking activities. We find that these distributions correctly identify parameter combinations that give similar model outputs, and that some parameters are significantly more constrained by observing the LFP than by observing the population spiking activities.
Collapse
Affiliation(s)
- Jan-Eirik W. Skaar
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Nicolai Haug
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Alexander J. Stasik
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- Department of Physics, University of Oslo, Oslo, Norway
| | - Gaute T. Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- Department of Physics, University of Oslo, Oslo, Norway
| | - Kristin Tøndel
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| |
Collapse
|
7
|
Lorenzi RM, Geminiani A, Zerlaut Y, De Grazia M, Destexhe A, Gandini Wheeler-Kingshott CAM, Palesi F, Casellato C, D'Angelo E. A multi-layer mean-field model of the cerebellum embedding microstructure and population-specific dynamics. PLoS Comput Biol 2023; 19:e1011434. [PMID: 37656758 PMCID: PMC10501640 DOI: 10.1371/journal.pcbi.1011434] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Revised: 09/14/2023] [Accepted: 08/15/2023] [Indexed: 09/03/2023] Open
Abstract
Mean-field (MF) models are computational formalism used to summarize in a few statistical parameters the salient biophysical properties of an inter-wired neuronal network. Their formalism normally incorporates different types of neurons and synapses along with their topological organization. MFs are crucial to efficiently implement the computational modules of large-scale models of brain function, maintaining the specificity of local cortical microcircuits. While MFs have been generated for the isocortex, they are still missing for other parts of the brain. Here we have designed and simulated a multi-layer MF of the cerebellar microcircuit (including Granule Cells, Golgi Cells, Molecular Layer Interneurons, and Purkinje Cells) and validated it against experimental data and the corresponding spiking neural network (SNN) microcircuit model. The cerebellar MF was built using a system of equations, where properties of neuronal populations and topological parameters are embedded in inter-dependent transfer functions. The model time constant was optimised using local field potentials recorded experimentally from acute mouse cerebellar slices as a template. The MF reproduced the average dynamics of different neuronal populations in response to various input patterns and predicted the modulation of the Purkinje Cells firing depending on cortical plasticity, which drives learning in associative tasks, and the level of feedforward inhibition. The cerebellar MF provides a computationally efficient tool for future investigations of the causal relationship between microscopic neuronal properties and ensemble brain activity in virtual brain models addressing both physiological and pathological conditions.
Collapse
Affiliation(s)
| | - Alice Geminiani
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
| | - Yann Zerlaut
- Institut du Cerveau-Paris Brain Institute-ICM, Inserm, CNRS, APHP, Hôpital de la Pitié Salpêtrière, Paris, France
| | | | | | - Claudia A M Gandini Wheeler-Kingshott
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
- NMR Research Unit, Queen Square Multiple Sclerosis Centre, Department of Neuroinflammation, UCL Queen Square Institute of Neurology, UCL, London, United Kingdom
- Brain Connectivity Center, IRCCS Mondino Foundation, Pavia, Italy
| | - Fulvia Palesi
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
| | - Claudia Casellato
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
| | - Egidio D'Angelo
- Department of Brain and Behavioural Sciences, University of Pavia, Pavia, Italy
- Brain Connectivity Center, IRCCS Mondino Foundation, Pavia, Italy
| |
Collapse
|
8
|
Huang CH, Lin CCK. New biophysical rate-based modeling of long-term plasticity in mean-field neuronal population models. Comput Biol Med 2023; 163:107213. [PMID: 37413849 DOI: 10.1016/j.compbiomed.2023.107213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Revised: 05/20/2023] [Accepted: 06/25/2023] [Indexed: 07/08/2023]
Abstract
The formation of customized neural networks as the basis of brain functions such as receptive field selectivity, learning or memory depends heavily on the long-term plasticity of synaptic connections. However, the current mean-field population models commonly used to simulate large-scale neural network dynamics lack explicit links to the underlying cellular mechanisms of long-term plasticity. In this study, we developed a new mean-field population model, the plastic density-based neural mass model (pdNMM), by incorporating a newly developed rate-based plasticity model based on the calcium control hypothesis into an existing density-based neural mass model. Derivation of the plasticity model was carried out using population density methods. Our results showed that the synaptic plasticity represented by the resulting rate-based plasticity model exhibited Bienenstock-Cooper-Munro-like learning rules. Furthermore, we demonstrated that the pdNMM accurately reproduced previous experimental observations of long-term plasticity, including characteristics of Hebbian plasticity such as longevity, associativity and input specificity, on hippocampal slices, and the formation of receptive field selectivity in the visual cortex. In conclusion, the pdNMM is a novel approach that can confer long-term plasticity to conventional mean-field neuronal population models.
Collapse
Affiliation(s)
- Chih-Hsu Huang
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chou-Ching K Lin
- Department of Neurology, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Innovation Center of Medical Devices and Technology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; Medical Device Innovation Center, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|
9
|
Ekelmans P, Kraynyukovas N, Tchumatchenko T. Targeting operational regimes of interest in recurrent neural networks. PLoS Comput Biol 2023; 19:e1011097. [PMID: 37186668 DOI: 10.1371/journal.pcbi.1011097] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2022] [Revised: 05/25/2023] [Accepted: 04/11/2023] [Indexed: 05/17/2023] Open
Abstract
Neural computations emerge from local recurrent neural circuits or computational units such as cortical columns that comprise hundreds to a few thousand neurons. Continuous progress in connectomics, electrophysiology, and calcium imaging require tractable spiking network models that can consistently incorporate new information about the network structure and reproduce the recorded neural activity features. However, for spiking networks, it is challenging to predict which connectivity configurations and neural properties can generate fundamental operational states and specific experimentally reported nonlinear cortical computations. Theoretical descriptions for the computational state of cortical spiking circuits are diverse, including the balanced state where excitatory and inhibitory inputs balance almost perfectly or the inhibition stabilized state (ISN) where the excitatory part of the circuit is unstable. It remains an open question whether these states can co-exist with experimentally reported nonlinear computations and whether they can be recovered in biologically realistic implementations of spiking networks. Here, we show how to identify spiking network connectivity patterns underlying diverse nonlinear computations such as XOR, bistability, inhibitory stabilization, supersaturation, and persistent activity. We establish a mapping between the stabilized supralinear network (SSN) and spiking activity which allows us to pinpoint the location in parameter space where these activity regimes occur. Notably, we find that biologically-sized spiking networks can have irregular asynchronous activity that does not require strong excitation-inhibition balance or large feedforward input and we show that the dynamic firing rate trajectories in spiking networks can be precisely targeted without error-driven training algorithms.
Collapse
Affiliation(s)
- Pierre Ekelmans
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany
| | - Nataliya Kraynyukovas
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, Universitätsklinikum Bonn, Bonn, Germany
| | - Tatjana Tchumatchenko
- Theory of Neural Dynamics group, Max Planck Institute for Brain Research, Frankfurt am Main, Germany
- Institute of Experimental Epileptology and Cognition Research, Life and Brain Center, Universitätsklinikum Bonn, Bonn, Germany
- Institute of physiological chemistry, Medical center of the Johannes Gutenberg-University Mainz, Mainz, Germany
| |
Collapse
|
10
|
Vinci GV, Benzi R, Mattia M. Self-Consistent Stochastic Dynamics for Finite-Size Networks of Spiking Neurons. PHYSICAL REVIEW LETTERS 2023; 130:097402. [PMID: 36930929 DOI: 10.1103/physrevlett.130.097402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 12/23/2022] [Accepted: 02/09/2023] [Indexed: 06/18/2023]
Abstract
Despite the huge number of neurons composing a brain network, ongoing activity of local cell assemblies is intrinsically stochastic. Fluctuations in their instantaneous rate of spike firing ν(t) scale with the size of the assembly and persist in isolated networks, i.e., in the absence of external sources of noise. Although deterministic chaos due to the quenched disorder of the synaptic couplings underlies this seemingly stochastic dynamics, an effective theory for the network dynamics of a finite assembly of spiking neurons is lacking. Here, we fill this gap by extending the so-called population density approach including an activity- and size-dependent stochastic source in the Fokker-Planck equation for the membrane potential density. The finite-size noise embedded in this stochastic partial derivative equation is analytically characterized leading to a self-consistent and nonperturbative description of ν(t) valid for a wide class of spiking neuron networks. Power spectra of ν(t) are found in excellent agreement with those from detailed simulations both in the linear regime and across a synchronization phase transition, when a size-dependent smearing of the critical dynamics emerges.
Collapse
Affiliation(s)
- Gianni V Vinci
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
- PhD Program in Physics, Dept. of Physics, "Tor Vergata" University of Rome, 00133 Roma, Italy
| | - Roberto Benzi
- Dept. of Physics and INFN, "Tor Vergata" University of Rome, 00133 Roma, Italy
- Centro Ricerche "E. Fermi," 00184, Roma, Italy
| | - Maurizio Mattia
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
| |
Collapse
|
11
|
Gast R, Solla SA, Kennedy A. Macroscopic dynamics of neural networks with heterogeneous spiking thresholds. Phys Rev E 2023; 107:024306. [PMID: 36932598 DOI: 10.1103/physreve.107.024306] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Accepted: 02/02/2023] [Indexed: 06/18/2023]
Abstract
Mean-field theory links the physiological properties of individual neurons to the emergent dynamics of neural population activity. These models provide an essential tool for studying brain function at different scales; however, for their application to neural populations on large scale, they need to account for differences between distinct neuron types. The Izhikevich single neuron model can account for a broad range of different neuron types and spiking patterns, thus rendering it an optimal candidate for a mean-field theoretic treatment of brain dynamics in heterogeneous networks. Here we derive the mean-field equations for networks of all-to-all coupled Izhikevich neurons with heterogeneous spiking thresholds. Using methods from bifurcation theory, we examine the conditions under which the mean-field theory accurately predicts the dynamics of the Izhikevich neuron network. To this end, we focus on three important features of the Izhikevich model that are subject here to simplifying assumptions: (i) spike-frequency adaptation, (ii) the spike reset conditions, and (iii) the distribution of single-cell spike thresholds across neurons. Our results indicate that, while the mean-field model is not an exact model of the Izhikevich network dynamics, it faithfully captures its different dynamic regimes and phase transitions. We thus present a mean-field model that can represent different neuron types and spiking dynamics. The model comprises biophysical state variables and parameters, incorporates realistic spike resetting conditions, and accounts for heterogeneity in neural spiking thresholds. These features allow for a broad applicability of the model as well as for a direct comparison to experimental data.
Collapse
Affiliation(s)
- Richard Gast
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, Illinois 60611, USA
| | - Sara A Solla
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, Illinois 60611, USA
| | - Ann Kennedy
- Department of Neuroscience, Feinberg School of Medicine, Northwestern University, Chicago, Illinois 60611, USA
| |
Collapse
|
12
|
Klinshov VV, Kirillov SY. Shot noise in next-generation neural mass models for finite-size networks. Phys Rev E 2022; 106:L062302. [PMID: 36671128 DOI: 10.1103/physreve.106.l062302] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Accepted: 12/12/2022] [Indexed: 06/17/2023]
Abstract
Neural mass models is a general name for various models describing the collective dynamics of large neural populations in terms of averaged macroscopic variables. Recently, the so-called next-generation neural mass models have attracted a lot of attention due to their ability to account for the degree of synchrony. Being exact in the limit of infinitely large number of neurons, these models provide only an approximate description of finite-size networks. In the present Letter we study finite-size effects in the collective behavior of neural networks and prove that these effects can be captured by appropriately modified neural mass models. Namely, we show that the finite size of the network leads to the emergence of the so-called shot noise appearing as a stochastic term in the neural mass model. The power spectrum of this shot noise contains pronounced peaks, therefore its impact on the collective dynamics might be crucial due to resonance effects.
Collapse
Affiliation(s)
- Vladimir V Klinshov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, Nizhny Novgorod 603950, Russia and Faculty of Informatics, Mathematics, and Computer Science, National Research University Higher School of Economics, 25/12 Bol'shaya Pecherskaya Street, Nizhny Novgorod 603155, Russia
| | - Sergey Yu Kirillov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, Nizhny Novgorod 603950, Russia
| |
Collapse
|
13
|
Pietras B, Schmutz V, Schwalger T. Mesoscopic description of hippocampal replay and metastability in spiking neural networks with short-term plasticity. PLoS Comput Biol 2022; 18:e1010809. [PMID: 36548392 PMCID: PMC9822116 DOI: 10.1371/journal.pcbi.1010809] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 01/06/2023] [Accepted: 12/11/2022] [Indexed: 12/24/2022] Open
Abstract
Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity patterns are propagating bursts of place-cell activities called hippocampal replay, which is critical for memory consolidation. The sudden and repeated occurrences of these burst states during ongoing neural activity suggest metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to stochastic spiking and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesoscopic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie to obtain a "chemical Langevin equation", which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down-states dynamics) by means of phase-plane analysis. An extension of the Langevin equation for small network sizes is also presented. The stochastic neural mass model constitutes the basic component of our mesoscopic model for replay. We show that the mesoscopic model faithfully captures the statistical structure of individual replayed trajectories in microscopic simulations and in previously reported experimental data. Moreover, compared to the deterministic Romani-Tsodyks model of place-cell dynamics, it exhibits a higher level of variability regarding order, direction and timing of replayed trajectories, which seems biologically more plausible and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
| | - Valentin Schmutz
- Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Tilo Schwalger
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
14
|
Hagen E, Magnusson SH, Ness TV, Halnes G, Babu PN, Linssen C, Morrison A, Einevoll GT. Brain signal predictions from multi-scale networks using a linearized framework. PLoS Comput Biol 2022; 18:e1010353. [PMID: 35960767 PMCID: PMC9401172 DOI: 10.1371/journal.pcbi.1010353] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Revised: 08/24/2022] [Accepted: 07/02/2022] [Indexed: 12/04/2022] Open
Abstract
Simulations of neural activity at different levels of detail are ubiquitous in modern neurosciences, aiding the interpretation of experimental data and underlying neural mechanisms at the level of cells and circuits. Extracellular measurements of brain signals reflecting transmembrane currents throughout the neural tissue remain commonplace. The lower frequencies (≲ 300Hz) of measured signals generally stem from synaptic activity driven by recurrent interactions among neural populations and computational models should also incorporate accurate predictions of such signals. Due to limited computational resources, large-scale neuronal network models (≳ 106 neurons or so) often require reducing the level of biophysical detail and account mainly for times of action potentials (‘spikes’) or spike rates. Corresponding extracellular signal predictions have thus poorly accounted for their biophysical origin. Here we propose a computational framework for predicting spatiotemporal filter kernels for such extracellular signals stemming from synaptic activity, accounting for the biophysics of neurons, populations, and recurrent connections. Signals are obtained by convolving population spike rates by appropriate kernels for each connection pathway and summing the contributions. Our main results are that kernels derived via linearized synapse and membrane dynamics, distributions of cells, conduction delay, and volume conductor model allow for accurately capturing the spatiotemporal dynamics of ground truth extracellular signals from conductance-based multicompartment neuron networks. One particular observation is that changes in the effective membrane time constants caused by persistent synapse activation must be accounted for. The work also constitutes a major advance in computational efficiency of accurate, biophysics-based signal predictions from large-scale spike and rate-based neuron network models drastically reducing signal prediction times compared to biophysically detailed network models. This work also provides insight into how experimentally recorded low-frequency extracellular signals of neuronal activity may be approximately linearly dependent on spiking activity. A new software tool LFPykernels serves as a reference implementation of the framework. Understanding the brain’s function and activity in healthy and pathological states across spatial scales and times spanning entire lives is one of humanity’s great undertakings. In experimental and clinical work probing the brain’s activity, a variety of electric and magnetic measurement techniques are routinely applied. However interpreting the extracellularly measured signals remains arduous due to multiple factors, mainly the large number of neurons contributing to the signals and complex interactions occurring in recurrently connected neuronal circuits. To understand how neurons give rise to such signals, mechanistic modeling combined with forward models derived using volume conductor theory has proven to be successful, but this approach currently does not scale to the systems level (encompassing millions of neurons or more) where simplified or abstract neuron representations typically are used. Motivated by experimental findings implying approximately linear relationships between times of neuronal action potentials and extracellular population signals, we provide a biophysics-based method for computing causal filters relating spikes and extracellular signals that can be applied with spike times or rates of large-scale neuronal network models for predictions of population signals without relying on ad hoc approximations.
Collapse
Affiliation(s)
- Espen Hagen
- Department of Data Science, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- * E-mail: (EH); (GTE)
| | - Steinn H. Magnusson
- Department of Physics, Faculty of Mathematics and Natural Sciences, University of Oslo, Oslo, Norway
| | - Torbjørn V. Ness
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Geir Halnes
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Pooja N. Babu
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
| | - Charl Linssen
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
- Institute of Neuroscience and Medicine (INM-6); Computational and Systems Neuroscience & Institute for Advanced Simulation (IAS-6); Theoretical Neuroscience & JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre and JARA, Jülich, Germany
| | - Abigail Morrison
- Simulation & Data Lab Neuroscience, Institute for Advanced Simulation, Jülich Supercomputing Centre (JSC), Jülich Research Centre, Jülich, Germany
- Institute of Neuroscience and Medicine (INM-6); Computational and Systems Neuroscience & Institute for Advanced Simulation (IAS-6); Theoretical Neuroscience & JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre and JARA, Jülich, Germany
- Software Engineering, Department of Computer Science 3, RWTH Aachen University, Aachen, Germany
| | - Gaute T. Einevoll
- Department of Physics, Faculty of Mathematics and Natural Sciences, University of Oslo, Oslo, Norway
- Department of Physics, Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- * E-mail: (EH); (GTE)
| |
Collapse
|
15
|
Dumont G, Pérez-Cervera A, Gutkin B. A framework for macroscopic phase-resetting curves for generalised spiking neural networks. PLoS Comput Biol 2022; 18:e1010363. [PMID: 35913991 PMCID: PMC9371324 DOI: 10.1371/journal.pcbi.1010363] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2021] [Revised: 08/11/2022] [Accepted: 07/06/2022] [Indexed: 11/18/2022] Open
Abstract
Brain rhythms emerge from synchronization among interconnected spiking neurons. Key properties of such rhythms can be gleaned from the phase-resetting curve (PRC). Inferring the PRC and developing a systematic phase reduction theory for large-scale brain rhythms remains an outstanding challenge. Here we present a theoretical framework and methodology to compute the PRC of generic spiking networks with emergent collective oscillations. We adopt a renewal approach where neurons are described by the time since their last action potential, a description that can reproduce the dynamical feature of many cell types. For a sufficiently large number of neurons, the network dynamics are well captured by a continuity equation known as the refractory density equation. We develop an adjoint method for this equation giving a semi-analytical expression of the infinitesimal PRC. We confirm the validity of our framework for specific examples of neural networks. Our theoretical framework can link key biological properties at the individual neuron scale and the macroscopic oscillatory network properties. Beyond spiking networks, the approach is applicable to a broad class of systems that can be described by renewal processes. The formation of oscillatory neuronal assemblies at the network level has been hypothesized to be fundamental to many cognitive and motor functions. One prominent tool to understand the dynamics of oscillatory activity response to stimuli, and hence the neural code for which it is a substrate, is a nonlinear measure called Phase-Resetting Curve (PRC). At the network scale, the PRC defines the measure of how a given synaptic input perturbs the timing of next upcoming volley of spike assemblies: either advancing or delaying this timing. As a further application, one can use PRCs to make unambiguous predictions about whether communicating networks of neurons will phase-lock as it is often observed across the cortical areas and what would be this stable phase-configuration: synchronous, asynchronous or with asymmetric phase-shifts. The latter configuration also implies a preferential flow of information form the leading network to the follower, thereby giving causal signatures of directed functional connectivity. Because of the key position of the PRC in studying synchrony, information flow and entrainment to external forcing, it is crucial to move toward a theory that allows to compute the PRCs of network-wide oscillations not only for a restricted class of models, as has been done in the past, but to network descriptions that are generalized and can reflect flexibly single cell properties. In this manuscript, we tackle this issue by showing how the PRC for network oscillations can be computed using the adjoint systems of partial differential equations that define the dynamics of the neural activity density.
Collapse
Affiliation(s)
- Grégory Dumont
- Group for Neural Theory, LNC INSERM U960, DEC, Ecole Normale Supérieure - PSL University, Paris France
- * E-mail:
| | - Alberto Pérez-Cervera
- Center for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow
- Instituto de Matemática Interdisciplinar, Universidad Complutense de Madrid, Madrid, Spain
| | - Boris Gutkin
- Group for Neural Theory, LNC INSERM U960, DEC, Ecole Normale Supérieure - PSL University, Paris France
- Center for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow
| |
Collapse
|
16
|
Osborne H, de Kamps M. A numerical population density technique for N-dimensional neuron models. Front Neuroinform 2022; 16:883796. [PMID: 35935536 PMCID: PMC9354936 DOI: 10.3389/fninf.2022.883796] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2022] [Accepted: 06/24/2022] [Indexed: 11/13/2022] Open
Abstract
Population density techniques can be used to simulate the behavior of a population of neurons which adhere to a common underlying neuron model. They have previously been used for analyzing models of orientation tuning and decision making tasks. They produce a fully deterministic solution to neural simulations which often involve a non-deterministic or noise component. Until now, numerical population density techniques have been limited to only one- and two-dimensional models. For the first time, we demonstrate a method to take an N-dimensional underlying neuron model and simulate the behavior of a population. The technique enables so-called graceful degradation of the dynamics allowing a balance between accuracy and simulation speed while maintaining important behavioral features such as rate curves and bifurcations. It is an extension of the numerical population density technique implemented in the MIIND software framework that simulates networks of populations of neurons. Here, we describe the extension to N dimensions and simulate populations of leaky integrate-and-fire neurons with excitatory and inhibitory synaptic conductances then demonstrate the effect of degrading the accuracy on the solution. We also simulate two separate populations in an E-I configuration to demonstrate the technique's ability to capture complex behaviors of interacting populations. Finally, we simulate a population of four-dimensional Hodgkin-Huxley neurons under the influence of noise. Though the MIIND software has been used only for neural modeling up to this point, the technique can be used to simulate the behavior of a population of agents adhering to any system of ordinary differential equations under the influence of shot noise. MIIND has been modified to render a visualization of any three of an N-dimensional state space of a population which encourages fast model prototyping and debugging and could prove a useful educational tool for understanding dynamical systems.
Collapse
Affiliation(s)
- Hugh Osborne
- School of Computing, University of Leeds, Leeds, United Kingdom
| | - Marc de Kamps
- School of Computing, University of Leeds, Leeds, United Kingdom
- Leeds Institute for Data Analytics, University of Leeds, Leeds, United Kingdom
- The Alan Turing Institute, London, United Kingdom
- *Correspondence: Marc de Kamps
| |
Collapse
|
17
|
Layer M, Senk J, Essink S, van Meegen A, Bos H, Helias M. NNMT: Mean-Field Based Analysis Tools for Neuronal Network Models. Front Neuroinform 2022; 16:835657. [PMID: 35712677 PMCID: PMC9196133 DOI: 10.3389/fninf.2022.835657] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 03/17/2022] [Indexed: 11/13/2022] Open
Abstract
Mean-field theory of neuronal networks has led to numerous advances in our analytical and intuitive understanding of their dynamics during the past decades. In order to make mean-field based analysis tools more accessible, we implemented an extensible, easy-to-use open-source Python toolbox that collects a variety of mean-field methods for the leaky integrate-and-fire neuron model. The Neuronal Network Mean-field Toolbox (NNMT) in its current state allows for estimating properties of large neuronal networks, such as firing rates, power spectra, and dynamical stability in mean-field and linear response approximation, without running simulations. In this article, we describe how the toolbox is implemented, show how it is used to reproduce results of previous studies, and discuss different use-cases, such as parameter space explorations, or mapping different network models. Although the initial version of the toolbox focuses on methods for leaky integrate-and-fire neurons, its structure is designed to be open and extensible. It aims to provide a platform for collecting analytical methods for neuronal network model analysis, such that the neuroscientific community can take maximal advantage of them.
Collapse
Affiliation(s)
- Moritz Layer
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Johanna Senk
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Simon Essink
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Alexander van Meegen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Institute of Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Cologne, Germany
| | - Hannah Bos
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
18
|
Shi YL, Steinmetz NA, Moore T, Boahen K, Engel TA. Cortical state dynamics and selective attention define the spatial pattern of correlated variability in neocortex. Nat Commun 2022; 13:44. [PMID: 35013259 PMCID: PMC8748999 DOI: 10.1038/s41467-021-27724-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2020] [Accepted: 12/03/2021] [Indexed: 01/20/2023] Open
Abstract
Correlated activity fluctuations in the neocortex influence sensory responses and behavior. Neural correlations reflect anatomical connectivity but also change dynamically with cognitive states such as attention. Yet, the network mechanisms defining the population structure of correlations remain unknown. We measured correlations within columns in the visual cortex. We show that the magnitude of correlations, their attentional modulation, and dependence on lateral distance are explained by columnar On-Off dynamics, which are synchronous activity fluctuations reflecting cortical state. We developed a network model in which the On-Off dynamics propagate across nearby columns generating spatial correlations with the extent controlled by attentional inputs. This mechanism, unlike previous proposals, predicts spatially non-uniform changes in correlations during attention. We confirm this prediction in our columnar recordings by showing that in superficial layers the largest changes in correlations occur at intermediate lateral distances. Our results reveal how spatially structured patterns of correlated variability emerge through interactions of cortical state dynamics, anatomical connectivity, and attention.
Collapse
Affiliation(s)
- Yan-Liang Shi
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA
| | | | - Tirin Moore
- Department of Neurobiology, Stanford University, Stanford, CA, USA
- Howard Hughes Medical Institute, Stanford University, Stanford, CA, USA
| | - Kwabena Boahen
- Department of Bioengineering, Stanford University, Stanford, CA, USA
- Department of Electrical Engineering, Stanford University, Stanford, CA, USA
| | | |
Collapse
|
19
|
Ito T, Konishi K, Sano T, Wakayama H, Ogawa M. Synchronization of relaxation oscillators with adaptive thresholds and application to automated guided vehicles. Phys Rev E 2022; 105:014201. [PMID: 35193180 DOI: 10.1103/physreve.105.014201] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Accepted: 12/02/2021] [Indexed: 11/07/2022]
Abstract
The present paper proposes an adaptive control law for inducing in-phase and antiphase synchronization in a pair of relaxation oscillators. We analytically show that the phase dynamics of the oscillators coupled by the control law is equivalent to that of Kuramoto phase oscillators and then extend the results for a pair of oscillators to three or more oscillators. We also provide a systematic procedure for designing the controller parameters for oscillator networks with all-to-all and ring topologies. Our numerical simulations demonstrate that these analytical results can be used to solve a dispatching problem encountered by automated guided vehicles (AGVs) in factories. AGV congestion can be avoided and the peak value of the amount of materials or parts in buffers can be suppressed.
Collapse
Affiliation(s)
- Takehiro Ito
- Data Science Research Laboratories, NEC Corporation, 1753 Shimonumabe, Nakahara-ku, Kawasaki, Kanagawa 211-8666, Japan
| | - Keiji Konishi
- Department of Electrical and Information Systems, Osaka Prefecture University, 1-1 Gakuen-cho, Naka-ku, Sakai, Osaka 599-8531, Japan
| | - Toru Sano
- Department of Electrical and Information Systems, Osaka Prefecture University, 1-1 Gakuen-cho, Naka-ku, Sakai, Osaka 599-8531, Japan
| | - Hisaya Wakayama
- Data Science Research Laboratories, NEC Corporation, 1753 Shimonumabe, Nakahara-ku, Kawasaki, Kanagawa 211-8666, Japan
| | - Masatsugu Ogawa
- Data Science Research Laboratories, NEC Corporation, 1753 Shimonumabe, Nakahara-ku, Kawasaki, Kanagawa 211-8666, Japan
| |
Collapse
|
20
|
Gast R, Knösche TR, Schmidt H. Mean-field approximations of networks of spiking neurons with short-term synaptic plasticity. Phys Rev E 2021; 104:044310. [PMID: 34781468 DOI: 10.1103/physreve.104.044310] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Accepted: 09/30/2021] [Indexed: 01/17/2023]
Abstract
Low-dimensional descriptions of spiking neural network dynamics are an effective tool for bridging different scales of organization of brain structure and function. Recent advances in deriving mean-field descriptions for networks of coupled oscillators have sparked the development of a new generation of neural mass models. Of notable interest are mean-field descriptions of all-to-all coupled quadratic integrate-and-fire (QIF) neurons, which have already seen numerous extensions and applications. These extensions include different forms of short-term adaptation considered to play an important role in generating and sustaining dynamic regimes of interest in the brain. It is an open question, however, whether the incorporation of presynaptic forms of synaptic plasticity driven by single neuron activity would still permit the derivation of mean-field equations using the same method. Here we discuss this problem using an established model of short-term synaptic plasticity at the single neuron level, for which we present two different approaches for the derivation of the mean-field equations. We compare these models with a recently proposed mean-field approximation that assumes stochastic spike timings. In general, the latter fails to accurately reproduce the macroscopic activity in networks of deterministic QIF neurons with distributed parameters. We show that the mean-field models we propose provide a more accurate description of the network dynamics, although they are mathematically more involved. Using bifurcation analysis, we find that QIF networks with presynaptic short-term plasticity can express regimes of periodic bursting activity as well as bistable regimes. Together, we provide novel insight into the macroscopic effects of short-term synaptic plasticity in spiking neural networks, as well as two different mean-field descriptions for future investigations of such networks.
Collapse
Affiliation(s)
- Richard Gast
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Thomas R Knösche
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Helmut Schmidt
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
21
|
A Biomorphic Model of Cortical Column for Content-Based Image Retrieval. ENTROPY 2021; 23:e23111458. [PMID: 34828156 PMCID: PMC8620877 DOI: 10.3390/e23111458] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/10/2021] [Revised: 10/22/2021] [Accepted: 10/28/2021] [Indexed: 11/18/2022]
Abstract
How do living systems process information? The search for an answer to this question is ongoing. We have developed an intelligent video analytics system. The process of the formation of detectors for content-based image retrieval aimed at detecting objects of various types simulates the operation of the structural and functional modules for image processing in living systems. The process of detector construction is, in fact, a model of the formation (or activation) of connections in the cortical column (structural and functional unit of information processing in the human and animal brain). The process of content-based image retrieval, that is, the detection of various types of images in the developed system, reproduces the process of “triggering” a model biomorphic column, i.e., a detector in which connections are formed during the learning process. The recognition process is a reaction of the receptive field of the column to the activation by a given signal. Since the learning process of the detector can be visualized, it is possible to see how a column (a detector of specific stimuli) is formed: a face, a digit, a number, etc. The created artificial cognitive system is a biomorphic model of the recognition column of living systems.
Collapse
|
22
|
Schwalger T. Mapping input noise to escape noise in integrate-and-fire neurons: a level-crossing approach. BIOLOGICAL CYBERNETICS 2021; 115:539-562. [PMID: 34668051 PMCID: PMC8551127 DOI: 10.1007/s00422-021-00899-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 09/27/2021] [Indexed: 06/13/2023]
Abstract
Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate ("escape noise"). While input noise lends itself to modeling biophysical noise processes, the phenomenological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with colored input noise and time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced across the sub- and suprathreshold firing regime compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise, enabling progress in the theory of neuronal population dynamics.
Collapse
Affiliation(s)
- Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623, Berlin, Germany.
- Bernstein Center for Computational Neuroscience Berlin, 10115, Berlin, Germany.
| |
Collapse
|
23
|
Chizhov AV, Graham LJ. A strategy for mapping biophysical to abstract neuronal network models applied to primary visual cortex. PLoS Comput Biol 2021; 17:e1009007. [PMID: 34398895 PMCID: PMC8389851 DOI: 10.1371/journal.pcbi.1009007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Revised: 08/26/2021] [Accepted: 07/27/2021] [Indexed: 11/18/2022] Open
Abstract
A fundamental challenge for the theoretical study of neuronal networks is to make the link between complex biophysical models based directly on experimental data, to progressively simpler mathematical models that allow the derivation of general operating principles. We present a strategy that successively maps a relatively detailed biophysical population model, comprising conductance-based Hodgkin-Huxley type neuron models with connectivity rules derived from anatomical data, to various representations with fewer parameters, finishing with a firing rate network model that permits analysis. We apply this methodology to primary visual cortex of higher mammals, focusing on the functional property of stimulus orientation selectivity of receptive fields of individual neurons. The mapping produces compact expressions for the parameters of the abstract model that clearly identify the impact of specific electrophysiological and anatomical parameters on the analytical results, in particular as manifested by specific functional signatures of visual cortex, including input-output sharpening, conductance invariance, virtual rotation and the tilt after effect. Importantly, qualitative differences between model behaviours point out consequences of various simplifications. The strategy may be applied to other neuronal systems with appropriate modifications. A hierarchy of theoretical approaches to study a neuronal network depends on a tradeoff between biological fidelity and mathematical tractibility. Biophysically-detailed models consider cellular mechanisms and anatomically defined synaptic circuits, but are often too complex to reveal insights into fundamental principles. In contrast, increasingly abstract reduced models facilitate analytical insights. To better ground the latter to the underlying biology, we describe a systematic procedure to move across the model hierarchy that allows understanding how changes in biological parameters—physiological, pathophysiological, or because of new data—impact the behaviour of the network. We apply this approach to mammalian primary visual cortex, and examine how the different models in the hierarchy reproduce functional signatures of this area, in particular the tuning of neurons to the orientation of a visual stimulus. Our work provides a navigation of the complex parameter space of neural network models faithful to biology, as well as highlighting how simplifications made for mathematical convenience can fundamentally change their behaviour.
Collapse
Affiliation(s)
- Anton V. Chizhov
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- * E-mail:
| | - Lyle J. Graham
- Centre Giovanni Borelli - CNRS UMR9010, Université de Paris, France
| |
Collapse
|
24
|
Pulvermüller F, Tomasello R, Henningsen-Schomers MR, Wennekers T. Biological constraints on neural network models of cognitive function. Nat Rev Neurosci 2021; 22:488-502. [PMID: 34183826 PMCID: PMC7612527 DOI: 10.1038/s41583-021-00473-5] [Citation(s) in RCA: 53] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/17/2021] [Indexed: 02/06/2023]
Abstract
Neural network models are potential tools for improving our understanding of complex brain functions. To address this goal, these models need to be neurobiologically realistic. However, although neural networks have advanced dramatically in recent years and even achieve human-like performance on complex perceptual and cognitive tasks, their similarity to aspects of brain anatomy and physiology is imperfect. Here, we discuss different types of neural models, including localist, auto-associative, hetero-associative, deep and whole-brain networks, and identify aspects under which their biological plausibility can be improved. These aspects range from the choice of model neurons and of mechanisms of synaptic plasticity and learning to implementation of inhibition and control, along with neuroanatomical properties including areal structure and local and long-range connectivity. We highlight recent advances in developing biologically grounded cognitive theories and in mechanistically explaining, on the basis of these brain-constrained neural models, hitherto unaddressed issues regarding the nature, localization and ontogenetic and phylogenetic development of higher brain functions. In closing, we point to possible future clinical applications of brain-constrained modelling.
Collapse
Affiliation(s)
- Friedemann Pulvermüller
- Brain Language Laboratory, Department of Philosophy and Humanities, WE4, Freie Universität Berlin, Berlin, Germany.
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany.
- Einstein Center for Neurosciences Berlin, Berlin, Germany.
- Cluster of Excellence 'Matters of Activity', Humboldt-Universität zu Berlin, Berlin, Germany.
| | - Rosario Tomasello
- Brain Language Laboratory, Department of Philosophy and Humanities, WE4, Freie Universität Berlin, Berlin, Germany
- Cluster of Excellence 'Matters of Activity', Humboldt-Universität zu Berlin, Berlin, Germany
| | - Malte R Henningsen-Schomers
- Brain Language Laboratory, Department of Philosophy and Humanities, WE4, Freie Universität Berlin, Berlin, Germany
- Cluster of Excellence 'Matters of Activity', Humboldt-Universität zu Berlin, Berlin, Germany
| | - Thomas Wennekers
- School of Engineering, Computing and Mathematics, University of Plymouth, Plymouth, UK
| |
Collapse
|
25
|
Goldobin DS, di Volo M, Torcini A. Reduction Methodology for Fluctuation Driven Population Dynamics. PHYSICAL REVIEW LETTERS 2021; 127:038301. [PMID: 34328756 DOI: 10.1103/physrevlett.127.038301] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Revised: 03/24/2021] [Accepted: 06/14/2021] [Indexed: 06/13/2023]
Abstract
Lorentzian distributions have been largely employed in statistical mechanics to obtain exact results for heterogeneous systems. Analytic continuation of these results is impossible even for slightly deformed Lorentzian distributions due to the divergence of all the moments (cumulants). We have solved this problem by introducing a "pseudocumulants" expansion. This allows us to develop a reduction methodology for heterogeneous spiking neural networks subject to extrinsic and endogenous fluctuations, thus obtaining a unified mean-field formulation encompassing quenched and dynamical sources of disorder.
Collapse
Affiliation(s)
- Denis S Goldobin
- Institute of Continuous Media Mechanics, Ural Branch of RAS, Acad. Korolev Street 1, 614013 Perm, Russia
- Department of Theoretical Physics, Perm State University, Bukirev Street 15, 614990 Perm, Russia
| | - Matteo di Volo
- Laboratoire de Physique Théorique et Modélisation, Université de Cergy-Pontoise, CNRS, UMR 8089, 95302 Cergy-Pontoise cedex, France
| | - Alessandro Torcini
- Laboratoire de Physique Théorique et Modélisation, Université de Cergy-Pontoise, CNRS, UMR 8089, 95302 Cergy-Pontoise cedex, France
- CNR-Consiglio Nazionale delle Ricerche-Istituto dei Sistemi Complessi, via Madonna del Piano 10, I-50019 Sesto Fiorentino, Italy
- INFN Sezione di Firenze, Via Sansone 1, I-50019 Sesto Fiorentino, Florence, Italy
| |
Collapse
|
26
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
27
|
Huang CH, Lin CCK. A novel density-based neural mass model for simulating neuronal network dynamics with conductance-based synapses and membrane current adaptation. Neural Netw 2021; 143:183-197. [PMID: 34157643 DOI: 10.1016/j.neunet.2021.06.009] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2020] [Revised: 04/01/2021] [Accepted: 06/06/2021] [Indexed: 10/21/2022]
Abstract
Despite its success in understanding brain rhythms, the neural mass model, as a low-dimensional mean-field network model, is phenomenological in nature, so that it cannot replicate some of rich repertoire of responses seen in real neuronal tissues. Here, using a colored-synapse population density method, we derived a novel neural mass model, termed density-based neural mass model (dNMM), as the mean-field description of network dynamics of adaptive exponential integrate-and-fire (aEIF) neurons, in which two critical neuronal features, i.e., voltage-dependent conductance-based synaptic interactions and adaptation of firing rate responses, were included. Our results showed that the dNMM was capable of correctly estimating firing rate responses of a neuronal population of aEIF neurons receiving stationary or time-varying excitatory and inhibitory inputs. Finally, it was also able to quantitatively describe the effect of spike-frequency adaptation in the generation of asynchronous irregular activity of excitatory-inhibitory cortical networks. We conclude that in terms of its biological reality and calculation efficiency, the dNMM is a suitable candidate to build significantly large-scale network models involving multiple brain areas, where the neuronal population is the smallest dynamic unit.
Collapse
Affiliation(s)
- Chih-Hsu Huang
- Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chou-Ching K Lin
- Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|
28
|
Romaro C, Najman FA, Lytton WW, Roque AC, Dura-Bernal S. NetPyNE Implementation and Scaling of the Potjans-Diesmann Cortical Microcircuit Model. Neural Comput 2021; 33:1993-2032. [PMID: 34411272 PMCID: PMC8382011 DOI: 10.1162/neco_a_01400] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2020] [Accepted: 02/16/2021] [Indexed: 11/04/2022]
Abstract
The Potjans-Diesmann cortical microcircuit model is a widely used model originally implemented in NEST. Here, we reimplemented the model using NetPyNE, a high-level Python interface to the NEURON simulator, and reproduced the findings of the original publication. We also implemented a method for scaling the network size that preserves first- and second-order statistics, building on existing work on network theory. Our new implementation enabled the use of more detailed neuron models with multicompartmental morphologies and multiple biophysically realistic ion channels. This opens the model to new research, including the study of dendritic processing, the influence of individual channel parameters, the relation to local field potentials, and other multiscale interactions. The scaling method we used provides flexibility to increase or decrease the network size as needed when running these CPU-intensive detailed simulations. Finally, NetPyNE facilitates modifying or extending the model using its declarative language; optimizing model parameters; running efficient, large-scale parallelized simulations; and analyzing the model through built-in methods, including local field potential calculation and information flow measures.
Collapse
Affiliation(s)
- Cecilia Romaro
- Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP 14049, Brazil
| | - Fernando Araujo Najman
- Institute of Mathematics and Statistics, University of São Paulo, São Paulo, SP 05508, Brazil
| | - William W Lytton
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, New York, NY 11203, U.S.A.
| | - Antonio C Roque
- Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP 14049, Brazil
| | - Salvador Dura-Bernal
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, New York, NY 11203, U.S.A., and Nathan Kline Institute for Psychiatric Research, New York, NY 10962, U.S.A.
| |
Collapse
|
29
|
Klinshov V, Kirillov S, Nekorkin V. Reduction of the collective dynamics of neural populations with realistic forms of heterogeneity. Phys Rev E 2021; 103:L040302. [PMID: 34005994 DOI: 10.1103/physreve.103.l040302] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Accepted: 03/16/2021] [Indexed: 11/07/2022]
Abstract
Reduction of collective dynamics of large heterogeneous populations to low-dimensional mean-field models is an important task of modern theoretical neuroscience. Such models can be derived from microscopic equations, for example with the help of Ott-Antonsen theory. An often used assumption of the Lorentzian distribution of the unit parameters makes the reduction especially efficient. However, the Lorentzian distribution is often implausible as having undefined moments, and the collective behavior of populations with other distributions needs to be studied. In the present Letter we propose a method which allows efficient reduction for an arbitrary distribution and show how it performs for the Gaussian distribution. We show that a reduced system for several macroscopic complex variables provides an accurate description of a population of thousands of neurons. Using this reduction technique we demonstrate that the population dynamics depends significantly on the form of its parameter distribution. In particular, the dynamics of populations with Lorentzian and Gaussian distributions with the same center and width differ drastically.
Collapse
Affiliation(s)
- Vladimir Klinshov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, 603950 Nizhny Novgorod, Russia
| | - Sergey Kirillov
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, 603950 Nizhny Novgorod, Russia
| | - Vladimir Nekorkin
- Institute of Applied Physics of the Russian Academy of Sciences, 46 Ul'yanov Street, 603950 Nizhny Novgorod, Russia
| |
Collapse
|
30
|
Causal cognitive architecture 1: Integration of connectionist elements into a navigation-based framework. COGN SYST RES 2021. [DOI: 10.1016/j.cogsys.2020.10.021] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
31
|
Liang J, Zhou T, Zhou C. Hopf Bifurcation in Mean Field Explains Critical Avalanches in Excitation-Inhibition Balanced Neuronal Networks: A Mechanism for Multiscale Variability. Front Syst Neurosci 2020; 14:580011. [PMID: 33324179 PMCID: PMC7725680 DOI: 10.3389/fnsys.2020.580011] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2020] [Accepted: 11/02/2020] [Indexed: 12/14/2022] Open
Abstract
Cortical neural circuits display highly irregular spiking in individual neurons but variably sized collective firing, oscillations and critical avalanches at the population level, all of which have functional importance for information processing. Theoretically, the balance of excitation and inhibition inputs is thought to account for spiking irregularity and critical avalanches may originate from an underlying phase transition. However, the theoretical reconciliation of these multilevel dynamic aspects in neural circuits remains an open question. Herein, we study excitation-inhibition (E-I) balanced neuronal network with biologically realistic synaptic kinetics. It can maintain irregular spiking dynamics with different levels of synchrony and critical avalanches emerge near the synchronous transition point. We propose a novel semi-analytical mean-field theory to derive the field equations governing the network macroscopic dynamics. It reveals that the E-I balanced state of the network manifesting irregular individual spiking is characterized by a macroscopic stable state, which can be either a fixed point or a periodic motion and the transition is predicted by a Hopf bifurcation in the macroscopic field. Furthermore, by analyzing public data, we find the coexistence of irregular spiking and critical avalanches in the spontaneous spiking activities of mouse cortical slice in vitro, indicating the universality of the observed phenomena. Our theory unveils the mechanism that permits complex neural activities in different spatiotemporal scales to coexist and elucidates a possible origin of the criticality of neural systems. It also provides a novel tool for analyzing the macroscopic dynamics of E-I balanced networks and its relationship to the microscopic counterparts, which can be useful for large-scale modeling and computation of cortical dynamics.
Collapse
Affiliation(s)
- Junhao Liang
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Key Laboratory of Computational Mathematics, Guangdong Province, and School of Mathematics, Sun Yat-sen University, Guangzhou, China
| | - Tianshou Zhou
- Key Laboratory of Computational Mathematics, Guangdong Province, and School of Mathematics, Sun Yat-sen University, Guangzhou, China
| | - Changsong Zhou
- Department of Physics, Centre for Nonlinear Studies, Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems, Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong
- Department of Physics, Zhejiang University, Hangzhou, China
| |
Collapse
|
32
|
Genkin M, Engel TA. Moving beyond generalization to accurate interpretation of flexible models. NAT MACH INTELL 2020; 2:674-683. [DOI: 10.1038/s42256-020-00242-6] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
|
33
|
Kulkarni A, Ranft J, Hakim V. Synchronization, Stochasticity, and Phase Waves in Neuronal Networks With Spatially-Structured Connectivity. Front Comput Neurosci 2020; 14:569644. [PMID: 33192427 PMCID: PMC7604323 DOI: 10.3389/fncom.2020.569644] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 08/18/2020] [Indexed: 01/15/2023] Open
Abstract
Oscillations in the beta/low gamma range (10–45 Hz) are recorded in diverse neural structures. They have successfully been modeled as sparsely synchronized oscillations arising from reciprocal interactions between randomly connected excitatory (E) pyramidal cells and local interneurons (I). The synchronization of spatially distant oscillatory spiking E–I modules has been well-studied in the rate model framework but less so for modules of spiking neurons. Here, we first show that previously proposed modifications of rate models provide a quantitative description of spiking E–I modules of Exponential Integrate-and-Fire (EIF) neurons. This allows us to analyze the dynamical regimes of sparsely synchronized oscillatory E–I modules connected by long-range excitatory interactions, for two modules, as well as for a chain of such modules. For modules with a large number of neurons (> 105), we obtain results similar to previously obtained ones based on the classic deterministic Wilson-Cowan rate model, with the added bonus that the results quantitatively describe simulations of spiking EIF neurons. However, for modules with a moderate (~ 104) number of neurons, stochastic variations in the spike emission of neurons are important and need to be taken into account. On the one hand, they modify the oscillations in a way that tends to promote synchronization between different modules. On the other hand, independent fluctuations on different modules tend to disrupt synchronization. The correlations between distant oscillatory modules can be described by stochastic equations for the oscillator phases that have been intensely studied in other contexts. On shorter distances, we develop a description that also takes into account amplitude modes and that quantitatively accounts for our simulation data. Stochastic dephasing of neighboring modules produces transient phase gradients and the transient appearance of phase waves. We propose that these stochastically-induced phase waves provide an explanative framework for the observations of traveling waves in the cortex during beta oscillations.
Collapse
Affiliation(s)
- Anirudh Kulkarni
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France.,IBENS, Ecole Normale Supérieure, PSL University, CNRS, INSERM, Paris, France
| | - Jonas Ranft
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France.,IBENS, Ecole Normale Supérieure, PSL University, CNRS, INSERM, Paris, France
| | - Vincent Hakim
- Laboratoire de Physique de l'Ecole Normale Supérieure, CNRS, Ecole Normale Supérieure, PSL University, Sorbonne Université, Université de Paris, Paris, France
| |
Collapse
|
34
|
Generation of Sharp Wave-Ripple Events by Disinhibition. J Neurosci 2020; 40:7811-7836. [PMID: 32913107 PMCID: PMC7548694 DOI: 10.1523/jneurosci.2174-19.2020] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2019] [Revised: 06/29/2020] [Accepted: 07/17/2020] [Indexed: 11/21/2022] Open
Abstract
Sharp wave-ripple complexes (SWRs) are hippocampal network phenomena involved in memory consolidation. To date, the mechanisms underlying their occurrence remain obscure. Here, we show how the interactions between pyramidal cells, parvalbumin-positive (PV+) basket cells, and an unidentified class of anti-SWR interneurons can contribute to the initiation and termination of SWRs. Using a biophysically constrained model of a network of spiking neurons and a rate-model approximation, we demonstrate that SWRs emerge as a result of the competition between two interneuron populations and the resulting disinhibition of pyramidal cells. Our models explain how the activation of pyramidal cells or PV+ cells can trigger SWRs, as shown in vitro, and suggests that PV+ cell-mediated short-term synaptic depression influences the experimentally reported dynamics of SWR events. Furthermore, we predict that the silencing of anti-SWR interneurons can trigger SWRs. These results broaden our understanding of the microcircuits supporting the generation of memory-related network dynamics. SIGNIFICANCE STATEMENT The hippocampus is a part of the mammalian brain that is crucial for episodic memories. During periods of sleep and inactive waking, the extracellular activity of the hippocampus is dominated by sharp wave-ripple events (SWRs), which have been shown to be important for memory consolidation. The mechanisms regulating the emergence of these events are still unclear. We developed a computational model to study the emergence of SWRs and to explain the roles of different cell types in regulating them. The model accounts for several previously unexplained features of SWRs and thus advances the understanding of memory-related dynamics.
Collapse
|
35
|
Staiger JF, Petersen CCH. Neuronal Circuits in Barrel Cortex for Whisker Sensory Perception. Physiol Rev 2020; 101:353-415. [PMID: 32816652 DOI: 10.1152/physrev.00019.2019] [Citation(s) in RCA: 56] [Impact Index Per Article: 11.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023] Open
Abstract
The array of whiskers on the snout provides rodents with tactile sensory information relating to the size, shape and texture of objects in their immediate environment. Rodents can use their whiskers to detect stimuli, distinguish textures, locate objects and navigate. Important aspects of whisker sensation are thought to result from neuronal computations in the whisker somatosensory cortex (wS1). Each whisker is individually represented in the somatotopic map of wS1 by an anatomical unit named a 'barrel' (hence also called barrel cortex). This allows precise investigation of sensory processing in the context of a well-defined map. Here, we first review the signaling pathways from the whiskers to wS1, and then discuss current understanding of the various types of excitatory and inhibitory neurons present within wS1. Different classes of cells can be defined according to anatomical, electrophysiological and molecular features. The synaptic connectivity of neurons within local wS1 microcircuits, as well as their long-range interactions and the impact of neuromodulators, are beginning to be understood. Recent technological progress has allowed cell-type-specific connectivity to be related to cell-type-specific activity during whisker-related behaviors. An important goal for future research is to obtain a causal and mechanistic understanding of how selected aspects of tactile sensory information are processed by specific types of neurons in the synaptically connected neuronal networks of wS1 and signaled to downstream brain areas, thus contributing to sensory-guided decision-making.
Collapse
Affiliation(s)
- Jochen F Staiger
- University Medical Center Göttingen, Institute for Neuroanatomy, Göttingen, Germany; and Laboratory of Sensory Processing, Faculty of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Carl C H Petersen
- University Medical Center Göttingen, Institute for Neuroanatomy, Göttingen, Germany; and Laboratory of Sensory Processing, Faculty of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| |
Collapse
|
36
|
René A, Longtin A, Macke JH. Inference of a Mesoscopic Population Model from Population Spike Trains. Neural Comput 2020; 32:1448-1498. [DOI: 10.1162/neco_a_01292] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Understanding how rich dynamics emerge in neural populations requires models exhibiting a wide range of behaviors while remaining interpretable in terms of connectivity and single-neuron dynamics. However, it has been challenging to fit such mechanistic spiking networks at the single-neuron scale to empirical population data. To close this gap, we propose to fit such data at a mesoscale, using a mechanistic but low-dimensional and, hence, statistically tractable model. The mesoscopic representation is obtained by approximating a population of neurons as multiple homogeneous pools of neurons and modeling the dynamics of the aggregate population activity within each pool. We derive the likelihood of both single-neuron and connectivity parameters given this activity, which can then be used to optimize parameters by gradient ascent on the log likelihood or perform Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. We illustrate this approach using a model of generalized integrate-and-fire neurons for which mesoscopic dynamics have been previously derived and show that both single-neuron and connectivity parameters can be recovered from simulated data. In particular, our inference method extracts posterior correlations between model parameters, which define parameter subsets able to reproduce the data. We compute the Bayesian posterior for combinations of parameters using MCMC sampling and investigate how the approximations inherent in a mesoscopic population model affect the accuracy of the inferred single-neuron parameters.
Collapse
Affiliation(s)
- Alexandre René
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada; Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany; and Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich 52425, Germany
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa K1N 6N5, Canada, and Brain and Mind Research Institute, University of Ottawa, Ottawa K1H 8M5, Canada
| | - Jakob H. Macke
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Bonn 53175, Germany, and Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of Munich, Munich 80333, Germany
| |
Collapse
|
37
|
Spike-induced ordering: Stochastic neural spikes provide immediate adaptability to the sensorimotor system. Proc Natl Acad Sci U S A 2020; 117:12486-12496. [PMID: 32430332 PMCID: PMC7275765 DOI: 10.1073/pnas.1819707117] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The functional advantages of using a stochastically spiking neural network (sSNN) instead of a nonspiking neural network (NS-NN) have remained largely unknown. We developed an architecture which enabled the parametric adjustment of the spikiness (i.e., impulsive dynamics and stochasticity) of the sSNN output and observed that stochastic spikes instantaneously induced the ordered motion of a dynamical system. We demonstrated the benefits of sSNNs using a musculoskeletal bipedal walker and, moreover, showed that the decrease in the spikiness of motor neuron output leads to a reduction in adaptability. Stochastic spikes may aid the adaptation of a biological system to sudden perturbations or environmental changes. Our architecture can easily be connected to the conventional NS-NN and may superimpose the on-site adaptability. Most biological neurons exhibit stochastic and spiking action potentials. However, the benefits of stochastic spikes versus continuous signals other than noise tolerance and energy efficiency remain largely unknown. In this study, we provide an insight into the potential roles of stochastic spikes, which may be beneficial for producing on-site adaptability in biological sensorimotor agents. We developed a platform that enables parametric modulation of the stochastic and discontinuous output of a stochastically spiking neural network (sSNN) to the rate-coded smooth output. This platform was applied to a complex musculoskeletal–neural system of a bipedal walker, and we demonstrated how stochastic spikes may help improve on-site adaptability of a bipedal walker to slippery surfaces or perturbation of random external forces. We further applied our sSNN platform to more general and simple sensorimotor agents and demonstrated four basic functions provided by an sSNN: 1) synchronization to a natural frequency, 2) amplification of the resonant motion in a natural frequency, 3) basin enlargement of the behavioral goal state, and 4) rapid complexity reduction and regular motion pattern formation. We propose that the benefits of sSNNs are not limited to musculoskeletal dynamics. Indeed, a wide range of the stability and adaptability of biological systems may arise from stochastic spiking dynamics.
Collapse
|
38
|
Todorov D, Truccolo W. Stability of stochastic finite-size spiking-neuron networks: Comparing mean-field, 1-loop correction and quasi-renewal approximations. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2019:4380-4386. [PMID: 31946838 DOI: 10.1109/embc.2019.8857101] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
We examine the stability and qualitative dynamics of stochastic neuronal networks specified as multivariate non-linear Hawkes processes and related point-process generalized linear models that incorporate both auto- and cross-history effects. In particular, we adapt previous theoretical approximations based on mean field and mean field plus 1-loop correction to incorporate absolute refractory periods and other auto-history effects. Furthermore, we extend previous quasi-renewal approximations to the multivariate case, i.e. neuronal networks. The best sensitivity and specificity performance, in terms of predicting stability and divergence to nonphysiologically high firing rates in the examined simulations, was obtained by a variant of the quasi-renewal approximation.
Collapse
|
39
|
Biophysically grounded mean-field models of neural populations under electrical stimulation. PLoS Comput Biol 2020; 16:e1007822. [PMID: 32324734 PMCID: PMC7200022 DOI: 10.1371/journal.pcbi.1007822] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Revised: 05/05/2020] [Accepted: 03/24/2020] [Indexed: 11/19/2022] Open
Abstract
Electrical stimulation of neural systems is a key tool for understanding neural dynamics and ultimately for developing clinical treatments. Many applications of electrical stimulation affect large populations of neurons. However, computational models of large networks of spiking neurons are inherently hard to simulate and analyze. We evaluate a reduced mean-field model of excitatory and inhibitory adaptive exponential integrate-and-fire (AdEx) neurons which can be used to efficiently study the effects of electrical stimulation on large neural populations. The rich dynamical properties of this basic cortical model are described in detail and validated using large network simulations. Bifurcation diagrams reflecting the network's state reveal asynchronous up- and down-states, bistable regimes, and oscillatory regions corresponding to fast excitation-inhibition and slow excitation-adaptation feedback loops. The biophysical parameters of the AdEx neuron can be coupled to an electric field with realistic field strengths which then can be propagated up to the population description. We show how on the edge of bifurcation, direct electrical inputs cause network state transitions, such as turning on and off oscillations of the population rate. Oscillatory input can frequency-entrain and phase-lock endogenous oscillations. Relatively weak electric field strengths on the order of 1 V/m are able to produce these effects, indicating that field effects are strongly amplified in the network. The effects of time-varying external stimulation are well-predicted by the mean-field model, further underpinning the utility of low-dimensional neural mass models.
Collapse
|
40
|
Schmutz V, Gerstner W, Schwalger T. Mesoscopic population equations for spiking neural networks with synaptic short-term plasticity. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2020; 10:5. [PMID: 32253526 PMCID: PMC7136387 DOI: 10.1186/s13408-020-00082-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Accepted: 03/25/2020] [Indexed: 06/07/2023]
Abstract
Coarse-graining microscopic models of biological neural networks to obtain mesoscopic models of neural activities is an essential step towards multi-scale models of the brain. Here, we extend a recent theory for mesoscopic population dynamics with static synapses to the case of dynamic synapses exhibiting short-term plasticity (STP). The extended theory offers an approximate mean-field dynamics for the synaptic input currents arising from populations of spiking neurons and synapses undergoing Tsodyks-Markram STP. The approximate mean-field dynamics accounts for both finite number of synapses and correlation between the two synaptic variables of the model (utilization and available resources) and its numerical implementation is simple. Comparisons with Monte Carlo simulations of the microscopic model show that in both feedforward and recurrent networks, the mesoscopic mean-field model accurately reproduces the first- and second-order statistics of the total synaptic input into a postsynaptic neuron and accounts for stochastic switches between Up and Down states and for population spikes. The extended mesoscopic population theory of spiking neural networks with STP may be useful for a systematic reduction of detailed biophysical models of cortical microcircuits to numerically efficient and mathematically tractable mean-field models.
Collapse
Affiliation(s)
- Valentin Schmutz
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL), Lausanne, Switzerland.
| | - Wulfram Gerstner
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL), Lausanne, Switzerland
| | - Tilo Schwalger
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL), Lausanne, Switzerland
- Bernstein Center for Computational Neuroscience, Institut für Mathematik, Technische Universität Berlin, Berlin, Germany
| |
Collapse
|
41
|
Carlu M, Chehab O, Dalla Porta L, Depannemaecker D, Héricé C, Jedynak M, Köksal Ersöz E, Muratore P, Souihel S, Capone C, Zerlaut Y, Destexhe A, di Volo M. A mean-field approach to the dynamics of networks of complex neurons, from nonlinear Integrate-and-Fire to Hodgkin-Huxley models. J Neurophysiol 2020; 123:1042-1051. [PMID: 31851573 PMCID: PMC7099478 DOI: 10.1152/jn.00399.2019] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2019] [Revised: 12/05/2019] [Accepted: 12/09/2019] [Indexed: 11/22/2022] Open
Abstract
We present a mean-field formalism able to predict the collective dynamics of large networks of conductance-based interacting spiking neurons. We apply this formalism to several neuronal models, from the simplest Adaptive Exponential Integrate-and-Fire model to the more complex Hodgkin-Huxley and Morris-Lecar models. We show that the resulting mean-field models are capable of predicting the correct spontaneous activity of both excitatory and inhibitory neurons in asynchronous irregular regimes, typical of cortical dynamics. Moreover, it is possible to quantitatively predict the population response to external stimuli in the form of external spike trains. This mean-field formalism therefore provides a paradigm to bridge the scale between population dynamics and the microscopic complexity of the individual cells physiology.NEW & NOTEWORTHY Population models are a powerful mathematical tool to study the dynamics of neuronal networks and to simulate the brain at macroscopic scales. We present a mean-field model capable of quantitatively predicting the temporal dynamics of a network of complex spiking neuronal models, from Integrate-and-Fire to Hodgkin-Huxley, thus linking population models to neurons electrophysiology. This opens a perspective on generating biologically realistic mean-field models from electrophysiological recordings.
Collapse
Affiliation(s)
- M. Carlu
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - O. Chehab
- Ecole Normale Superieure Paris-Saclay, France
| | - L. Dalla Porta
- Institut d’Investigacions Biomèdiques August Pi i Sunyer, Barcelona, Spain
| | - D. Depannemaecker
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - C. Héricé
- Strathclyde Institute of Pharmacy and Biomedical Sciences, Glasgow, Scotland, United Kingdom
| | - M. Jedynak
- Université Grenoble Alpes, Grenoble Institut des Neurosciences and Institut National de la Santé et de la Recherche Médicale (INSERM), U1216, France
| | - E. Köksal Ersöz
- INSERM, U1099, Rennes, France
- MathNeuro Team, Inria Sophia Antipolis Méditerranée, Sophia Antipolis, France
| | - P. Muratore
- Physics Department, Sapienza University, Rome, Italy
| | - S. Souihel
- Université Côte d’Azur, Inria Sophia Antipolis Méditerranée, France
| | - C. Capone
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - Y. Zerlaut
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - A. Destexhe
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
| | - M. di Volo
- Department of Integrative and Computational Neuroscience, Paris-Saclay Institute of Neuroscience, Centre National de la Recherche Scientifique, Gif sur Yvette, France
- Laboratoire de Physique Théorique et Modelisation, Université de Cergy-Pontoise, Cergy-Pontoise, France
| |
Collapse
|
42
|
Schmidt H, Avitabile D. Bumps and oscillons in networks of spiking neurons. CHAOS (WOODBURY, N.Y.) 2020; 30:033133. [PMID: 32237760 DOI: 10.1063/1.5135579] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/06/2019] [Accepted: 03/03/2020] [Indexed: 06/11/2023]
Abstract
We study localized patterns in an exact mean-field description of a spatially extended network of quadratic integrate-and-fire neurons. We investigate conditions for the existence and stability of localized solutions, so-called bumps, and give an analytic estimate for the parameter range, where these solutions exist in parameter space, when one or more microscopic network parameters are varied. We develop Galerkin methods for the model equations, which enable numerical bifurcation analysis of stationary and time-periodic spatially extended solutions. We study the emergence of patterns composed of multiple bumps, which are arranged in a snake-and-ladder bifurcation structure if a homogeneous or heterogeneous synaptic kernel is suitably chosen. Furthermore, we examine time-periodic, spatially localized solutions (oscillons) in the presence of external forcing, and in autonomous, recurrently coupled excitatory and inhibitory networks. In both cases, we observe period-doubling cascades leading to chaotic oscillations.
Collapse
Affiliation(s)
- Helmut Schmidt
- Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a, 04103 Leipzig, Germany
| | - Daniele Avitabile
- Department of Mathematics, Faculteit der Exacte Wetenschappen, Vrije Universiteit (VU University Amsterdam), De Boelelaan 1081a, 1081 HV Amsterdam, Netherlands and Mathneuro Team, Inria Sophia Antipolis, 2004 Rue des Lucioles, Sophia Antipolis, 06902 Cedex, France
| |
Collapse
|
43
|
|
44
|
Koren V, Andrei AR, Hu M, Dragoi V, Obermayer K. Reading-out task variables as a low-dimensional reconstruction of neural spike trains in single trials. PLoS One 2019; 14:e0222649. [PMID: 31622346 PMCID: PMC6797168 DOI: 10.1371/journal.pone.0222649] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2019] [Accepted: 09/03/2019] [Indexed: 11/18/2022] Open
Abstract
We propose a new model of the read-out of spike trains that exploits the multivariate structure of responses of neural ensembles. Assuming the point of view of a read-out neuron that receives synaptic inputs from a population of projecting neurons, synaptic inputs are weighted with a heterogeneous set of weights. We propose that synaptic weights reflect the role of each neuron within the population for the computational task that the network has to solve. In our case, the computational task is discrimination of binary classes of stimuli, and weights are such as to maximize the discrimination capacity of the network. We compute synaptic weights as the feature weights of an optimal linear classifier. Once weights have been learned, they weight spike trains and allow to compute the post-synaptic current that modulates the spiking probability of the read-out unit in real time. We apply the model on parallel spike trains from V1 and V4 areas in the behaving monkey macaca mulatta, while the animal is engaged in a visual discrimination task with binary classes of stimuli. The read-out of spike trains with our model allows to discriminate the two classes of stimuli, while population PSTH entirely fails to do so. Splitting neurons in two subpopulations according to the sign of the weight, we show that population signals of the two functional subnetworks are negatively correlated. Disentangling the superficial, the middle and the deep layer of the cortex, we show that in both V1 and V4, superficial layers are the most important in discriminating binary classes of stimuli.
Collapse
Affiliation(s)
- Veronika Koren
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Germany
- * E-mail:
| | - Ariana R. Andrei
- Department of Neurobiology and Anatomy, University of Texas Medical School, Houston, Texas, United States of America
| | - Ming Hu
- Picower Institute for Learning and Memory, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America
| | - Valentin Dragoi
- Department of Neurobiology and Anatomy, University of Texas Medical School, Houston, Texas, United States of America
| | - Klaus Obermayer
- Neural Information Processing Group, Institute of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Germany
| |
Collapse
|
45
|
Schwalger T, Chizhov AV. Mind the last spike - firing rate models for mesoscopic populations of spiking neurons. Curr Opin Neurobiol 2019; 58:155-166. [PMID: 31590003 DOI: 10.1016/j.conb.2019.08.003] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/25/2019] [Indexed: 02/07/2023]
Abstract
The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons - a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany; Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany.
| | - Anton V Chizhov
- Ioffe Institute, 194021 Saint-Petersburg, Russia; Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, 194223 Saint-Petersburg, Russia
| |
Collapse
|
46
|
Gleeson P, Cantarelli M, Marin B, Quintana A, Earnshaw M, Sadeh S, Piasini E, Birgiolas J, Cannon RC, Cayco-Gajic NA, Crook S, Davison AP, Dura-Bernal S, Ecker A, Hines ML, Idili G, Lanore F, Larson SD, Lytton WW, Majumdar A, McDougal RA, Sivagnanam S, Solinas S, Stanislovas R, van Albada SJ, van Geit W, Silver RA. Open Source Brain: A Collaborative Resource for Visualizing, Analyzing, Simulating, and Developing Standardized Models of Neurons and Circuits. Neuron 2019; 103:395-411.e5. [PMID: 31201122 PMCID: PMC6693896 DOI: 10.1016/j.neuron.2019.05.019] [Citation(s) in RCA: 42] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2018] [Revised: 03/04/2019] [Accepted: 05/09/2019] [Indexed: 02/07/2023]
Abstract
Computational models are powerful tools for exploring the properties of complex biological systems. In neuroscience, data-driven models of neural circuits that span multiple scales are increasingly being used to understand brain function in health and disease. But their adoption and reuse has been limited by the specialist knowledge required to evaluate and use them. To address this, we have developed Open Source Brain, a platform for sharing, viewing, analyzing, and simulating standardized models from different brain regions and species. Model structure and parameters can be automatically visualized and their dynamical properties explored through browser-based simulations. Infrastructure and tools for collaborative interaction, development, and testing are also provided. We demonstrate how existing components can be reused by constructing new models of inhibition-stabilized cortical networks that match recent experimental results. These features of Open Source Brain improve the accessibility, transparency, and reproducibility of models and facilitate their reuse by the wider community.
Collapse
Affiliation(s)
- Padraig Gleeson
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | - Matteo Cantarelli
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK; MetaCell Limited, Oxford, UK
| | - Boris Marin
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK; Centro de Matemática, Computação e Cognição, Universidade Federal do ABC, Santo André, Brazil
| | - Adrian Quintana
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | - Matt Earnshaw
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | - Sadra Sadeh
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | - Eugenio Piasini
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK; Computational Neuroscience Initiative and Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA, USA
| | - Justas Birgiolas
- School of Life Sciences, Arizona State University, Tempe, AZ, USA
| | | | - N Alex Cayco-Gajic
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | - Sharon Crook
- School of Life Sciences, Arizona State University, Tempe, AZ, USA; School of Mathematical and Statistical Sciences, Arizona State University, Tempe, AZ, USA
| | - Andrew P Davison
- Unité de Neuroscience, Information et Complexité, Centre National de la Recherche Scientifique, Paris, France
| | | | - András Ecker
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK; Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Michael L Hines
- Department of Neuroscience, Yale School of Medicine, New Haven, CT, USA
| | | | - Frederic Lanore
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | | | - William W Lytton
- SUNY Downstate Medical Center and Kings County Hospital, Brooklyn, NY, USA
| | | | - Robert A McDougal
- Department of Neuroscience, Yale School of Medicine, New Haven, CT, USA; Center for Medical Informatics, Yale University, New Haven, CT, USA
| | | | - Sergio Solinas
- Department of Biomedical Science, University of Sassari, Sassari, Italy; Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
| | - Rokas Stanislovas
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK
| | - Sacha J van Albada
- Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6) and JARA-Institut Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Werner van Geit
- Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - R Angus Silver
- Department of Neuroscience, Physiology and Pharmacology, University College London, London, UK.
| |
Collapse
|
47
|
Lagzi F, Atay FM, Rotter S. Bifurcation analysis of the dynamics of interacting subnetworks of a spiking network. Sci Rep 2019; 9:11397. [PMID: 31388027 PMCID: PMC6684592 DOI: 10.1038/s41598-019-47190-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2018] [Accepted: 07/10/2019] [Indexed: 12/04/2022] Open
Abstract
We analyze the collective dynamics of hierarchically structured networks of densely connected spiking neurons. These networks of sub-networks may represent interactions between cell assemblies or different nuclei in the brain. The dynamical activity pattern that results from these interactions depends on the strength of synaptic coupling between them. Importantly, the overall dynamics of a brain region in the absence of external input, so called ongoing brain activity, has been attributed to the dynamics of such interactions. In our study, two different network scenarios are considered: a system with one inhibitory and two excitatory subnetworks, and a network representation with three inhibitory subnetworks. To study the effect of synaptic strength on the global dynamics of the network, two parameters for relative couplings between these subnetworks are considered. For each case, a bifurcation analysis is performed and the results have been compared to large-scale network simulations. Our analysis shows that Generalized Lotka-Volterra (GLV) equations, well-known in predator-prey studies, yield a meaningful population-level description for the collective behavior of spiking neuronal interaction, which have a hierarchical structure. In particular, we observed a striking equivalence between the bifurcation diagrams of spiking neuronal networks and their corresponding GLV equations. This study gives new insight on the behavior of neuronal assemblies, and can potentially suggest new mechanisms for altering the dynamical patterns of spiking networks based on changing the synaptic strength between some groups of neurons.
Collapse
Affiliation(s)
- Fereshteh Lagzi
- Bernstein Center Freiburg, Freiburg, Germany. .,Faculty of Biology, University of Freiburg, Freiburg, Germany.
| | - Fatihcan M Atay
- Department of Mathematics, Bilkent University, Ankara, Turkey
| | - Stefan Rotter
- Bernstein Center Freiburg, Freiburg, Germany.,Faculty of Biology, University of Freiburg, Freiburg, Germany
| |
Collapse
|
48
|
Conductance-Based Refractory Density Approach for a Population of Bursting Neurons. Bull Math Biol 2019; 81:4124-4143. [PMID: 31313084 DOI: 10.1007/s11538-019-00643-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2018] [Accepted: 07/09/2019] [Indexed: 12/29/2022]
Abstract
The conductance-based refractory density (CBRD) approach is a parsimonious mathematical-computational framework for modelling interacting populations of regular spiking neurons, which, however, has not been yet extended for a population of bursting neurons. The canonical CBRD method allows to describe the firing activity of a statistical ensemble of uncoupled Hodgkin-Huxley-like neurons (differentiated by noise) and has demonstrated its validity against experimental data. The present manuscript generalises the CBRD for a population of bursting neurons; however, in this pilot computational study, we consider the simplest setting in which each individual neuron is governed by a piecewise linear bursting dynamics. The resulting population model makes use of slow-fast analysis, which leads to a novel methodology that combines CBRD with the theory of multiple timescale dynamics. The main prospect is that it opens novel avenues for mathematical explorations, as well as, the derivation of more sophisticated population activity from Hodgkin-Huxley-like bursting neurons, which will allow to capture the activity of synchronised bursting activity in hyper-excitable brain states (e.g. onset of epilepsy).
Collapse
|
49
|
Roe AW. Columnar connectome: toward a mathematics of brain function. Netw Neurosci 2019; 3:779-791. [PMID: 31410379 PMCID: PMC6663318 DOI: 10.1162/netn_a_00088] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2018] [Accepted: 04/14/2019] [Indexed: 01/09/2023] Open
Abstract
Understanding brain networks is important for many fields, including neuroscience, psychology, medicine, and artificial intelligence. To address this fundamental need, there are multiple ongoing connectome projects in the United States, Europe, and Asia producing brain connection maps with resolutions at macro- and microscales. However, still lacking is a mesoscale connectome. This viewpoint (1) explains the need for a mesoscale connectome in the primate brain (the columnar connectome), (2) presents a new method for acquiring such data rapidly on a large scale, and (3) proposes how one might use such data to achieve a mathematics of brain function.
Collapse
Affiliation(s)
- Anna Wang Roe
- Institute of Interdisciplinary Neuroscience and Technology, Zhejiang University, Hangzhou, China
| |
Collapse
|
50
|
Kuehn C, Tölle JM. A gradient flow formulation for the stochastic Amari neural field model. J Math Biol 2019; 79:1227-1252. [PMID: 31214776 DOI: 10.1007/s00285-019-01393-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2018] [Revised: 11/26/2018] [Indexed: 11/27/2022]
Abstract
We study stochastic Amari-type neural field equations, which are mean-field models for neural activity in the cortex. We prove that under certain assumptions on the coupling kernel, the neural field model can be viewed as a gradient flow in a nonlocal Hilbert space. This makes all gradient flow methods available for the analysis, which could previously not be used, as it was not known, whether a rigorous gradient flow formulation exists. We show that the equation is well-posed in the nonlocal Hilbert space in the sense that solutions starting in this space also remain in it for all times and space-time regularity results hold for the case of spatially correlated noise. Uniqueness of invariant measures, ergodic properties for the associated Feller semigroups, and several examples of kernels are also discussed.
Collapse
Affiliation(s)
- Christian Kuehn
- Research Unit "Multiscale and Stochastic Dynamics", Faculty of Mathematics, Technical University of Munich, 85748, Garching bei München, Germany.
| | - Jonas M Tölle
- Institut für Mathematik, Universität Augsburg, 86135, Augsburg, Germany
| |
Collapse
|