1
|
Zendrikov D, Paraskevov A. The vitals for steady nucleation maps of spontaneous spiking coherence in autonomous two-dimensional neuronal networks. Neural Netw 2024; 180:106589. [PMID: 39217864 DOI: 10.1016/j.neunet.2024.106589] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2024] [Revised: 07/06/2024] [Accepted: 07/28/2024] [Indexed: 09/04/2024]
Abstract
Thin pancake-like neuronal networks cultured on top of a planar microelectrode array have been extensively tried out in neuroengineering, as a substrate for the mobile robot's control unit, i.e., as a cyborg's brain. Most of these attempts failed due to intricate self-organizing dynamics in the neuronal systems. In particular, the networks may exhibit an emergent spatial map of steady nucleation sites ("n-sites") of spontaneous population spikes. Being unpredictable and independent of the surface electrode locations, the n-sites drastically change local ability of the network to generate spikes. Here, using a spiking neuronal network model with generative spatially-embedded connectome, we systematically show in simulations that the number, location, and relative activity of spontaneously formed n-sites ("the vitals") crucially depend on the samplings of three distributions: (1) the network distribution of neuronal excitability, (2) the distribution of connections between neurons of the network, and (3) the distribution of maximal amplitudes of a single synaptic current pulse. Moreover, blocking the dynamics of a small fraction (about 4%) of non-pacemaker neurons having the highest excitability was enough to completely suppress the occurrence of population spikes and their n-sites. This key result is explained theoretically. Remarkably, the n-sites occur taking into account only short-term synaptic plasticity, i.e., without a Hebbian-type plasticity. As the spiking network model used in this study is strictly deterministic, all simulation results can be accurately reproduced. The model, which has already demonstrated a very high richness-to-complexity ratio, can also be directly extended into the three-dimensional case, e.g., for targeting peculiarities of spiking dynamics in cerebral (or brain) organoids. We recommend the model as an excellent illustrative tool for teaching network-level computational neuroscience, complementing a few benchmark models.
Collapse
Affiliation(s)
- Dmitrii Zendrikov
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, 8057 Zurich, Switzerland.
| | | |
Collapse
|
2
|
Pietras B. Pulse Shape and Voltage-Dependent Synchronization in Spiking Neuron Networks. Neural Comput 2024; 36:1476-1540. [PMID: 39028958 DOI: 10.1162/neco_a_01680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Accepted: 03/18/2024] [Indexed: 07/21/2024]
Abstract
Pulse-coupled spiking neural networks are a powerful tool to gain mechanistic insights into how neurons self-organize to produce coherent collective behavior. These networks use simple spiking neuron models, such as the θ-neuron or the quadratic integrate-and-fire (QIF) neuron, that replicate the essential features of real neural dynamics. Interactions between neurons are modeled with infinitely narrow pulses, or spikes, rather than the more complex dynamics of real synapses. To make these networks biologically more plausible, it has been proposed that they must also account for the finite width of the pulses, which can have a significant impact on the network dynamics. However, the derivation and interpretation of these pulses are contradictory, and the impact of the pulse shape on the network dynamics is largely unexplored. Here, I take a comprehensive approach to pulse coupling in networks of QIF and θ-neurons. I argue that narrow pulses activate voltage-dependent synaptic conductances and show how to implement them in QIF neurons such that their effect can last through the phase after the spike. Using an exact low-dimensional description for networks of globally coupled spiking neurons, I prove for instantaneous interactions that collective oscillations emerge due to an effective coupling through the mean voltage. I analyze the impact of the pulse shape by means of a family of smooth pulse functions with arbitrary finite width and symmetric or asymmetric shapes. For symmetric pulses, the resulting voltage coupling is not very effective in synchronizing neurons, but pulses that are slightly skewed to the phase after the spike readily generate collective oscillations. The results unveil a voltage-dependent spike synchronization mechanism at the heart of emergent collective behavior, which is facilitated by pulses of finite width and complementary to traditional synaptic transmission in spiking neuron networks.
Collapse
Affiliation(s)
- Bastian Pietras
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, 08018, Barcelona, Spain
| |
Collapse
|
3
|
Ashida G, Wang T, Kretzberg J. Integrate-and-fire-type models of the lateral superior olive. PLoS One 2024; 19:e0304832. [PMID: 38900820 PMCID: PMC11189240 DOI: 10.1371/journal.pone.0304832] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2023] [Accepted: 05/20/2024] [Indexed: 06/22/2024] Open
Abstract
Neurons of the lateral superior olive (LSO) in the auditory brainstem play a fundamental role in binaural sound localization. Previous theoretical studies developed various types of neuronal models to study the physiological functions of the LSO. These models were usually tuned to a small set of physiological data with specific aims in mind. Therefore, it is unclear whether and how they can be related to each other, how widely applicable they are, and which model is suitable for what purposes. In this study, we address these questions for six different single-compartment integrate-and-fire (IF) type LSO models. The models are divided into two groups depending on their subthreshold responses: passive (linear) models with only the leak conductance and active (nonlinear) models with an additional low-voltage-activated potassium conductance that is prevalent among the auditory system. Each of these two groups is further subdivided into three subtypes according to the spike generation mechanism: one with simple threshold-crossing detection and voltage reset, one with threshold-crossing detection plus a current to mimic spike shapes, and one with a depolarizing exponential current for spiking. In our simulations, all six models were driven by identical synaptic inputs and calibrated with common criteria for binaural tuning. The resulting spike rates of the passive models were higher for intensive inputs and lower for temporally structured inputs than those of the active models, confirming the active function of the potassium current. Within each passive or active group, the simulated responses resembled each other, regardless of the spike generation types. These results, in combination with the analysis of computational costs, indicate that an active IF model is more suitable than a passive model for accurately reproducing temporal coding of LSO. The simulation of realistic spike shapes with an extended spiking mechanism added relatively small computational costs.
Collapse
Affiliation(s)
- Go Ashida
- Faculty 6, Department of Neuroscience, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Cluster of Excellence "Hearing4all", Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
| | - Tiezhi Wang
- Faculty 6, Department of Neuroscience, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Faculty 6, Department of Health Services Research, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
| | - Jutta Kretzberg
- Faculty 6, Department of Neuroscience, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Cluster of Excellence "Hearing4all", Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
- Research Center Neurosensory Science, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany
| |
Collapse
|
4
|
Choudhary K, Berberich S, Hahn TTG, McFarland JM, Mehta MR. Spontaneous persistent activity and inactivity in vivo reveals differential cortico-entorhinal functional connectivity. Nat Commun 2024; 15:3542. [PMID: 38719802 PMCID: PMC11079062 DOI: 10.1038/s41467-024-47617-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Accepted: 04/04/2024] [Indexed: 05/12/2024] Open
Abstract
Understanding the functional connectivity between brain regions and its emergent dynamics is a central challenge. Here we present a theory-experiment hybrid approach involving iteration between a minimal computational model and in vivo electrophysiological measurements. Our model not only predicted spontaneous persistent activity (SPA) during Up-Down-State oscillations, but also inactivity (SPI), which has never been reported. These were confirmed in vivo in the membrane potential of neurons, especially from layer 3 of the medial and lateral entorhinal cortices. The data was then used to constrain two free parameters, yielding a unique, experimentally determined model for each neuron. Analytic and computational analysis of the model generated a dozen quantitative predictions about network dynamics, which were all confirmed in vivo to high accuracy. Our technique predicted functional connectivity; e. g. the recurrent excitation is stronger in the medial than lateral entorhinal cortex. This too was confirmed with connectomics data. This technique uncovers how differential cortico-entorhinal dialogue generates SPA and SPI, which could form an energetically efficient working-memory substrate and influence the consolidation of memories during sleep. More broadly, our procedure can reveal the functional connectivity of large networks and a theory of their emergent dynamics.
Collapse
Affiliation(s)
- Krishna Choudhary
- Department of Physics and Astronomy, University of California, Los Angeles, Los Angeles, CA, USA
- HRL Laboratories, Malibu, CA, USA
| | - Sven Berberich
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
- Department of Psychiatry and Psychotherapy, University Medical Center, Johannes Gutenberg University, Mainz, Germany
| | | | | | - Mayank R Mehta
- Department of Physics and Astronomy, University of California, Los Angeles, Los Angeles, CA, USA.
- W. M. Keck Center for Neurophysics, University of California, Los Angeles, CA, USA.
- Department of Electrical and Computer Engineering, University of California, Los Angeles, CA, USA.
- Departments of Neurology and Neurobiology, University of California, Los Angeles, Los Angeles, CA, USA.
| |
Collapse
|
5
|
Becker MP, Idiart MAP. Mean-field method for generic conductance-based integrate-and-fire neurons with finite timescales. Phys Rev E 2024; 109:024406. [PMID: 38491595 DOI: 10.1103/physreve.109.024406] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Accepted: 11/08/2023] [Indexed: 03/18/2024]
Abstract
The construction of transfer functions in theoretical neuroscience plays an important role in determining the spiking rate behavior of neurons in networks. These functions can be obtained through various fitting methods, but the biological relevance of the parameters is not always clear. However, for stationary inputs, such functions can be obtained without the adjustment of free parameters by using mean-field methods. In this work, we expand current Fokker-Planck approaches to account for the concurrent influence of colored and multiplicative noise terms on generic conductance-based integrate-and-fire neurons. We reduce the resulting stochastic system through the application of the diffusion approximation to a one-dimensional Langevin equation. An effective Fokker-Planck is then constructed using Fox Theory, which is solved numerically using a newly developed double integration procedure to obtain the transfer function and the membrane potential distribution. The solution is capable of reproducing the transfer function and the stationary voltage distribution of simulated neurons across a wide range of parameters. The method can also be easily extended to account for different sources of noise with various multiplicative terms, and it can be used in other types of problems in principle.
Collapse
Affiliation(s)
- Marcelo P Becker
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| | - Marco A P Idiart
- Department of Physics, Institute of Physics, Federal University of Rio Grande do Sul, Porto Alegre, Brazil
| |
Collapse
|
6
|
Laing CR, Omel’chenko OE. Periodic solutions in next generation neural field models. BIOLOGICAL CYBERNETICS 2023; 117:259-274. [PMID: 37535104 PMCID: PMC10600056 DOI: 10.1007/s00422-023-00969-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/24/2023] [Accepted: 07/12/2023] [Indexed: 08/04/2023]
Abstract
We consider a next generation neural field model which describes the dynamics of a network of theta neurons on a ring. For some parameters the network supports stable time-periodic solutions. Using the fact that the dynamics at each spatial location are described by a complex-valued Riccati equation we derive a self-consistency equation that such periodic solutions must satisfy. We determine the stability of these solutions, and present numerical results to illustrate the usefulness of this technique. The generality of this approach is demonstrated through its application to several other systems involving delays, two-population architecture and networks of Winfree oscillators.
Collapse
Affiliation(s)
- Carlo R. Laing
- School of Mathematical and Computational Sciences, Massey University, Private Bag 102-904 NSMC, Auckland, New Zealand
| | - Oleh E. Omel’chenko
- Institute of Physics and Astronomy, University of Potsdam, Karl-Liebknecht-Str. 24/25, 14476 Potsdam, Germany
| |
Collapse
|
7
|
Chen S, Zhang T, Tappertzhofen S, Yang Y, Valov I. Electrochemical-Memristor-Based Artificial Neurons and Synapses-Fundamentals, Applications, and Challenges. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2023; 35:e2301924. [PMID: 37199224 DOI: 10.1002/adma.202301924] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/28/2023] [Revised: 04/22/2023] [Indexed: 05/19/2023]
Abstract
Artificial neurons and synapses are considered essential for the progress of the future brain-inspired computing, based on beyond von Neumann architectures. Here, a discussion on the common electrochemical fundamentals of biological and artificial cells is provided, focusing on their similarities with the redox-based memristive devices. The driving forces behind the functionalities and the ways to control them by an electrochemical-materials approach are presented. Factors such as the chemical symmetry of the electrodes, doping of the solid electrolyte, concentration gradients, and excess surface energy are discussed as essential to understand, predict, and design artificial neurons and synapses. A variety of two- and three-terminal memristive devices and memristive architectures are presented and their application for solving various problems is shown. The work provides an overview of the current understandings on the complex processes of neural signal generation and transmission in both biological and artificial cells and presents the state-of-the-art applications, including signal transmission between biological and artificial cells. This example is showcasing the possibility for creating bioelectronic interfaces and integrating artificial circuits in biological systems. Prospectives and challenges of the modern technology toward low-power, high-information-density circuits are highlighted.
Collapse
Affiliation(s)
- Shaochuan Chen
- Institute of Materials in Electrical Engineering 2 (IWE2), RWTH Aachen University, Sommerfeldstraße 24, 52074, Aachen, Germany
| | - Teng Zhang
- Key Laboratory of Microelectronic Devices and Circuits (MOE), School of Integrated Circuits, Peking University, Beijing, 100871, China
| | - Stefan Tappertzhofen
- Chair for Micro- and Nanoelectronics, Department of Electrical Engineering and Information Technology, TU Dortmund University, Martin-Schmeisser-Weg 4-6, D-44227, Dortmund, Germany
| | - Yuchao Yang
- Key Laboratory of Microelectronic Devices and Circuits (MOE), School of Integrated Circuits, Peking University, Beijing, 100871, China
- School of Electronic and Computer Engineering, Peking University, Shenzhen, 518055, China
- Center for Brain Inspired Intelligence, Chinese Institute for Brain Research (CIBR), Beijing, 102206, China
| | - Ilia Valov
- Peter Grünberg Institute (PGI-7), Forschungszentrum Jülich, Wilhelm-Johnen-Straße, 52425, Jülich, Germany
- Institute of Electrochemistry and Energy Systems "Acad. E. Budewski", Bulgarian Academy of Sciences, Acad. G. Bonchev 10, 1113, Sofia, Bulgaria
| |
Collapse
|
8
|
Chialva U, González Boscá V, Rotstein HG. Low-dimensional models of single neurons: a review. BIOLOGICAL CYBERNETICS 2023; 117:163-183. [PMID: 37060453 DOI: 10.1007/s00422-023-00960-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Accepted: 03/05/2023] [Indexed: 06/13/2023]
Abstract
The classical Hodgkin-Huxley (HH) point-neuron model of action potential generation is four-dimensional. It consists of four ordinary differential equations describing the dynamics of the membrane potential and three gating variables associated to a transient sodium and a delayed-rectifier potassium ionic currents. Conductance-based models of HH type are higher-dimensional extensions of the classical HH model. They include a number of supplementary state variables associated with other ionic current types, and are able to describe additional phenomena such as subthreshold oscillations, mixed-mode oscillations (subthreshold oscillations interspersed with spikes), clustering and bursting. In this manuscript we discuss biophysically plausible and phenomenological reduced models that preserve the biophysical and/or dynamic description of models of HH type and the ability to produce complex phenomena, but the number of effective dimensions (state variables) is lower. We describe several representative models. We also describe systematic and heuristic methods of deriving reduced models from models of HH type.
Collapse
Affiliation(s)
- Ulises Chialva
- Departamento de Matemática, Universidad Nacional del Sur and CONICET, Bahía Blanca, Buenos Aires, Argentina
| | | | - Horacio G Rotstein
- Federated Department of Biological Sciences, New Jersey Institute of Technology and Rutgers University, Newark, New Jersey, USA.
- Behavioral Neurosciences Program, Rutgers University, Newark, NJ, USA.
- Corresponding Investigators Group, CONICET, Buenos Aires, Argentina.
| |
Collapse
|
9
|
Clusella P, Köksal-Ersöz E, Garcia-Ojalvo J, Ruffini G. Comparison between an exact and a heuristic neural mass model with second-order synapses. BIOLOGICAL CYBERNETICS 2023; 117:5-19. [PMID: 36454267 PMCID: PMC10160168 DOI: 10.1007/s00422-022-00952-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Accepted: 10/23/2022] [Indexed: 05/05/2023]
Abstract
Neural mass models (NMMs) are designed to reproduce the collective dynamics of neuronal populations. A common framework for NMMs assumes heuristically that the output firing rate of a neural population can be described by a static nonlinear transfer function (NMM1). However, a recent exact mean-field theory for quadratic integrate-and-fire (QIF) neurons challenges this view by showing that the mean firing rate is not a static function of the neuronal state but follows two coupled nonlinear differential equations (NMM2). Here we analyze and compare these two descriptions in the presence of second-order synaptic dynamics. First, we derive the mathematical equivalence between the two models in the infinitely slow synapse limit, i.e., we show that NMM1 is an approximation of NMM2 in this regime. Next, we evaluate the applicability of this limit in the context of realistic physiological parameter values by analyzing the dynamics of models with inhibitory or excitatory synapses. We show that NMM1 fails to reproduce important dynamical features of the exact model, such as the self-sustained oscillations of an inhibitory interneuron QIF network. Furthermore, in the exact model but not in the limit one, stimulation of a pyramidal cell population induces resonant oscillatory activity whose peak frequency and amplitude increase with the self-coupling gain and the external excitatory input. This may play a role in the enhanced response of densely connected networks to weak uniform inputs, such as the electric fields produced by noninvasive brain stimulation.
Collapse
Affiliation(s)
- Pau Clusella
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain.
| | - Elif Köksal-Ersöz
- LTSI - UMR 1099, INSERM, Univ Rennes, Campus Beaulieu, 35000, Rennes, France
| | - Jordi Garcia-Ojalvo
- Department of Medicine and Life Sciences, Universitat Pompeu Fabra, Barcelona Biomedical Research Park, 08003, Barcelona, Spain
| | - Giulio Ruffini
- Brain Modeling Department, Neuroelectrics, Av. Tibidabo, 47b, 08035, Barcelona, Spain.
| |
Collapse
|
10
|
Barton A, Volna E, Kotyrba M, Jarusek R. Proposal of a Control Algorithm for Multiagent Cooperation Using Spiking Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:2016-2027. [PMID: 34449399 DOI: 10.1109/tnnls.2021.3105800] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
The study deals with the issue of using spiking neural networks (SNNs) in multiagent systems. The research objective is a proposal of a control algorithm for the cooperation of a group of agents using SNNs, application of the Izhikevich model, and plasticity depending on the timing of action potentials. The proposed method has been verified and experimentally tested, proving numerous advantages over second-generation networks. The advantages and the application in real systems are described in the research conclusions.
Collapse
|
11
|
Manninen T, Aćimović J, Linne ML. Analysis of Network Models with Neuron-Astrocyte Interactions. Neuroinformatics 2023; 21:375-406. [PMID: 36959372 PMCID: PMC10085960 DOI: 10.1007/s12021-023-09622-w] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/01/2023] [Indexed: 03/25/2023]
Abstract
Neural networks, composed of many neurons and governed by complex interactions between them, are a widely accepted formalism for modeling and exploring global dynamics and emergent properties in brain systems. In the past decades, experimental evidence of computationally relevant neuron-astrocyte interactions, as well as the astrocytic modulation of global neural dynamics, have accumulated. These findings motivated advances in computational glioscience and inspired several models integrating mechanisms of neuron-astrocyte interactions into the standard neural network formalism. These models were developed to study, for example, synchronization, information transfer, synaptic plasticity, and hyperexcitability, as well as classification tasks and hardware implementations. We here focus on network models of at least two neurons interacting bidirectionally with at least two astrocytes that include explicitly modeled astrocytic calcium dynamics. In this study, we analyze the evolution of these models and the biophysical, biochemical, cellular, and network mechanisms used to construct them. Based on our analysis, we propose how to systematically describe and categorize interaction schemes between cells in neuron-astrocyte networks. We additionally study the models in view of the existing experimental data and present future perspectives. Our analysis is an important first step towards understanding astrocytic contribution to brain functions. However, more advances are needed to collect comprehensive data about astrocyte morphology and physiology in vivo and to better integrate them in data-driven computational models. Broadening the discussion about theoretical approaches and expanding the computational tools is necessary to better understand astrocytes' roles in brain functions.
Collapse
Affiliation(s)
- Tiina Manninen
- Faculty of Medicine and Health Technology, Tampere University, Korkeakoulunkatu 3, FI-33720, Tampere, Finland.
| | - Jugoslava Aćimović
- Faculty of Medicine and Health Technology, Tampere University, Korkeakoulunkatu 3, FI-33720, Tampere, Finland
| | - Marja-Leena Linne
- Faculty of Medicine and Health Technology, Tampere University, Korkeakoulunkatu 3, FI-33720, Tampere, Finland.
| |
Collapse
|
12
|
Weir JS, Christiansen N, Sandvig A, Sandvig I. Selective inhibition of excitatory synaptic transmission alters the emergent bursting dynamics of in vitro neural networks. Front Neural Circuits 2023; 17:1020487. [PMID: 36874945 PMCID: PMC9978115 DOI: 10.3389/fncir.2023.1020487] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2022] [Accepted: 01/31/2023] [Indexed: 02/18/2023] Open
Abstract
Neurons in vitro connect to each other and form neural networks that display emergent electrophysiological activity. This activity begins as spontaneous uncorrelated firing in the early phase of development, and as functional excitatory and inhibitory synapses mature, the activity typically emerges as spontaneous network bursts. Network bursts are events of coordinated global activation among many neurons interspersed with periods of silencing and are important for synaptic plasticity, neural information processing, and network computation. While bursting is the consequence of balanced excitatory-inhibitory (E/I) interactions, the functional mechanisms underlying their evolution from physiological to potentially pathophysiological states, such as decreasing or increasing in synchrony, are still poorly understood. Synaptic activity, especially that related to maturity of E/I synaptic transmission, is known to strongly influence these processes. In this study, we used selective chemogenetic inhibition to target and disrupt excitatory synaptic transmission in in vitro neural networks to study functional response and recovery of spontaneous network bursts over time. We found that over time, inhibition resulted in increases in both network burstiness and synchrony. Our results indicate that the disruption in excitatory synaptic transmission during early network development likely affected inhibitory synaptic maturity which resulted in an overall decrease in network inhibition at later stages. These findings lend support to the importance of E/I balance in maintaining physiological bursting dynamics and, conceivably, information processing capacity in neural networks.
Collapse
Affiliation(s)
- Janelle Shari Weir
- Department of Neuromedicine and Movement Science, Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology, Trondheim, Norway
| | - Nicholas Christiansen
- Department of Neuromedicine and Movement Science, Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology, Trondheim, Norway
| | - Axel Sandvig
- Department of Neuromedicine and Movement Science, Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology, Trondheim, Norway.,Department of Neurology and Clinical Neurophysiology, St. Olav's University Hospital, Trondheim, Norway.,Division of Neuro, Head and Neck, Department of Pharmacology and Clinical Neurosciences, Umeå University Hospital, Umeå, Sweden.,Division of Neuro, Head and Neck, Department of Community Medicine and Rehabilitation, Umeå University Hospital, Umeå, Sweden
| | - Ioanna Sandvig
- Department of Neuromedicine and Movement Science, Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology, Trondheim, Norway
| |
Collapse
|
13
|
Khona M, Fiete IR. Attractor and integrator networks in the brain. Nat Rev Neurosci 2022; 23:744-766. [DOI: 10.1038/s41583-022-00642-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/22/2022] [Indexed: 11/06/2022]
|
14
|
Thivierge JP, Giraud E, Lynn M, Théberge Charbonneau A. Key role of neuronal diversity in structured reservoir computing. CHAOS (WOODBURY, N.Y.) 2022; 32:113130. [PMID: 36456321 DOI: 10.1063/5.0111131] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 10/24/2022] [Indexed: 06/17/2023]
Abstract
Chaotic time series have been captured by reservoir computing models composed of a recurrent neural network whose output weights are trained in a supervised manner. These models, however, are typically limited to randomly connected networks of homogeneous units. Here, we propose a new class of structured reservoir models that incorporates a diversity of cell types and their known connections. In a first version of the model, the reservoir was composed of mean-rate units separated into pyramidal, parvalbumin, and somatostatin cells. Stability analysis of this model revealed two distinct dynamical regimes, namely, (i) an inhibition-stabilized network (ISN) where strong recurrent excitation is balanced by strong inhibition and (ii) a non-ISN network with weak excitation. These results were extended to a leaky integrate-and-fire model that captured different cell types along with their network architecture. ISN and non-ISN reservoir networks were trained to relay and generate a chaotic Lorenz attractor. Despite their increased performance, ISN networks operate in a regime of activity near the limits of stability where external perturbations yield a rapid divergence in output. The proposed framework of structured reservoir computing opens avenues for exploring how neural microcircuits can balance performance and stability when representing time series through distinct dynamical regimes.
Collapse
Affiliation(s)
- Jean-Philippe Thivierge
- University of Ottawa Brain and Mind Research Institute, 451 Smyth Rd., Ottawa, Ontario K1H 8M5, Canada
| | - Eloïse Giraud
- School of Psychology, University of Ottawa, 156 Jean-Jacques Lussier, Ottawa, Ontario K1N 6N5, Canada
| | - Michael Lynn
- University of Ottawa Brain and Mind Research Institute, 451 Smyth Rd., Ottawa, Ontario K1H 8M5, Canada
| | | |
Collapse
|
15
|
Srivastava P, Fotiadis P, Parkes L, Bassett DS. The expanding horizons of network neuroscience: From description to prediction and control. Neuroimage 2022; 258:119250. [PMID: 35659996 PMCID: PMC11164099 DOI: 10.1016/j.neuroimage.2022.119250] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2021] [Revised: 04/15/2022] [Accepted: 04/25/2022] [Indexed: 01/11/2023] Open
Abstract
The field of network neuroscience has emerged as a natural framework for the study of the brain and has been increasingly applied across divergent problems in neuroscience. From a disciplinary perspective, network neuroscience originally emerged as a formal integration of graph theory (from mathematics) and neuroscience (from biology). This early integration afforded marked utility in describing the interconnected nature of neural units, both structurally and functionally, and underscored the relevance of that interconnection for cognition and behavior. But since its inception, the field has not remained static in its methodological composition. Instead, it has grown to use increasingly advanced graph-theoretic tools and to bring in several other disciplinary perspectives-including machine learning and systems engineering-that have proven complementary. In doing so, the problem space amenable to the discipline has expanded markedly. In this review, we discuss three distinct flavors of investigation in state-of-the-art network neuroscience: (i) descriptive network neuroscience, (ii) predictive network neuroscience, and (iii) a perturbative network neuroscience that draws on recent advances in network control theory. In considering each area, we provide a brief summary of the approaches, discuss the nature of the insights obtained, and highlight future directions.
Collapse
Affiliation(s)
- Pragya Srivastava
- Department of Bioengineering, University of Pennsylvania, Philadelphia PA 19104, USA
| | - Panagiotis Fotiadis
- Department of Bioengineering, University of Pennsylvania, Philadelphia PA 19104, USA; Department of Neuroscience, University of Pennsylvania, Philadelphia PA 19104, USA
| | - Linden Parkes
- Department of Bioengineering, University of Pennsylvania, Philadelphia PA 19104, USA
| | - Dani S Bassett
- Department of Bioengineering, University of Pennsylvania, Philadelphia PA 19104, USA; Department of Physics & Astronomy, University of Pennsylvania, Philadelphia PA 19104, USA; Department of Electrical & Systems Engineering, University of Pennsylvania, Philadelphia PA 19104, USA; Department of Neurology, University of Pennsylvania, Philadelphia PA 19104, USA; Department of Psychiatry, University of Pennsylvania, Philadelphia PA 19104, USA; Santa Fe Institute, Santa Fe NM 87501, USA.
| |
Collapse
|
16
|
Salfenmoser L, Obermayer K. Nonlinear optimal control of a mean-field model of neural population dynamics. Front Comput Neurosci 2022; 16:931121. [PMID: 35990368 PMCID: PMC9382303 DOI: 10.3389/fncom.2022.931121] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2022] [Accepted: 07/11/2022] [Indexed: 11/13/2022] Open
Abstract
We apply the framework of nonlinear optimal control to a biophysically realistic neural mass model, which consists of two mutually coupled populations of deterministic excitatory and inhibitory neurons. External control signals are realized by time-dependent inputs to both populations. Optimality is defined by two alternative cost functions that trade the deviation of the controlled variable from its target value against the “strength” of the control, which is quantified by the integrated 1- and 2-norms of the control signal. We focus on a bistable region in state space where one low- (“down state”) and one high-activity (“up state”) stable fixed points coexist. With methods of nonlinear optimal control, we search for the most cost-efficient control function to switch between both activity states. For a broad range of parameters, we find that cost-efficient control strategies consist of a pulse of finite duration to push the state variables only minimally into the basin of attraction of the target state. This strategy only breaks down once we impose time constraints that force the system to switch on a time scale comparable to the duration of the control pulse. Penalizing control strength via the integrated 1-norm (2-norm) yields control inputs targeting one or both populations. However, whether control inputs to the excitatory or the inhibitory population dominate, depends on the location in state space relative to the bifurcation lines. Our study highlights the applicability of nonlinear optimal control to understand neuronal processing under constraints better.
Collapse
|
17
|
Exact mean-field models for spiking neural networks with adaptation. J Comput Neurosci 2022; 50:445-469. [PMID: 35834100 DOI: 10.1007/s10827-022-00825-9] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 06/15/2022] [Indexed: 10/17/2022]
Abstract
Networks of spiking neurons with adaption have been shown to be able to reproduce a wide range of neural activities, including the emergent population bursting and spike synchrony that underpin brain disorders and normal function. Exact mean-field models derived from spiking neural networks are extremely valuable, as such models can be used to determine how individual neurons and the network they reside within interact to produce macroscopic network behaviours. In the paper, we derive and analyze a set of exact mean-field equations for the neural network with spike frequency adaptation. Specifically, our model is a network of Izhikevich neurons, where each neuron is modeled by a two dimensional system consisting of a quadratic integrate and fire equation plus an equation which implements spike frequency adaptation. Previous work deriving a mean-field model for this type of network, relied on the assumption of sufficiently slow dynamics of the adaptation variable. However, this approximation did not succeed in establishing an exact correspondence between the macroscopic description and the realistic neural network, especially when the adaptation time constant was not large. The challenge lies in how to achieve a closed set of mean-field equations with the inclusion of the mean-field dynamics of the adaptation variable. We address this problem by using a Lorentzian ansatz combined with the moment closure approach to arrive at a mean-field system in the thermodynamic limit. The resulting macroscopic description is capable of qualitatively and quantitatively describing the collective dynamics of the neural network, including transition between states where the individual neurons exhibit asynchronous tonic firing and synchronous bursting. We extend the approach to a network of two populations of neurons and discuss the accuracy and efficacy of our mean-field approximations by examining all assumptions that are imposed during the derivation. Numerical bifurcation analysis of our mean-field models reveals bifurcations not previously observed in the models, including a novel mechanism for emergence of bursting in the network. We anticipate our results will provide a tractable and reliable tool to investigate the underlying mechanism of brain function and dysfunction from the perspective of computational neuroscience.
Collapse
|
18
|
Triche A, Maida AS, Kumar A. Exploration in neo-Hebbian reinforcement learning: Computational approaches to the exploration-exploitation balance with bio-inspired neural networks. Neural Netw 2022; 151:16-33. [DOI: 10.1016/j.neunet.2022.03.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2021] [Revised: 03/08/2022] [Accepted: 03/14/2022] [Indexed: 10/18/2022]
|
19
|
Ferdous ZI, Yu A, Zeng Y, Guo X, Yan Z, Berdichevsky Y. Efficient and Accurate Computational Model of Neuron with Spike Frequency Adaptation. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021; 2021:6496-6499. [PMID: 34892598 DOI: 10.1109/embc46164.2021.9629799] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Simplified models of neurons are widely used in computational investigations of large networks. One of the most important performance metrics of simplified models is their accuracy in reproducing action potential (spike) timing. In this article, we developed a simple, computationally efficient neuron model by modifying the adaptive exponential integrate and fire (AdEx) model [1] with sigmoid afterhyperpolarization current (Sigmoid AHP). Our model can precisely match the spike times and spike frequency adaptation of cortical pyramidal neurons. The accuracy was similar to a more complex two compartment biophysically realistic model of the same neurons. This work provides a simplified neuronal model with improved spike timing accuracy for use in modeling of large neural networks.Clinical Relevance- Accurate and computationally efficient single neuron model will enable large network modeling of brain regions involved in neurological and psychiatric disorders and may lead to a better understanding of the disorder mechanisms.
Collapse
|
20
|
Zendrikov D, Paraskevov A. Emergent population activity in metric-free and metric networks of neurons with stochastic spontaneous spikes and dynamic synapses. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.11.073] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
21
|
Laing CR. Effects of degree distributions in random networks of type-I neurons. Phys Rev E 2021; 103:052305. [PMID: 34134197 DOI: 10.1103/physreve.103.052305] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Accepted: 04/28/2021] [Indexed: 11/07/2022]
Abstract
We consider large networks of theta neurons and use the Ott-Antonsen ansatz to derive degree-based mean-field equations governing the expected dynamics of the networks. Assuming random connectivity, we investigate the effects of varying the widths of the in- and out-degree distributions on the dynamics of excitatory or inhibitory synaptically coupled networks and gap junction coupled networks. For synaptically coupled networks, the dynamics are independent of the out-degree distribution. Broadening the in-degree distribution destroys oscillations in inhibitory networks and decreases the range of bistability in excitatory networks. For gap junction coupled neurons, broadening the degree distribution varies the values of parameters at which there is an onset of collective oscillations. Many of the results are shown to also occur in networks of more realistic neurons.
Collapse
Affiliation(s)
- Carlo R Laing
- School of Natural and Computational Sciences, Massey University, Private Bag 102-904 NSMC, Auckland, New Zealand
| |
Collapse
|
22
|
Byrne Á, Ross J, Nicks R, Coombes S. Mean-Field Models for EEG/MEG: From Oscillations to Waves. Brain Topogr 2021; 35:36-53. [PMID: 33993357 PMCID: PMC8813727 DOI: 10.1007/s10548-021-00842-4] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2020] [Accepted: 04/21/2021] [Indexed: 11/24/2022]
Abstract
Neural mass models have been used since the 1970s to model the coarse-grained activity of large populations of neurons. They have proven especially fruitful for understanding brain rhythms. However, although motivated by neurobiological considerations they are phenomenological in nature, and cannot hope to recreate some of the rich repertoire of responses seen in real neuronal tissue. Here we consider a simple spiking neuron network model that has recently been shown to admit an exact mean-field description for both synaptic and gap-junction interactions. The mean-field model takes a similar form to a standard neural mass model, with an additional dynamical equation to describe the evolution of within-population synchrony. As well as reviewing the origins of this next generation mass model we discuss its extension to describe an idealised spatially extended planar cortex. To emphasise the usefulness of this model for EEG/MEG modelling we show how it can be used to uncover the role of local gap-junction coupling in shaping large scale synaptic waves.
Collapse
Affiliation(s)
- Áine Byrne
- School of Mathematics and Statistics, Science Centre, University College Dublin, South Belfield, Dublin 4, Ireland.
| | - James Ross
- School of Mathematical Sciences, Centre for Mathematical Medicine and Biology, University of Nottingham, Nottingham, NG7 2RD, UK
| | - Rachel Nicks
- School of Mathematical Sciences, Centre for Mathematical Medicine and Biology, University of Nottingham, Nottingham, NG7 2RD, UK
| | - Stephen Coombes
- School of Mathematical Sciences, Centre for Mathematical Medicine and Biology, University of Nottingham, Nottingham, NG7 2RD, UK
| |
Collapse
|
23
|
Jüttner B, Henriksen C, Martens EA. Birth and destruction of collective oscillations in a network of two populations of coupled type 1 neurons. CHAOS (WOODBURY, N.Y.) 2021; 31:023141. [PMID: 33653075 DOI: 10.1063/5.0031630] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Accepted: 01/12/2021] [Indexed: 06/12/2023]
Abstract
We study the macroscopic dynamics of large networks of excitable type 1 neurons composed of two populations interacting with disparate but symmetric intra- and inter-population coupling strengths. This nonuniform coupling scheme facilitates symmetric equilibria, where both populations display identical firing activity, characterized by either quiescent or spiking behavior, or asymmetric equilibria, where the firing activity of one population exhibits quiescent but the other exhibits spiking behavior. Oscillations in the firing rate are possible if neurons emit pulses with non-zero width but are otherwise quenched. Here, we explore how collective oscillations emerge for two statistically identical neuron populations in the limit of an infinite number of neurons. A detailed analysis reveals how collective oscillations are born and destroyed in various bifurcation scenarios and how they are organized around higher codimension bifurcation points. Since both symmetric and asymmetric equilibria display bistable behavior, a large configuration space with steady and oscillatory behavior is available. Switching between configurations of neural activity is relevant in functional processes such as working memory and the onset of collective oscillations in motor control.
Collapse
Affiliation(s)
- Benjamin Jüttner
- Department of Applied Mathematics and Computer Science, Technical University of Denmark, 2800 Kgs. Lyngby, Denmark
| | - Christian Henriksen
- Department of Applied Mathematics and Computer Science, Technical University of Denmark, 2800 Kgs. Lyngby, Denmark
| | - Erik A Martens
- Department of Applied Mathematics and Computer Science, Technical University of Denmark, 2800 Kgs. Lyngby, Denmark
| |
Collapse
|
24
|
Van Pottelbergh T, Drion G, Sepulchre R. From Biophysical to Integrate-and-Fire Modeling. Neural Comput 2021; 33:563-589. [PMID: 33400899 DOI: 10.1162/neco_a_01353] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
This article proposes a methodology to extract a low-dimensional integrate-and-fire model from an arbitrarily detailed single-compartment biophysical model. The method aims at relating the modulation of maximal conductance parameters in the biophysical model to the modulation of parameters in the proposed integrate-and-fire model. The approach is illustrated on two well-documented examples of cellular neuromodulation: the transition between type I and type II excitability and the transition between spiking and bursting.
Collapse
Affiliation(s)
| | - Guillaume Drion
- Department of Electrical Engineering and Computer Science, University of Liège, 4000 Liège, Belgium
| | - Rodolphe Sepulchre
- Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, U.K.
| |
Collapse
|
25
|
Montbrió E, Pazó D. Exact Mean-Field Theory Explains the Dual Role of Electrical Synapses in Collective Synchronization. PHYSICAL REVIEW LETTERS 2020; 125:248101. [PMID: 33412049 DOI: 10.1103/physrevlett.125.248101] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Revised: 11/18/2020] [Accepted: 11/18/2020] [Indexed: 06/12/2023]
Abstract
Electrical synapses play a major role in setting up neuronal synchronization, but the precise mechanisms whereby these synapses contribute to synchrony are subtle and remain elusive. To investigate these mechanisms mean-field theories for quadratic integrate-and-fire neurons with electrical synapses have been recently put forward. Still, the validity of these theories is controversial since they assume that the neurons produce unrealistic, symmetric spikes, ignoring the well-known impact of spike shape on synchronization. Here, we show that the assumption of symmetric spikes can be relaxed in such theories. The resulting mean-field equations reveal a dual role of electrical synapses: First, they equalize membrane potentials favoring the emergence of synchrony. Second, electrical synapses act as "virtual chemical synapses," which can be either excitatory or inhibitory depending upon the spike shape. Our results offer a precise mathematical explanation of the intricate effect of electrical synapses in collective synchronization. This reconciles previous theoretical and numerical works, and confirms the suitability of recent low-dimensional mean-field theories to investigate electrically coupled neuronal networks.
Collapse
Affiliation(s)
- Ernest Montbrió
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, 08003 Barcelona, Spain
| | - Diego Pazó
- Instituto de Física de Cantabria (IFCA), CSIC-Universidad de Cantabria, 39005 Santander, Spain
| |
Collapse
|
26
|
Bick C, Goodfellow M, Laing CR, Martens EA. Understanding the dynamics of biological and neural oscillator networks through exact mean-field reductions: a review. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2020; 10:9. [PMID: 32462281 PMCID: PMC7253574 DOI: 10.1186/s13408-020-00086-9] [Citation(s) in RCA: 94] [Impact Index Per Article: 23.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/17/2019] [Accepted: 05/07/2020] [Indexed: 05/03/2023]
Abstract
Many biological and neural systems can be seen as networks of interacting periodic processes. Importantly, their functionality, i.e., whether these networks can perform their function or not, depends on the emerging collective dynamics of the network. Synchrony of oscillations is one of the most prominent examples of such collective behavior and has been associated both with function and dysfunction. Understanding how network structure and interactions, as well as the microscopic properties of individual units, shape the emerging collective dynamics is critical to find factors that lead to malfunction. However, many biological systems such as the brain consist of a large number of dynamical units. Hence, their analysis has either relied on simplified heuristic models on a coarse scale, or the analysis comes at a huge computational cost. Here we review recently introduced approaches, known as the Ott-Antonsen and Watanabe-Strogatz reductions, allowing one to simplify the analysis by bridging small and large scales. Thus, reduced model equations are obtained that exactly describe the collective dynamics for each subpopulation in the oscillator network via few collective variables only. The resulting equations are next-generation models: Rather than being heuristic, they exactly link microscopic and macroscopic descriptions and therefore accurately capture microscopic properties of the underlying system. At the same time, they are sufficiently simple to analyze without great computational effort. In the last decade, these reduction methods have become instrumental in understanding how network structure and interactions shape the collective dynamics and the emergence of synchrony. We review this progress based on concrete examples and outline possible limitations. Finally, we discuss how linking the reduced models with experimental data can guide the way towards the development of new treatment approaches, for example, for neurological disease.
Collapse
Affiliation(s)
- Christian Bick
- Centre for Systems, Dynamics, and Control, University of Exeter, Exeter, UK.
- Department of Mathematics, University of Exeter, Exeter, UK.
- EPSRC Centre for Predictive Modelling in Healthcare, University of Exeter, Exeter, UK.
- Mathematical Institute, University of Oxford, Oxford, UK.
- Institute for Advanced Study, Technische Universität München, Garching, Germany.
| | - Marc Goodfellow
- Department of Mathematics, University of Exeter, Exeter, UK
- EPSRC Centre for Predictive Modelling in Healthcare, University of Exeter, Exeter, UK
- Living Systems Institute, University of Exeter, Exeter, UK
- Wellcome Trust Centre for Biomedical Modelling and Analysis, University of Exeter, Exeter, UK
| | - Carlo R Laing
- School of Natural and Computational Sciences, Massey University, Auckland, New Zealand
| | - Erik A Martens
- Department of Applied Mathematics and Computer Science, Technical University of Denmark, Kgs. Lyngby, Denmark.
- Department of Biomedical Science, University of Copenhagen, Copenhagen N, Denmark.
- Centre for Translational Neuroscience, University of Copenhagen, Copenhagen N, Denmark.
| |
Collapse
|
27
|
Stapmanns J, Kühn T, Dahmen D, Luu T, Honerkamp C, Helias M. Self-consistent formulations for stochastic nonlinear neuronal dynamics. Phys Rev E 2020; 101:042124. [PMID: 32422832 DOI: 10.1103/physreve.101.042124] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 12/18/2019] [Indexed: 01/28/2023]
Abstract
Neural dynamics is often investigated with tools from bifurcation theory. However, many neuron models are stochastic, mimicking fluctuations in the input from unknown parts of the brain or the spiking nature of signals. Noise changes the dynamics with respect to the deterministic model; in particular classical bifurcation theory cannot be applied. We formulate the stochastic neuron dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To formulate self-consistency equations, we derive a fundamental link between the effective action in the Onsager-Machlup (OM) formalism, which allows the study of phase transitions, and the MSRDJ effective action, which is computationally advantageous. These results in particular allow the derivation of an OM effective action for systems with non-Gaussian noise. This approach naturally leads to effective deterministic equations for the first moment of the stochastic system; they explain how nonlinearities and noise cooperate to produce memory effects. Moreover, the MSRDJ formulation yields an effective linear system that has identical power spectra and linear response. Starting from the better known loopwise approximation, we then discuss the use of the fRG as a method to obtain self-consistency beyond the mean. We present a new efficient truncation scheme for the hierarchy of flow equations for the vertex functions by adapting the Blaizot, Méndez, and Wschebor approximation from the derivative expansion to the vertex expansion. The methods are presented by means of the simplest possible example of a stochastic differential equation that has generic features of neuronal dynamics.
Collapse
Affiliation(s)
- Jonas Stapmanns
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| | - David Dahmen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Thomas Luu
- Institut für Kernphysik (IKP-3), Institute for Advanced Simulation (IAS-4) and Jülich Center for Hadron Physics, Jülich Research Centre, Jülich, Germany
| | - Carsten Honerkamp
- Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany.,JARA-FIT, Jülich Aachen Research Alliance-Fundamentals of Future Information Technology, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany.,Institute for Theoretical Solid State Physics, RWTH Aachen University, 52074 Aachen, Germany
| |
Collapse
|
28
|
Ganguly C, Chakrabarti S. A Discrete Time Framework for Spike Transfer Process in a Cortical Neuron With Asynchronous EPSP, IPSP, and Variable Threshold. IEEE Trans Neural Syst Rehabil Eng 2020; 28:772-781. [PMID: 32086215 DOI: 10.1109/tnsre.2020.2975203] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Interpretation of high level cognitive behavior of human brain requires comprehensive understanding of spike transfer process at neuronal level. Synapses play major role in spike transfer process from one neuron to another. An expanded leaky integrate and fire model of a neuron in multiple input and single output configuration with threshold variability for spike transfer process is proposed in this paper. Asynchronous generation of post synaptic potential is considered. Multiple types of excitatory and inhibitory post-synaptic potentials are also included in the model. An analytical expression of membrane potential including threshold variability and activity dependant noise process has been developed. The model captures several important features of a spiking neuron through a set of well defined parameters. Simulation results are provided to explain various aspects of the proposed model. A functionally scaled version of the model has also been compared with limited experimental data, available from the Allen Institute of Brain Science.
Collapse
|
29
|
Jensen's force and the statistical mechanics of cortical asynchronous states. Sci Rep 2019; 9:15183. [PMID: 31645611 PMCID: PMC6811577 DOI: 10.1038/s41598-019-51520-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2019] [Accepted: 09/28/2019] [Indexed: 12/02/2022] Open
Abstract
Cortical networks are shaped by the combined action of excitatory and inhibitory interactions. Among other important functions, inhibition solves the problem of the all-or-none type of response that comes about in purely excitatory networks, allowing the network to operate in regimes of moderate or low activity, between quiescent and saturated regimes. Here, we elucidate a noise-induced effect that we call “Jensen’s force” –stemming from the combined effect of excitation/inhibition balance and network sparsity– which is responsible for generating a phase of self-sustained low activity in excitation-inhibition networks. The uncovered phase reproduces the main empirically-observed features of cortical networks in the so-called asynchronous state, characterized by low, un-correlated and highly-irregular activity. The parsimonious model analyzed here allows us to resolve a number of long-standing issues, such as proving that activity can be self-sustained even in the complete absence of external stimuli or driving. The simplicity of our approach allows for a deep understanding of asynchronous states and of the phase transitions to other standard phases it exhibits, opening the door to reconcile, asynchronous-state and critical-state hypotheses, putting them within a unified framework. We argue that Jensen’s forces are measurable experimentally and might be relevant in contexts beyond neuroscience.
Collapse
|
30
|
Okujeni S, Egert U. Inhomogeneities in Network Structure and Excitability Govern Initiation and Propagation of Spontaneous Burst Activity. Front Neurosci 2019; 13:543. [PMID: 31213971 PMCID: PMC6554329 DOI: 10.3389/fnins.2019.00543] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2018] [Accepted: 05/10/2019] [Indexed: 11/13/2022] Open
Abstract
The mesoscale architecture of neuronal networks strongly influences the initiation of spontaneous activity and its pathways of propagation. Spontaneous activity has been studied extensively in networks of cultured cortical neurons that generate complex yet reproducible patterns of synchronous bursting events that resemble the activity dynamics in developing neuronal networks in vivo. Synchronous bursts are mostly thought to be triggered at burst initiation sites due to build-up of noise or by highly active neurons, or to reflect reverberating activity that circulates within larger networks, although neither of these has been observed directly. Inferring such collective dynamics in neuronal populations from electrophysiological recordings crucially depends on the spatial resolution and sampling ratio relative to the size of the networks assessed. Using large-scale microelectrode arrays with 1024 electrodes at 0.3 mm pitch that covered the full extent of in vitro networks on about 1 cm2, we investigated where bursts of spontaneous activity arise and how their propagation patterns relate to the regions of origin, the network's structure, and to the overall distribution of activity. A set of alternating burst initiation zones (BIZ) dominated the initiation of distinct bursting events and triggered specific propagation patterns. Moreover, BIZs were typically located in areas with moderate activity levels, i.e., at transitions between hot and cold spots. The activity-dependent alternation between these zones suggests that the local networks forming the dominating BIZ enter a transient depressed state after several cycles (similar to Eytan et al., 2003), allowing other BIZs to take over temporarily. We propose that inhomogeneities in the network structure define such BIZs and that the depletion of local synaptic resources limit repetitive burst initiation.
Collapse
Affiliation(s)
- Samora Okujeni
- Biomicrotechnology, IMTEK - Department of Microsystems Engineering, University of Freiburg, Freiburg, Germany
| | - Ulrich Egert
- Bernstein Center Freiburg, University of Freiburg, Freiburg, Germany
| |
Collapse
|
31
|
Kobak D, Pardo-Vazquez JL, Valente M, Machens CK, Renart A. State-dependent geometry of population activity in rat auditory cortex. eLife 2019; 8:e44526. [PMID: 30969167 PMCID: PMC6491041 DOI: 10.7554/elife.44526] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Accepted: 04/07/2019] [Indexed: 12/02/2022] Open
Abstract
The accuracy of the neural code depends on the relative embedding of signal and noise in the activity of neural populations. Despite a wealth of theoretical work on population codes, there are few empirical characterizations of the high-dimensional signal and noise subspaces. We studied the geometry of population codes in the rat auditory cortex across brain states along the activation-inactivation continuum, using sounds varying in difference and mean level across the ears. As the cortex becomes more activated, single-hemisphere populations go from preferring contralateral loud sounds to a symmetric preference across lateralizations and intensities, gain-modulation effectively disappears, and the signal and noise subspaces become approximately orthogonal to each other and to the direction corresponding to global activity modulations. Level-invariant decoding of sound lateralization also becomes possible in the active state. Our results provide an empirical foundation for the geometry and state-dependence of cortical population codes.
Collapse
Affiliation(s)
- Dmitry Kobak
- Champalimaud Center for the UnknownLisbonPortugal
- Institute for Ophthalmic ResearchUniversity of TübingenTübingenGermany
| | - Jose L Pardo-Vazquez
- Champalimaud Center for the UnknownLisbonPortugal
- Neuroscience and Motor Control GroupUniversity of A CoruñaCoruñaSpain
| | | | | | | |
Collapse
|
32
|
Nazemi PS, Jamali Y. On the Influence of Structural Connectivity on the Correlation Patterns and Network Synchronization. Front Comput Neurosci 2019; 12:105. [PMID: 30670958 PMCID: PMC6332471 DOI: 10.3389/fncom.2018.00105] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 12/12/2018] [Indexed: 01/17/2023] Open
Abstract
Since brain structural connectivity is the foundation of its functionality, in order to understand brain abilities, studying the relation between structural and functional connectivity is essential. Several approaches have been applied to measure the role of the structural connectivity in the emergent correlation/synchronization patterns. In this study, we investigates the cross-correlation and synchronization sensitivity to coupling strength between neural regions for different topological networks. We model the neural populations by a neural mass model that express an oscillatory dynamic. The results highlight that coupling between neural ensembles leads to various cross-correlation patterns and local synchrony even on an ordered network. Moreover, as the network departs from an ordered organization to a small-world architecture, correlation patterns, and synchronization dynamics change. Interestingly, at a certain range of the synaptic strength, by fixing the structural conditions, different organized patterns are seen at the different input signals. This variety switches to a bifurcation region by increasing the synaptic strength. We show that topological variations is a major factor of synchronization behavior and lead to alterations in correlated local clusters. We found the coupling strength (between cortical areas) to be especially important at conversions of correlation and synchronization states. Since correlation patterns generate functional connections and transitions of functional connectivity have been related to cognitive operations, these diverse correlation patterns may be considered as different dynamical states corresponding to various cognitive tasks.
Collapse
Affiliation(s)
| | - Yousef Jamali
- Department of Mathematics, Tarbiat Modares University, Tehran, Iran
| |
Collapse
|
33
|
Kada H, Teramae JN, Tokuda IT. Highly Heterogeneous Excitatory Connections Require Less Amount of Noise to Sustain Firing Activities in Cortical Networks. Front Comput Neurosci 2019; 12:104. [PMID: 30622467 PMCID: PMC6308195 DOI: 10.3389/fncom.2018.00104] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2018] [Accepted: 12/07/2018] [Indexed: 11/17/2022] Open
Abstract
Cortical networks both in vivo and in vitro sustain asynchronous irregular firings with extremely low frequency. To realize such self-sustained activity in neural network models, balance between excitatory and inhibitory activities is known to be one of the keys. In addition, recent theoretical studies have revealed that another feature commonly observed in cortical networks, i.e., sparse but strong connections and dense weak connections, plays an essential role. The previous studies, however, have not thoroughly considered the cooperative dynamics between a network of such heterogeneous synaptic connections and intrinsic noise. The noise stimuli, representing inherent nature of the neuronal activities, e.g., variability of presynaptic discharges, should be also of significant importance for sustaining the irregular firings in cortical networks. Here, we numerically demonstrate that highly heterogeneous distribution, typically a lognormal type, of excitatory-to-excitatory connections, reduces the amount of noise required to sustain the network firing activities. In the sense that noise consumes an energy resource, the heterogeneous network receiving less amount of noise stimuli is considered to realize an efficient dynamics in cortex. A noise-driven network of bi-modally distributed synapses further shows that many weak and a few very strong synapses are the key feature of the synaptic heterogeneity, supporting the network firing activity.
Collapse
Affiliation(s)
- Hisashi Kada
- Department of Mechanical Engineering, Ritsumeikan University, Kusatsu-shi, Japan
| | | | - Isao T Tokuda
- Department of Mechanical Engineering, Ritsumeikan University, Kusatsu-shi, Japan
| |
Collapse
|
34
|
|
35
|
Aamir SA, Muller P, Kiene G, Kriener L, Stradmann Y, Grubl A, Schemmel J, Meier K. A Mixed-Signal Structured AdEx Neuron for Accelerated Neuromorphic Cores. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2018; 12:1027-1037. [PMID: 30047897 DOI: 10.1109/tbcas.2018.2848203] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Here, we describe a multicompartment neuron circuit based on the adaptive-exponential I&F (AdEx) model, developed for the second-generation BrainScaleS hardware. Based on an existing modular leaky integrate-and-fire (LIF) architecture designed in 65-nm CMOS, the circuit features exponential spike generation, neuronal adaptation, intercompartmental connections as well as a conductance-based reset. The design reproduces a diverse set of firing patterns observed in cortical pyramidal neurons. Further, it enables the emulation of sodium and calcium spikes, as well as N-methyl-D-aspartate plateau potentials known from apical and thin dendrites. We characterize the AdEx circuit extensions and exemplify how the interplay between passive and nonlinear active signal processing enhances the computational capabilities of single (but structured) on-chip neurons.
Collapse
|
36
|
Chizhov AV, Zefirov AV, Amakhin DV, Smirnova EY, Zaitsev AV. Minimal model of interictal and ictal discharges "Epileptor-2". PLoS Comput Biol 2018; 14:e1006186. [PMID: 29851959 PMCID: PMC6005638 DOI: 10.1371/journal.pcbi.1006186] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2017] [Revised: 06/18/2018] [Accepted: 05/09/2018] [Indexed: 12/01/2022] Open
Abstract
Seizures occur in a recurrent manner with intermittent states of interictal and ictal discharges (IIDs and IDs). The transitions to and from IDs are determined by a set of processes, including synaptic interaction and ionic dynamics. Although mathematical models of separate types of epileptic discharges have been developed, modeling the transitions between states remains a challenge. A simple generic mathematical model of seizure dynamics (Epileptor) has recently been proposed by Jirsa et al. (2014); however, it is formulated in terms of abstract variables. In this paper, a minimal population-type model of IIDs and IDs is proposed that is as simple to use as the Epileptor, but the suggested model attributes physical meaning to the variables. The model is expressed in ordinary differential equations for extracellular potassium and intracellular sodium concentrations, membrane potential, and short-term synaptic depression variables. A quadratic integrate-and-fire model driven by the population input current is used to reproduce spike trains in a representative neuron. In simulations, potassium accumulation governs the transition from the silent state to the state of an ID. Each ID is composed of clustered IID-like events. The sodium accumulates during discharge and activates the sodium-potassium pump, which terminates the ID by restoring the potassium gradient and thus polarizing the neuronal membranes. The whole-cell and cell-attached recordings of a 4-AP-based in vitro model of epilepsy confirmed the primary model assumptions and predictions. The mathematical analysis revealed that the IID-like events are large-amplitude stochastic oscillations, which in the case of ID generation are controlled by slow oscillations of ionic concentrations. The IDs originate in the conditions of elevated potassium concentrations in a bath solution via a saddle-node-on-invariant-circle-like bifurcation for a non-smooth dynamical system. By providing a minimal biophysical description of ionic dynamics and network interactions, the model may serve as a hierarchical base from a simple to more complex modeling of seizures. In pathological conditions of epilepsy, the functioning of the neural network crucially depends on the ionic concentrations inside and outside neurons. A number of factors that affect neuronal activity is large. That is why the development of a minimal model that reproduces typical seizures could structure further experimental and analytical studies of the pathological mechanisms. Here, on a base of known biophysical models, we present a simple population-type model that includes only four principal variables, the extracellular potassium concentration, the intracellular sodium concentration, the membrane potential and the synaptic resource diminishing due to short-term synaptic depression. A simple modeled neuron is used as an observer of the population activity. We validate the model assumptions with in vitro experiments. Our model reproduces ictal and interictal events, where the latter result in bursts of spikes in single neurons, and the former represent the cluster of spike bursts. Mathematical analysis reveals that the bursts are spontaneous large-amplitude oscillations, which may cluster after a saddle-node on invariant circle bifurcation in the pro-epileptic conditions. Our consideration has significant bearing in understanding pathological neuronal network dynamics.
Collapse
Affiliation(s)
- Anton V. Chizhov
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
- * E-mail:
| | - Artyom V. Zefirov
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
| | - Dmitry V. Amakhin
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
| | - Elena Yu. Smirnova
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- Computational Physics Laboratory, Ioffe Institute, Saint Petersburg, Russia
| | - Aleksey V. Zaitsev
- Laboratory of Molecular Mechanisms of Neural Interactions, Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, Saint Petersburg, Russia
- Institute of Experimental Medicine, Almazov National Medical Research Centre, Saint Petersburg, Russia
| |
Collapse
|
37
|
Manninen T, Aćimović J, Havela R, Teppola H, Linne ML. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures. Front Neuroinform 2018; 12:20. [PMID: 29765315 PMCID: PMC5938413 DOI: 10.3389/fninf.2018.00020] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2018] [Accepted: 04/06/2018] [Indexed: 01/26/2023] Open
Abstract
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Collapse
Affiliation(s)
- Tiina Manninen
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Jugoslava Aćimović
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Riikka Havela
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Heidi Teppola
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| | - Marja-Leena Linne
- Computational Neuroscience Group, BioMediTech Institute and Faculty of Biomedical Sciences and Engineering, Tampere University of Technology, Tampere, Finland
- Laboratory of Signal Processing, Tampere University of Technology, Tampere, Finland
| |
Collapse
|
38
|
Affiliation(s)
- Peter E Latham
- Gatsby Computational Unit, University College London, London, UK
| |
Collapse
|
39
|
Up-Down-Like Background Spiking Can Enhance Neural Information Transmission. eNeuro 2018; 4:eN-TNC-0282-17. [PMID: 29354678 PMCID: PMC5773284 DOI: 10.1523/eneuro.0282-17.2017] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2017] [Revised: 11/15/2017] [Accepted: 11/20/2017] [Indexed: 11/23/2022] Open
Abstract
How neurons transmit information about sensory or internal signals is strongly influenced by ongoing internal activity. Depending on brain state, this background spiking can occur asynchronously or clustered in up states, periods of collective firing that are interspersed by silent down states. Here, we study which effect such up-down (UD) transitions have on signal transmission. In a simple model, we obtain numerical and analytical results for information theoretic measures. We find that, surprisingly, an UD background can benefit information transmission: when background activity is sparse, it is advantageous to distribute spikes into up states rather than uniformly in time. We reproduce the same effect in a more realistic recurrent network and show that signal transmission is further improved by incorporating that up states propagate across cortex as traveling waves. We propose that traveling UD activity might represent a compromise between reducing metabolic strain and maintaining information transmission capabilities.
Collapse
|
40
|
Kim CM, Chow CC. Learning recurrent dynamics in spiking networks. eLife 2018; 7:37124. [PMID: 30234488 PMCID: PMC6195349 DOI: 10.7554/elife.37124] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2018] [Accepted: 09/14/2018] [Indexed: 01/27/2023] Open
Abstract
Spiking activity of neurons engaged in learning and performing a task show complex spatiotemporal dynamics. While the output of recurrent network models can learn to perform various tasks, the possible range of recurrent dynamics that emerge after learning remains unknown. Here we show that modifying the recurrent connectivity with a recursive least squares algorithm provides sufficient flexibility for synaptic and spiking rate dynamics of spiking networks to produce a wide range of spatiotemporal activity. We apply the training method to learn arbitrary firing patterns, stabilize irregular spiking activity in a network of excitatory and inhibitory neurons respecting Dale's law, and reproduce the heterogeneous spiking rate patterns of cortical neurons engaged in motor planning and movement. We identify sufficient conditions for successful learning, characterize two types of learning errors, and assess the network capacity. Our findings show that synaptically-coupled recurrent spiking networks possess a vast computational capability that can support the diverse activity patterns in the brain.
Collapse
Affiliation(s)
- Christopher M Kim
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney DiseasesNational Institutes of HealthBethesdaUnited States
| | - Carson C Chow
- Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney DiseasesNational Institutes of HealthBethesdaUnited States
| |
Collapse
|
41
|
Devalle F, Roxin A, Montbrió E. Firing rate equations require a spike synchrony mechanism to correctly describe fast oscillations in inhibitory networks. PLoS Comput Biol 2017; 13:e1005881. [PMID: 29287081 PMCID: PMC5764488 DOI: 10.1371/journal.pcbi.1005881] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2017] [Revised: 01/11/2018] [Accepted: 11/15/2017] [Indexed: 12/25/2022] Open
Abstract
Recurrently coupled networks of inhibitory neurons robustly generate oscillations in the gamma band. Nonetheless, the corresponding Wilson-Cowan type firing rate equation for such an inhibitory population does not generate such oscillations without an explicit time delay. We show that this discrepancy is due to a voltage-dependent spike-synchronization mechanism inherent in networks of spiking neurons which is not captured by standard firing rate equations. Here we investigate an exact low-dimensional description for a network of heterogeneous canonical Class 1 inhibitory neurons which includes the sub-threshold dynamics crucial for generating synchronous states. In the limit of slow synaptic kinetics the spike-synchrony mechanism is suppressed and the standard Wilson-Cowan equations are formally recovered as long as external inputs are also slow. However, even in this limit synchronous spiking can be elicited by inputs which fluctuate on a time-scale of the membrane time-constant of the neurons. Our meanfield equations therefore represent an extension of the standard Wilson-Cowan equations in which spike synchrony is also correctly described. Population models describing the average activity of large neuronal ensembles are a powerful mathematical tool to investigate the principles underlying cooperative function of large neuronal systems. However, these models do not properly describe the phenomenon of spike synchrony in networks of neurons. In particular, they fail to capture the onset of synchronous oscillations in networks of inhibitory neurons. We show that this limitation is due to a voltage-dependent synchronization mechanism which is naturally present in spiking neuron models but not captured by traditional firing rate equations. Here we investigate a novel set of macroscopic equations which incorporate both firing rate and membrane potential dynamics, and that correctly generate fast inhibition-based synchronous oscillations. In the limit of slow-synaptic processing oscillations are suppressed, and the model reduces to an equation formally equivalent to the Wilson-Cowan model.
Collapse
Affiliation(s)
- Federico Devalle
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
- Department of Physics, Lancaster University, Lancaster, United Kingdom
| | - Alex Roxin
- Centre de Recerca Matemàtica, Campus de Bellaterra, Edifici C, Bellaterra, Barcelona, Spain
| | - Ernest Montbrió
- Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
- * E-mail:
| |
Collapse
|
42
|
Ashida G, Tollin DJ, Kretzberg J. Physiological models of the lateral superior olive. PLoS Comput Biol 2017; 13:e1005903. [PMID: 29281618 PMCID: PMC5744914 DOI: 10.1371/journal.pcbi.1005903] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2017] [Accepted: 11/28/2017] [Indexed: 01/09/2023] Open
Abstract
In computational biology, modeling is a fundamental tool for formulating, analyzing and predicting complex phenomena. Most neuron models, however, are designed to reproduce certain small sets of empirical data. Hence their outcome is usually not compatible or comparable with other models or datasets, making it unclear how widely applicable such models are. In this study, we investigate these aspects of modeling, namely credibility and generalizability, with a specific focus on auditory neurons involved in the localization of sound sources. The primary cues for binaural sound localization are comprised of interaural time and level differences (ITD/ILD), which are the timing and intensity differences of the sound waves arriving at the two ears. The lateral superior olive (LSO) in the auditory brainstem is one of the locations where such acoustic information is first computed. An LSO neuron receives temporally structured excitatory and inhibitory synaptic inputs that are driven by ipsi- and contralateral sound stimuli, respectively, and changes its spike rate according to binaural acoustic differences. Here we examine seven contemporary models of LSO neurons with different levels of biophysical complexity, from predominantly functional ones (‘shot-noise’ models) to those with more detailed physiological components (variations of integrate-and-fire and Hodgkin-Huxley-type). These models, calibrated to reproduce known monaural and binaural characteristics of LSO, generate largely similar results to each other in simulating ITD and ILD coding. Our comparisons of physiological detail, computational efficiency, predictive performances, and further expandability of the models demonstrate (1) that the simplistic, functional LSO models are suitable for applications where low computational costs and mathematical transparency are needed, (2) that more complex models with detailed membrane potential dynamics are necessary for simulation studies where sub-neuronal nonlinear processes play important roles, and (3) that, for general purposes, intermediate models might be a reasonable compromise between simplicity and biological plausibility. Computational models help our understanding of complex biological systems, by identifying their key elements and revealing their operational principles. Close comparisons between model predictions and empirical observations ensure our confidence in a model as a building block for further applications. Most current neuronal models, however, are constructed to replicate only a small specific set of experimental data. Thus, it is usually unclear how these models can be generalized to different datasets and how they compare with each other. In this paper, seven neuronal models are examined that are designed to reproduce known physiological characteristics of auditory neurons involved in the detection of sound source location. Despite their different levels of complexity, the models generate largely similar results when their parameters are tuned with common criteria. Comparisons show that simple models are computationally more efficient and theoretically transparent, and therefore suitable for rigorous mathematical analyses and engineering applications including real-time simulations. In contrast, complex models are necessary for investigating the relationship between underlying biophysical processes and sub- and suprathreshold spiking properties, although they have a large number of unconstrained, unverified parameters. Having identified their advantages and drawbacks, these auditory neuron models may readily be used for future studies and applications.
Collapse
Affiliation(s)
- Go Ashida
- Cluster of Excellence "Hearing4all", Department of Neuroscience, University of Oldenburg, Oldenburg, Germany
| | - Daniel J Tollin
- Department of Physiology and Biophysics, University of Colorado School of Medicine, Aurora, Colorado, United States of America
| | - Jutta Kretzberg
- Cluster of Excellence "Hearing4all", Department of Neuroscience, University of Oldenburg, Oldenburg, Germany
| |
Collapse
|
43
|
Chan SC, Mok SY, Ng DWK, Goh SY. The role of neuron-glia interactions in the emergence of ultra-slow oscillations. BIOLOGICAL CYBERNETICS 2017; 111:459-472. [PMID: 29128889 DOI: 10.1007/s00422-017-0740-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/15/2016] [Accepted: 10/30/2017] [Indexed: 06/07/2023]
Abstract
Ultra-slow cortical oscillatory activity of 1-100 mHz has been recorded in human by electroencephalography and in dissociated cultures of cortical rat neurons, but the underlying mechanisms remain to be elucidated. This study presents a computational model of ultra-slow oscillatory activity based on the interaction between neurons and astrocytes. We predict that the frequency of these oscillations closely depends on activation of astrocytes in the network, which is reflected by oscillations of their intracellular calcium concentrations with periods between tens of seconds and minutes. An increase of intracellular calcium in astrocytes triggers the release of adenosine triphosphate from these cells which may alter transmission at nearby synapses by increasing or decreasing neurotransmitter release. These results provide theoretical support for the emerging awareness of astrocytes as active players in the regulation of neural activity and identify neuron-astrocyte interactions as a potential primary mechanism for the emergence of ultra-slow cortical oscillations.
Collapse
Affiliation(s)
- Siow-Cheng Chan
- Lee Kong Chian Faculty of Engineering and Science, Universiti Tunku Abdul Rahman, Jalan Sungai Long, Bandar Sungai Long, Cheras, 43000, Kajang, Selangor, Malaysia.
| | - Siew-Ying Mok
- Lee Kong Chian Faculty of Engineering and Science, Universiti Tunku Abdul Rahman, Jalan Sungai Long, Bandar Sungai Long, Cheras, 43000, Kajang, Selangor, Malaysia
| | - Danny Wee-Kiat Ng
- Lee Kong Chian Faculty of Engineering and Science, Universiti Tunku Abdul Rahman, Jalan Sungai Long, Bandar Sungai Long, Cheras, 43000, Kajang, Selangor, Malaysia
| | - Sing-Yau Goh
- Lee Kong Chian Faculty of Engineering and Science, Universiti Tunku Abdul Rahman, Jalan Sungai Long, Bandar Sungai Long, Cheras, 43000, Kajang, Selangor, Malaysia
| |
Collapse
|
44
|
Modeling mesoscopic cortical dynamics using a mean-field model of conductance-based networks of adaptive exponential integrate-and-fire neurons. J Comput Neurosci 2017; 44:45-61. [DOI: 10.1007/s10827-017-0668-2] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2017] [Revised: 09/19/2017] [Accepted: 10/17/2017] [Indexed: 11/26/2022]
|
45
|
Byrne Á, Brookes MJ, Coombes S. A mean field model for movement induced changes in the beta rhythm. J Comput Neurosci 2017; 43:143-158. [PMID: 28748303 PMCID: PMC5585324 DOI: 10.1007/s10827-017-0655-7] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2016] [Revised: 03/27/2017] [Accepted: 05/31/2017] [Indexed: 01/29/2023]
Abstract
In electrophysiological recordings of the brain, the transition from high amplitude to low amplitude signals are most likely caused by a change in the synchrony of underlying neuronal population firing patterns. Classic examples of such modulations are the strong stimulus-related oscillatory phenomena known as the movement related beta decrease (MRBD) and post-movement beta rebound (PMBR). A sharp decrease in neural oscillatory power is observed during movement (MRBD) followed by an increase above baseline on movement cessation (PMBR). MRBD and PMBR represent important neuroscientific phenomena which have been shown to have clinical relevance. Here, we present a parsimonious model for the dynamics of synchrony within a synaptically coupled spiking network that is able to replicate a human MEG power spectrogram showing the evolution from MRBD to PMBR. Importantly, the high-dimensional spiking model has an exact mean field description in terms of four ordinary differential equations that allows considerable insight to be obtained into the cause of the experimentally observed time-lag from movement termination to the onset of PMBR (∼ 0.5 s), as well as the subsequent long duration of PMBR (∼ 1 - 10 s). Our model represents the first to predict these commonly observed and robust phenomena and represents a key step in their understanding, in health and disease.
Collapse
Affiliation(s)
- Áine Byrne
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, University Park, Nottingham, NG7 2RD, UK.
| | - Matthew J Brookes
- Sir Peter Mansfield Imaging Centre, School of Physics and Astronomy, University of Nottingham, University Park, Nottingham, NG7 2RD, UK
| | - Stephen Coombes
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, University Park, Nottingham, NG7 2RD, UK
| |
Collapse
|
46
|
Jercog D, Roxin A, Barthó P, Luczak A, Compte A, de la Rocha J. UP-DOWN cortical dynamics reflect state transitions in a bistable network. eLife 2017; 6:22425. [PMID: 28826485 PMCID: PMC5582872 DOI: 10.7554/elife.22425] [Citation(s) in RCA: 85] [Impact Index Per Article: 12.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2016] [Accepted: 07/21/2017] [Indexed: 11/21/2022] Open
Abstract
In the idling brain, neuronal circuits transition between periods of sustained firing (UP state) and quiescence (DOWN state), a pattern the mechanisms of which remain unclear. Here we analyzed spontaneous cortical population activity from anesthetized rats and found that UP and DOWN durations were highly variable and that population rates showed no significant decay during UP periods. We built a network rate model with excitatory (E) and inhibitory (I) populations exhibiting a novel bistable regime between a quiescent and an inhibition-stabilized state of arbitrarily low rate. Fluctuations triggered state transitions, while adaptation in E cells paradoxically caused a marginal decay of E-rate but a marked decay of I-rate in UP periods, a prediction that we validated experimentally. A spiking network implementation further predicted that DOWN-to-UP transitions must be caused by synchronous high-amplitude events. Our findings provide evidence of bistable cortical networks that exhibit non-rhythmic state transitions when the brain rests.
Collapse
Affiliation(s)
- Daniel Jercog
- Institut d'Investigacions Biomèdiques August Pi i Sunyer, Barcelona, Spain
| | - Alex Roxin
- Centre de Recerca Matemàtica, Bellaterra, Spain
| | - Peter Barthó
- MTA TTK NAP B Research Group of Sleep Oscillations, Budapest, Hungary
| | - Artur Luczak
- Canadian Center for Behavioural Neuroscience, University of Lethbridge, Lethbridge, Canada
| | - Albert Compte
- Institut d'Investigacions Biomèdiques August Pi i Sunyer, Barcelona, Spain
| | - Jaime de la Rocha
- Institut d'Investigacions Biomèdiques August Pi i Sunyer, Barcelona, Spain
| |
Collapse
|
47
|
Sanchez-Vives MV, Massimini M, Mattia M. Shaping the Default Activity Pattern of the Cortical Network. Neuron 2017; 94:993-1001. [PMID: 28595056 DOI: 10.1016/j.neuron.2017.05.015] [Citation(s) in RCA: 85] [Impact Index Per Article: 12.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2016] [Revised: 03/20/2017] [Accepted: 05/06/2017] [Indexed: 10/19/2022]
Abstract
Slow oscillations have been suggested as the default emergent activity of the cortical network. This is a low complexity state that integrates neuronal, synaptic, and connectivity properties of the cortex. Shaped by variations of physiological parameters, slow oscillations provide information about the underlying healthy or pathological network. We review how this default activity is shaped, how it acts as a powerful attractor, and how getting out of it is necessary for the brain to recover the levels of complexity associated with conscious states. We propose that slow oscillations provide a robust unifying paradigm for the study of cortical function.
Collapse
Affiliation(s)
- Maria V Sanchez-Vives
- Systems Neuroscience, IDIBAPS, 08036 Barcelona, Spain; ICREA, 08010 Barcelona, Spain.
| | | | | |
Collapse
|
48
|
Vich C, Berg RW, Guillamon A, Ditlevsen S. Estimation of Synaptic Conductances in Presence of Nonlinear Effects Caused by Subthreshold Ionic Currents. Front Comput Neurosci 2017; 11:69. [PMID: 28790909 PMCID: PMC5524927 DOI: 10.3389/fncom.2017.00069] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2016] [Accepted: 07/07/2017] [Indexed: 11/13/2022] Open
Abstract
Subthreshold fluctuations in neuronal membrane potential traces contain nonlinear components, and employing nonlinear models might improve the statistical inference. We propose a new strategy to estimate synaptic conductances, which has been tested using in silico data and applied to in vivo recordings. The model is constructed to capture the nonlinearities caused by subthreshold activated currents, and the estimation procedure can discern between excitatory and inhibitory conductances using only one membrane potential trace. More precisely, we perform second order approximations of biophysical models to capture the subthreshold nonlinearities, resulting in quadratic integrate-and-fire models, and apply approximate maximum likelihood estimation where we only suppose that conductances are stationary in a 50–100 ms time window. The results show an improvement compared to existent procedures for the models tested here.
Collapse
Affiliation(s)
- Catalina Vich
- Departament de Matemàtiques i Informàtica, Universitat de les Illes BalearsPalma, Spain
| | - Rune W Berg
- Center for Neuroscience, University of CopenhagenCopenhagen, Denmark
| | - Antoni Guillamon
- Departament de Matemàtiques, Universitat Politècnica de CatalunyaBarcelona, Spain
| | - Susanne Ditlevsen
- Department of Mathematical Sciences, University of CopenhagenCopenhagen, Denmark
| |
Collapse
|
49
|
Noisy Juxtacellular Stimulation In Vivo Leads to Reliable Spiking and Reveals High-Frequency Coding in Single Neurons. J Neurosci 2017; 36:11120-11132. [PMID: 27798191 DOI: 10.1523/jneurosci.0787-16.2016] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2016] [Accepted: 09/09/2016] [Indexed: 01/16/2023] Open
Abstract
Single cells in the motor and somatosensory cortex of rats were stimulated in vivo with broadband fluctuating currents applied juxtacellularly. Unlike the DC current steps used previously, fluctuating stimulation currents reliably evoked spike trains with precise timing of individual spikes. Fluctuating currents resulted in strong cellular responses at stimulation frequencies beyond the inverse membrane time constant and the mean firing rate of the neuron. Neuronal firing was associated with high rates of information transmission, even for the high-frequency components of the stimulus. Such response characteristics were also revealed in additional experiments with sinusoidal juxtacellular stimulation. For selected cells, we could reproduce these statistics with compartmental models of varying complexity. We also developed a method to generate Gaussian stimuli that evoke spike trains with prescribed spike times (under the constraint of a certain rate and coefficient of variation) and exemplify its ability to achieve precise and reliable spiking in cortical neurons in vivo Our results demonstrate a novel method for precise control of spike timing by juxtacellular stimulation, confirm and extend earlier conclusions from ex vivo work about the capacity of cortical neurons to generate precise discharges, and contribute to the understanding of the biophysics of information transfer of single neurons in vivo at high frequencies. SIGNIFICANCE STATEMENT Nanostimulation of single identified neurons in vivo can control spike frequency parametrically and, surprisingly, can even bias the animal's behavioral response. Here, we extend this stimulation protocol to time-dependent broadband noise stimulation in sensory and motor cortices of rat. In response to such stimuli, we found increased temporal spike-time reliability. The information transmission properties reveal, both experimentally and theoretically, that the neurons support high-frequency stimulation beyond the inverse membrane time. Generating a stimulus using the neuron's response properties, we could evoke prescribed spike times with high precision. Our work helps to establish a novel method for precise temporal control of single-cell spiking and provides a simplified biophysical description of single-neuron spiking under time-dependent in vivo-like stimulation.
Collapse
|
50
|
Kirst C, Modes CD, Magnasco MO. Shifting attention to dynamics: Self-reconfiguration of neural networks. ACTA ACUST UNITED AC 2017. [DOI: 10.1016/j.coisb.2017.04.006] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
|