1
|
Pietras B. Pulse Shape and Voltage-Dependent Synchronization in Spiking Neuron Networks. Neural Comput 2024; 36:1476-1540. [PMID: 39028958 DOI: 10.1162/neco_a_01680] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Accepted: 03/18/2024] [Indexed: 07/21/2024]
Abstract
Pulse-coupled spiking neural networks are a powerful tool to gain mechanistic insights into how neurons self-organize to produce coherent collective behavior. These networks use simple spiking neuron models, such as the θ-neuron or the quadratic integrate-and-fire (QIF) neuron, that replicate the essential features of real neural dynamics. Interactions between neurons are modeled with infinitely narrow pulses, or spikes, rather than the more complex dynamics of real synapses. To make these networks biologically more plausible, it has been proposed that they must also account for the finite width of the pulses, which can have a significant impact on the network dynamics. However, the derivation and interpretation of these pulses are contradictory, and the impact of the pulse shape on the network dynamics is largely unexplored. Here, I take a comprehensive approach to pulse coupling in networks of QIF and θ-neurons. I argue that narrow pulses activate voltage-dependent synaptic conductances and show how to implement them in QIF neurons such that their effect can last through the phase after the spike. Using an exact low-dimensional description for networks of globally coupled spiking neurons, I prove for instantaneous interactions that collective oscillations emerge due to an effective coupling through the mean voltage. I analyze the impact of the pulse shape by means of a family of smooth pulse functions with arbitrary finite width and symmetric or asymmetric shapes. For symmetric pulses, the resulting voltage coupling is not very effective in synchronizing neurons, but pulses that are slightly skewed to the phase after the spike readily generate collective oscillations. The results unveil a voltage-dependent spike synchronization mechanism at the heart of emergent collective behavior, which is facilitated by pulses of finite width and complementary to traditional synaptic transmission in spiking neuron networks.
Collapse
Affiliation(s)
- Bastian Pietras
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, 08018, Barcelona, Spain
| |
Collapse
|
2
|
Paliwal S, Ocker GK, Brinkman BAW. Metastability in networks of nonlinear stochastic integrate-and-fire neurons. ARXIV 2024:arXiv:2406.07445v1. [PMID: 38947936 PMCID: PMC11213153] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 07/02/2024]
Abstract
Neurons in the brain continuously process the barrage of sensory inputs they receive from the environment. A wide array of experimental work has shown that the collective activity of neural populations encodes and processes this constant bombardment of information. How these collective patterns of activity depend on single neuron properties is often unclear. Single-neuron recordings have shown that individual neural responses to inputs are nonlinear, which prevents a straightforward extrapolation from single neuron features to emergent collective states. In this work, we use a field theoretic formulation of a stochastic leaky integrate-and-fire model to study the impact of nonlinear intensity functions on macroscopic network activity. We show that the interplay between nonlinear spike emission and membrane potential resets can i) give rise to metastable transitions between active firing rate states, and ii) can enhance or suppress mean firing rates and membrane potentials in opposite directions.
Collapse
Affiliation(s)
- Siddharth Paliwal
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, NY, 11794, USA
| | - Gabriel Koch Ocker
- Department of Mathematics and Statistics, Boston University, Boston, MA, 02215, USA
| | - Braden A W Brinkman
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, NY, 11794, USA
| |
Collapse
|
3
|
Beninger J, Rossbroich J, Tóth K, Naud R. Functional subtypes of synaptic dynamics in mouse and human. Cell Rep 2024; 43:113785. [PMID: 38363673 DOI: 10.1016/j.celrep.2024.113785] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2023] [Revised: 12/08/2023] [Accepted: 01/27/2024] [Indexed: 02/18/2024] Open
Abstract
Synapses preferentially respond to particular temporal patterns of activity with a large degree of heterogeneity that is informally or tacitly separated into classes. Yet, the precise number and properties of such classes are unclear. Do they exist on a continuum and, if so, when is it appropriate to divide that continuum into functional regions? In a large dataset of glutamatergic cortical connections, we perform model-based characterization to infer the number and characteristics of functionally distinct subtypes of synaptic dynamics. In rodent data, we find five clusters that partially converge with transgenic-associated subtypes. Strikingly, the application of the same clustering method in human data infers a highly similar number of clusters, supportive of stable clustering. This nuanced dictionary of functional subtypes shapes the heterogeneity of cortical synaptic dynamics and provides a lens into the basic motifs of information transmission in the brain.
Collapse
Affiliation(s)
- John Beninger
- Center for Neural Dynamics and Artificial Intelligence, University of Ottawa, Ottawa, ON K1H 8M5, Canada; uOttawa Brain and Mind Research Institute, University of Ottawa, Ottawa, ON K1H 8M5, Canada; Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Julian Rossbroich
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland; Faculty of Science, University of Basel, Basel, Switzerland
| | - Katalin Tóth
- Center for Neural Dynamics and Artificial Intelligence, University of Ottawa, Ottawa, ON K1H 8M5, Canada; uOttawa Brain and Mind Research Institute, University of Ottawa, Ottawa, ON K1H 8M5, Canada; Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada
| | - Richard Naud
- Center for Neural Dynamics and Artificial Intelligence, University of Ottawa, Ottawa, ON K1H 8M5, Canada; uOttawa Brain and Mind Research Institute, University of Ottawa, Ottawa, ON K1H 8M5, Canada; Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON K1H 8M5, Canada; Department of Physics, University of Ottawa, Ottawa, ON K1H 8M5, Canada.
| |
Collapse
|
4
|
Kern FB, Chao ZC. Short-term neuronal and synaptic plasticity act in synergy for deviance detection in spiking networks. PLoS Comput Biol 2023; 19:e1011554. [PMID: 37831721 PMCID: PMC10599548 DOI: 10.1371/journal.pcbi.1011554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 10/25/2023] [Accepted: 09/29/2023] [Indexed: 10/15/2023] Open
Abstract
Sensory areas of cortex respond more strongly to infrequent stimuli when these violate previously established regularities, a phenomenon known as deviance detection (DD). Previous modeling work has mainly attempted to explain DD on the basis of synaptic plasticity. However, a large fraction of cortical neurons also exhibit firing rate adaptation, an underexplored potential mechanism. Here, we investigate DD in a spiking neuronal network model with two types of short-term plasticity, fast synaptic short-term depression (STD) and slower threshold adaptation (TA). We probe the model with an oddball stimulation paradigm and assess DD by evaluating the network responses. We find that TA is sufficient to elicit DD. It achieves this by habituating neurons near the stimulation site that respond earliest to the frequently presented standard stimulus (local fatigue), which diminishes the response and promotes the recovery (global fatigue) of the wider network. Further, we find a synergy effect between STD and TA, where they interact with each other to achieve greater DD than the sum of their individual effects. We show that this synergy is caused by the local fatigue added by STD, which inhibits the global response to the frequently presented stimulus, allowing greater recovery of TA-mediated global fatigue and making the network more responsive to the deviant stimulus. Finally, we show that the magnitude of DD strongly depends on the timescale of stimulation. We conclude that highly predictable information can be encoded in strong local fatigue, which allows greater global recovery and subsequent heightened sensitivity for DD.
Collapse
Affiliation(s)
- Felix Benjamin Kern
- International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, Tokyo, Japan
| | - Zenas C. Chao
- International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, Tokyo, Japan
| |
Collapse
|
5
|
Arribas DM, Marin-Burgin A, Morelli LG. Adult-born granule cells improve stimulus encoding and discrimination in the dentate gyrus. eLife 2023; 12:e80250. [PMID: 37584478 PMCID: PMC10476965 DOI: 10.7554/elife.80250] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2022] [Accepted: 08/15/2023] [Indexed: 08/17/2023] Open
Abstract
Heterogeneity plays an important role in diversifying neural responses to support brain function. Adult neurogenesis provides the dentate gyrus with a heterogeneous population of granule cells (GCs) that were born and developed their properties at different times. Immature GCs have distinct intrinsic and synaptic properties than mature GCs and are needed for correct encoding and discrimination in spatial tasks. How immature GCs enhance the encoding of information to support these functions is not well understood. Here, we record the responses to fluctuating current injections of GCs of different ages in mouse hippocampal slices to study how they encode stimuli. Immature GCs produce unreliable responses compared to mature GCs, exhibiting imprecise spike timings across repeated stimulation. We use a statistical model to describe the stimulus-response transformation performed by GCs of different ages. We fit this model to the data and obtain parameters that capture GCs' encoding properties. Parameter values from this fit reflect the maturational differences of the population and indicate that immature GCs perform a differential encoding of stimuli. To study how this age heterogeneity influences encoding by a population, we perform stimulus decoding using populations that contain GCs of different ages. We find that, despite their individual unreliability, immature GCs enhance the fidelity of the signal encoded by the population and improve the discrimination of similar time-dependent stimuli. Thus, the observed heterogeneity confers the population with enhanced encoding capabilities.
Collapse
Affiliation(s)
- Diego M Arribas
- Instituto de Investigacion en Biomedicina de Buenos Aires (IBioBA) – CONICET/Partner Institute of the Max Planck Society, Polo Cientifico TecnologicoBuenos AiresArgentina
| | - Antonia Marin-Burgin
- Instituto de Investigacion en Biomedicina de Buenos Aires (IBioBA) – CONICET/Partner Institute of the Max Planck Society, Polo Cientifico TecnologicoBuenos AiresArgentina
| | - Luis G Morelli
- Instituto de Investigacion en Biomedicina de Buenos Aires (IBioBA) – CONICET/Partner Institute of the Max Planck Society, Polo Cientifico TecnologicoBuenos AiresArgentina
- Departamento de Fisica, FCEyN UBA, Ciudad UniversitariaBuenos AiresArgentina
- Max Planck Institute for Molecular Physiology, Department of Systemic Cell BiologyDortmundGermany
| |
Collapse
|
6
|
Chialva U, González Boscá V, Rotstein HG. Low-dimensional models of single neurons: a review. BIOLOGICAL CYBERNETICS 2023; 117:163-183. [PMID: 37060453 DOI: 10.1007/s00422-023-00960-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Accepted: 03/05/2023] [Indexed: 06/13/2023]
Abstract
The classical Hodgkin-Huxley (HH) point-neuron model of action potential generation is four-dimensional. It consists of four ordinary differential equations describing the dynamics of the membrane potential and three gating variables associated to a transient sodium and a delayed-rectifier potassium ionic currents. Conductance-based models of HH type are higher-dimensional extensions of the classical HH model. They include a number of supplementary state variables associated with other ionic current types, and are able to describe additional phenomena such as subthreshold oscillations, mixed-mode oscillations (subthreshold oscillations interspersed with spikes), clustering and bursting. In this manuscript we discuss biophysically plausible and phenomenological reduced models that preserve the biophysical and/or dynamic description of models of HH type and the ability to produce complex phenomena, but the number of effective dimensions (state variables) is lower. We describe several representative models. We also describe systematic and heuristic methods of deriving reduced models from models of HH type.
Collapse
Affiliation(s)
- Ulises Chialva
- Departamento de Matemática, Universidad Nacional del Sur and CONICET, Bahía Blanca, Buenos Aires, Argentina
| | | | - Horacio G Rotstein
- Federated Department of Biological Sciences, New Jersey Institute of Technology and Rutgers University, Newark, New Jersey, USA.
- Behavioral Neurosciences Program, Rutgers University, Newark, NJ, USA.
- Corresponding Investigators Group, CONICET, Buenos Aires, Argentina.
| |
Collapse
|
7
|
Harkin EF, Lynn MB, Payeur A, Boucher JF, Caya-Bissonnette L, Cyr D, Stewart C, Longtin A, Naud R, Béïque JC. Temporal derivative computation in the dorsal raphe network revealed by an experimentally driven augmented integrate-and-fire modeling framework. eLife 2023; 12:72951. [PMID: 36655738 PMCID: PMC9977298 DOI: 10.7554/elife.72951] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2021] [Accepted: 12/19/2022] [Indexed: 01/20/2023] Open
Abstract
By means of an expansive innervation, the serotonin (5-HT) neurons of the dorsal raphe nucleus (DRN) are positioned to enact coordinated modulation of circuits distributed across the entire brain in order to adaptively regulate behavior. Yet the network computations that emerge from the excitability and connectivity features of the DRN are still poorly understood. To gain insight into these computations, we began by carrying out a detailed electrophysiological characterization of genetically identified mouse 5-HT and somatostatin (SOM) neurons. We next developed a single-neuron modeling framework that combines the realism of Hodgkin-Huxley models with the simplicity and predictive power of generalized integrate-and-fire models. We found that feedforward inhibition of 5-HT neurons by heterogeneous SOM neurons implemented divisive inhibition, while endocannabinoid-mediated modulation of excitatory drive to the DRN increased the gain of 5-HT output. Our most striking finding was that the output of the DRN encodes a mixture of the intensity and temporal derivative of its input, and that the temporal derivative component dominates this mixture precisely when the input is increasing rapidly. This network computation primarily emerged from prominent adaptation mechanisms found in 5-HT neurons, including a previously undescribed dynamic threshold. By applying a bottom-up neural network modeling approach, our results suggest that the DRN is particularly apt to encode input changes over short timescales, reflecting one of the salient emerging computations that dominate its output to regulate behavior.
Collapse
Affiliation(s)
- Emerson F Harkin
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Michael B Lynn
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Alexandre Payeur
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
- Department of Physics, University of OttawaOttawaCanada
| | - Jean-François Boucher
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Léa Caya-Bissonnette
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Dominic Cyr
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - Chloe Stewart
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| | - André Longtin
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
- Department of Physics, University of OttawaOttawaCanada
| | - Richard Naud
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
- Department of Physics, University of OttawaOttawaCanada
| | - Jean-Claude Béïque
- Brain and Mind Research Institute, Centre for Neural Dynamics, Department of Cellular and Molecular Medicine, University of OttawaOttawaCanada
| |
Collapse
|
8
|
A survey on dendritic neuron model: Mechanisms, algorithms and practical applications. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2021.08.153] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
9
|
A User’s Guide to Generalized Integrate-and-Fire Models. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2022; 1359:69-86. [DOI: 10.1007/978-3-030-89439-9_3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
10
|
Schwalger T. Mapping input noise to escape noise in integrate-and-fire neurons: a level-crossing approach. BIOLOGICAL CYBERNETICS 2021; 115:539-562. [PMID: 34668051 PMCID: PMC8551127 DOI: 10.1007/s00422-021-00899-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 09/27/2021] [Indexed: 06/13/2023]
Abstract
Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate ("escape noise"). While input noise lends itself to modeling biophysical noise processes, the phenomenological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with colored input noise and time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced across the sub- and suprathreshold firing regime compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise, enabling progress in the theory of neuronal population dynamics.
Collapse
Affiliation(s)
- Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623, Berlin, Germany.
- Bernstein Center for Computational Neuroscience Berlin, 10115, Berlin, Germany.
| |
Collapse
|
11
|
Salaj D, Subramoney A, Kraisnikovic C, Bellec G, Legenstein R, Maass W. Spike frequency adaptation supports network computations on temporally dispersed information. eLife 2021; 10:e65459. [PMID: 34310281 PMCID: PMC8313230 DOI: 10.7554/elife.65459] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Accepted: 06/29/2021] [Indexed: 11/13/2022] Open
Abstract
For solving tasks such as recognizing a song, answering a question, or inverting a sequence of symbols, cortical microcircuits need to integrate and manipulate information that was dispersed over time during the preceding seconds. Creating biologically realistic models for the underlying computations, especially with spiking neurons and for behaviorally relevant integration time spans, is notoriously difficult. We examine the role of spike frequency adaptation in such computations and find that it has a surprisingly large impact. The inclusion of this well-known property of a substantial fraction of neurons in the neocortex - especially in higher areas of the human neocortex - moves the performance of spiking neural network models for computations on network inputs that are temporally dispersed from a fairly low level up to the performance level of the human brain.
Collapse
Affiliation(s)
- Darjan Salaj
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Anand Subramoney
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Ceca Kraisnikovic
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Guillaume Bellec
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
- Laboratory of Computational Neuroscience, Ecole Polytechnique Fédérale de Lausanne (EPFL)LausanneSwitzerland
| | - Robert Legenstein
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| | - Wolfgang Maass
- Institute of Theoretical Computer Science, Graz University of TechnologyGrazAustria
| |
Collapse
|
12
|
An adaptive threshold neuron for recurrent spiking neural networks with nanodevice hardware implementation. Nat Commun 2021; 12:4234. [PMID: 34244491 PMCID: PMC8270926 DOI: 10.1038/s41467-021-24427-8] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2020] [Accepted: 06/14/2021] [Indexed: 11/19/2022] Open
Abstract
We propose a Double EXponential Adaptive Threshold (DEXAT) neuron model that improves the performance of neuromorphic Recurrent Spiking Neural Networks (RSNNs) by providing faster convergence, higher accuracy and a flexible long short-term memory. We present a hardware efficient methodology to realize the DEXAT neurons using tightly coupled circuit-device interactions and experimentally demonstrate the DEXAT neuron block using oxide based non-filamentary resistive switching devices. Using experimentally extracted parameters we simulate a full RSNN that achieves a classification accuracy of 96.1% on SMNIST dataset and 91% on Google Speech Commands (GSC) dataset. We also demonstrate full end-to-end real-time inference for speech recognition using real fabricated resistive memory circuit based DEXAT neurons. Finally, we investigate the impact of nanodevice variability and endurance illustrating the robustness of DEXAT based RSNNs. Recurrent spiking neural networks have garnered interest due to their energy efficiency; however, they suffer from lower accuracy compared to conventional neural networks. Here, the authors present an alternative neuron model and its efficient hardware implementation, demonstrating high classification accuracy across a range of datasets.
Collapse
|
13
|
A convolutional neural-network framework for modelling auditory sensory cells and synapses. Commun Biol 2021; 4:827. [PMID: 34211095 PMCID: PMC8249591 DOI: 10.1038/s42003-021-02341-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Accepted: 06/09/2021] [Indexed: 12/02/2022] Open
Abstract
In classical computational neuroscience, analytical model descriptions are derived from neuronal recordings to mimic the underlying biological system. These neuronal models are typically slow to compute and cannot be integrated within large-scale neuronal simulation frameworks. We present a hybrid, machine-learning and computational-neuroscience approach that transforms analytical models of sensory neurons and synapses into deep-neural-network (DNN) neuronal units with the same biophysical properties. Our DNN-model architecture comprises parallel and differentiable equations that can be used for backpropagation in neuro-engineering applications, and offers a simulation run-time improvement factor of 70 and 280 on CPU or GPU systems respectively. We focussed our development on auditory neurons and synapses, and show that our DNN-model architecture can be extended to a variety of existing analytical models. We describe how our approach for auditory models can be applied to other neuron and synapse types to help accelerate the development of large-scale brain networks and DNN-based treatments of the pathological system. Drakopoulos et al developed a machine-learning and computational-neuroscience approach that transforms analytical models of sensory neurons and synapses into deep-neural-network (DNN) neuronal units with the same biophysical properties. Focusing on auditory neurons and synapses, they showed that their DNN-model architecture could be extended to a variety of existing analytical models and to other neuron and synapse types, thus potentially assisting the development of large-scale brain networks and DNN-based treatments.
Collapse
|
14
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
15
|
Marín M, Cruz NC, Ortigosa EM, Sáez-Lara MJ, Garrido JA, Carrillo RR. On the Use of a Multimodal Optimizer for Fitting Neuron Models. Application to the Cerebellar Granule Cell. Front Neuroinform 2021; 15:663797. [PMID: 34149387 PMCID: PMC8209370 DOI: 10.3389/fninf.2021.663797] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Accepted: 04/13/2021] [Indexed: 11/19/2022] Open
Abstract
This article extends a recent methodological workflow for creating realistic and computationally efficient neuron models whilst capturing essential aspects of single-neuron dynamics. We overcome the intrinsic limitations of the extant optimization methods by proposing an alternative optimization component based on multimodal algorithms. This approach can natively explore a diverse population of neuron model configurations. In contrast to methods that focus on a single global optimum, the multimodal method allows directly obtaining a set of promising solutions for a single but complex multi-feature objective function. The final sparse population of candidate solutions has to be analyzed and evaluated according to the biological plausibility and their objective to the target features by the expert. In order to illustrate the value of this approach, we base our proposal on the optimization of cerebellar granule cell (GrC) models that replicate the essential properties of the biological cell. Our results show the emerging variability of plausible sets of values that this type of neuron can adopt underlying complex spiking characteristics. Also, the set of selected cerebellar GrC models captured spiking dynamics closer to the reference model than the single model obtained with off-the-shelf parameter optimization algorithms used in our previous article. The method hereby proposed represents a valuable strategy for adjusting a varied population of realistic and simplified neuron models. It can be applied to other kinds of neuron models and biological contexts.
Collapse
Affiliation(s)
- Milagros Marín
- Department of Biochemistry and Molecular Biology I, University of Granada, Granada, Spain
| | - Nicolás C Cruz
- Department of Informatics, University of Almería, ceiA3, Almería, Spain
| | - Eva M Ortigosa
- Department of Computer Architecture and Technology-CITIC, University of Granada, Granada, Spain
| | - María J Sáez-Lara
- Department of Biochemistry and Molecular Biology I, University of Granada, Granada, Spain
| | - Jesús A Garrido
- Department of Computer Architecture and Technology-CITIC, University of Granada, Granada, Spain
| | - Richard R Carrillo
- Department of Computer Architecture and Technology-CITIC, University of Granada, Granada, Spain
| |
Collapse
|
16
|
Rossbroich J, Trotter D, Beninger J, Tóth K, Naud R. Linear-nonlinear cascades capture synaptic dynamics. PLoS Comput Biol 2021; 17:e1008013. [PMID: 33720935 PMCID: PMC7993773 DOI: 10.1371/journal.pcbi.1008013] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2020] [Revised: 03/25/2021] [Accepted: 02/25/2021] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic dynamics differ markedly across connections and strongly regulate how action potentials communicate information. To model the range of synaptic dynamics observed in experiments, we have developed a flexible mathematical framework based on a linear-nonlinear operation. This model can capture various experimentally observed features of synaptic dynamics and different types of heteroskedasticity. Despite its conceptual simplicity, we show that it is more adaptable than previous models. Combined with a standard maximum likelihood approach, synaptic dynamics can be accurately and efficiently characterized using naturalistic stimulation patterns. These results make explicit that synaptic processing bears algorithmic similarities with information processing in convolutional neural networks.
Collapse
Affiliation(s)
- Julian Rossbroich
- Friedrich Miescher Institute for Biomedical Research, Basel, Switzerland
| | - Daniel Trotter
- Department of Physics, University of Ottawa, Ottawa, ON, Canada
| | - John Beninger
- uOttawa Brain Mind Institute, Center for Neural Dynamics, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Katalin Tóth
- uOttawa Brain Mind Institute, Center for Neural Dynamics, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Richard Naud
- Department of Physics, University of Ottawa, Ottawa, ON, Canada
- uOttawa Brain Mind Institute, Center for Neural Dynamics, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, ON, Canada
| |
Collapse
|
17
|
Fehrman C, Robbins TD, Meliza CD. Nonlinear effects of intrinsic dynamics on temporal encoding in a model of avian auditory cortex. PLoS Comput Biol 2021; 17:e1008768. [PMID: 33617539 PMCID: PMC7932506 DOI: 10.1371/journal.pcbi.1008768] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Revised: 03/04/2021] [Accepted: 02/04/2021] [Indexed: 11/18/2022] Open
Abstract
Neurons exhibit diverse intrinsic dynamics, which govern how they integrate synaptic inputs to produce spikes. Intrinsic dynamics are often plastic during development and learning, but the effects of these changes on stimulus encoding properties are not well known. To examine this relationship, we simulated auditory responses to zebra finch song using a linear-dynamical cascade model, which combines a linear spectrotemporal receptive field with a dynamical, conductance-based neuron model, then used generalized linear models to estimate encoding properties from the resulting spike trains. We focused on the effects of a low-threshold potassium current (KLT) that is present in a subset of cells in the zebra finch caudal mesopallium and is affected by early auditory experience. We found that KLT affects both spike adaptation and the temporal filtering properties of the receptive field. The direction of the effects depended on the temporal modulation tuning of the linear (input) stage of the cascade model, indicating a strongly nonlinear relationship. These results suggest that small changes in intrinsic dynamics in tandem with differences in synaptic connectivity can have dramatic effects on the tuning of auditory neurons. Experience-dependent developmental plasticity involves changes not only to synaptic connections, but to voltage-gated currents as well. Using biophysical models, it is straightforward to predict the effects of this intrinsic plasticity on the firing patterns of individual neurons, but it remains difficult to understand the consequences for sensory coding. We investigated this in the context of the zebra finch auditory cortex, where early exposure to a complex acoustic environment causes increased expression of a low-threshold potassium current. We simulated responses to song using a detailed biophysical model and then characterized encoding properties using generalized linear models. This analysis revealed that this potassium current has strong, nonlinear effects on how the model encodes the song’s temporal structure, and that the sign of these effects depend on the temporal tuning of the synaptic inputs. This nonlinearity gives intrinsic plasticity broad scope as a mechanism for developmental learning in the auditory system.
Collapse
Affiliation(s)
- Christof Fehrman
- Psychology Department, University of Virginia, Charlottesville, Virginia, United States of America
| | - Tyler D. Robbins
- Cognitive Science Program, University of Virginia, Charlottesville, Virginia, United States of America
| | - C. Daniel Meliza
- Psychology Department, University of Virginia, Charlottesville, Virginia, United States of America
- Neuroscience Graduate Program, University of Virginia, Charlottesville, Virginia, United States of America
- * E-mail:
| |
Collapse
|
18
|
Wybo WA, Jordan J, Ellenberger B, Marti Mengual U, Nevian T, Senn W. Data-driven reduction of dendritic morphologies with preserved dendro-somatic responses. eLife 2021; 10:60936. [PMID: 33494860 PMCID: PMC7837682 DOI: 10.7554/elife.60936] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2020] [Accepted: 01/04/2021] [Indexed: 11/13/2022] Open
Abstract
Dendrites shape information flow in neurons. Yet, there is little consensus on the level of spatial complexity at which they operate. Through carefully chosen parameter fits, solvable in the least-squares sense, we obtain accurate reduced compartmental models at any level of complexity. We show that (back-propagating) action potentials, Ca2+ spikes, and N-methyl-D-aspartate spikes can all be reproduced with few compartments. We also investigate whether afferent spatial connectivity motifs admit simplification by ablating targeted branches and grouping affected synapses onto the next proximal dendrite. We find that voltage in the remaining branches is reproduced if temporal conductance fluctuations stay below a limit that depends on the average difference in input resistance between the ablated branches and the next proximal dendrite. Furthermore, our methodology fits reduced models directly from experimental data, without requiring morphological reconstructions. We provide software that automatizes the simplification, eliminating a common hurdle toward including dendritic computations in network models.
Collapse
Affiliation(s)
- Willem Am Wybo
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Jakob Jordan
- Department of Physiology, University of Bern, Bern, Switzerland
| | | | | | - Thomas Nevian
- Department of Physiology, University of Bern, Bern, Switzerland
| | - Walter Senn
- Department of Physiology, University of Bern, Bern, Switzerland
| |
Collapse
|
19
|
Abstract
Neuromorphic devices and systems have attracted attention as next-generation computing due to their high efficiency in processing complex data. So far, they have been demonstrated using both machine-learning software and complementary metal-oxide-semiconductor-based hardware. However, these approaches have drawbacks in power consumption and learning speed. An energy-efficient neuromorphic computing system requires hardware that can mimic the functions of a brain. Therefore, various materials have been introduced for the development of neuromorphic devices. Here, recent advances in neuromorphic devices are reviewed. First, the functions of biological synapses and neurons are discussed. Also, deep neural networks and spiking neural networks are described. Then, the operation mechanism and the neuromorphic functions of emerging devices are reviewed. Finally, the challenges and prospects for developing neuromorphic devices that use emerging materials are discussed.
Collapse
Affiliation(s)
- Min-Kyu Kim
- Department of Materials Science and Engineering, Pohang University of Science and Technology (POSTECH), Pohang 37673, Republic of Korea
| | - Youngjun Park
- Department of Materials Science and Engineering, Pohang University of Science and Technology (POSTECH), Pohang 37673, Republic of Korea
| | - Ik-Jyae Kim
- Department of Materials Science and Engineering, Pohang University of Science and Technology (POSTECH), Pohang 37673, Republic of Korea
| | - Jang-Sik Lee
- Department of Materials Science and Engineering, Pohang University of Science and Technology (POSTECH), Pohang 37673, Republic of Korea
| |
Collapse
|
20
|
Interaction of neuronal and network mechanisms on firing propagation in a feedforward network. Neurocomputing 2020. [DOI: 10.1016/j.neucom.2020.05.088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
21
|
Gonçalves PJ, Lueckmann JM, Deistler M, Nonnenmacher M, Öcal K, Bassetto G, Chintaluri C, Podlaski WF, Haddad SA, Vogels TP, Greenberg DS, Macke JH. Training deep neural density estimators to identify mechanistic models of neural dynamics. eLife 2020; 9:e56261. [PMID: 32940606 PMCID: PMC7581433 DOI: 10.7554/elife.56261] [Citation(s) in RCA: 68] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Accepted: 09/16/2020] [Indexed: 01/27/2023] Open
Abstract
Mechanistic modeling in neuroscience aims to explain observed phenomena in terms of underlying causes. However, determining which model parameters agree with complex and stochastic neural data presents a significant challenge. We address this challenge with a machine learning tool which uses deep neural density estimators-trained using model simulations-to carry out Bayesian inference and retrieve the full space of parameters compatible with raw data or selected data features. Our method is scalable in parameters and data features and can rapidly analyze new data after initial training. We demonstrate the power and flexibility of our approach on receptive fields, ion channels, and Hodgkin-Huxley models. We also characterize the space of circuit configurations giving rise to rhythmic activity in the crustacean stomatogastric ganglion, and use these results to derive hypotheses for underlying compensation mechanisms. Our approach will help close the gap between data-driven and theory-driven models of neural dynamics.
Collapse
Affiliation(s)
- Pedro J Gonçalves
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
| | - Jan-Matthis Lueckmann
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
| | - Michael Deistler
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Machine Learning in Science, Excellence Cluster Machine Learning, Tübingen UniversityTübingenGermany
| | - Marcel Nonnenmacher
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
- Model-Driven Machine Learning, Institute of Coastal Research, Helmholtz Centre GeesthachtGeesthachtGermany
| | - Kaan Öcal
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
- Mathematical Institute, University of BonnBonnGermany
| | - Giacomo Bassetto
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
| | - Chaitanya Chintaluri
- Centre for Neural Circuits and Behaviour, University of OxfordOxfordUnited Kingdom
- Institute of Science and Technology AustriaKlosterneuburgAustria
| | - William F Podlaski
- Centre for Neural Circuits and Behaviour, University of OxfordOxfordUnited Kingdom
| | - Sara A Haddad
- Max Planck Institute for Brain ResearchFrankfurtGermany
| | - Tim P Vogels
- Centre for Neural Circuits and Behaviour, University of OxfordOxfordUnited Kingdom
- Institute of Science and Technology AustriaKlosterneuburgAustria
| | - David S Greenberg
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Model-Driven Machine Learning, Institute of Coastal Research, Helmholtz Centre GeesthachtGeesthachtGermany
| | - Jakob H Macke
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
- Machine Learning in Science, Excellence Cluster Machine Learning, Tübingen UniversityTübingenGermany
- Max Planck Institute for Intelligent SystemsTübingenGermany
| |
Collapse
|
22
|
Staiger JF, Petersen CCH. Neuronal Circuits in Barrel Cortex for Whisker Sensory Perception. Physiol Rev 2020; 101:353-415. [PMID: 32816652 DOI: 10.1152/physrev.00019.2019] [Citation(s) in RCA: 49] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023] Open
Abstract
The array of whiskers on the snout provides rodents with tactile sensory information relating to the size, shape and texture of objects in their immediate environment. Rodents can use their whiskers to detect stimuli, distinguish textures, locate objects and navigate. Important aspects of whisker sensation are thought to result from neuronal computations in the whisker somatosensory cortex (wS1). Each whisker is individually represented in the somatotopic map of wS1 by an anatomical unit named a 'barrel' (hence also called barrel cortex). This allows precise investigation of sensory processing in the context of a well-defined map. Here, we first review the signaling pathways from the whiskers to wS1, and then discuss current understanding of the various types of excitatory and inhibitory neurons present within wS1. Different classes of cells can be defined according to anatomical, electrophysiological and molecular features. The synaptic connectivity of neurons within local wS1 microcircuits, as well as their long-range interactions and the impact of neuromodulators, are beginning to be understood. Recent technological progress has allowed cell-type-specific connectivity to be related to cell-type-specific activity during whisker-related behaviors. An important goal for future research is to obtain a causal and mechanistic understanding of how selected aspects of tactile sensory information are processed by specific types of neurons in the synaptically connected neuronal networks of wS1 and signaled to downstream brain areas, thus contributing to sensory-guided decision-making.
Collapse
Affiliation(s)
- Jochen F Staiger
- University Medical Center Göttingen, Institute for Neuroanatomy, Göttingen, Germany; and Laboratory of Sensory Processing, Faculty of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Carl C H Petersen
- University Medical Center Göttingen, Institute for Neuroanatomy, Göttingen, Germany; and Laboratory of Sensory Processing, Faculty of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| |
Collapse
|
23
|
Cremonesi F, Schürmann F. Understanding Computational Costs of Cellular-Level Brain Tissue Simulations Through Analytical Performance Models. Neuroinformatics 2020; 18:407-428. [PMID: 32056104 PMCID: PMC7338826 DOI: 10.1007/s12021-019-09451-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Computational modeling and simulation have become essential tools in the quest to better understand the brain's makeup and to decipher the causal interrelations of its components. The breadth of biochemical and biophysical processes and structures in the brain has led to the development of a large variety of model abstractions and specialized tools, often times requiring high performance computing resources for their timely execution. What has been missing so far was an in-depth analysis of the complexity of the computational kernels, hindering a systematic approach to identifying bottlenecks of algorithms and hardware. If whole brain models are to be achieved on emerging computer generations, models and simulation engines will have to be carefully co-designed for the intrinsic hardware tradeoffs. For the first time, we present a systematic exploration based on analytic performance modeling. We base our analysis on three in silico models, chosen as representative examples of the most widely employed modeling abstractions: current-based point neurons, conductance-based point neurons and conductance-based detailed neurons. We identify that the synaptic modeling formalism, i.e. current or conductance-based representation, and not the level of morphological detail, is the most significant factor in determining the properties of memory bandwidth saturation and shared-memory scaling of in silico models. Even though general purpose computing has, until now, largely been able to deliver high performance, we find that for all types of abstractions, network latency and memory bandwidth will become severe bottlenecks as the number of neurons to be simulated grows. By adapting and extending a performance modeling approach, we deliver a first characterization of the performance landscape of brain tissue simulations, allowing us to pinpoint current bottlenecks for state-of-the-art in silico models, and make projections for future hardware and software requirements.
Collapse
Affiliation(s)
- Francesco Cremonesi
- Blue Brain Project, Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Campus Biotech, 1202, Geneva, Switzerland
| | - Felix Schürmann
- Blue Brain Project, Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Campus Biotech, 1202, Geneva, Switzerland.
| |
Collapse
|
24
|
Todorov D, Truccolo W. Stability of stochastic finite-size spiking-neuron networks: Comparing mean-field, 1-loop correction and quasi-renewal approximations. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2019:4380-4386. [PMID: 31946838 DOI: 10.1109/embc.2019.8857101] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
We examine the stability and qualitative dynamics of stochastic neuronal networks specified as multivariate non-linear Hawkes processes and related point-process generalized linear models that incorporate both auto- and cross-history effects. In particular, we adapt previous theoretical approximations based on mean field and mean field plus 1-loop correction to incorporate absolute refractory periods and other auto-history effects. Furthermore, we extend previous quasi-renewal approximations to the multivariate case, i.e. neuronal networks. The best sensitivity and specificity performance, in terms of predicting stability and divergence to nonphysiologically high firing rates in the examined simulations, was obtained by a variant of the quasi-renewal approximation.
Collapse
|
25
|
Matzner A, Gorodetski L, Korngreen A, Bar-Gad I. Dynamic input-dependent encoding of individual basal ganglia neurons. Sci Rep 2020; 10:5833. [PMID: 32242059 PMCID: PMC7118110 DOI: 10.1038/s41598-020-62750-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2019] [Accepted: 03/16/2020] [Indexed: 11/09/2022] Open
Abstract
Computational models are crucial to studying the encoding of individual neurons. Static models are composed of a fixed set of parameters, thus resulting in static encoding properties that do not change under different inputs. Here, we challenge this basic concept which underlies these models. Using generalized linear models, we quantify the encoding and information processing properties of basal ganglia neurons recorded in-vitro. These properties are highly sensitive to the internal state of the neuron due to factors such as dependency on the baseline firing rate. Verification of these experimental results with simulations provides insights into the mechanisms underlying this input-dependent encoding. Thus, static models, which are not context dependent, represent only part of the neuronal encoding capabilities, and are not sufficient to represent the dynamics of a neuron over varying inputs. Input-dependent encoding is crucial for expanding our understanding of neuronal behavior in health and disease and underscores the need for a new generation of dynamic neuronal models.
Collapse
Affiliation(s)
- Ayala Matzner
- The Leslie & Susan Goldschmied (Gonda) Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel
| | - Lilach Gorodetski
- Goodman Faculty of life sciences, Bar-Ilan University, Ramat-Gan, Israel
| | - Alon Korngreen
- The Leslie & Susan Goldschmied (Gonda) Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel.,Goodman Faculty of life sciences, Bar-Ilan University, Ramat-Gan, Israel
| | - Izhar Bar-Gad
- The Leslie & Susan Goldschmied (Gonda) Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel.
| |
Collapse
|
26
|
Skaar JEW, Stasik AJ, Hagen E, Ness TV, Einevoll GT. Estimation of neural network model parameters from local field potentials (LFPs). PLoS Comput Biol 2020; 16:e1007725. [PMID: 32155141 PMCID: PMC7083334 DOI: 10.1371/journal.pcbi.1007725] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2019] [Revised: 03/20/2020] [Accepted: 02/12/2020] [Indexed: 11/20/2022] Open
Abstract
Most modeling in systems neuroscience has been descriptive where neural representations such as ‘receptive fields’, have been found by statistically correlating neural activity to sensory input. In the traditional physics approach to modelling, hypotheses are represented by mechanistic models based on the underlying building blocks of the system, and candidate models are validated by comparing with experiments. Until now validation of mechanistic cortical network models has been based on comparison with neuronal spikes, found from the high-frequency part of extracellular electrical potentials. In this computational study we investigated to what extent the low-frequency part of the signal, the local field potential (LFP), can be used to validate and infer properties of mechanistic cortical network models. In particular, we asked the question whether the LFP can be used to accurately estimate synaptic connection weights in the underlying network. We considered the thoroughly analysed Brunel network comprising an excitatory and an inhibitory population of recurrently connected integrate-and-fire (LIF) neurons. This model exhibits a high diversity of spiking network dynamics depending on the values of only three network parameters. The LFP generated by the network was computed using a hybrid scheme where spikes computed from the point-neuron network were replayed on biophysically detailed multicompartmental neurons. We assessed how accurately the three model parameters could be estimated from power spectra of stationary ‘background’ LFP signals by application of convolutional neural nets (CNNs). All network parameters could be very accurately estimated, suggesting that LFPs indeed can be used for network model validation. Most of what we have learned about brain networks in vivo has come from the measurement of spikes (action potentials) recorded by extracellular electrodes. The low-frequency part of these signals, the local field potential (LFP), contains unique information about how dendrites in neuronal populations integrate synaptic inputs, but has so far played a lesser role. To investigate whether the LFP can be used to validate network models, we computed LFP signals for a recurrent network model (the Brunel network) for which the ground-truth parameters are known. By application of convolutional neural nets (CNNs) we found that the synaptic weights indeed could be accurately estimated from ‘background’ LFP signals, suggesting a future key role for LFP in development of network models.
Collapse
Affiliation(s)
- Jan-Eirik W. Skaar
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | | | - Espen Hagen
- Department of Physics, University of Oslo, Oslo, Norway
| | - Torbjørn V. Ness
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
| | - Gaute T. Einevoll
- Faculty of Science and Technology, Norwegian University of Life Sciences, Ås, Norway
- Department of Physics, University of Oslo, Oslo, Norway
- * E-mail:
| |
Collapse
|
27
|
Inferring and validating mechanistic models of neural microcircuits based on spike-train data. Nat Commun 2019; 10:4933. [PMID: 31666513 PMCID: PMC6821748 DOI: 10.1038/s41467-019-12572-0] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2018] [Accepted: 09/18/2019] [Indexed: 01/11/2023] Open
Abstract
The interpretation of neuronal spike train recordings often relies on abstract statistical models that allow for principled parameter estimation and model selection but provide only limited insights into underlying microcircuits. In contrast, mechanistic models are useful to interpret microcircuit dynamics, but are rarely quantitatively matched to experimental data due to methodological challenges. Here we present analytical methods to efficiently fit spiking circuit models to single-trial spike trains. Using derived likelihood functions, we statistically infer the mean and variance of hidden inputs, neuronal adaptation properties and connectivity for coupled integrate-and-fire neurons. Comprehensive evaluations on synthetic data, validations using ground truth in-vitro and in-vivo recordings, and comparisons with existing techniques demonstrate that parameter estimation is very accurate and efficient, even for highly subsampled networks. Our methods bridge statistical, data-driven and theoretical, model-based neurosciences at the level of spiking circuits, for the purpose of a quantitative, mechanistic interpretation of recorded neuronal population activity. It is difficult to fit mechanistic, biophysically constrained circuit models to spike train data from in vivo extracellular recordings. Here the authors present analytical methods that enable efficient parameter estimation for integrate-and-fire circuit models and inference of the underlying connectivity structure in subsampled networks.
Collapse
|
28
|
Schwalger T, Chizhov AV. Mind the last spike - firing rate models for mesoscopic populations of spiking neurons. Curr Opin Neurobiol 2019; 58:155-166. [PMID: 31590003 DOI: 10.1016/j.conb.2019.08.003] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/25/2019] [Indexed: 02/07/2023]
Abstract
The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons - a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany; Institut für Mathematik, Technische Universität Berlin, 10623 Berlin, Germany.
| | - Anton V Chizhov
- Ioffe Institute, 194021 Saint-Petersburg, Russia; Sechenov Institute of Evolutionary Physiology and Biochemistry of the Russian Academy of Sciences, 194223 Saint-Petersburg, Russia
| |
Collapse
|
29
|
Naud R, Longtin A. Linking demyelination to compound action potential dispersion with a spike-diffuse-spike approach. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2019; 9:3. [PMID: 31147800 PMCID: PMC6542900 DOI: 10.1186/s13408-019-0071-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/10/2018] [Accepted: 05/20/2019] [Indexed: 06/09/2023]
Abstract
To establish and exploit novel biomarkers of demyelinating diseases requires a mechanistic understanding of axonal propagation. Here, we present a novel computational framework called the stochastic spike-diffuse-spike (SSDS) model for assessing the effects of demyelination on axonal transmission. It models transmission through nodal and internodal compartments with two types of operations: a stochastic integrate-and-fire operation captures nodal excitability and a linear filtering operation describes internodal propagation. The effects of demyelinated segments on the probability of transmission, transmission delay and spike time jitter are explored. We argue that demyelination-induced impedance mismatch prevents propagation mostly when the action potential leaves a demyelinated region, not when it enters a demyelinated region. In addition, we model sodium channel remodeling as a homeostatic control of nodal excitability. We find that the effects of mild demyelination on transmission probability and delay can be largely counterbalanced by an increase in excitability at the nodes surrounding the demyelination. The spike timing jitter, however, reflects the level of demyelination whether excitability is fixed or is allowed to change in compensation. This jitter can accumulate over long axons and leads to a broadening of the compound action potential, linking microscopic defects to a mesoscopic observable. Our findings articulate why action potential jitter and compound action potential dispersion can serve as potential markers of weak and sporadic demyelination.
Collapse
Affiliation(s)
- Richard Naud
- Ottawa Brain and Mind Research Institute, Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Canada
- Department of Physics, University of Ottawa, Ottawa, Canada
| | - André Longtin
- Department of Physics, University of Ottawa, Ottawa, Canada
| |
Collapse
|
30
|
Experimentally-constrained biophysical models of tonic and burst firing modes in thalamocortical neurons. PLoS Comput Biol 2019; 15:e1006753. [PMID: 31095552 PMCID: PMC6541309 DOI: 10.1371/journal.pcbi.1006753] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2018] [Revised: 05/29/2019] [Accepted: 04/10/2019] [Indexed: 01/12/2023] Open
Abstract
Somatosensory thalamocortical (TC) neurons from the ventrobasal (VB) thalamus are central components in the flow of sensory information between the periphery and the cerebral cortex, and participate in the dynamic regulation of thalamocortical states including wakefulness and sleep. This property is reflected at the cellular level by the ability to generate action potentials in two distinct firing modes, called tonic firing and low-threshold bursting. Although the general properties of TC neurons are known, we still lack a detailed characterization of their morphological and electrical properties in the VB thalamus. The aim of this study was to build biophysically-detailed models of VB TC neurons explicitly constrained with experimental data from rats. We recorded the electrical activity of VB neurons (N = 49) and reconstructed morphologies in 3D (N = 50) by applying standardized protocols. After identifying distinct electrical types, we used a multi-objective optimization to fit single neuron electrical models (e-models), which yielded multiple solutions consistent with the experimental data. The models were tested for generalization using electrical stimuli and neuron morphologies not used during fitting. A local sensitivity analysis revealed that the e-models are robust to small parameter changes and that all the parameters were constrained by one or more features. The e-models, when tested in combination with different morphologies, showed that the electrical behavior is substantially preserved when changing dendritic structure and that the e-models were not overfit to a specific morphology. The models and their analysis show that automatic parameter search can be applied to capture complex firing behavior, such as co-existence of tonic firing and low-threshold bursting over a wide range of parameter sets and in combination with different neuron morphologies. Thalamocortical neurons are one of the main components of the thalamocortical system, which is implicated in key functions including sensory transmission and the transition between brain states. These functions are reflected at the cellular level by the ability to generate action potentials in two distinct modes, called burst and tonic firing. Biophysically-detailed computational modeling of these cells can provide a tool to understand the role of these neurons within thalamocortical circuitry. We started by collecting single cell experimental data by applying standardized experimental procedures in brain slices of the rat. Prior work has demonstrated that biological constraints can be integrated using multi-objective optimization to build biologically realistic models of neurons. Here, we employed similar techniques, but extended them to capture the multiple firing modes of thalamic neurons. We compared the model results with additional experimental data, test their generalization and quantitatively reject those that deviated significantly from the experimental variability. These models can be readily integrated in a data-driven pipeline to reconstruct and simulate circuit activity in the thalamocortical system.
Collapse
|
31
|
Einevoll GT, Destexhe A, Diesmann M, Grün S, Jirsa V, de Kamps M, Migliore M, Ness TV, Plesser HE, Schürmann F. The Scientific Case for Brain Simulations. Neuron 2019; 102:735-744. [DOI: 10.1016/j.neuron.2019.03.027] [Citation(s) in RCA: 50] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2018] [Revised: 02/06/2019] [Accepted: 03/18/2019] [Indexed: 01/30/2023]
|
32
|
Geminiani A, Casellato C, Locatelli F, Prestori F, Pedrocchi A, D'Angelo E. Complex Dynamics in Simplified Neuronal Models: Reproducing Golgi Cell Electroresponsiveness. Front Neuroinform 2018; 12:88. [PMID: 30559658 PMCID: PMC6287018 DOI: 10.3389/fninf.2018.00088] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2018] [Accepted: 11/13/2018] [Indexed: 11/21/2022] Open
Abstract
Brain neurons exhibit complex electroresponsive properties – including intrinsic subthreshold oscillations and pacemaking, resonance and phase-reset – which are thought to play a critical role in controlling neural network dynamics. Although these properties emerge from detailed representations of molecular-level mechanisms in “realistic” models, they cannot usually be generated by simplified neuronal models (although these may show spike-frequency adaptation and bursting). We report here that this whole set of properties can be generated by the extended generalized leaky integrate-and-fire (E-GLIF) neuron model. E-GLIF derives from the GLIF model family and is therefore mono-compartmental, keeps the limited computational load typical of a linear low-dimensional system, admits analytical solutions and can be tuned through gradient-descent algorithms. Importantly, E-GLIF is designed to maintain a correspondence between model parameters and neuronal membrane mechanisms through a minimum set of equations. In order to test its potential, E-GLIF was used to model a specific neuron showing rich and complex electroresponsiveness, the cerebellar Golgi cell, and was validated against experimental electrophysiological data recorded from Golgi cells in acute cerebellar slices. During simulations, E-GLIF was activated by stimulus patterns, including current steps and synaptic inputs, identical to those used for the experiments. The results demonstrate that E-GLIF can reproduce the whole set of complex neuronal dynamics typical of these neurons – including intensity-frequency curves, spike-frequency adaptation, post-inhibitory rebound bursting, spontaneous subthreshold oscillations, resonance, and phase-reset – providing a new effective tool to investigate brain dynamics in large-scale simulations.
Collapse
Affiliation(s)
- Alice Geminiani
- NEARLab, Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy
| | - Claudia Casellato
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Francesca Locatelli
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Francesca Prestori
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Alessandra Pedrocchi
- NEARLab, Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy
| | - Egidio D'Angelo
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| |
Collapse
|
33
|
Setareh H, Deger M, Gerstner W. Excitable neuronal assemblies with adaptation as a building block of brain circuits for velocity-controlled signal propagation. PLoS Comput Biol 2018; 14:e1006216. [PMID: 29979674 PMCID: PMC6051644 DOI: 10.1371/journal.pcbi.1006216] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2017] [Revised: 07/18/2018] [Accepted: 05/21/2018] [Indexed: 01/07/2023] Open
Abstract
The time scale of neuronal network dynamics is determined by synaptic interactions and neuronal signal integration, both of which occur on the time scale of milliseconds. Yet many behaviors like the generation of movements or vocalizations of sounds occur on the much slower time scale of seconds. Here we ask the question of how neuronal networks of the brain can support reliable behavior on this time scale. We argue that excitable neuronal assemblies with spike-frequency adaptation may serve as building blocks that can flexibly adjust the speed of execution of neural circuit function. We show in simulations that a chain of neuronal assemblies can propagate signals reliably, similar to the well-known synfire chain, but with the crucial difference that the propagation speed is slower and tunable to the behaviorally relevant range. Moreover we study a grid of excitable neuronal assemblies as a simplified model of the somatosensory barrel cortex of the mouse and demonstrate that various patterns of experimentally observed spatial activity propagation can be explained. Models of activity propagation in cortical networks have often been based on feedforward structures. Here we propose a model of activity propagation, called excitation chain, which does not need such a feedforward structure. The model is composed of excitable neural assemblies with spike-frequency adaptation, connected bidirectionally in a row or a grid. This prototypical neural circuit can propagate activity forwards, backwards or in both directions. Furthermore, the propagation speed is slow enough to trigger the generation of behaviors on the time scale of hundreds of milliseconds. A two-dimensional variant of the model is able to generate different activity propagation patterns, similar to spontaneous activity and stimulus-evoked responses in anesthetized mouse barrel cortex. We propose the excitation chain model as a basic component that can be employed in various ways to create spiking neural circuit models that generate signals on behavioral time scales. In contrast to abstract models of excitable media, our model makes an explicit link to the time scale of neuronal spikes.
Collapse
Affiliation(s)
- Hesam Setareh
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Switzerland
| | - Moritz Deger
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Switzerland
- Institute for Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Köln, Germany
| | - Wulfram Gerstner
- School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Switzerland
- * E-mail:
| |
Collapse
|
34
|
Venkadesh S, Komendantov AO, Listopad S, Scott EO, De Jong K, Krichmar JL, Ascoli GA. Evolving Simple Models of Diverse Intrinsic Dynamics in Hippocampal Neuron Types. Front Neuroinform 2018; 12:8. [PMID: 29593519 PMCID: PMC5859109 DOI: 10.3389/fninf.2018.00008] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2017] [Accepted: 02/21/2018] [Indexed: 12/24/2022] Open
Abstract
The diversity of intrinsic dynamics observed in neurons may enhance the computations implemented in the circuit by enriching network-level emergent properties such as synchronization and phase locking. Large-scale spiking network models of entire brain regions offer a platform to test theories of neural computation and cognitive function, providing useful insights on information processing in the nervous system. However, a systematic in-depth investigation requires network simulations to capture the biological intrinsic diversity of individual neurons at a sufficient level of accuracy. The computationally efficient Izhikevich model can reproduce a wide range of neuronal behaviors qualitatively. Previous studies using optimization techniques, however, were less successful in quantitatively matching experimentally recorded voltage traces. In this article, we present an automated pipeline based on evolutionary algorithms to quantitatively reproduce features of various classes of neuronal spike patterns using the Izhikevich model. Employing experimental data from Hippocampome.org, a comprehensive knowledgebase of neuron types in the rodent hippocampus, we demonstrate that our approach reliably fit Izhikevich models to nine distinct classes of experimentally recorded spike patterns, including delayed spiking, spiking with adaptation, stuttering, and bursting. Importantly, by leveraging the parameter-exploration capabilities of evolutionary algorithms, and by representing qualitative spike pattern class definitions in the error landscape, our approach creates several suitable models for each neuron type, exhibiting appropriate feature variabilities among neurons. Moreover, we demonstrate the flexibility of our methodology by creating multi-compartment Izhikevich models for each neuron type in addition to single-point versions. Although the results presented here focus on hippocampal neuron types, the same strategy is broadly applicable to any neural systems.
Collapse
Affiliation(s)
- Siva Venkadesh
- Center for Neural Informatics, Structures, and Plasticity, Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, United States
| | - Alexander O Komendantov
- Center for Neural Informatics, Structures, and Plasticity, Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, United States
| | - Stanislav Listopad
- Cognitive Anteater Robotics Laboratory, Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, United States
| | - Eric O Scott
- Adaptive Systems Laboratory, Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, United States
| | - Kenneth De Jong
- Adaptive Systems Laboratory, Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, United States
| | - Jeffrey L Krichmar
- Cognitive Anteater Robotics Laboratory, Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, United States
| | - Giorgio A Ascoli
- Center for Neural Informatics, Structures, and Plasticity, Krasnow Institute for Advanced Study, George Mason University, Fairfax, VA, United States
| |
Collapse
|
35
|
Teeter C, Iyer R, Menon V, Gouwens N, Feng D, Berg J, Szafer A, Cain N, Zeng H, Hawrylycz M, Koch C, Mihalas S. Generalized leaky integrate-and-fire models classify multiple neuron types. Nat Commun 2018; 9:709. [PMID: 29459723 PMCID: PMC5818568 DOI: 10.1038/s41467-017-02717-4] [Citation(s) in RCA: 109] [Impact Index Per Article: 18.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2017] [Accepted: 12/20/2017] [Indexed: 11/18/2022] Open
Abstract
There is a high diversity of neuronal types in the mammalian neocortex. To facilitate construction of system models with multiple cell types, we generate a database of point models associated with the Allen Cell Types Database. We construct a set of generalized leaky integrate-and-fire (GLIF) models of increasing complexity to reproduce the spiking behaviors of 645 recorded neurons from 16 transgenic lines. The more complex models have an increased capacity to predict spiking behavior of hold-out stimuli. We use unsupervised methods to classify cell types, and find that high level GLIF model parameters are able to differentiate transgenic lines comparable to electrophysiological features. The more complex model parameters also have an increased ability to differentiate between transgenic lines. Thus, creating simple models is an effective dimensionality reduction technique that enables the differentiation of cell types from electrophysiological responses without the need for a priori-defined features. This database will provide a set of simplified models of multiple cell types for the community to use in network models.
Collapse
Affiliation(s)
- Corinne Teeter
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA.
| | - Ramakrishnan Iyer
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Vilas Menon
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
- Howard Hughes Medical Institute, Janelia Research Campus, 19700 Helix Dr, Ashburn, VA, 20147, USA
| | - Nathan Gouwens
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - David Feng
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Jim Berg
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Aaron Szafer
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Nicholas Cain
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Hongkui Zeng
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Michael Hawrylycz
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Christof Koch
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA
| | - Stefan Mihalas
- Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, WA, 98109, USA.
| |
Collapse
|
36
|
How linear response shaped models of neural circuits and the quest for alternatives. Curr Opin Neurobiol 2017; 46:234-240. [DOI: 10.1016/j.conb.2017.09.001] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Accepted: 09/07/2017] [Indexed: 11/23/2022]
|
37
|
Inferring cortical function in the mouse visual system through large-scale systems neuroscience. Proc Natl Acad Sci U S A 2017; 113:7337-44. [PMID: 27382147 DOI: 10.1073/pnas.1512901113] [Citation(s) in RCA: 44] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.
Collapse
|
38
|
Setareh H, Deger M, Petersen CCH, Gerstner W. Cortical Dynamics in Presence of Assemblies of Densely Connected Weight-Hub Neurons. Front Comput Neurosci 2017; 11:52. [PMID: 28690508 PMCID: PMC5480278 DOI: 10.3389/fncom.2017.00052] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2017] [Accepted: 05/29/2017] [Indexed: 01/21/2023] Open
Abstract
Experimental measurements of pairwise connection probability of pyramidal neurons together with the distribution of synaptic weights have been used to construct randomly connected model networks. However, several experimental studies suggest that both wiring and synaptic weight structure between neurons show statistics that differ from random networks. Here we study a network containing a subset of neurons which we call weight-hub neurons, that are characterized by strong inward synapses. We propose a connectivity structure for excitatory neurons that contain assemblies of densely connected weight-hub neurons, while the pairwise connection probability and synaptic weight distribution remain consistent with experimental data. Simulations of such a network with generalized integrate-and-fire neurons display regular and irregular slow oscillations akin to experimentally observed up/down state transitions in the activity of cortical neurons with a broad distribution of pairwise spike correlations. Moreover, stimulation of a model network in the presence or absence of assembly structure exhibits responses similar to light-evoked responses of cortical layers in optogenetically modified animals. We conclude that a high connection probability into and within assemblies of excitatory weight-hub neurons, as it likely is present in some but not all cortical layers, changes the dynamics of a layer of cortical microcircuitry significantly.
Collapse
Affiliation(s)
- Hesam Setareh
- Laboratory of Computational Neuroscience, School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de LausanneLausanne, Switzerland
| | - Moritz Deger
- Laboratory of Computational Neuroscience, School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de LausanneLausanne, Switzerland.,Faculty of Mathematics and Natural Sciences, Institute for Zoology, University of CologneCologne, Germany
| | - Carl C H Petersen
- Laboratory of Sensory Processing, Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de LausanneLausanne, Switzerland
| | - Wulfram Gerstner
- Laboratory of Computational Neuroscience, School of Computer and Communication Sciences and Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de LausanneLausanne, Switzerland
| |
Collapse
|
39
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 68] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
40
|
|
41
|
Robinson BS, Berger TW, Song D. Identification of Stable Spike-Timing-Dependent Plasticity from Spiking Activity with Generalized Multilinear Modeling. Neural Comput 2016; 28:2320-2351. [PMID: 27557101 DOI: 10.1162/neco_a_00883] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Characterization of long-term activity-dependent plasticity from behaviorally driven spiking activity is important for understanding the underlying mechanisms of learning and memory. In this letter, we present a computational framework for quantifying spike-timing-dependent plasticity (STDP) during behavior by identifying a functional plasticity rule solely from spiking activity. First, we formulate a flexible point-process spiking neuron model structure with STDP, which includes functions that characterize the stationary and plastic properties of the neuron. The STDP model includes a novel function for prolonged plasticity induction, as well as a more typical function for synaptic weight change based on the relative timing of input-output spike pairs. Consideration for system stability is incorporated with weight-dependent synaptic modification. Next, we formalize an estimation technique using a generalized multilinear model (GMLM) structure with basis function expansion. The weight-dependent synaptic modification adds a nonlinearity to the model, which is addressed with an iterative unconstrained optimization approach. Finally, we demonstrate successful model estimation on simulated spiking data and show that all model functions can be estimated accurately with this method across a variety of simulation parameters, such as number of inputs, output firing rate, input firing type, and simulation time. Since this approach requires only naturally generated spikes, it can be readily applied to behaving animal studies to characterize the underlying mechanisms of learning and memory.
Collapse
Affiliation(s)
- Brian S Robinson
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| | - Theodore W Berger
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| | - Dong Song
- Department of Biomedical Engineering, University of Southern California, Los Angeles, CA 90089, U.S.A.
| |
Collapse
|
42
|
Abstract
As information flows through the brain, neuronal firing progresses from encoding the world as sensed by the animal to driving the motor output of subsequent behavior. One of the more tractable goals of quantitative neuroscience is to develop predictive models that relate the sensory or motor streams with neuronal firing. Here we review and contrast analytical tools used to accomplish this task. We focus on classes of models in which the external variable is compared with one or more feature vectors to extract a low-dimensional representation, the history of spiking and other variables are potentially incorporated, and these factors are nonlinearly transformed to predict the occurrences of spikes. We illustrate these techniques in application to datasets of different degrees of complexity. In particular, we address the fitting of models in the presence of strong correlations in the external variable, as occurs in natural sensory stimuli and in movement. Spectral correlation between predicted and measured spike trains is introduced to contrast the relative success of different methods.
Collapse
Affiliation(s)
- Johnatan Aljadeff
- Department of Physics, University of California, San Diego, San Diego, CA 92093, USA; Department of Neurobiology, University of Chicago, Chicago, IL 60637, USA.
| | - Benjamin J Lansdell
- Department of Applied Mathematics, University of Washington, Seattle, WA 98195, USA
| | - Adrienne L Fairhall
- Department of Physiology and Biophysics, University of Washington, Seattle, WA 98195, USA; WRF UW Institute for Neuroengineering, University of Washington, Seattle, WA 98195, USA
| | - David Kleinfeld
- Department of Physics, University of California, San Diego, San Diego, CA 92093, USA; Section of Neurobiology, University of California, San Diego, La Jolla, CA 92093, USA; Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093, USA.
| |
Collapse
|
43
|
Van Geit W, Gevaert M, Chindemi G, Rössert C, Courcol JD, Muller EB, Schürmann F, Segev I, Markram H. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience. Front Neuroinform 2016; 10:17. [PMID: 27375471 PMCID: PMC4896051 DOI: 10.3389/fninf.2016.00017] [Citation(s) in RCA: 85] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2016] [Accepted: 05/06/2016] [Indexed: 11/13/2022] Open
Abstract
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases.
Collapse
Affiliation(s)
- Werner Van Geit
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Michael Gevaert
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Giuseppe Chindemi
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Christian Rössert
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Jean-Denis Courcol
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Eilif B Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Felix Schürmann
- Blue Brain Project, École Polytechnique Fédérale de Lausanne Geneva, Switzerland
| | - Idan Segev
- Department of Neurobiology, Alexander Silberman Institute of Life Sciences, The Hebrew University of JerusalemJerusalem, Israel; The Edmond and Lily Safra Centre for Brain Sciences, The Hebrew University of JerusalemJerusalem, Israel
| | - Henry Markram
- Blue Brain Project, École Polytechnique Fédérale de LausanneGeneva, Switzerland; Laboratory of Neural Microcircuitry, Brain Mind Institute, École Polytechnique Fédérale de LausanneLausanne, Switzerland
| |
Collapse
|
44
|
Mensi S, Hagens O, Gerstner W, Pozzorini C. Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons. PLoS Comput Biol 2016; 12:e1004761. [PMID: 26907675 PMCID: PMC4764342 DOI: 10.1371/journal.pcbi.1004761] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2015] [Accepted: 01/19/2016] [Indexed: 11/25/2022] Open
Abstract
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations. Over the last decades, a variety of simplified spiking models have been shown to achieve a surprisingly high performance in predicting the neuronal responses to in vitro somatic current injections. Because of the complex adaptive behavior featured by cortical neurons, this success is however restricted to limited stimulus ranges: model parameters optimized for a specific input regime are often inappropriate to describe the response to input currents with different statistical properties. In the present study, a new spiking neuron model is introduced that captures single-neuron computation over a wide range of input statistics and explains different aspects of the neuronal dynamics within a single framework. Our results indicate that complex forms of single neuron adaptation are mediated by the nonlinear dynamics of the firing threshold and that the input-output transformation performed by cortical pyramidal neurons can be intuitively understood in terms of an enhanced Generalized Linear Model in which both the input filter and the spike-history filter adapt to the input statistics.
Collapse
Affiliation(s)
- Skander Mensi
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Olivier Hagens
- Laboratory of Neural Microcircuitry (LNMC), Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Wulfram Gerstner
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Christian Pozzorini
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| |
Collapse
|