1
|
Souza DLM, Gabrick EC, Protachevicz PR, Borges FS, Trobia J, Iarosz KC, Batista AM, Caldas IL, Lenzi EK. Adaptive exponential integrate-and-fire model with fractal extension. CHAOS (WOODBURY, N.Y.) 2024; 34:023107. [PMID: 38341761 DOI: 10.1063/5.0176455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Accepted: 01/08/2024] [Indexed: 02/13/2024]
Abstract
The description of neuronal activity has been of great importance in neuroscience. In this field, mathematical models are useful to describe the electrophysical behavior of neurons. One successful model used for this purpose is the Adaptive Exponential Integrate-and-Fire (Adex), which is composed of two ordinary differential equations. Usually, this model is considered in the standard formulation, i.e., with integer order derivatives. In this work, we propose and study the fractal extension of Adex model, which in simple terms corresponds to replacing the integer derivative by non-integer. As non-integer operators, we choose the fractal derivatives. We explore the effects of equal and different orders of fractal derivatives in the firing patterns and mean frequency of the neuron described by the Adex model. Previous results suggest that fractal derivatives can provide a more realistic representation due to the fact that the standard operators are generalized. Our findings show that the fractal order influences the inter-spike intervals and changes the mean firing frequency. In addition, the firing patterns depend not only on the neuronal parameters but also on the order of respective fractal operators. As our main conclusion, the fractal order below the unit value increases the influence of the adaptation mechanism in the spike firing patterns.
Collapse
Affiliation(s)
- Diogo L M Souza
- Graduate Program in Science, State University of Ponta Grossa, 84030-900 Ponta Grossa, PR, Brazil
| | - Enrique C Gabrick
- Graduate Program in Science, State University of Ponta Grossa, 84030-900 Ponta Grossa, PR, Brazil
- Department of Physics, Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
- Potsdam Institute for Climate Impact Research, Telegrafenberg A31, 14473 Potsdam, Germany
| | | | - Fernando S Borges
- Department of Physiology and Pharmacology, State University of New York Downstate Health Sciences University, Brooklyn, New York 11203, USA
- Center for Mathematics, Computation, and Cognition, Federal University of ABC, 09606-045 São Bernardo do Campo, SP, Brazil
| | - José Trobia
- Department of Mathematics and Statistics, State University of Ponta Grossa, 84030-900 Ponta Grossa, Brazil
| | - Kelly C Iarosz
- University Center UNIFATEB, 84266-010 Telêmaco Borba, PR, Brazil
| | - Antonio M Batista
- Graduate Program in Science, State University of Ponta Grossa, 84030-900 Ponta Grossa, PR, Brazil
- Institute of Physics, University of São Paulo, 05508-090 São Paulo, SP, Brazil
- Department of Mathematics and Statistics, State University of Ponta Grossa, 84030-900 Ponta Grossa, Brazil
| | - Iberê L Caldas
- Institute of Physics, University of São Paulo, 05508-090 São Paulo, SP, Brazil
| | - Ervin K Lenzi
- Graduate Program in Science, State University of Ponta Grossa, 84030-900 Ponta Grossa, PR, Brazil
- Departament of Physics, State University of Ponta Grossa, Av. Gen. Carlos Cavalcanti 4748, Ponta Grossa 84030-900, PR, Brazil
| |
Collapse
|
2
|
Yuan C, Li X. Fitting of TC model according to key parameters affecting Parkinson's state based on improved particle swarm optimization algorithm. Sci Rep 2022; 12:13938. [PMID: 35977977 PMCID: PMC9385711 DOI: 10.1038/s41598-022-18267-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2021] [Accepted: 08/08/2022] [Indexed: 11/10/2022] Open
Abstract
Biophysical models contain a large number of parameters, while the spiking characteristics of neurons are related to a few key parameters. For thalamic neurons, relay reliability is an important characteristic that affects Parkinson's state. This paper proposes a method to fit key parameters of the model based on the spiking characteristics of neurons, and improves the traditional particle swarm optimization algorithm. That is, a nonlinear concave function and a Logistic chaotic mapping are combined to adjust the inertia weight of particles to avoid the particle falling into a local optimum in the search process or appearing premature convergence. In this paper, three parameters that play an important role in Parkinson's state of the thalamic cell model are selected and fitted by the improved particle swarm optimization algorithm. Using the fitted parameters to reconstruct the neuron model can predict the spiking trajectories well, which verifies the effectiveness of the fitting method. By comparing the fitting results with other particle swarm optimization algorithms, it is shown that the proposed particle swarm optimization algorithm can better avoid local optima and converge to the optimal values quickly.
Collapse
Affiliation(s)
- Chunhua Yuan
- School of Automation and Electrical Engineering, Shenyang Ligong University, Shenyang, 110159, China
| | - Xiangyu Li
- School of Automation and Electrical Engineering, Shenyang Ligong University, Shenyang, 110159, China.
| |
Collapse
|
3
|
Gonçalves PJ, Lueckmann JM, Deistler M, Nonnenmacher M, Öcal K, Bassetto G, Chintaluri C, Podlaski WF, Haddad SA, Vogels TP, Greenberg DS, Macke JH. Training deep neural density estimators to identify mechanistic models of neural dynamics. eLife 2020; 9:e56261. [PMID: 32940606 PMCID: PMC7581433 DOI: 10.7554/elife.56261] [Citation(s) in RCA: 68] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Accepted: 09/16/2020] [Indexed: 01/27/2023] Open
Abstract
Mechanistic modeling in neuroscience aims to explain observed phenomena in terms of underlying causes. However, determining which model parameters agree with complex and stochastic neural data presents a significant challenge. We address this challenge with a machine learning tool which uses deep neural density estimators-trained using model simulations-to carry out Bayesian inference and retrieve the full space of parameters compatible with raw data or selected data features. Our method is scalable in parameters and data features and can rapidly analyze new data after initial training. We demonstrate the power and flexibility of our approach on receptive fields, ion channels, and Hodgkin-Huxley models. We also characterize the space of circuit configurations giving rise to rhythmic activity in the crustacean stomatogastric ganglion, and use these results to derive hypotheses for underlying compensation mechanisms. Our approach will help close the gap between data-driven and theory-driven models of neural dynamics.
Collapse
Affiliation(s)
- Pedro J Gonçalves
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
| | - Jan-Matthis Lueckmann
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
| | - Michael Deistler
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Machine Learning in Science, Excellence Cluster Machine Learning, Tübingen UniversityTübingenGermany
| | - Marcel Nonnenmacher
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
- Model-Driven Machine Learning, Institute of Coastal Research, Helmholtz Centre GeesthachtGeesthachtGermany
| | - Kaan Öcal
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
- Mathematical Institute, University of BonnBonnGermany
| | - Giacomo Bassetto
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
| | - Chaitanya Chintaluri
- Centre for Neural Circuits and Behaviour, University of OxfordOxfordUnited Kingdom
- Institute of Science and Technology AustriaKlosterneuburgAustria
| | - William F Podlaski
- Centre for Neural Circuits and Behaviour, University of OxfordOxfordUnited Kingdom
| | - Sara A Haddad
- Max Planck Institute for Brain ResearchFrankfurtGermany
| | - Tim P Vogels
- Centre for Neural Circuits and Behaviour, University of OxfordOxfordUnited Kingdom
- Institute of Science and Technology AustriaKlosterneuburgAustria
| | - David S Greenberg
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Model-Driven Machine Learning, Institute of Coastal Research, Helmholtz Centre GeesthachtGeesthachtGermany
| | - Jakob H Macke
- Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of MunichMunichGermany
- Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar)BonnGermany
- Machine Learning in Science, Excellence Cluster Machine Learning, Tübingen UniversityTübingenGermany
- Max Planck Institute for Intelligent SystemsTübingenGermany
| |
Collapse
|
4
|
Durstewitz D, Koppe G, Meyer-Lindenberg A. Deep neural networks in psychiatry. Mol Psychiatry 2019; 24:1583-1598. [PMID: 30770893 DOI: 10.1038/s41380-019-0365-9] [Citation(s) in RCA: 108] [Impact Index Per Article: 21.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/05/2018] [Revised: 01/02/2019] [Accepted: 01/24/2019] [Indexed: 01/03/2023]
Abstract
Machine and deep learning methods, today's core of artificial intelligence, have been applied with increasing success and impact in many commercial and research settings. They are powerful tools for large scale data analysis, prediction and classification, especially in very data-rich environments ("big data"), and have started to find their way into medical applications. Here we will first give an overview of machine learning methods, with a focus on deep and recurrent neural networks, their relation to statistics, and the core principles behind them. We will then discuss and review directions along which (deep) neural networks can be, or already have been, applied in the context of psychiatry, and will try to delineate their future potential in this area. We will also comment on an emerging area that so far has been much less well explored: by embedding semantically interpretable computational models of brain dynamics or behavior into a statistical machine learning context, insights into dysfunction beyond mere prediction and classification may be gained. Especially this marriage of computational models with statistical inference may offer insights into neural and behavioral mechanisms that could open completely novel avenues for psychiatric treatment.
Collapse
Affiliation(s)
- Daniel Durstewitz
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim/Heidelberg University, 68159, Mannheim, Germany.
| | - Georgia Koppe
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim/Heidelberg University, 68159, Mannheim, Germany.,Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim/Heidelberg University, 68159, Mannheim, Germany
| | - Andreas Meyer-Lindenberg
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim/Heidelberg University, 68159, Mannheim, Germany
| |
Collapse
|
5
|
Koppe G, Toutounji H, Kirsch P, Lis S, Durstewitz D. Identifying nonlinear dynamical systems via generative recurrent neural networks with applications to fMRI. PLoS Comput Biol 2019; 15:e1007263. [PMID: 31433810 PMCID: PMC6719895 DOI: 10.1371/journal.pcbi.1007263] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2019] [Revised: 09/03/2019] [Accepted: 07/11/2019] [Indexed: 12/31/2022] Open
Abstract
A major tenet in theoretical neuroscience is that cognitive and behavioral processes are ultimately implemented in terms of the neural system dynamics. Accordingly, a major aim for the analysis of neurophysiological measurements should lie in the identification of the computational dynamics underlying task processing. Here we advance a state space model (SSM) based on generative piecewise-linear recurrent neural networks (PLRNN) to assess dynamics from neuroimaging data. In contrast to many other nonlinear time series models which have been proposed for reconstructing latent dynamics, our model is easily interpretable in neural terms, amenable to systematic dynamical systems analysis of the resulting set of equations, and can straightforwardly be transformed into an equivalent continuous-time dynamical system. The major contributions of this paper are the introduction of a new observation model suitable for functional magnetic resonance imaging (fMRI) coupled to the latent PLRNN, an efficient stepwise training procedure that forces the latent model to capture the 'true' underlying dynamics rather than just fitting (or predicting) the observations, and of an empirical measure based on the Kullback-Leibler divergence to evaluate from empirical time series how well this goal of approximating the underlying dynamics has been achieved. We validate and illustrate the power of our approach on simulated 'ground-truth' dynamical systems as well as on experimental fMRI time series, and demonstrate that the learnt dynamics harbors task-related nonlinear structure that a linear dynamical model fails to capture. Given that fMRI is one of the most common techniques for measuring brain activity non-invasively in human subjects, this approach may provide a novel step toward analyzing aberrant (nonlinear) dynamics for clinical assessment or neuroscientific research.
Collapse
Affiliation(s)
- Georgia Koppe
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Hazem Toutounji
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
| | - Peter Kirsch
- Department of Clinical Psychology, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Stefanie Lis
- Institute for Psychiatric and Psychosomatic Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
| | - Daniel Durstewitz
- Department of Theoretical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany
- Faculty of Physics and Astronomy, Heidelberg University, Heidelberg, Germany
| |
Collapse
|
6
|
Geminiani A, Casellato C, Locatelli F, Prestori F, Pedrocchi A, D'Angelo E. Complex Dynamics in Simplified Neuronal Models: Reproducing Golgi Cell Electroresponsiveness. Front Neuroinform 2018; 12:88. [PMID: 30559658 PMCID: PMC6287018 DOI: 10.3389/fninf.2018.00088] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2018] [Accepted: 11/13/2018] [Indexed: 11/21/2022] Open
Abstract
Brain neurons exhibit complex electroresponsive properties – including intrinsic subthreshold oscillations and pacemaking, resonance and phase-reset – which are thought to play a critical role in controlling neural network dynamics. Although these properties emerge from detailed representations of molecular-level mechanisms in “realistic” models, they cannot usually be generated by simplified neuronal models (although these may show spike-frequency adaptation and bursting). We report here that this whole set of properties can be generated by the extended generalized leaky integrate-and-fire (E-GLIF) neuron model. E-GLIF derives from the GLIF model family and is therefore mono-compartmental, keeps the limited computational load typical of a linear low-dimensional system, admits analytical solutions and can be tuned through gradient-descent algorithms. Importantly, E-GLIF is designed to maintain a correspondence between model parameters and neuronal membrane mechanisms through a minimum set of equations. In order to test its potential, E-GLIF was used to model a specific neuron showing rich and complex electroresponsiveness, the cerebellar Golgi cell, and was validated against experimental electrophysiological data recorded from Golgi cells in acute cerebellar slices. During simulations, E-GLIF was activated by stimulus patterns, including current steps and synaptic inputs, identical to those used for the experiments. The results demonstrate that E-GLIF can reproduce the whole set of complex neuronal dynamics typical of these neurons – including intensity-frequency curves, spike-frequency adaptation, post-inhibitory rebound bursting, spontaneous subthreshold oscillations, resonance, and phase-reset – providing a new effective tool to investigate brain dynamics in large-scale simulations.
Collapse
Affiliation(s)
- Alice Geminiani
- NEARLab, Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy
| | - Claudia Casellato
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Francesca Locatelli
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Francesca Prestori
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Alessandra Pedrocchi
- NEARLab, Department of Electronics, Information and Bioengineering, Politecnico di Milano, Milan, Italy
| | - Egidio D'Angelo
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy
| |
Collapse
|
7
|
|
8
|
Schneider AD. Model Vestibular Nuclei Neurons Can Exhibit a Boosting Nonlinearity Due to an Adaptation Current Regulated by Spike-Triggered Calcium and Calcium-Activated Potassium Channels. PLoS One 2016; 11:e0159300. [PMID: 27427914 PMCID: PMC4948908 DOI: 10.1371/journal.pone.0159300] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2016] [Accepted: 06/30/2016] [Indexed: 11/18/2022] Open
Abstract
In vitro studies have previously found a class of vestibular nuclei neurons to exhibit a bidirectional afterhyperpolarization (AHP) in their membrane potential, due to calcium and calcium-activated potassium conductances. More recently in vivo studies of such vestibular neurons were found to exhibit a boosting nonlinearity in their input-output tuning curves. In this paper, a Hodgkin-Huxley (HH) type neuron model, originally developed to reproduce the in vitro AHP, is shown to produce a boosting nonlinearity similar to that seen in vivo for increased the calcium conductance. Indicative of a bifurcation, the HH model is reduced to a generalized integrate-and-fire (IF) model that preserves the bifurcation structure and boosting nonliearity. By then projecting the neuron model’s phase space trajectories into 2D, the underlying geometric mechanism relating the AHP and boosting nonlinearity is revealed. Further simplifications and approximations are made to derive analytic expressions for the steady steady state firing rate as a function of bias current, μ, as well as the gain (i.e. its slope) and the position of its peak at μ = μ*. Finally, although the boosting nonlinearity has not yet been experimentally observed in vitro, testable predictions indicate how it might be found.
Collapse
|
9
|
Spike-Based Bayesian-Hebbian Learning of Temporal Sequences. PLoS Comput Biol 2016; 12:e1004954. [PMID: 27213810 PMCID: PMC4877102 DOI: 10.1371/journal.pcbi.1004954] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2015] [Accepted: 04/28/2016] [Indexed: 11/25/2022] Open
Abstract
Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model’s feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison. From one moment to the next, in an ever-changing world, and awash in a deluge of sensory data, the brain fluidly guides our actions throughout an astonishing variety of tasks. Processing this ongoing bombardment of information is a fundamental problem faced by its underlying neural circuits. Given that the structure of our actions along with the organization of the environment in which they are performed can be intuitively decomposed into sequences of simpler patterns, an encoding strategy reflecting the temporal nature of these patterns should offer an efficient approach for assembling more complex memories and behaviors. We present a model that demonstrates how activity could propagate through recurrent cortical microcircuits as a result of a learning rule based on neurobiologically plausible time courses and dynamics. The model predicts that the interaction between several learning and dynamical processes constitute a compound mnemonic engram that can flexibly generate sequential step-wise increases of activity within neural populations.
Collapse
|
10
|
Hass J, Hertäg L, Durstewitz D. A Detailed Data-Driven Network Model of Prefrontal Cortex Reproduces Key Features of In Vivo Activity. PLoS Comput Biol 2016; 12:e1004930. [PMID: 27203563 PMCID: PMC4874603 DOI: 10.1371/journal.pcbi.1004930] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2016] [Accepted: 04/20/2016] [Indexed: 11/30/2022] Open
Abstract
The prefrontal cortex is centrally involved in a wide range of cognitive functions and their impairment in psychiatric disorders. Yet, the computational principles that govern the dynamics of prefrontal neural networks, and link their physiological, biochemical and anatomical properties to cognitive functions, are not well understood. Computational models can help to bridge the gap between these different levels of description, provided they are sufficiently constrained by experimental data and capable of predicting key properties of the intact cortex. Here, we present a detailed network model of the prefrontal cortex, based on a simple computationally efficient single neuron model (simpAdEx), with all parameters derived from in vitro electrophysiological and anatomical data. Without additional tuning, this model could be shown to quantitatively reproduce a wide range of measures from in vivo electrophysiological recordings, to a degree where simulated and experimentally observed activities were statistically indistinguishable. These measures include spike train statistics, membrane potential fluctuations, local field potentials, and the transmission of transient stimulus information across layers. We further demonstrate that model predictions are robust against moderate changes in key parameters, and that synaptic heterogeneity is a crucial ingredient to the quantitative reproduction of in vivo-like electrophysiological behavior. Thus, we have produced a physiologically highly valid, in a quantitative sense, yet computationally efficient PFC network model, which helped to identify key properties underlying spike time dynamics as observed in vivo, and can be harvested for in-depth investigation of the links between physiology and cognition. Computational network models are an important tool for linking physiological and neuro-dynamical processes to cognition. However, harvesting network models for this purpose may less depend on how much biophysical detail is included, but more on how well the model can capture the functional network physiology. Here, we present the first network model of the prefrontal cortex which has not only its single neuron properties and anatomical layout tightly constrained by experimental data, but is also able to quantitatively reproduce a large range of spiking, field potential, and membrane voltage statistics obtained from in vivo data, without need of specific parameter tuning. It thus represents a novel computational tool for addressing questions about the neuro-dynamics of cognition in health and disease.
Collapse
Affiliation(s)
- Joachim Hass
- Department of Theoretical Neuroscience, Bernstein-Center for Computational Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim of Heidelberg University, Mannheim, Germany
- * E-mail: (JH); (DD)
| | - Loreen Hertäg
- Modelling of Cognitive Processes, Berlin Institute of Technology and Bernstein Center for Computational Neuroscience Berlin, Germany
| | - Daniel Durstewitz
- Department of Theoretical Neuroscience, Bernstein-Center for Computational Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim of Heidelberg University, Mannheim, Germany
- * E-mail: (JH); (DD)
| |
Collapse
|
11
|
Mensi S, Hagens O, Gerstner W, Pozzorini C. Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons. PLoS Comput Biol 2016; 12:e1004761. [PMID: 26907675 PMCID: PMC4764342 DOI: 10.1371/journal.pcbi.1004761] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2015] [Accepted: 01/19/2016] [Indexed: 11/25/2022] Open
Abstract
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations. Over the last decades, a variety of simplified spiking models have been shown to achieve a surprisingly high performance in predicting the neuronal responses to in vitro somatic current injections. Because of the complex adaptive behavior featured by cortical neurons, this success is however restricted to limited stimulus ranges: model parameters optimized for a specific input regime are often inappropriate to describe the response to input currents with different statistical properties. In the present study, a new spiking neuron model is introduced that captures single-neuron computation over a wide range of input statistics and explains different aspects of the neuronal dynamics within a single framework. Our results indicate that complex forms of single neuron adaptation are mediated by the nonlinear dynamics of the firing threshold and that the input-output transformation performed by cortical pyramidal neurons can be intuitively understood in terms of an enhanced Generalized Linear Model in which both the input filter and the spike-history filter adapt to the input statistics.
Collapse
Affiliation(s)
- Skander Mensi
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Olivier Hagens
- Laboratory of Neural Microcircuitry (LNMC), Brain Mind Institute, School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Wulfram Gerstner
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Christian Pozzorini
- Laboratory of Computational Neuroscience (LCN), Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
- * E-mail:
| |
Collapse
|
12
|
Hertäg L, Durstewitz D, Brunel N. Analytical approximations of the firing rate of an adaptive exponential integrate-and-fire neuron in the presence of synaptic noise. Front Comput Neurosci 2014; 8:116. [PMID: 25278872 PMCID: PMC4167001 DOI: 10.3389/fncom.2014.00116] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2014] [Accepted: 08/31/2014] [Indexed: 11/17/2022] Open
Abstract
Computational models offer a unique tool for understanding the network-dynamical mechanisms which mediate between physiological and biophysical properties, and behavioral function. A traditional challenge in computational neuroscience is, however, that simple neuronal models which can be studied analytically fail to reproduce the diversity of electrophysiological behaviors seen in real neurons, while detailed neuronal models which do reproduce such diversity are intractable analytically and computationally expensive. A number of intermediate models have been proposed whose aim is to capture the diversity of firing behaviors and spike times of real neurons while entailing the simplest possible mathematical description. One such model is the exponential integrate-and-fire neuron with spike rate adaptation (aEIF) which consists of two differential equations for the membrane potential (V) and an adaptation current (w). Despite its simplicity, it can reproduce a wide variety of physiologically observed spiking patterns, can be fit to physiological recordings quantitatively, and, once done so, is able to predict spike times on traces not used for model fitting. Here we compute the steady-state firing rate of aEIF in the presence of Gaussian synaptic noise, using two approaches. The first approach is based on the 2-dimensional Fokker-Planck equation that describes the (V,w)-probability distribution, which is solved using an expansion in the ratio between the time constants of the two variables. The second is based on the firing rate of the EIF model, which is averaged over the distribution of the w variable. These analytically derived closed-form expressions were tested on simulations from a large variety of model cells quantitatively fitted to in vitro electrophysiological recordings from pyramidal cells and interneurons. Theoretical predictions closely agreed with the firing rate of the simulated cells fed with in-vivo-like synaptic noise.
Collapse
Affiliation(s)
- Loreen Hertäg
- Department Theoretical Neuroscience, Bernstein-Center for Computational Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim/Heidelberg University Mannheim, Germany
| | - Daniel Durstewitz
- Department Theoretical Neuroscience, Bernstein-Center for Computational Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim/Heidelberg University Mannheim, Germany ; Faculty of Science and Environment, School of Computing and Mathematics, Plymouth University Plymouth, UK
| | - Nicolas Brunel
- Departments of Statistics and Neurobiology, University of Chicago Chicago, IL, USA
| |
Collapse
|
13
|
Ferguson KA, Huh CYL, Amilhon B, Williams S, Skinner FK. Simple, biologically-constrained CA1 pyramidal cell models using an intact, whole hippocampus context. F1000Res 2014; 3:104. [PMID: 25383182 PMCID: PMC4215760 DOI: 10.12688/f1000research.3894.1] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 05/06/2014] [Indexed: 01/24/2023] Open
Abstract
The hippocampus is a heavily studied brain structure due to its involvement in learning and memory. Detailed models of excitatory, pyramidal cells in hippocampus have been developed using a range of experimental data. These models have been used to help us understand, for example, the effects of synaptic integration and voltage gated channel densities and distributions on cellular responses. However, these cellular outputs need to be considered from the perspective of the networks in which they are embedded. Using modeling approaches, if cellular representations are too detailed, it quickly becomes computationally unwieldy to explore large network simulations. Thus, simple models are preferable, but at the same time they need to have a clear, experimental basis so as to allow physiologically based understandings to emerge. In this article, we describe the development of simple models of CA1 pyramidal cells, as derived in a well-defined experimental context of an intact, whole hippocampus preparation expressing population oscillations. These models are based on the intrinsic properties and frequency-current profiles of CA1 pyramidal cells, and can be used to build, fully examine, and analyze large networks.
Collapse
Affiliation(s)
- Katie A Ferguson
- Toronto Western Research Institute, University Health Network, Toronto, Ontario, M5T 2S8, Canada ; Department of Physiology, University of Toronto, Toronto, Ontario, M5S 1A1, Canada
| | - Carey Y L Huh
- Department of Psychiatry, Douglas Mental Health University Institute, McGill University, Montreal, Quebec, H4G 1X6, Canada
| | - Benedicte Amilhon
- Department of Psychiatry, Douglas Mental Health University Institute, McGill University, Montreal, Quebec, H4G 1X6, Canada
| | - Sylvain Williams
- Department of Psychiatry, Douglas Mental Health University Institute, McGill University, Montreal, Quebec, H4G 1X6, Canada
| | - Frances K Skinner
- Toronto Western Research Institute, University Health Network, Toronto, Ontario, M5T 2S8, Canada ; Department of Medicine (Neurology), Physiology, University of Toronto, Toronto, Ontario, M5S 1A1, Canada
| |
Collapse
|
14
|
Spanagel R, Durstewitz D, Hansson A, Heinz A, Kiefer F, Köhr G, Matthäus F, Nöthen MM, Noori HR, Obermayer K, Rietschel M, Schloss P, Scholz H, Schumann G, Smolka M, Sommer W, Vengeliene V, Walter H, Wurst W, Zimmermann US, Stringer S, Smits Y, Derks EM. A systems medicine research approach for studying alcohol addiction. Addict Biol 2013; 18:883-96. [PMID: 24283978 DOI: 10.1111/adb.12109] [Citation(s) in RCA: 72] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
According to the World Health Organization, about 2 billion people drink alcohol. Excessive alcohol consumption can result in alcohol addiction, which is one of the most prevalent neuropsychiatric diseases afflicting our society today. Prevention and intervention of alcohol binging in adolescents and treatment of alcoholism are major unmet challenges affecting our health-care system and society alike. Our newly formed German SysMedAlcoholism consortium is using a new systems medicine approach and intends (1) to define individual neurobehavioral risk profiles in adolescents that are predictive of alcohol use disorders later in life and (2) to identify new pharmacological targets and molecules for the treatment of alcoholism. To achieve these goals, we will use omics-information from epigenomics, genetics transcriptomics, neurodynamics, global neurochemical connectomes and neuroimaging (IMAGEN; Schumann et al. ) to feed mathematical prediction modules provided by two Bernstein Centers for Computational Neurosciences (Berlin and Heidelberg/Mannheim), the results of which will subsequently be functionally validated in independent clinical samples and appropriate animal models. This approach will lead to new early intervention strategies and identify innovative molecules for relapse prevention that will be tested in experimental human studies. This research program will ultimately help in consolidating addiction research clusters in Germany that can effectively conduct large clinical trials, implement early intervention strategies and impact political and healthcare decision makers.
Collapse
Affiliation(s)
- Rainer Spanagel
- Insitute of Psychopharmacology; Central Institute of Mental Health; Medical Faculty Mannheim; University of Heidelberg; Germany
| | - Daniel Durstewitz
- Bernstein Center for Computational Neuroscience; Central Institute of Mental Health; Germany
| | - Anita Hansson
- Insitute of Psychopharmacology; Central Institute of Mental Health; Medical Faculty Mannheim; University of Heidelberg; Germany
| | - Andreas Heinz
- Department of Addictive Behaviour and Addiction Medicine; Central Institute of Mental Health; Germany
| | - Falk Kiefer
- Department of Genetic Epidemiology in Psychiatry; Central Institute of Mental Health; Germany
| | - Georg Köhr
- Insitute of Psychopharmacology; Central Institute of Mental Health; Medical Faculty Mannheim; University of Heidelberg; Germany
| | | | - Markus M. Nöthen
- Department of Psychiatry; Charité University Medical Center; Germany
| | - Hamid R. Noori
- Insitute of Psychopharmacology; Central Institute of Mental Health; Medical Faculty Mannheim; University of Heidelberg; Germany
| | - Klaus Obermayer
- Institute of Applied Mathematics; University of Heidelberg; Germany
| | - Marcella Rietschel
- Department of Genomics, Life & Brain Centre; University of Bonn; Germany
| | - Patrick Schloss
- Neural Information Processing Group; Technical University of Berlin; Germany
| | - Henrike Scholz
- Behavioral Neurogenetics' Zoological Institute; University of Cologne; Germany
| | - Gunter Schumann
- MRC-SGDP Centre; Institute of Psychiatry; King's College; UK
| | - Michael Smolka
- Department of Psychiatry and Psychotherapy; Technical University Dresden; Germany
| | - Wolfgang Sommer
- Insitute of Psychopharmacology; Central Institute of Mental Health; Medical Faculty Mannheim; University of Heidelberg; Germany
| | - Valentina Vengeliene
- Insitute of Psychopharmacology; Central Institute of Mental Health; Medical Faculty Mannheim; University of Heidelberg; Germany
| | - Henrik Walter
- Department of Addictive Behaviour and Addiction Medicine; Central Institute of Mental Health; Germany
| | - Wolfgang Wurst
- Institute of Developmental Genetics; Helmholtz Center Munich; Germany
| | - Uli S. Zimmermann
- Department of Psychiatry and Psychotherapy; Technical University Dresden; Germany
| | - Sven Stringer
- Psychiatry Department; Academic Medical Center; The Netherlands
- Brain Center Rudolf Magnus; University Medical Center; The Netherlands
| | - Yannick Smits
- Psychiatry Department; Academic Medical Center; The Netherlands
| | - Eske M. Derks
- Psychiatry Department; Academic Medical Center; The Netherlands
| | | |
Collapse
|
15
|
Hass J, Hertäg L, Quiroga Lombard SC, Golovko T, Durstewitz D. A computational model of prefrontal cortex based on physiologically derived cellular parameter distributions. BMC Neurosci 2013. [PMCID: PMC3704620 DOI: 10.1186/1471-2202-14-s1-p116] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|