1
|
Ramlow L, Falcke M, Lindner B. An integrate-and-fire approach to Ca 2+ signaling. Part I: Renewal model. Biophys J 2023; 122:713-736. [PMID: 36635961 PMCID: PMC9989887 DOI: 10.1016/j.bpj.2023.01.007] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 12/13/2022] [Accepted: 01/06/2023] [Indexed: 01/13/2023] Open
Abstract
In computational neuroscience integrate-and-fire models capture the spike generation by a subthreshold dynamics supplemented by a simple fire-and-reset rule; they allow for a numerically efficient and analytically tractable description of stochastic single cell as well as network dynamics. Stochastic spiking is also a prominent feature of Ca2+ signaling which suggests to adopt the integrate-and-fire approach for this fundamental biophysical process. The model introduced here consists of two components describing 1) activity of clusters of inositol-trisphosphate receptor channels and 2) dynamics of the global Ca2+ concentrations in the cytosol. The cluster dynamics is given in terms of a cyclic Markov chain, capturing the puff, i.e., the punctuated release of Ca2+ from intracellular stores. The cytosolic Ca2+ concentration is described by an integrate-and-fire dynamics driven by the puff current. For the cyclic Markov chain we derive expressions for the statistics of the interpuff interval, the single-puff strength and the puff current assuming constant cytosolic Ca2+. The latter condition is often well approximated because cytosolic Ca2+ varies much slower than the cluster activity does. Furthermore, because the detailed two-component model is numerically expensive to simulate and difficult to treat analytically, we develop an analytical framework to approximate the driving puff current of the stochastic cytosolic Ca2+ dynamics by a temporally uncorrelated Gaussian noise. This approximation reduces our two-component system to an integrate-and-fire model with a nonlinear drift function and a multiplicative Gaussian white noise, a model that is known to generate a renewal spike train, i.e., a point process with statistically independent interspike intervals. The model allows for fast numerical simulations, permits to derive analytical expressions for the rate of Ca2+ spiking and the coefficient of variation of the interspike interval, and to approximate the interspike interval density and the spike train power spectrum. Comparison of these statistics to experimental data is discussed.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Physics Department of Humboldt University Berlin, Berlin, Germany.
| | - Martin Falcke
- Physics Department of Humboldt University Berlin, Berlin, Germany; Max Delbrück Center for Molecular Medicine, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; Physics Department of Humboldt University Berlin, Berlin, Germany
| |
Collapse
|
2
|
Franzen J, Ramlow L, Lindner B. The steady state and response to a periodic stimulation of the firing rate for a theta neuron with correlated noise. J Comput Neurosci 2023; 51:107-128. [PMID: 36273087 PMCID: PMC9840600 DOI: 10.1007/s10827-022-00836-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 07/29/2022] [Accepted: 09/01/2022] [Indexed: 01/18/2023]
Abstract
The stochastic activity of neurons is caused by various sources of correlated fluctuations and can be described in terms of simplified, yet biophysically grounded, integrate-and-fire models. One paradigmatic model is the quadratic integrate-and-fire model and its equivalent phase description by the theta neuron. Here we study the theta neuron model driven by a correlated Ornstein-Uhlenbeck noise and by periodic stimuli. We apply the matrix-continued-fraction method to the associated Fokker-Planck equation to develop an efficient numerical scheme to determine the stationary firing rate as well as the stimulus-induced modulation of the instantaneous firing rate. For the stationary case, we identify the conditions under which the firing rate decreases or increases by the effect of the colored noise and compare our results to existing analytical approximations for limit cases. For an additional periodic signal we demonstrate how the linear and nonlinear response terms can be computed and report resonant behavior for some of them. We extend the method to the case of two periodic signals, generally with incommensurable frequencies, and present a particular case for which a strong mixed response to both signals is observed, i.e. where the response to the sum of signals differs significantly from the sum of responses to the single signals. We provide Python code for our computational method: https://github.com/jannikfranzen/theta_neuron .
Collapse
Affiliation(s)
- Jannik Franzen
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
| | - Lukas Ramlow
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| | - Benjamin Lindner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| |
Collapse
|
3
|
Layer M, Senk J, Essink S, van Meegen A, Bos H, Helias M. NNMT: Mean-Field Based Analysis Tools for Neuronal Network Models. Front Neuroinform 2022; 16:835657. [PMID: 35712677 PMCID: PMC9196133 DOI: 10.3389/fninf.2022.835657] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 03/17/2022] [Indexed: 11/13/2022] Open
Abstract
Mean-field theory of neuronal networks has led to numerous advances in our analytical and intuitive understanding of their dynamics during the past decades. In order to make mean-field based analysis tools more accessible, we implemented an extensible, easy-to-use open-source Python toolbox that collects a variety of mean-field methods for the leaky integrate-and-fire neuron model. The Neuronal Network Mean-field Toolbox (NNMT) in its current state allows for estimating properties of large neuronal networks, such as firing rates, power spectra, and dynamical stability in mean-field and linear response approximation, without running simulations. In this article, we describe how the toolbox is implemented, show how it is used to reproduce results of previous studies, and discuss different use-cases, such as parameter space explorations, or mapping different network models. Although the initial version of the toolbox focuses on methods for leaky integrate-and-fire neurons, its structure is designed to be open and extensible. It aims to provide a platform for collecting analytical methods for neuronal network model analysis, such that the neuroscientific community can take maximal advantage of them.
Collapse
Affiliation(s)
- Moritz Layer
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Johanna Senk
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Simon Essink
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Alexander van Meegen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Institute of Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Cologne, Germany
| | - Hannah Bos
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
4
|
Holzhausen K, Ramlow L, Pu S, Thomas PJ, Lindner B. Mean-return-time phase of a stochastic oscillator provides an approximate renewal description for the associated point process. BIOLOGICAL CYBERNETICS 2022; 116:235-251. [PMID: 35166932 PMCID: PMC9068687 DOI: 10.1007/s00422-022-00920-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Accepted: 01/11/2022] [Indexed: 06/14/2023]
Abstract
Stochastic oscillations can be characterized by a corresponding point process; this is a common practice in computational neuroscience, where oscillations of the membrane voltage under the influence of noise are often analyzed in terms of the interspike interval statistics, specifically the distribution and correlation of intervals between subsequent threshold-crossing times. More generally, crossing times and the corresponding interval sequences can be introduced for different kinds of stochastic oscillators that have been used to model variability of rhythmic activity in biological systems. In this paper we show that if we use the so-called mean-return-time (MRT) phase isochrons (introduced by Schwabedal and Pikovsky) to count the cycles of a stochastic oscillator with Markovian dynamics, the interphase interval sequence does not show any linear correlations, i.e., the corresponding sequence of passage times forms approximately a renewal point process. We first outline the general mathematical argument for this finding and illustrate it numerically for three models of increasing complexity: (i) the isotropic Guckenheimer-Schwabedal-Pikovsky oscillator that displays positive interspike interval (ISI) correlations if rotations are counted by passing the spoke of a wheel; (ii) the adaptive leaky integrate-and-fire model with white Gaussian noise that shows negative interspike interval correlations when spikes are counted in the usual way by the passage of a voltage threshold; (iii) a Hodgkin-Huxley model with channel noise (in the diffusion approximation represented by Gaussian noise) that exhibits weak but statistically significant interspike interval correlations, again for spikes counted when passing a voltage threshold. For all these models, linear correlations between intervals vanish when we count rotations by the passage of an MRT isochron. We finally discuss that the removal of interval correlations does not change the long-term variability and its effect on information transmission, especially in the neural context.
Collapse
Affiliation(s)
- Konstantin Holzhausen
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Shusen Pu
- Department of Biomedical Engineering, 5814 Stevenson Center, Vanderbilt University, Nashville, TN 37215 USA
| | - Peter J. Thomas
- Department of Mathematics, Applied Mathematics, and Statistics, 212 Yost Hall, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, Ohio USA
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
5
|
Schwalger T. Mapping input noise to escape noise in integrate-and-fire neurons: a level-crossing approach. BIOLOGICAL CYBERNETICS 2021; 115:539-562. [PMID: 34668051 PMCID: PMC8551127 DOI: 10.1007/s00422-021-00899-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/17/2021] [Accepted: 09/27/2021] [Indexed: 06/13/2023]
Abstract
Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate ("escape noise"). While input noise lends itself to modeling biophysical noise processes, the phenomenological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with colored input noise and time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced across the sub- and suprathreshold firing regime compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise, enabling progress in the theory of neuronal population dynamics.
Collapse
Affiliation(s)
- Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623, Berlin, Germany.
- Bernstein Center for Computational Neuroscience Berlin, 10115, Berlin, Germany.
| |
Collapse
|
6
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel's time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
7
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
8
|
Mankin R, Rekker A, Paekivi S. Statistical moments of the interspike intervals for a neuron model driven by trichotomous noise. Phys Rev E 2021; 103:062201. [PMID: 34271748 DOI: 10.1103/physreve.103.062201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Accepted: 05/14/2021] [Indexed: 11/07/2022]
Abstract
The influence of a colored three-level input noise (trichotomous noise) on the spike generation of a perfect integrate-and-fire (PIF) model of neurons is studied. Using a first-passage-time formulation, exact expressions for the Laplace transform of the output interspike interval (ISI) density and for the statistical moments of the ISIs (such as the coefficient of variation, the skewness, the serial correlation coefficient, and the Fano factor) are derived. To model the anomalous subdiffusion that can arise from, e.g., the trapping properties of dendritic spines, the model is extended by including a random operational time in the form of an inverse strictly increasing Lévy-type subordinator, and exact formulas for ISI statistics are given for this case as well. Particularly, it is shown that at some parameter regimes, the ISI density exhibits a three-modal structure. The results for the extended model show that the ISI serial correlation coefficient and the Fano factor are nonmonotonic with respect to the input current, which indicates that at an intermediate value of the input current the variability of the output spike trains is minimal. Similarities and differences between the behavior of the presented models and the previously investigated PIF models driven by dichotomous noise are also discussed.
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Astrid Rekker
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Sander Paekivi
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
9
|
Mankin R, Rekker A. Effects of transient subordinators on the firing statistics of a neuron model driven by dichotomous noise. Phys Rev E 2020; 102:012103. [PMID: 32794976 DOI: 10.1103/physreve.102.012103] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Accepted: 06/15/2020] [Indexed: 06/11/2023]
Abstract
The behavior of a stochastic perfect integrate-and-fire (PIF) model of neurons is considered. The effect of temporally correlated random activity of synaptic inputs is modeled as a combination of an asymmetric dichotomous noise and a random operation time in the form of an inverse strictly increasing Lévy-type subordinator. Using a first-passage-time formulation, we find exact expressions for the output interspike interval (ISI) statistics. Particularly, it is shown that at some parameter regimes the ISI density exhibits a multimodal structure. Moreover, it is demonstrated that the coefficient of variation, the serial correlation coefficient, and the Fano factor display a nonmonotonic dependence on the mean input current μ, i.e., the ISI's regularity is maximized at an intermediate value of μ. The features of spike statistics, analytically revealed in our study, are compared with previously obtained results for a perfect integrate-and-fire neuron model driven by dichotomous noise (without subordination).
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Astrid Rekker
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
10
|
Ponzi A, Barton SJ, Bunner KD, Rangel-Barajas C, Zhang ES, Miller BR, Rebec GV, Kozloski J. Striatal network modeling in Huntington's Disease. PLoS Comput Biol 2020; 16:e1007648. [PMID: 32302302 PMCID: PMC7197869 DOI: 10.1371/journal.pcbi.1007648] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2019] [Revised: 05/04/2020] [Accepted: 01/09/2020] [Indexed: 12/26/2022] Open
Abstract
Medium spiny neurons (MSNs) comprise over 90% of cells in the striatum. In vivo MSNs display coherent burst firing cell assembly activity patterns, even though isolated MSNs do not burst fire intrinsically. This activity is important for the learning and execution of action sequences and is characteristically dysregulated in Huntington's Disease (HD). However, how dysregulation is caused by the various neural pathologies affecting MSNs in HD is unknown. Previous modeling work using simple cell models has shown that cell assembly activity patterns can emerge as a result of MSN inhibitory network interactions. Here, by directly estimating MSN network model parameters from single unit spiking data, we show that a network composed of much more physiologically detailed MSNs provides an excellent quantitative fit to wild type (WT) mouse spiking data, but only when network parameters are appropriate for the striatum. We find the WT MSN network is situated in a regime close to a transition from stable to strongly fluctuating network dynamics. This regime facilitates the generation of low-dimensional slowly varying coherent activity patterns and confers high sensitivity to variations in cortical driving. By re-estimating the model on HD spiking data we discover network parameter modifications are consistent across three very different types of HD mutant mouse models (YAC128, Q175, R6/2). In striking agreement with the known pathophysiology we find feedforward excitatory drive is reduced in HD compared to WT mice, while recurrent inhibition also shows phenotype dependency. We show that these modifications shift the HD MSN network to a sub-optimal regime where higher dimensional incoherent rapidly fluctuating activity predominates. Our results provide insight into a diverse range of experimental findings in HD, including cognitive and motor symptoms, and may suggest new avenues for treatment.
Collapse
Affiliation(s)
- Adam Ponzi
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
- * E-mail:
| | - Scott J. Barton
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Kendra D. Bunner
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Claudia Rangel-Barajas
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Emily S. Zhang
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - Benjamin R. Miller
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - George V. Rebec
- Program in Neuroscience, Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, United States of America
| | - James Kozloski
- IBM Research, Computational Biology Center, Thomas J. Watson Research Laboratories, Yorktown Heights, New York, United States of America
| |
Collapse
|
11
|
Schmutz V, Gerstner W, Schwalger T. Mesoscopic population equations for spiking neural networks with synaptic short-term plasticity. JOURNAL OF MATHEMATICAL NEUROSCIENCE 2020; 10:5. [PMID: 32253526 PMCID: PMC7136387 DOI: 10.1186/s13408-020-00082-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/11/2019] [Accepted: 03/25/2020] [Indexed: 06/07/2023]
Abstract
Coarse-graining microscopic models of biological neural networks to obtain mesoscopic models of neural activities is an essential step towards multi-scale models of the brain. Here, we extend a recent theory for mesoscopic population dynamics with static synapses to the case of dynamic synapses exhibiting short-term plasticity (STP). The extended theory offers an approximate mean-field dynamics for the synaptic input currents arising from populations of spiking neurons and synapses undergoing Tsodyks-Markram STP. The approximate mean-field dynamics accounts for both finite number of synapses and correlation between the two synaptic variables of the model (utilization and available resources) and its numerical implementation is simple. Comparisons with Monte Carlo simulations of the microscopic model show that in both feedforward and recurrent networks, the mesoscopic mean-field model accurately reproduces the first- and second-order statistics of the total synaptic input into a postsynaptic neuron and accounts for stochastic switches between Up and Down states and for population spikes. The extended mesoscopic population theory of spiking neural networks with STP may be useful for a systematic reduction of detailed biophysical models of cortical microcircuits to numerically efficient and mathematically tractable mean-field models.
Collapse
Affiliation(s)
- Valentin Schmutz
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL), Lausanne, Switzerland.
| | - Wulfram Gerstner
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL), Lausanne, Switzerland
| | - Tilo Schwalger
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL), Lausanne, Switzerland
- Bernstein Center for Computational Neuroscience, Institut für Mathematik, Technische Universität Berlin, Berlin, Germany
| |
Collapse
|
12
|
van Vreeswijk C, Farkhooi F. Fredholm theory for the mean first-passage time of integrate-and-fire oscillators with colored noise input. Phys Rev E 2020; 100:060402. [PMID: 31962454 DOI: 10.1103/physreve.100.060402] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2019] [Indexed: 11/07/2022]
Abstract
We develop a method to investigate the effect of noise timescales on the first-passage time of nonlinear oscillators. Using Fredholm theory, we derive an exact integral equation for the mean event rate of a leaky integrate-and-fire oscillator that receives constant input and temporally correlated noise. Furthermore, we show that Fredholm theory provides a unified framework to determine the system scaling behavior for small and large noise timescales. In this framework, the leading-order and higher-order asymptotic corrections for slow and fast noise are naturally emerging. We show the scaling behavior in the both limits is not reciprocal. We discuss further how this approach can be extended to study the first-passage time in a general class of nonlinear oscillators driven by colored noise at arbitrary timescales.
Collapse
Affiliation(s)
- Carl van Vreeswijk
- Centre de Neurophysique Physiologie et Pathologie, Paris Descartes University and CNRS UMR 8002 INCC, 75006 Paris, France
| | - Farzad Farkhooi
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, 10115 Berlin, Germany
| |
Collapse
|
13
|
Mattia M, Biggio M, Galluzzi A, Storace M. Dimensional reduction in networks of non-Markovian spiking neurons: Equivalence of synaptic filtering and heterogeneous propagation delays. PLoS Comput Biol 2019; 15:e1007404. [PMID: 31593569 PMCID: PMC6799936 DOI: 10.1371/journal.pcbi.1007404] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2018] [Revised: 10/18/2019] [Accepted: 09/16/2019] [Indexed: 11/19/2022] Open
Abstract
Message passing between components of a distributed physical system is non-instantaneous and contributes to determine the time scales of the emerging collective dynamics. In biological neuron networks this is due in part to local synaptic filtering of exchanged spikes, and in part to the distribution of the axonal transmission delays. How differently these two kinds of communication protocols affect the network dynamics is still an open issue due to the difficulties in dealing with the non-Markovian nature of synaptic transmission. Here, we develop a mean-field dimensional reduction yielding to an effective Markovian dynamics of the population density of the neuronal membrane potential, valid under the hypothesis of small fluctuations of the synaptic current. Within this limit, the resulting theory allows us to prove the formal equivalence between the two transmission mechanisms, holding for any synaptic time scale, integrate-and-fire neuron model, spike emission regimes and for different network states even when the neuron number is finite. The equivalence holds even for larger fluctuations of the synaptic input, if white noise currents are incorporated to model other possible biological features such as ionic channel stochasticity.
Collapse
|
14
|
Qu G, Fan B, Fu X, Yu Y. The Impact of Frequency Scale on the Response Sensitivity and Reliability of Cortical Neurons to 1/f β Input Signals. Front Cell Neurosci 2019; 13:311. [PMID: 31354432 PMCID: PMC6637762 DOI: 10.3389/fncel.2019.00311] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2019] [Accepted: 06/25/2019] [Indexed: 12/16/2022] Open
Abstract
What type of principle features intrinsic inside of the fluctuated input signals could drive neurons with the maximal excitations is one of the crucial neural coding issues. In this article, we examined both experimentally and theoretically the cortical neuronal responsivity (including firing rate and spike timing reliability) to input signals with different intrinsic correlational statistics (e.g., white-type noise, showed 1/f0 power spectrum, pink noise 1/f, and brown noises 1/f2) and different frequency ranges. Our results revealed that the response sensitivity and reliability of cortical neurons is much higher in response to 1/f noise stimuli with long-term correlations than 1/f0 with short-term correlations for a broad frequency range, and also higher than 1/f2 for all frequency ranges. In addition, we found that neuronal sensitivity diverges to opposite directions for 1/f noise comparing with 1/f0 white noise as a function of cutoff frequency of input signal. As the cutoff frequency is progressively increased from 50 to 1,000 Hz, the neuronal responsiveness increased gradually for 1/f noise, while decreased exponentially for white noise. Computational simulations of a general cortical model revealed that, neuronal sensitivity and reliability to input signal statistics was majorly dominated by fast sodium inactivation, potassium activation, and membrane time constants.
Collapse
Affiliation(s)
| | | | | | - Yuguo Yu
- State Key Laboratory of Medical Neurobiology, School of Life Science, Human Phenome Institute, Institute of Brain Science, Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
| |
Collapse
|
15
|
Puelma Touzel M, Wolf F. Statistical mechanics of spike events underlying phase space partitioning and sequence codes in large-scale models of neural circuits. Phys Rev E 2019; 99:052402. [PMID: 31212548 DOI: 10.1103/physreve.99.052402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2017] [Indexed: 11/07/2022]
Abstract
Cortical circuits operate in an inhibition-dominated regime of spiking activity. Recently, it was found that spiking circuit models in this regime can, despite disordered connectivity and asynchronous irregular activity, exhibit a locally stable dynamics that may be used for neural computation. The lack of existing mathematical tools has precluded analytical insight into this phase. Here we present analytical methods tailored to the granularity of spike-based interactions for analyzing attractor geometry in high-dimensional spiking dynamics. We apply them to reveal the properties of the complex geometry of trajectories of population spiking activity in a canonical model of locally stable spiking dynamics. We find that attractor basin boundaries are the preimages of spike-time collision events involving connected neurons. These spike-based instabilities control the divergence rate of neighboring basins and have no equivalent in rate-based models. They are located according to the disordered connectivity at a random subset of edges in a hypercube representation of the phase space. Iterating backward these edges using the stable dynamics induces a partition refinement on this space that converges to the attractor basins. We formulate a statistical theory of the locations of such events relative to attracting trajectories via a tractable representation of local trajectory ensembles. Averaging over the disorder, we derive the basin diameter distribution, whose characteristic scale emerges from the relative strengths of the stabilizing inhibitory coupling and destabilizing spike interactions. Our study provides an approach to analytically dissect how connectivity, coupling strength, and single-neuron dynamics shape the phase space geometry in the locally stable regime of spiking neural circuit dynamics.
Collapse
Affiliation(s)
- Maximilian Puelma Touzel
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany and Mila, Université de Montréal, Montréal, Quebec, Canada H2S 3H1
| | - Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany; Faculty of Physics, Georg August University, 37077 Göttingen, Germany; Bernstein Center for Computational Neuroscience, 37077 Göttingen, Germany; and Kavli Institute for Theoretical Physics, University of California, Santa Barbara, Santa Barbara, California 93106-4111, USA
| |
Collapse
|
16
|
Baker C, Ebsch C, Lampl I, Rosenbaum R. Correlated states in balanced neuronal networks. Phys Rev E 2019; 99:052414. [PMID: 31212573 DOI: 10.1103/physreve.99.052414] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2018] [Indexed: 06/09/2023]
Abstract
Understanding the magnitude and structure of interneuronal correlations and their relationship to synaptic connectivity structure is an important and difficult problem in computational neuroscience. Early studies show that neuronal network models with excitatory-inhibitory balance naturally create very weak spike train correlations, defining the "asynchronous state." Later work showed that, under some connectivity structures, balanced networks can produce larger correlations between some neuron pairs, even when the average correlation is very small. All of these previous studies assume that the local network receives feedforward synaptic input from a population of uncorrelated spike trains. We show that when spike trains providing feedforward input are correlated, the downstream recurrent network produces much larger correlations. We provide an in-depth analysis of the resulting "correlated state" in balanced networks and show that, unlike the asynchronous state, it produces a tight excitatory-inhibitory balance consistent with in vivo cortical recordings.
Collapse
Affiliation(s)
- Cody Baker
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Christopher Ebsch
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
| | - Ilan Lampl
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, 7610001, Israel
| | - Robert Rosenbaum
- Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana 46556, USA
- Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana 46556, USA
| |
Collapse
|
17
|
Braun W, Longtin A. Interspike interval correlations in networks of inhibitory integrate-and-fire neurons. Phys Rev E 2019; 99:032402. [PMID: 30999498 DOI: 10.1103/physreve.99.032402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Indexed: 11/07/2022]
Abstract
We study temporal correlations of interspike intervals, quantified by the network-averaged serial correlation coefficient (SCC), in networks of both current- and conductance-based purely inhibitory integrate-and-fire neurons. Numerical simulations reveal transitions to negative SCCs at intermediate values of bias current drive and network size. As bias drive and network size are increased past these values, the SCC returns to zero. The SCC is maximally negative at an intermediate value of the network oscillation strength. The dependence of the SCC on two canonical schemes for synaptic connectivity is studied, and it is shown that the results occur robustly in both schemes. For conductance-based synapses, the SCC becomes negative at the onset of both a fast and slow coherent network oscillation. We then show by means of offline simulations using prerecorded network activity that a neuron's SCC is highly sensitive to its number of presynaptic inputs. Finally, we devise a noise-reduced diffusion approximation for current-based networks that accounts for the observed temporal correlation transitions.
Collapse
Affiliation(s)
- Wilhelm Braun
- Neural Network Dynamics and Computation, Institut für Genetik, Universität Bonn, Kirschallee 1, 53115 Bonn, Germany.,Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| |
Collapse
|
18
|
Bird AD, Richardson MJE. Transmission of temporally correlated spike trains through synapses with short-term depression. PLoS Comput Biol 2018; 14:e1006232. [PMID: 29933363 PMCID: PMC6039054 DOI: 10.1371/journal.pcbi.1006232] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Revised: 07/10/2018] [Accepted: 05/24/2018] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic depression, caused by depletion of releasable neurotransmitter, modulates the strength of neuronal connections in a history-dependent manner. Quantifying the statistics of synaptic transmission requires stochastic models that link probabilistic neurotransmitter release with presynaptic spike-train statistics. Common approaches are to model the presynaptic spike train as either regular or a memory-less Poisson process: few analytical results are available that describe depressing synapses when the afferent spike train has more complex, temporally correlated statistics such as bursts. Here we present a series of analytical results—from vesicle release-site occupancy statistics, via neurotransmitter release, to the post-synaptic voltage mean and variance—for depressing synapses driven by correlated presynaptic spike trains. The class of presynaptic drive considered is that fully characterised by the inter-spike-interval distribution and encompasses a broad range of models used for neuronal circuit and network analyses, such as integrate-and-fire models with a complete post-spike reset and receiving sufficiently short-time correlated drive. We further demonstrate that the derived post-synaptic voltage mean and variance allow for a simple and accurate approximation of the firing rate of the post-synaptic neuron, using the exponential integrate-and-fire model as an example. These results extend the level of biological detail included in models of synaptic transmission and will allow for the incorporation of more complex and physiologically relevant firing patterns into future studies of neuronal networks. Synapses between neurons transmit signals with strengths that vary with the history of their activity, over scales from milliseconds to decades. Short-term changes in synaptic strength modulate and sculpt ongoing neuronal activity, whereas long-term changes underpin memory formation. Here we focus on changes of strength over timescales of less than a second caused by transitory depletion of the neurotransmitters that carry signals across the synapse. Neurotransmitters are stored in small vesicles that release their contents, with a certain probability, when the presynaptic neuron is active. Once a vesicle has been used it is replenished after a variable delay. There is therefore a complex interaction between the pattern of incoming signals to the synapse and the probablistic release and restock of packaged neurotransmitter. Here we extend existing models to examine how correlated synaptic activity is transmitted through synapses and affects the voltage fluctuations and firing rate of the target neuron. Our results provide a framework that will allow for the inclusion of biophysically realistic synaptic behaviour in studies of neuronal circuits.
Collapse
Affiliation(s)
- Alex D. Bird
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
- Ernst Strüngmann Institute for Neuroscience, Max Planck Society, Frankfurt, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt, Germany
- * E-mail: (ADB); (MJER)
| | - Magnus J. E. Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry, United Kingdom
- * E-mail: (ADB); (MJER)
| |
Collapse
|
19
|
Richard A, Orio P, Tanré E. An integrate-and-fire model to generate spike trains with long-range dependence. J Comput Neurosci 2018; 44:297-312. [PMID: 29574632 DOI: 10.1007/s10827-018-0680-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2017] [Revised: 02/22/2018] [Accepted: 02/23/2018] [Indexed: 11/24/2022]
Abstract
Long-range dependence (LRD) has been observed in a variety of phenomena in nature, and for several years also in the spiking activity of neurons. Often, this is interpreted as originating from a non-Markovian system. Here we show that a purely Markovian integrate-and-fire (IF) model, with a noisy slow adaptation term, can generate interspike intervals (ISIs) that appear as having LRD. However a proper analysis shows that this is not the case asymptotically. For comparison, we also consider a new model of individual IF neuron with fractional (non-Markovian) noise. The correlations of its spike trains are studied and proven to have LRD, unlike classical IF models. On the other hand, to correctly measure long-range dependence, it is usually necessary to know if the data are stationary. Thus, a methodology to evaluate stationarity of the ISIs is presented and applied to the various IF models. We explain that Markovian IF models may seem to have LRD because of non-stationarities.
Collapse
Affiliation(s)
- Alexandre Richard
- CentraleSupélec, Université Paris-Saclay, Laboratoire MICS et Fédération CNRS - FR3487, Gif-sur-Yvette, France.
| | - Patricio Orio
- Instituto de Neurociencia, Facultad de Ciencias, Universidad de Valparaíso, Valparaiso, Chile.,Centro Interdisciplinario de Neurociencia de Valparaíso, Universidad de Valparaíso, Valparaiso, Chile
| | - Etienne Tanré
- Université Côte d'Azur, Inria, 2004 Route des Lucioles BP 93, 06902, Sophia-Antipolis, France
| |
Collapse
|
20
|
Pena RFO, Vellmer S, Bernardi D, Roque AC, Lindner B. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks. Front Comput Neurosci 2018; 12:9. [PMID: 29551968 PMCID: PMC5840464 DOI: 10.3389/fncom.2018.00009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2017] [Accepted: 02/07/2018] [Indexed: 11/13/2022] Open
Abstract
Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.
Collapse
Affiliation(s)
- Rodrigo F O Pena
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Sebastian Vellmer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Davide Bernardi
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Antonio C Roque
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| |
Collapse
|
21
|
Mankin R, Paekivi S. Memory-induced resonancelike suppression of spike generation in a resonate-and-fire neuron model. Phys Rev E 2018; 97:012125. [PMID: 29448468 DOI: 10.1103/physreve.97.012125] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2017] [Indexed: 06/08/2023]
Abstract
The behavior of a stochastic resonate-and-fire neuron model based on a reduction of a fractional noise-driven generalized Langevin equation (GLE) with a power-law memory kernel is considered. The effect of temporally correlated random activity of synaptic inputs, which arise from other neurons forming local and distant networks, is modeled as an additive fractional Gaussian noise in the GLE. Using a first-passage-time formulation, in certain system parameter domains exact expressions for the output interspike interval (ISI) density and for the survival probability (the probability that a spike is not generated) are derived and their dependence on input parameters, especially on the memory exponent, is analyzed. In the case of external white noise, it is shown that at intermediate values of the memory exponent the survival probability is significantly enhanced in comparison with the cases of strong and weak memory, which causes a resonancelike suppression of the probability of spike generation as a function of the memory exponent. Moreover, an examination of the dependence of multimodality in the ISI distribution on input parameters shows that there exists a critical memory exponent α_{c}≈0.402, which marks a dynamical transition in the behavior of the system. That phenomenon is illustrated by a phase diagram describing the emergence of three qualitatively different structures of the ISI distribution. Similarities and differences between the behavior of the model at internal and external noises are also discussed.
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Sander Paekivi
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
22
|
Rajdl K, Lansky P, Kostal L. Entropy factor for randomness quantification in neuronal data. Neural Netw 2017; 95:57-65. [DOI: 10.1016/j.neunet.2017.07.016] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2017] [Revised: 07/27/2017] [Accepted: 07/28/2017] [Indexed: 11/28/2022]
|
23
|
Doose J, Lindner B. Evoking prescribed spike times in stochastic neurons. Phys Rev E 2017; 96:032109. [PMID: 29346970 DOI: 10.1103/physreve.96.032109] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2017] [Indexed: 11/07/2022]
Abstract
Single cell stimulation in vivo is a powerful tool to investigate the properties of single neurons and their functionality in neural networks. We present a method to determine a cell-specific stimulus that reliably evokes a prescribed spike train with high temporal precision of action potentials. We test the performance of this stimulus in simulations for two different stochastic neuron models. For a broad range of parameters and a neuron firing with intermediate firing rates (20-40 Hz) the reliability in evoking the prescribed spike train is close to its theoretical maximum that is mainly determined by the level of intrinsic noise.
Collapse
Affiliation(s)
- Jens Doose
- Bernstein Center for Computational Neuroscience, Berlin 10115, Germany and Physics Department of the Humboldt University Berlin, Berlin 12489, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin 10115, Germany and Physics Department of the Humboldt University Berlin, Berlin 12489, Germany
| |
Collapse
|
24
|
Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, Shea-Brown E. From the statistics of connectivity to the statistics of spike times in neuronal networks. Curr Opin Neurobiol 2017; 46:109-119. [PMID: 28863386 DOI: 10.1016/j.conb.2017.07.011] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2017] [Revised: 07/21/2017] [Accepted: 07/27/2017] [Indexed: 10/19/2022]
Abstract
An essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits. The first is that local features of network connectivity can be surprisingly effective in predicting global statistics of activity across a network. The second is that, for the important case of large networks with excitatory-inhibitory balance, correlated spiking persists or vanishes depending on the spatial scales of recurrent and feedforward connectivity. We close by showing how these ideas, together with plasticity rules, can help to close the loop between network structure and activity statistics.
Collapse
Affiliation(s)
| | - Yu Hu
- Center for Brain Science, Harvard University, United States
| | - Michael A Buice
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, United States; Center for the Neural Basis of Cognition, Pittsburgh, United States
| | - Krešimir Josić
- Department of Mathematics, University of Houston, United States; Department of Biology and Biochemistry, University of Houston, United States; Department of BioSciences, Rice University, United States
| | - Robert Rosenbaum
- Department of Mathematics, University of Notre Dame, United States
| | - Eric Shea-Brown
- Allen Institute for Brain Science, United States; Department of Applied Mathematics, University of Washington, United States; Department of Physiology and Biophysics, and University of Washington Institute for Neuroengineering, United States.
| |
Collapse
|
25
|
Augustin M, Ladenbauer J, Baumann F, Obermayer K. Low-dimensional spike rate models derived from networks of adaptive integrate-and-fire neurons: Comparison and implementation. PLoS Comput Biol 2017. [PMID: 28644841 PMCID: PMC5507472 DOI: 10.1371/journal.pcbi.1005545] [Citation(s) in RCA: 36] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023] Open
Abstract
The spiking activity of single neurons can be well described by a nonlinear integrate-and-fire model that includes somatic adaptation. When exposed to fluctuating inputs sparsely coupled populations of these model neurons exhibit stochastic collective dynamics that can be effectively characterized using the Fokker-Planck equation. This approach, however, leads to a model with an infinite-dimensional state space and non-standard boundary conditions. Here we derive from that description four simple models for the spike rate dynamics in terms of low-dimensional ordinary differential equations using two different reduction techniques: one uses the spectral decomposition of the Fokker-Planck operator, the other is based on a cascade of two linear filters and a nonlinearity, which are determined from the Fokker-Planck equation and semi-analytically approximated. We evaluate the reduced models for a wide range of biologically plausible input statistics and find that both approximation approaches lead to spike rate models that accurately reproduce the spiking behavior of the underlying adaptive integrate-and-fire population. Particularly the cascade-based models are overall most accurate and robust, especially in the sensitive region of rapidly changing input. For the mean-driven regime, when input fluctuations are not too strong and fast, however, the best performing model is based on the spectral decomposition. The low-dimensional models also well reproduce stable oscillatory spike rate dynamics that are generated either by recurrent synaptic excitation and neuronal adaptation or through delayed inhibitory synaptic feedback. The computational demands of the reduced models are very low but the implementation complexity differs between the different model variants. Therefore we have made available implementations that allow to numerically integrate the low-dimensional spike rate models as well as the Fokker-Planck partial differential equation in efficient ways for arbitrary model parametrizations as open source software. The derived spike rate descriptions retain a direct link to the properties of single neurons, allow for convenient mathematical analyses of network states, and are well suited for application in neural mass/mean-field based brain network models. Characterizing the dynamics of biophysically modeled, large neuronal networks usually involves extensive numerical simulations. As an alternative to this expensive procedure we propose efficient models that describe the network activity in terms of a few ordinary differential equations. These systems are simple to solve and allow for convenient investigations of asynchronous, oscillatory or chaotic network states because linear stability analyses and powerful related methods are readily applicable. We build upon two research lines on which substantial efforts have been exerted in the last two decades: (i) the development of single neuron models of reduced complexity that can accurately reproduce a large repertoire of observed neuronal behavior, and (ii) different approaches to approximate the Fokker-Planck equation that represents the collective dynamics of large neuronal networks. We combine these advances and extend recent approximation methods of the latter kind to obtain spike rate models that surprisingly well reproduce the macroscopic dynamics of the underlying neuronal network. At the same time the microscopic properties are retained through the single neuron model parameters. To enable a fast adoption we have released an efficient Python implementation as open source software under a free license.
Collapse
Affiliation(s)
- Moritz Augustin
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Josef Ladenbauer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, École Normale Supérieure, Paris, France
| | - Fabian Baumann
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Klaus Obermayer
- Department of Software Engineering and Theoretical Computer Science, Technische Universität Berlin, Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
26
|
Droste F, Lindner B. Exact analytical results for integrate-and-fire neurons driven by excitatory shot noise. J Comput Neurosci 2017; 43:81-91. [PMID: 28585050 DOI: 10.1007/s10827-017-0649-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2017] [Revised: 04/14/2017] [Accepted: 05/04/2017] [Indexed: 11/24/2022]
Abstract
A neuron receives input from other neurons via electrical pulses, so-called spikes. The pulse-like nature of the input is frequently neglected in analytical studies; instead, the input is usually approximated to be Gaussian. Recent experimental studies have shown, however, that an assumption underlying this approximation is often not met: Individual presynaptic spikes can have a significant effect on a neuron's dynamics. It is thus desirable to explicitly account for the pulse-like nature of neural input, i.e. consider neurons driven by a shot noise - a long-standing problem that is mathematically challenging. In this work, we exploit the fact that excitatory shot noise with exponentially distributed weights can be obtained as a limit case of dichotomous noise, a Markovian two-state process. This allows us to obtain novel exact expressions for the stationary voltage density and the moments of the interspike-interval density of general integrate-and-fire neurons driven by such an input. For the special case of leaky integrate-and-fire neurons, we also give expressions for the power spectrum and the linear response to a signal. We verify and illustrate our expressions by comparison to simulations of leaky-, quadratic- and exponential integrate-and-fire neurons.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany. .,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany
| |
Collapse
|
27
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 68] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
28
|
Droste F, Lindner B. Exact results for power spectrum and susceptibility of a leaky integrate-and-fire neuron with two-state noise. Phys Rev E 2017; 95:012411. [PMID: 28208429 DOI: 10.1103/physreve.95.012411] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2016] [Indexed: 11/07/2022]
Abstract
The response properties of excitable systems driven by colored noise are of great interest, but are usually mathematically only accessible via approximations. For this reason, dichotomous noise, a rare example of a colored noise leading often to analytically tractable problems, has been extensively used in the study of stochastic systems. Here, we calculate exact expressions for the power spectrum and the susceptibility of a leaky integrate-and-fire neuron driven by asymmetric dichotomous noise. While our results are in excellent agreement with simulations, they also highlight a limitation of using dichotomous noise as a simple model for more complex fluctuations: Both power spectrum and susceptibility exhibit an undamped periodic structure, the origin of which we discuss in detail.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
29
|
Braun W, Thul R. Sign changes as a universal concept in first-passage-time calculations. Phys Rev E 2017; 95:012114. [PMID: 28208500 DOI: 10.1103/physreve.95.012114] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2016] [Indexed: 06/06/2023]
Abstract
First-passage-time problems are ubiquitous across many fields of study, including transport processes in semiconductors and biological synapses, evolutionary game theory and percolation. Despite their prominence, first-passage-time calculations have proven to be particularly challenging. Analytical results to date have often been obtained under strong conditions, leaving most of the exploration of first-passage-time problems to direct numerical computations. Here we present an analytical approach that allows the derivation of first-passage-time distributions for the wide class of nondifferentiable Gaussian processes. We demonstrate that the concept of sign changes naturally generalizes the common practice of counting crossings to determine first-passage events. Our method works across a wide range of time-dependent boundaries and noise strengths, thus alleviating common hurdles in first-passage-time calculations.
Collapse
Affiliation(s)
- Wilhelm Braun
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, Ottawa, Canada K1N 6N5
| | - Rüdiger Thul
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, United Kingdom
| |
Collapse
|
30
|
Deniz T, Rotter S. Solving the two-dimensional Fokker-Planck equation for strongly correlated neurons. Phys Rev E 2017; 95:012412. [PMID: 28208505 DOI: 10.1103/physreve.95.012412] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2016] [Indexed: 06/06/2023]
Abstract
Pairs of neurons in brain networks often share much of the input they receive from other neurons. Due to essential nonlinearities of the neuronal dynamics, the consequences for the correlation of the output spike trains are generally not well understood. Here we analyze the case of two leaky integrate-and-fire neurons using an approach which is nonperturbative with respect to the degree of input correlation. Our treatment covers both weakly and strongly correlated dynamics, generalizing previous results based on linear response theory.
Collapse
Affiliation(s)
- Taşkın Deniz
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Hansastraße 9a, 79104 Freiburg, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Hansastraße 9a, 79104 Freiburg, Germany
| |
Collapse
|
31
|
Mankin R, Rekker A. Response to a periodic stimulus in a perfect integrate-and-fire neuron model driven by colored noise. Phys Rev E 2016; 94:062103. [PMID: 28085436 DOI: 10.1103/physreve.94.062103] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2016] [Indexed: 06/06/2023]
Abstract
The output interspike interval statistics of a stochastic perfect integrate-and-fire neuron model driven by an additive exogenous periodic stimulus is considered. The effect of temporally correlated random activity of synaptic inputs is modeled by an additive symmetric dichotomous noise. Using a first-passage-time formulation, exact expressions for the output interspike interval density and for the serial correlation coefficient are derived in the nonsteady regime, and their dependence on input parameters (e.g., the noise correlation time and amplitude as well as the frequency of an input current) is analyzed. It is shown that an interplay of a periodic forcing and colored noise can cause a variety of nonequilibrium cooperation effects, such as sign reversals of the interspike interval correlations versus noise-switching rate as well as versus the frequency of periodic forcing, a power-law-like decay of oscillations of the serial correlation coefficients in the long-lag limit, amplification of the output signal modulation in the instantaneous firing rate of the neural response, etc. The features of spike statistics in the limits of slow and fast noises are also discussed.
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Astrid Rekker
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
32
|
D'Onofrio G, Pirozzi E. Successive spike times predicted by a stochastic neuronal model with a variable input signal. MATHEMATICAL BIOSCIENCES AND ENGINEERING : MBE 2016; 13:495-507. [PMID: 27106180 DOI: 10.3934/mbe.2016003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Two different stochastic processes are used to model the evolution of the membrane voltage of a neuron exposed to a time-varying input signal. The first process is an inhomogeneous Ornstein-Uhlenbeck process and its first passage time through a constant threshold is used to model the first spike time after the signal onset. The second process is a Gauss-Markov process identified by a particular mean function dependent on the first passage time of the first process. It is shown that the second process is also of a diffusion type. The probability density function of the maximum between the first passage time of the first and the second process is considered to approximate the distribution of the second spike time. Results obtained by simulations are compared with those following the numerical and asymptotic approximations. A general equation to model successive spike times is given. Finally, examples with specific input signals are provided.
Collapse
Affiliation(s)
- Giuseppe D'Onofrio
- Dipartimento di Matematica e Applicazioni, Universita degli studi di Napoli, FEDERICO II, Via Cinthia, Monte S.Angelo, Napoli, 80126, Italy.
| | | |
Collapse
|
33
|
|
34
|
Mankin R, Lumi N. Statistics of a leaky integrate-and-fire model of neurons driven by dichotomous noise. Phys Rev E 2016; 93:052143. [PMID: 27300865 DOI: 10.1103/physreve.93.052143] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2016] [Indexed: 06/06/2023]
Abstract
The behavior of a stochastic leaky integrate-and-fire model of neurons is considered. The effect of temporally correlated random neuronal input is modeled as a colored two-level (dichotomous) Markovian noise. Relying on the Riemann method, exact expressions for the output interspike interval density and for the serial correlation coefficient are derived, and their dependence on noise parameters (such as correlation time and amplitude) is analyzed. Particularly, noise-induced sign reversal and a resonancelike amplification of the kurtosis of the interspike interval distribution are established. The features of spike statistics, analytically revealed in our study, are compared with recently obtained results for a perfect integrate-and-fire neuron model.
Collapse
Affiliation(s)
- Romi Mankin
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| | - Neeme Lumi
- School of Natural Sciences and Health, Tallinn University, 29 Narva Road, 10120 Tallinn, Estonia
| |
Collapse
|
35
|
Rosenbaum R. A Diffusion Approximation and Numerical Methods for Adaptive Neuron Models with Stochastic Inputs. Front Comput Neurosci 2016; 10:39. [PMID: 27148036 PMCID: PMC4840919 DOI: 10.3389/fncom.2016.00039] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2015] [Accepted: 04/04/2016] [Indexed: 11/16/2022] Open
Abstract
Characterizing the spiking statistics of neurons receiving noisy synaptic input is a central problem in computational neuroscience. Monte Carlo approaches to this problem are computationally expensive and often fail to provide mechanistic insight. Thus, the field has seen the development of mathematical and numerical approaches, often relying on a Fokker-Planck formalism. These approaches force a compromise between biological realism, accuracy and computational efficiency. In this article we develop an extension of existing diffusion approximations to more accurately approximate the response of neurons with adaptation currents and noisy synaptic currents. The implementation refines existing numerical schemes for solving the associated Fokker-Planck equations to improve computationally efficiency and accuracy. Computer code implementing the developed algorithms is made available to the public.
Collapse
Affiliation(s)
- Robert Rosenbaum
- Applied and Computational Mathematics and Statistics, University of Notre Dame Notre Dame, IN, USA
| |
Collapse
|
36
|
Schwalger T, Lindner B. Analytical approach to an integrate-and-fire model with spike-triggered adaptation. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062703. [PMID: 26764723 DOI: 10.1103/physreve.92.062703] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2015] [Indexed: 06/05/2023]
Abstract
The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.
Collapse
Affiliation(s)
- Tilo Schwalger
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL) Station 15, CH-1015 Lausanne, Switzerland
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
37
|
Schuecker J, Diesmann M, Helias M. Modulated escape from a metastable state driven by colored noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:052119. [PMID: 26651659 DOI: 10.1103/physreve.92.052119] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/06/2014] [Indexed: 06/05/2023]
Abstract
Many phenomena in nature are described by excitable systems driven by colored noise. The temporal correlations in the fluctuations hinder an analytical treatment. We here present a general method of reduction to a white-noise system, capturing the color of the noise by effective and time-dependent boundary conditions. We apply the formalism to a model of the excitability of neuronal membranes, the leaky integrate-and-fire neuron model, revealing an analytical expression for the linear response of the system valid up to moderate frequencies. The closed form analytical expression enables the characterization of the response properties of such excitable units and the assessment of oscillations emerging in networks thereof.
Collapse
Affiliation(s)
- Jannis Schuecker
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
38
|
Marcoux CM, Clarke SE, Nesse WH, Longtin A, Maler L. Balanced ionotropic receptor dynamics support signal estimation via voltage-dependent membrane noise. J Neurophysiol 2015; 115:530-45. [PMID: 26561607 DOI: 10.1152/jn.00786.2015] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2015] [Accepted: 11/10/2015] [Indexed: 11/22/2022] Open
Abstract
Encoding behaviorally relevant stimuli in a noisy background is critical for animals to survive in their natural environment. We identify core biophysical and synaptic mechanisms that permit the encoding of low-frequency signals in pyramidal neurons of the weakly electric fish Apteronotus leptorhynchus, an animal that can accurately encode even miniscule amplitude modulations of its self-generated electric field. We demonstrate that slow NMDA receptor (NMDA-R)-mediated excitatory postsynaptic potentials (EPSPs) are able to summate over many interspike intervals (ISIs) of the primary electrosensory afferents (EAs), effectively eliminating the baseline EA ISI correlations from the pyramidal cell input. Together with a dynamic balance of NMDA-R and GABA-A-R currents, this permits stimulus-evoked changes in EA spiking to be transmitted efficiently to target electrosensory lobe (ELL) pyramidal cells, for encoding low-frequency signals. Interestingly, AMPA-R activity is depressed and appears to play a negligible role in the generation of action potentials. Instead, we hypothesize that cell-intrinsic voltage-dependent membrane noise supports the encoding of perithreshold sensory input; this noise drives a significant proportion of pyramidal cell spikes. Together, these mechanisms may be sufficient for the ELL to encode signals near the threshold of behavioral detection.
Collapse
Affiliation(s)
- Curtis M Marcoux
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
| | - Stephen E Clarke
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada
| | - William H Nesse
- Department of Mathematics, University of Utah, Salt Lake City, Utah
| | - Andre Longtin
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada; Department of Physics, University of Ottawa, Ottawa, Ontario, Canada; and Brain and Mind Institute and Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| | - Leonard Maler
- Department of Cellular and Molecular Medicine, University of Ottawa, Ottawa, Ontario, Canada; Brain and Mind Institute and Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
39
|
Wieland S, Bernardi D, Schwalger T, Lindner B. Slow fluctuations in recurrent networks of spiking neurons. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:040901. [PMID: 26565154 DOI: 10.1103/physreve.92.040901] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/24/2015] [Indexed: 06/05/2023]
Abstract
Networks of fast nonlinear elements may display slow fluctuations if interactions are strong. We find a transition in the long-term variability of a sparse recurrent network of perfect integrate-and-fire neurons at which the Fano factor switches from zero to infinity and the correlation time is minimized. This corresponds to a bifurcation in a linear map arising from the self-consistency of temporal input and output statistics. More realistic neural dynamics with a leak current and refractory period lead to smoothed transitions and modified critical couplings that can be theoretically predicted.
Collapse
Affiliation(s)
- Stefan Wieland
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Tilo Schwalger
- School of Computer and Communication Sciences and School of Life Sciences, Brain Mind Institute, École Polytechnique Fédérale de Lausanne, Station 15, 1015 Lausanne EPFL, Switzerland
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstrasse 13, Haus 2, 10115 Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|