1
|
Kiessling L, Lindner B. Extraction of parameters of a stochastic integrate-and-fire model with adaptation from voltage recordings. BIOLOGICAL CYBERNETICS 2024; 119:2. [PMID: 39738681 DOI: 10.1007/s00422-024-01000-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/07/2024] [Accepted: 11/21/2024] [Indexed: 01/02/2025]
Abstract
Integrate-and-fire models are an important class of phenomenological neuronal models that are frequently used in computational studies of single neural activity, population activity, and recurrent neural networks. If these models are used to understand and interpret electrophysiological data, it is important to reliably estimate the values of the model's parameters. However, there are no standard methods for the parameter estimation of Integrate-and-fire models. Here, we identify the model parameters of an adaptive integrate-and-fire neuron with temporally correlated noise by analyzing membrane potential and spike trains in response to a current step. Explicit formulas for the parameters are analytically derived by stationary and time-dependent ensemble averaging of the model dynamics. Specifically, we give mathematical expressions for the adaptation time constant, the adaptation strength, the membrane time constant, and the mean constant input current. These theoretical predictions are validated by numerical simulations for a broad range of system parameters. Importantly, we demonstrate that parameters can be extracted by using only a modest number of trials. This is particularly encouraging, as the number of trials in experimental settings is often limited. Hence, our formulas may be useful for the extraction of effective parameters from neurophysiological data obtained from standard current-step experiments.
Collapse
Affiliation(s)
- Lilli Kiessling
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Technische, Universit Berlin, Hardenbergstr. 36, 10623, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
2
|
Ramlow L, Lindner B. Noise intensity of a Markov chain. Phys Rev E 2024; 110:014139. [PMID: 39161007 DOI: 10.1103/physreve.110.014139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2024] [Accepted: 07/08/2024] [Indexed: 08/21/2024]
Abstract
Stochastic transitions between discrete microscopic states play an important role in many physical and biological systems. Often these transitions lead to fluctuations on a macroscopic scale. A classic example from neuroscience is the stochastic opening and closing of ion channels and the resulting fluctuations in membrane current. When the microscopic transitions are fast, the macroscopic fluctuations are nearly uncorrelated and can be fully characterized by their mean and noise intensity. We show how, for an arbitrary Markov chain, the noise intensity can be determined from an algebraic equation, based on the transition rate matrix; these results are in agreement with earlier results from the theory of zero-frequency noise in quantum mechanical and classical systems. We demonstrate the validity of the theory using an analytically tractable two-state Markovian dichotomous noise, an eight-state model for a calcium channel subunit (De Young-Keizer model), and Markov models of the voltage-gated sodium and potassium channels as they appear in a stochastic version of the Hodgkin-Huxley model.
Collapse
|
3
|
Franzen J, Ramlow L, Lindner B. The steady state and response to a periodic stimulation of the firing rate for a theta neuron with correlated noise. J Comput Neurosci 2023; 51:107-128. [PMID: 36273087 PMCID: PMC9840600 DOI: 10.1007/s10827-022-00836-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 07/29/2022] [Accepted: 09/01/2022] [Indexed: 01/18/2023]
Abstract
The stochastic activity of neurons is caused by various sources of correlated fluctuations and can be described in terms of simplified, yet biophysically grounded, integrate-and-fire models. One paradigmatic model is the quadratic integrate-and-fire model and its equivalent phase description by the theta neuron. Here we study the theta neuron model driven by a correlated Ornstein-Uhlenbeck noise and by periodic stimuli. We apply the matrix-continued-fraction method to the associated Fokker-Planck equation to develop an efficient numerical scheme to determine the stationary firing rate as well as the stimulus-induced modulation of the instantaneous firing rate. For the stationary case, we identify the conditions under which the firing rate decreases or increases by the effect of the colored noise and compare our results to existing analytical approximations for limit cases. For an additional periodic signal we demonstrate how the linear and nonlinear response terms can be computed and report resonant behavior for some of them. We extend the method to the case of two periodic signals, generally with incommensurable frequencies, and present a particular case for which a strong mixed response to both signals is observed, i.e. where the response to the sum of signals differs significantly from the sum of responses to the single signals. We provide Python code for our computational method: https://github.com/jannikfranzen/theta_neuron .
Collapse
Affiliation(s)
- Jannik Franzen
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
| | - Lukas Ramlow
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| | - Benjamin Lindner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| |
Collapse
|
4
|
Sidhu RS, Johnson EC, Jones DL, Ratnam R. A dynamic spike threshold with correlated noise predicts observed patterns of negative interval correlations in neuronal spike trains. BIOLOGICAL CYBERNETICS 2022; 116:611-633. [PMID: 36244004 PMCID: PMC9691502 DOI: 10.1007/s00422-022-00946-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/10/2021] [Accepted: 09/26/2022] [Indexed: 06/16/2023]
Abstract
Negative correlations in the sequential evolution of interspike intervals (ISIs) are a signature of memory in neuronal spike-trains. They provide coding benefits including firing-rate stabilization, improved detectability of weak sensory signals, and enhanced transmission of information by improving signal-to-noise ratio. Primary electrosensory afferent spike-trains in weakly electric fish fall into two categories based on the pattern of ISI correlations: non-bursting units have negative correlations which remain negative but decay to zero with increasing lags (Type I ISI correlations), and bursting units have oscillatory (alternating sign) correlation which damp to zero with increasing lags (Type II ISI correlations). Here, we predict and match observed ISI correlations in these afferents using a stochastic dynamic threshold model. We determine the ISI correlation function as a function of an arbitrary discrete noise correlation function [Formula: see text], where k is a multiple of the mean ISI. The function permits forward and inverse calculations of the correlation function. Both types of correlation functions can be generated by adding colored noise to the spike threshold with Type I correlations generated with slow noise and Type II correlations generated with fast noise. A first-order autoregressive (AR) process with a single parameter is sufficient to predict and accurately match both types of afferent ISI correlation functions, with the type being determined by the sign of the AR parameter. The predicted and experimentally observed correlations are in geometric progression. The theory predicts that the limiting sum of ISI correlations is [Formula: see text] yielding a perfect DC-block in the power spectrum of the spike train. Observed ISI correlations from afferents have a limiting sum that is slightly larger at [Formula: see text] ([Formula: see text]). We conclude that the underlying process for generating ISIs may be a simple combination of low-order AR and moving average processes and discuss the results from the perspective of optimal coding.
Collapse
Affiliation(s)
- Robin S Sidhu
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Erik C Johnson
- The Johns Hopkins University Applied Physics Laboratory, Laurel, MD, USA
| | - Douglas L Jones
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Rama Ratnam
- Division of Biological and Life Sciences, School of Arts and Sciences, Ahmedabad University, Ahmedabad, Gujarat, India.
| |
Collapse
|
5
|
Stimulus presentation can enhance spiking irregularity across subcortical and cortical regions. PLoS Comput Biol 2022; 18:e1010256. [PMID: 35789328 PMCID: PMC9286274 DOI: 10.1371/journal.pcbi.1010256] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2021] [Revised: 07/15/2022] [Accepted: 05/27/2022] [Indexed: 11/24/2022] Open
Abstract
Stimulus presentation is believed to quench neural response variability as measured by fano-factor (FF). However, the relative contributions of within-trial spike irregularity and trial-to-trial rate variability to FF fluctuations have remained elusive. Here, we introduce a principled approach for accurate estimation of spiking irregularity and rate variability in time for doubly stochastic point processes. Consistent with previous evidence, analysis showed stimulus-induced reduction in rate variability across multiple cortical and subcortical areas. However, unlike what was previously thought, spiking irregularity, was not constant in time but could be enhanced due to factors such as bursting abating the quench in the post-stimulus FF. Simulations confirmed plausibility of a time varying spiking irregularity arising from within and between pool correlations of excitatory and inhibitory neural inputs. By accurate parsing of neural variability, our approach reveals previously unnoticed changes in neural response variability and constrains candidate mechanisms that give rise to observed rate variability and spiking irregularity within brain regions. Mounting evidence suggest neural response variability to be important for understanding and constraining the underlying neural mechanisms in a given brain area. Here, by analyzing responses across multiple brain areas and by using a principled method for parsing variability components into rate variability and spiking irregularity, we show that unlike what was previously thought, event-related quench of variability is not a brain-wide phenomenon and that point process variability and nonrenewal bursting can enhance post-stimulus spiking irregularity across certain cortical and subcortical regions. We propose possible presynaptic mechanisms that may underlie the observed heterogeneities in spiking variability across the brain.
Collapse
|
6
|
Ramlow L, Lindner B. Interspike interval correlations in neuron models with adaptation and correlated noise. PLoS Comput Biol 2021; 17:e1009261. [PMID: 34449771 PMCID: PMC8428727 DOI: 10.1371/journal.pcbi.1009261] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 09/09/2021] [Accepted: 07/08/2021] [Indexed: 11/19/2022] Open
Abstract
The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel's time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability.
Collapse
Affiliation(s)
- Lukas Ramlow
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
- Physics Department, Humboldt University zu Berlin, Berlin, Germany
| |
Collapse
|
7
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
8
|
Pu S, Thomas PJ. Resolving molecular contributions of ion channel noise to interspike interval variability through stochastic shielding. BIOLOGICAL CYBERNETICS 2021; 115:267-302. [PMID: 34021802 DOI: 10.1007/s00422-021-00877-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Accepted: 05/04/2021] [Indexed: 06/12/2023]
Abstract
Molecular fluctuations can lead to macroscopically observable effects. The random gating of ion channels in the membrane of a nerve cell provides an important example. The contributions of independent noise sources to the variability of action potential timing have not previously been studied at the level of molecular transitions within a conductance-based model ion-state graph. Here we study a stochastic Langevin model for the Hodgkin-Huxley (HH) system based on a detailed representation of the underlying channel state Markov process, the "[Formula: see text]D model" introduced in (Pu and Thomas in Neural Computation 32(10):1775-1835, 2020). We show how to resolve the individual contributions that each transition in the ion channel graph makes to the variance of the interspike interval (ISI). We extend the mean return time (MRT) phase reduction developed in (Cao et al. in SIAM J Appl Math 80(1):422-447, 2020) to the second moment of the return time from an MRT isochron to itself. Because fixed-voltage spike detection triggers do not correspond to MRT isochrons, the inter-phase interval (IPI) variance only approximates the ISI variance. We find the IPI variance and ISI variance agree to within a few percent when both can be computed. Moreover, we prove rigorously, and show numerically, that our expression for the IPI variance is accurate in the small noise (large system size) regime; our theory is exact in the limit of small noise. By selectively including the noise associated with only those few transitions responsible for most of the ISI variance, our analysis extends the stochastic shielding (SS) paradigm (Schmandt and Galán in Phys Rev Lett 109(11):118101, 2012) from the stationary voltage clamp case to the current clamp case. We show numerically that the SS approximation has a high degree of accuracy even for larger, physiologically relevant noise levels. Finally, we demonstrate that the ISI variance is not an unambiguously defined quantity, but depends on the choice of voltage level set as the spike detection threshold. We find a small but significant increase in ISI variance, the higher the spike detection voltage, both for simulated stochastic HH data and for voltage traces recorded in in vitro experiments. In contrast, the IPI variance is invariant with respect to the choice of isochron used as a trigger for counting "spikes."
Collapse
Affiliation(s)
- Shusen Pu
- Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University, Cleveland, OH, USA.
- Department of Biomedical Engineering, Vanderbilt University, Nashville, TN, USA.
| | - Peter J Thomas
- Department of Mathematics, Applied Mathematics, and Statistics, Case Western Reserve University, Cleveland, OH, USA
- Department of Biology, Case Western Reserve University, Cleveland, OH, USA
- Department of Cognitive Science, Case Western Reserve University, Cleveland, OH, USA
- Department of Data and Computer Science, Case Western Reserve University, Cleveland, OH, USA
- Department of Electrical, Control, and Systems Engineering, Case Western Reserve University, Cleveland, OH, USA
| |
Collapse
|
9
|
Braun HA. Stochasticity Versus Determinacy in Neurobiology: From Ion Channels to the Question of the "Free Will". Front Syst Neurosci 2021; 15:629436. [PMID: 34122020 PMCID: PMC8190656 DOI: 10.3389/fnsys.2021.629436] [Citation(s) in RCA: 43] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2020] [Accepted: 04/06/2021] [Indexed: 11/13/2022] Open
Abstract
If one accepts that decisions are made by the brain and that neuronal mechanisms obey deterministic physical laws, it is hard to deny what some brain researchers postulate, such as "We do not do what we want, but we want what we do" and "We should stop talking about freedom. Our actions are determined by physical laws." This point of view has been substantially supported by spectacular neurophysiological experiments demonstrating action-related brain activity (readiness potentials, blood oxygen level-dependent signals) occurring up to several seconds before an individual becomes aware of his/her decision to perform the action. This report aims to counter the deterministic argument for the absence of free will by using experimental data, supplemented by computer simulations, to demonstrate that biological systems, specifically brain functions, are built on principle randomness, which is introduced already at the lowest level of neuronal information processing, the opening and closing of ion channels. Switching between open and closed states follows physiological laws but also makes use of randomness, which is apparently introduced by Brownian motion - principally unavoidable under all life-compatible conditions. Ion-channel stochasticity, manifested as noise, function is not smoothed out toward higher functional levels but can even be amplified by appropriate adjustment of the system's non-linearities. Examples shall be given to illustrate how stochasticity can propagate from ion channels to single neuron action potentials to neuronal network dynamics to the interactions between different brain nuclei up to the control of autonomic functions. It is proposed that this intrinsic stochasticity helps to keep the brain in a flexible state to explore diverse alternatives as a prerequisite of free decision-making.
Collapse
Affiliation(s)
- Hans Albert Braun
- Neurodynamics Group, Institute of Physiology and Pathophysiology, Philipps University of Marburg, Marburg, Germany
| |
Collapse
|
10
|
Knoll G, Lindner B. Recurrence-mediated suprathreshold stochastic resonance. J Comput Neurosci 2021; 49:407-418. [PMID: 34003421 PMCID: PMC8556192 DOI: 10.1007/s10827-021-00788-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Revised: 04/21/2021] [Accepted: 04/26/2021] [Indexed: 11/29/2022]
Abstract
It has previously been shown that the encoding of time-dependent signals by feedforward networks (FFNs) of processing units exhibits suprathreshold stochastic resonance (SSR), which is an optimal signal transmission for a finite level of independent, individual stochasticity in the single units. In this study, a recurrent spiking network is simulated to demonstrate that SSR can be also caused by network noise in place of intrinsic noise. The level of autonomously generated fluctuations in the network can be controlled by the strength of synapses, and hence the coding fraction (our measure of information transmission) exhibits a maximum as a function of the synaptic coupling strength. The presence of a coding peak at an optimal coupling strength is robust over a wide range of individual, network, and signal parameters, although the optimal strength and peak magnitude depend on the parameter being varied. We also perform control experiments with an FFN illustrating that the optimized coding fraction is due to the change in noise level and not from other effects entailed when changing the coupling strength. These results also indicate that the non-white (temporally correlated) network noise in general provides an extra boost to encoding performance compared to the FFN driven by intrinsic white noise fluctuations.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany. .,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115, Germany.,Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
11
|
Schleimer JH, Hesse J, Contreras SA, Schreiber S. Firing statistics in the bistable regime of neurons with homoclinic spike generation. Phys Rev E 2021; 103:012407. [PMID: 33601551 DOI: 10.1103/physreve.103.012407] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Accepted: 11/20/2020] [Indexed: 11/07/2022]
Abstract
Neuronal voltage dynamics of regularly firing neurons typically has one stable attractor: either a fixed point (like in the subthreshold regime) or a limit cycle that defines the tonic firing of action potentials (in the suprathreshold regime). In two of the three spike onset bifurcation sequences that are known to give rise to all-or-none type action potentials, however, the resting-state fixed point and limit cycle spiking can coexist in an intermediate regime, resulting in bistable dynamics. Here, noise can induce switches between the attractors, i.e., between rest and spiking, and thus increase the variability of the spike train compared to neurons with only one stable attractor. Qualitative features of the resulting spike statistics depend on the spike onset bifurcations. This paper focuses on the creation of the spiking limit cycle via the saddle-homoclinic orbit (HOM) bifurcation and derives interspike interval (ISI) densities for a conductance-based neuron model in the bistable regime. The ISI densities of bistable homoclinic neurons are found to be unimodal yet distinct from the inverse Gaussian distribution associated with the saddle-node-on-invariant-cycle bifurcation. It is demonstrated that for the HOM bifurcation the transition between rest and spiking is mainly determined along the downstroke of the action potential-a dynamical feature that is not captured by the commonly used reset neuron models. The deduced spike statistics can help to identify HOM dynamics in experimental data.
Collapse
Affiliation(s)
- Jan-Hendrik Schleimer
- Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Philippstrasse 13, Haus 4, 10115 Berlin, Germany
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| | - Janina Hesse
- Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Philippstrasse 13, Haus 4, 10115 Berlin, Germany
- MSH Medical School Hamburg, Am Kaiserkai 1, 20457 Hamburg, Germany
| | - Susana Andrea Contreras
- Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Philippstrasse 13, Haus 4, 10115 Berlin, Germany
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| | - Susanne Schreiber
- Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Philippstrasse 13, Haus 4, 10115 Berlin, Germany
- Bernstein Center for Computational Neuroscience, 10115 Berlin, Germany
| |
Collapse
|
12
|
Rajdl K, Lansky P, Kostal L. Fano Factor: A Potentially Useful Information. Front Comput Neurosci 2020; 14:569049. [PMID: 33328945 PMCID: PMC7718036 DOI: 10.3389/fncom.2020.569049] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2020] [Accepted: 10/07/2020] [Indexed: 12/03/2022] Open
Abstract
The Fano factor, defined as the variance-to-mean ratio of spike counts in a time window, is often used to measure the variability of neuronal spike trains. However, despite its transparent definition, careless use of the Fano factor can easily lead to distorted or even wrong results. One of the problems is the unclear dependence of the Fano factor on the spiking rate, which is often neglected or handled insufficiently. In this paper we aim to explore this problem in more detail and to study the possible solution, which is to evaluate the Fano factor in the operational time. We use equilibrium renewal and Markov renewal processes as spike train models to describe the method in detail, and we provide an illustration on experimental data.
Collapse
Affiliation(s)
- Kamil Rajdl
- Laboratory of Computational Neuroscience, Institute of Physiology, Academy of Sciences of the Czech Republic, Prague, Czechia
| | - Petr Lansky
- Laboratory of Computational Neuroscience, Institute of Physiology, Academy of Sciences of the Czech Republic, Prague, Czechia
| | - Lubomir Kostal
- Laboratory of Computational Neuroscience, Institute of Physiology, Academy of Sciences of the Czech Republic, Prague, Czechia
| |
Collapse
|
13
|
Bernardi D, Lindner B. Receiver operating characteristic curves for a simple stochastic process that carries a static signal. Phys Rev E 2020; 101:062132. [PMID: 32688497 DOI: 10.1103/physreve.101.062132] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2020] [Accepted: 05/14/2020] [Indexed: 11/07/2022]
Abstract
The detection of a weak signal in the presence of noise is an important problem that is often studied in terms of the receiver operating characteristic (ROC) curve, in which the probability of correct detection is plotted against the probability for a false positive. This kind of analysis is typically applied to the situation in which signal and noise are stochastic variables; the detection problem emerges, however, also often in a context in which both signal and noise are stochastic processes and the (correct or false) detection is said to take place when the process crosses a threshold in a given time window. Here we consider the problem for a combination of a static signal which has to be detected against a dynamic noise process, the well-known Ornstein-Uhlenbeck process. We give exact (but difficult to evaluate) quadrature expressions for the detection rates for false positives and correct detections, investigate systematically a simple sampling approximation suggested earlier, compare to an approximation by Stratonovich for the limit of high threshold, and briefly explore the case of multiplicative signal; all theoretical results are compared to extensive numerical simulations of the corresponding Langevin equation. Our results demonstrate that the sampling approximation provides a reasonable description of the ROC curve for this system, and it clarifies limit cases for the ROC curve.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| |
Collapse
|
14
|
Abstract
This study presents a computational model to reproduce the biological dynamics of "listening to music." A biologically plausible model of periodicity pitch detection is proposed and simulated. Periodicity pitch is computed across a range of the auditory spectrum. Periodicity pitch is detected from subsets of activated auditory nerve fibers (ANFs). These activate connected model octopus cells, which trigger model neurons detecting onsets and offsets; thence model interval-tuned neurons are innervated at the right interval times; and finally, a set of common interval-detecting neurons indicate pitch. Octopus cells rhythmically spike with the pitch periodicity of the sound. Batteries of interval-tuned neurons stopwatch-like measure the inter-spike intervals of the octopus cells by coding interval durations as first spike latencies (FSLs). The FSL-triggered spikes synchronously coincide through a monolayer spiking neural network at the corresponding receiver pitch neurons.
Collapse
Affiliation(s)
- Frank Klefenz
- Fraunhofer Institute for Digital Media Technology IDMT, Ilmenau, Germany
| | - Tamas Harczos
- Fraunhofer Institute for Digital Media Technology IDMT, Ilmenau, Germany
- Auditory Neuroscience and Optogenetics Laboratory, German Primate Center, Göttingen, Germany
- audifon GmbH & Co. KG, Kölleda, Germany
| |
Collapse
|
15
|
Bostner Ž, Knoll G, Lindner B. Information filtering by coincidence detection of synchronous population output: analytical approaches to the coherence function of a two-stage neural system. BIOLOGICAL CYBERNETICS 2020; 114:403-418. [PMID: 32583370 PMCID: PMC7326833 DOI: 10.1007/s00422-020-00838-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Accepted: 05/18/2020] [Indexed: 06/11/2023]
Abstract
Information about time-dependent sensory stimuli is encoded in the activity of neural populations; distinct aspects of the stimulus are read out by different types of neurons: while overall information is perceived by integrator cells, so-called coincidence detector cells are driven mainly by the synchronous activity in the population that encodes predominantly high-frequency content of the input signal (high-pass information filtering). Previously, an analytically accessible statistic called the partial synchronous output was introduced as a proxy for the coincidence detector cell's output in order to approximate its information transmission. In the first part of the current paper, we compare the information filtering properties (specifically, the coherence function) of this proxy to those of a simple coincidence detector neuron. We show that the latter's coherence function can indeed be well-approximated by the partial synchronous output with a time scale and threshold criterion that are related approximately linearly to the membrane time constant and firing threshold of the coincidence detector cell. In the second part of the paper, we propose an alternative theory for the spectral measures (including the coherence) of the coincidence detector cell that combines linear-response theory for shot-noise driven integrate-and-fire neurons with a novel perturbation ansatz for the spectra of spike-trains driven by colored noise. We demonstrate how the variability of the synaptic weights for connections from the population to the coincidence detector can shape the information transmission of the entire two-stage system.
Collapse
Affiliation(s)
- Žiga Bostner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| |
Collapse
|
16
|
Betkiewicz R, Lindner B, Nawrot MP. Circuit and Cellular Mechanisms Facilitate the Transformation from Dense to Sparse Coding in the Insect Olfactory System. eNeuro 2020; 7:ENEURO.0305-18.2020. [PMID: 32132095 PMCID: PMC7294456 DOI: 10.1523/eneuro.0305-18.2020] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2018] [Revised: 10/31/2019] [Accepted: 02/19/2020] [Indexed: 11/21/2022] Open
Abstract
Transformations between sensory representations are shaped by neural mechanisms at the cellular and the circuit level. In the insect olfactory system, the encoding of odor information undergoes a transition from a dense spatiotemporal population code in the antennal lobe to a sparse code in the mushroom body. However, the exact mechanisms shaping odor representations and their role in sensory processing are incompletely identified. Here, we investigate the transformation from dense to sparse odor representations in a spiking model of the insect olfactory system, focusing on two ubiquitous neural mechanisms: spike frequency adaptation at the cellular level and lateral inhibition at the circuit level. We find that cellular adaptation is essential for sparse representations in time (temporal sparseness), while lateral inhibition regulates sparseness in the neuronal space (population sparseness). The interplay of both mechanisms shapes spatiotemporal odor representations, which are optimized for the discrimination of odors during stimulus onset and offset. Response pattern correlation across different stimuli showed a nonmonotonic dependence on the strength of lateral inhibition with an optimum at intermediate levels, which is explained by two counteracting mechanisms. In addition, we find that odor identity is stored on a prolonged timescale in the adaptation levels but not in the spiking activity of the principal cells of the mushroom body, providing a testable hypothesis for the location of the so-called odor trace.
Collapse
Affiliation(s)
- Rinaldo Betkiewicz
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, 50674 Cologne, Germany
- Department of Physics, Humboldt University Berlin, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
- Department of Physics, Humboldt University Berlin, 12489 Berlin, Germany
| | - Martin P Nawrot
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
- Computational Systems Neuroscience, Institute of Zoology, University of Cologne, 50674 Cologne, Germany
| |
Collapse
|
17
|
Kähne M, Rüdiger S, Kihara AH, Lindner B. Gap junctions set the speed and nucleation rate of stage I retinal waves. PLoS Comput Biol 2019; 15:e1006355. [PMID: 31034472 PMCID: PMC6508742 DOI: 10.1371/journal.pcbi.1006355] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2018] [Revised: 05/09/2019] [Accepted: 11/27/2018] [Indexed: 11/18/2022] Open
Abstract
Spontaneous waves in the developing retina are essential in the formation of the retinotopic mapping in the visual system. From experiments in rabbits, it is known that the earliest type of retinal waves (stage I) is nucleated spontaneously, propagates at a speed of 451±91 μm/sec and relies on gap junction coupling between ganglion cells. Because gap junctions (electrical synapses) have short integration times, it has been argued that they cannot set the low speed of stage I retinal waves. Here, we present a theoretical study of a two-dimensional neural network of the ganglion cell layer with gap junction coupling and intrinsic noise. We demonstrate that this model can explain observed nucleation rates as well as the comparatively slow propagation speed of the waves. From the interaction between two coupled neurons, we estimate the wave speed in the model network. Furthermore, using simulations of small networks of neurons (N≤260), we estimate the nucleation rate in the form of an Arrhenius escape rate. These results allow for informed simulations of a realistically sized network, yielding values of the gap junction coupling and the intrinsic noise level that are in a physiologically plausible range.
Collapse
Affiliation(s)
- Malte Kähne
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- * E-mail:
| | - Sten Rüdiger
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
| | | | - Benjamin Lindner
- Institut für Physik, Humboldt-Universität zu Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
18
|
Orio P, Gatica M, Herzog R, Maidana JP, Castro S, Xu K. Chaos versus noise as drivers of multistability in neural networks. CHAOS (WOODBURY, N.Y.) 2018; 28:106321. [PMID: 30384618 DOI: 10.1063/1.5043447] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
The multistable behavior of neural networks is actively being studied as a landmark of ongoing cerebral activity, reported in both functional Magnetic Resonance Imaging (fMRI) and electro- or magnetoencephalography recordings. This consists of a continuous jumping between different partially synchronized states in the absence of external stimuli. It is thought to be an important mechanism for dealing with sensory novelty and to allow for efficient coding of information in an ever-changing surrounding environment. Many advances have been made to understand how network topology, connection delays, and noise can contribute to building this dynamic. Little or no attention, however, has been paid to the difference between local chaotic and stochastic influences on the switching between different network states. Using a conductance-based neural model that can have chaotic dynamics, we showed that a network can show multistable dynamics in a certain range of global connectivity strength and under deterministic conditions. In the present work, we characterize the multistable dynamics when the networks are, in addition to chaotic, subject to ion channel stochasticity in the form of multiplicative (channel) or additive (current) noise. We calculate the Functional Connectivity Dynamics matrix by comparing the Functional Connectivity (FC) matrices that describe the pair-wise phase synchronization in a moving window fashion and performing clustering of FCs. Moderate noise can enhance the multistable behavior that is evoked by chaos, resulting in more heterogeneous synchronization patterns, while more intense noise abolishes multistability. In networks composed of nonchaotic nodes, some noise can induce multistability in an otherwise synchronized, nonchaotic network. Finally, we found the same results regardless of the multiplicative or additive nature of noise.
Collapse
Affiliation(s)
- Patricio Orio
- Centro Interdisciplinario de Neurociencia de Valparaíso, Universidad de Valparaíso, Pje Harrington 287, 2360103 Valparaíso, Chile
| | - Marilyn Gatica
- Centro Interdisciplinario de Neurociencia de Valparaíso, Universidad de Valparaíso, Pje Harrington 287, 2360103 Valparaíso, Chile
| | - Rubén Herzog
- Centro Interdisciplinario de Neurociencia de Valparaíso, Universidad de Valparaíso, Pje Harrington 287, 2360103 Valparaíso, Chile
| | - Jean Paul Maidana
- Centro Interdisciplinario de Neurociencia de Valparaíso, Universidad de Valparaíso, Pje Harrington 287, 2360103 Valparaíso, Chile
| | - Samy Castro
- Centro Interdisciplinario de Neurociencia de Valparaíso, Universidad de Valparaíso, Pje Harrington 287, 2360103 Valparaíso, Chile
| | - Kesheng Xu
- Centro Interdisciplinario de Neurociencia de Valparaíso, Universidad de Valparaíso, Pje Harrington 287, 2360103 Valparaíso, Chile
| |
Collapse
|
19
|
Beiran M, Kruscha A, Benda J, Lindner B. Coding of time-dependent stimuli in homogeneous and heterogeneous neural populations. J Comput Neurosci 2017; 44:189-202. [PMID: 29222729 DOI: 10.1007/s10827-017-0674-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Revised: 11/08/2017] [Accepted: 11/12/2017] [Indexed: 11/29/2022]
Abstract
We compare the information transmission of a time-dependent signal by two types of uncoupled neuron populations that differ in their sources of variability: i) a homogeneous population whose units receive independent noise and ii) a deterministic heterogeneous population, where each unit exhibits a different baseline firing rate ('disorder'). Our criterion for making both sources of variability quantitatively comparable is that the interspike-interval distributions are identical for both systems. Numerical simulations using leaky integrate-and-fire neurons unveil that a non-zero amount of both noise or disorder maximizes the encoding efficiency of the homogeneous and heterogeneous system, respectively, as a particular case of suprathreshold stochastic resonance. Our findings thus illustrate that heterogeneity can render similarly profitable effects for neuronal populations as dynamic noise. The optimal noise/disorder depends on the system size and the properties of the stimulus such as its intensity or cutoff frequency. We find that weak stimuli are better encoded by a noiseless heterogeneous population, whereas for strong stimuli a homogeneous population outperforms an equivalent heterogeneous system up to a moderate noise level. Furthermore, we derive analytical expressions of the coherence function for the cases of very strong noise and of vanishing intrinsic noise or heterogeneity, which predict the existence of an optimal noise intensity. Our results show that, depending on the type of signal, noise as well as heterogeneity can enhance the encoding performance of neuronal populations.
Collapse
Affiliation(s)
- Manuel Beiran
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany. .,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, Département Études Cognitives, École Normale Supérieure, INSERM, PSL Research University, Paris, France.
| | - Alexandra Kruscha
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Jan Benda
- Institute for Neurobiology, Eberhard Karls Universität, Tübingen, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
20
|
Rajdl K, Lansky P, Kostal L. Entropy factor for randomness quantification in neuronal data. Neural Netw 2017; 95:57-65. [DOI: 10.1016/j.neunet.2017.07.016] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2017] [Revised: 07/27/2017] [Accepted: 07/28/2017] [Indexed: 11/28/2022]
|
21
|
Braun W, Thul R, Longtin A. Evolution of moments and correlations in nonrenewal escape-time processes. Phys Rev E 2017; 95:052127. [PMID: 28618562 DOI: 10.1103/physreve.95.052127] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2016] [Indexed: 06/07/2023]
Abstract
The theoretical description of nonrenewal stochastic systems is a challenge. Analytical results are often not available or can be obtained only under strong conditions, limiting their applicability. Also, numerical results have mostly been obtained by ad hoc Monte Carlo simulations, which are usually computationally expensive when a high degree of accuracy is needed. To gain quantitative insight into these systems under general conditions, we here introduce a numerical iterated first-passage time approach based on solving the time-dependent Fokker-Planck equation (FPE) to describe the statistics of nonrenewal stochastic systems. We illustrate the approach using spike-triggered neuronal adaptation in the leaky and perfect integrate-and-fire model, respectively. The transition to stationarity of first-passage time moments and their sequential correlations occur on a nontrivial time scale that depends on all system parameters. Surprisingly this is so for both single exponential and scale-free power-law adaptation. The method works beyond the small noise and time-scale separation approximations. It shows excellent agreement with direct Monte Carlo simulations, which allow for the computation of transient and stationary distributions. We compare different methods to compute the evolution of the moments and serial correlation coefficients (SCCs) and discuss the challenge of reliably computing the SCCs, which we find to be very sensitive to numerical inaccuracies for both the leaky and perfect integrate-and-fire models. In conclusion, our methods provide a general picture of nonrenewal dynamics in a wide range of stochastic systems exhibiting short- and long-range correlations.
Collapse
Affiliation(s)
- Wilhelm Braun
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| | - Rüdiger Thul
- Centre for Mathematical Medicine and Biology, School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD, United Kingdom
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
- University of Ottawa Brain and Mind Research Institute, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5, Canada
| |
Collapse
|
22
|
Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS Comput Biol 2017; 13:e1005507. [PMID: 28422957 PMCID: PMC5415267 DOI: 10.1371/journal.pcbi.1005507] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2016] [Revised: 05/03/2017] [Accepted: 04/07/2017] [Indexed: 11/22/2022] Open
Abstract
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50–2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations. Understanding the brain requires mathematical models on different spatial scales. On the “microscopic” level of nerve cells, neural spike trains can be well predicted by phenomenological spiking neuron models. On a coarse scale, neural activity can be modeled by phenomenological equations that summarize the total activity of many thousands of neurons. Such population models are widely used to model neuroimaging data such as EEG, MEG or fMRI data. However, it is largely unknown how large-scale models are connected to an underlying microscale model. Linking the scales is vital for a correct description of rapid changes and fluctuations of the population activity, and is crucial for multiscale brain models. The challenge is to treat realistic spiking dynamics as well as fluctuations arising from the finite number of neurons. We obtained such a link by deriving stochastic population equations on the mesoscopic scale of 100–1000 neurons from an underlying microscopic model. These equations can be efficiently integrated and reproduce results of a microscopic simulation while achieving a high speed-up factor. We expect that our novel population theory on the mesoscopic scale will be instrumental for understanding experimental data on information processing in the brain, and ultimately link microscopic and macroscopic activity patterns.
Collapse
|
23
|
Norman SE, Butera RJ, Canavier CC. Stochastic slowly adapting ionic currents may provide a decorrelation mechanism for neural oscillators by causing wander in the intrinsic period. J Neurophysiol 2016; 116:1189-98. [PMID: 27281746 DOI: 10.1152/jn.00193.2016] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2016] [Accepted: 06/01/2016] [Indexed: 11/22/2022] Open
Abstract
Oscillatory neurons integrate their synaptic inputs in fundamentally different ways than normally quiescent neurons. We show that the oscillation period of invertebrate endogenous pacemaker neurons wanders, producing random fluctuations in the interspike intervals (ISI) on a time scale of seconds to minutes, which decorrelates pairs of neurons in hybrid circuits constructed using the dynamic clamp. The autocorrelation of the ISI sequence remained high for many ISIs, but the autocorrelation of the ΔISI series had on average a single nonzero value, which was negative at a lag of one interval. We reproduced these results using a simple integrate and fire (IF) model with a stochastic population of channels carrying an adaptation current with a stochastic component that was integrated with a slow time scale, suggesting that a similar population of channels underlies the observed wander in the period. Using autoregressive integrated moving average (ARIMA) models, we found that a single integrator and a single moving average with a negative coefficient could simulate both the experimental data and the IF model. Feeding white noise into an integrator with a slow time constant is sufficient to produce the autocorrelation structure of the ISI series. Moreover, the moving average clearly accounted for the autocorrelation structure of the ΔISI series and is biophysically implemented in the IF model using slow stochastic adaptation. The observed autocorrelation structure may be a neural signature of slow stochastic adaptation, and wander generated in this manner may be a general mechanism for limiting episodes of synchronized activity in the nervous system.
Collapse
Affiliation(s)
- Sharon E Norman
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia; Bioengineering Graduate Program, Georgia Institute of Technology, Atlanta, Georgia
| | - Robert J Butera
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia; Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, Atlanta, Georgia; and
| | - Carmen C Canavier
- Neuroscience Center of Excellence, Louisiana State University Health Sciences Center, New Orleans, Louisiana; and Department of Cell Biology and Anatomy, Louisiana State University Health Sciences Center, New Orleans, Louisiana
| |
Collapse
|
24
|
Blankenburg S, Lindner B. The effect of positive interspike interval correlations on neuronal information transmission. MATHEMATICAL BIOSCIENCES AND ENGINEERING : MBE 2016; 13:461-481. [PMID: 27106183 DOI: 10.3934/mbe.2016001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Experimentally it is known that some neurons encode preferentially information about low-frequency (slow) components of a time-dependent stimulus while others prefer intermediate or high-frequency (fast) components. Accordingly, neurons can be categorized as low-pass, band-pass or high-pass information filters. Mechanisms of information filtering at the cellular and the network levels have been suggested. Here we propose yet another mechanism, based on noise shaping due to spontaneous non-renewal spiking statistics. We compare two integrate-and-fire models with threshold noise that differ solely in their interspike interval (ISI) correlations: the renewal model generates independent ISIs, whereas the non-renewal model exhibits positive correlations between adjacent ISIs. For these simplified neuron models we analytically calculate ISI density and power spectrum of the spontaneous spike train as well as approximations for input-output cross-spectrum and spike-train power spectrum in the presence of a broad-band Gaussian stimulus. This yields the spectral coherence, an approximate frequency-resolved measure of information transmission. We demonstrate that for low spiking variability the renewal model acts as a low-pass filter of information (coherence has a global maximum at zero frequency), whereas the non-renewal model displays a pronounced maximum of the coherence at non-vanishing frequency and thus can be regarded as a band-pass filter of information.
Collapse
Affiliation(s)
- Sven Blankenburg
- Bernstein Center for Computational Neuroscience Berlin, Berlin 10115, Germany.
| | | |
Collapse
|
25
|
|
26
|
Schwalger T, Lindner B. Analytical approach to an integrate-and-fire model with spike-triggered adaptation. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062703. [PMID: 26764723 DOI: 10.1103/physreve.92.062703] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2015] [Indexed: 06/05/2023]
Abstract
The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.
Collapse
Affiliation(s)
- Tilo Schwalger
- Brain Mind Institute, École Polytechnique Féderale de Lausanne (EPFL) Station 15, CH-1015 Lausanne, Switzerland
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
27
|
Huang Y, Rüdiger S, Shuai J. Accurate Langevin approaches to simulate Markovian channel dynamics. Phys Biol 2015; 12:061001. [PMID: 26403205 DOI: 10.1088/1478-3975/12/6/061001] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
The stochasticity of ion-channels dynamic is significant for physiological processes on neuronal cell membranes. Microscopic simulations of the ion-channel gating with Markov chains can be considered to be an accurate standard. However, such Markovian simulations are computationally demanding for membrane areas of physiologically relevant sizes, which makes the noise-approximating or Langevin equation methods advantageous in many cases. In this review, we discuss the Langevin-like approaches, including the channel-based and simplified subunit-based stochastic differential equations proposed by Fox and Lu, and the effective Langevin approaches in which colored noise is added to deterministic differential equations. In the framework of Fox and Lu's classical models, several variants of numerical algorithms, which have been recently developed to improve accuracy as well as efficiency, are also discussed. Through the comparison of different simulation algorithms of ion-channel noise with the standard Markovian simulation, we aim to reveal the extent to which the existing Langevin-like methods approximate results using Markovian methods. Open questions for future studies are also discussed.
Collapse
Affiliation(s)
- Yandong Huang
- Department of Physics, Xiamen University, Xiamen 361005, People's Republic of China
| | | | | |
Collapse
|
28
|
Eberhard MJB, Schleimer JH, Schreiber S, Ronacher B. A temperature rise reduces trial-to-trial variability of locust auditory neuron responses. J Neurophysiol 2015; 114:1424-37. [PMID: 26041833 DOI: 10.1152/jn.00980.2014] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2014] [Accepted: 06/03/2015] [Indexed: 11/22/2022] Open
Abstract
The neurophysiology of ectothermic animals, such as insects, is affected by environmental temperature, as their body temperature fluctuates with ambient conditions. Changes in temperature alter properties of neurons and, consequently, have an impact on the processing of information. Nevertheless, nervous system function is often maintained over a broad temperature range, exhibiting a surprising robustness to variations in temperature. A special problem arises for acoustically communicating insects, as in these animals mate recognition and mate localization typically rely on the decoding of fast amplitude modulations in calling and courtship songs. In the auditory periphery, however, temporal resolution is constrained by intrinsic neuronal noise. Such noise predominantly arises from the stochasticity of ion channel gating and potentially impairs the processing of sensory signals. On the basis of intracellular recordings of locust auditory neurons, we show that intrinsic neuronal variability on the level of spikes is reduced with increasing temperature. We use a detailed mathematical model including stochastic ion channel gating to shed light on the underlying biophysical mechanisms in auditory receptor neurons: because of a redistribution of channel-induced current noise toward higher frequencies and specifics of the temperature dependence of the membrane impedance, membrane potential noise is indeed reduced at higher temperatures. This finding holds under generic conditions and physiologically plausible assumptions on the temperature dependence of the channels' kinetics and peak conductances. We demonstrate that the identified mechanism also can explain the experimentally observed reduction of spike timing variability at higher temperatures.
Collapse
Affiliation(s)
- Monika J B Eberhard
- Department of Biology, Behavioural Physiology Group, Humboldt-Universität zu Berlin, Berlin, Germany;
| | - Jan-Hendrik Schleimer
- Department of Biology, Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Berlin, Germany; and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Susanne Schreiber
- Department of Biology, Institute for Theoretical Biology, Humboldt-Universität zu Berlin, Berlin, Germany; and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Bernhard Ronacher
- Department of Biology, Behavioural Physiology Group, Humboldt-Universität zu Berlin, Berlin, Germany; Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
29
|
Statistical structure of neural spiking under non-Poissonian or other non-white stimulation. J Comput Neurosci 2015; 39:29-51. [PMID: 25936628 DOI: 10.1007/s10827-015-0560-x] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2014] [Revised: 02/24/2015] [Accepted: 03/27/2015] [Indexed: 10/23/2022]
Abstract
Nerve cells in the brain generate sequences of action potentials with a complex statistics. Theoretical attempts to understand this statistics were largely limited to the case of a temporally uncorrelated input (Poissonian shot noise) from the neurons in the surrounding network. However, the stimulation from thousands of other neurons has various sorts of temporal structure. Firstly, input spike trains are temporally correlated because their firing rates can carry complex signals and because of cell-intrinsic properties like neural refractoriness, bursting, or adaptation. Secondly, at the connections between neurons, the synapses, usage-dependent changes in the synaptic weight (short-term plasticity) further shape the correlation structure of the effective input to the cell. From the theoretical side, it is poorly understood how these correlated stimuli, so-called colored noise, affect the spike train statistics. In particular, no standard method exists to solve the associated first-passage-time problem for the interspike-interval statistics with an arbitrarily colored noise. Assuming that input fluctuations are weaker than the mean neuronal drive, we derive simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network. We verify our theory by numerical simulations for three paradigmatic situations that lead to input correlations: (i) rate-coded naturalistic stimuli in presynaptic spike trains; (ii) presynaptic refractoriness or bursting; (iii) synaptic short-term plasticity. In all cases, we find severe effects on interval statistics. Our results provide a framework for the interpretation of firing statistics measured in vivo in the brain.
Collapse
|
30
|
Shiau L, Schwalger T, Lindner B. Interspike interval correlation in a stochastic exponential integrate-and-fire model with subthreshold and spike-triggered adaptation. J Comput Neurosci 2015; 38:589-600. [DOI: 10.1007/s10827-015-0558-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Revised: 02/11/2015] [Accepted: 03/20/2015] [Indexed: 10/23/2022]
|
31
|
Müller-Hansen F, Droste F, Lindner B. Statistics of a neuron model driven by asymmetric colored noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 91:022718. [PMID: 25768542 DOI: 10.1103/physreve.91.022718] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/25/2014] [Indexed: 06/04/2023]
Abstract
Irregular firing of neurons can be modeled as a stochastic process. Here we study the perfect integrate-and-fire neuron driven by dichotomous noise, a Markovian process that jumps between two states (i.e., possesses a non-Gaussian statistics) and exhibits nonvanishing temporal correlations (i.e., represents a colored noise). Specifically, we consider asymmetric dichotomous noise with two different transition rates. Using a first-passage-time formulation, we derive exact expressions for the probability density and the serial correlation coefficient of the interspike interval (time interval between two subsequent neural action potentials) and the power spectrum of the spike train. Furthermore, we extend the model by including additional Gaussian white noise, and we give approximations for the interspike interval (ISI) statistics in this case. Numerical simulations are used to validate the exact analytical results for pure dichotomous noise, and to test the approximations of the ISI statistics when Gaussian white noise is included. The results may help to understand how correlations and asymmetry of noise and signals in nerve cells shape neuronal firing statistics.
Collapse
Affiliation(s)
- Finn Müller-Hansen
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Freie Universität Berlin, Arnimallee 14, 14195 Berlin, Germany
| | - Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstraße 13, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
32
|
Stochastic representations of ion channel kinetics and exact stochastic simulation of neuronal dynamics. J Comput Neurosci 2014; 38:67-82. [PMID: 25408289 DOI: 10.1007/s10827-014-0528-2] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2014] [Revised: 08/18/2014] [Accepted: 09/03/2014] [Indexed: 10/24/2022]
Abstract
In this paper we provide two representations for stochastic ion channel kinetics, and compare the performance of exact simulation with a commonly used numerical approximation strategy. The first representation we present is a random time change representation, popularized by Thomas Kurtz, with the second being analogous to a "Gillespie" representation. Exact stochastic algorithms are provided for the different representations, which are preferable to either (a) fixed time step or (b) piecewise constant propensity algorithms, which still appear in the literature. As examples, we provide versions of the exact algorithms for the Morris-Lecar conductance based model, and detail the error induced, both in a weak and a strong sense, by the use of approximate algorithms on this model. We include ready-to-use implementations of the random time change algorithm in both XPP and Matlab. Finally, through the consideration of parametric sensitivity analysis, we show how the representations presented here are useful in the development of further computational methods. The general representations and simulation strategies provided here are known in other parts of the sciences, but less so in the present setting.
Collapse
|
33
|
Computational themes of peripheral processing in the auditory pathway of insects. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2014; 201:39-50. [PMID: 25358727 DOI: 10.1007/s00359-014-0956-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2014] [Revised: 10/10/2014] [Accepted: 10/11/2014] [Indexed: 10/24/2022]
Abstract
Hearing in insects serves to gain information in the context of mate finding, predator avoidance or host localization. For these goals, the auditory pathways of insects represent the computational substrate for object recognition and localization. Before these higher level computations can be executed in more central parts of the nervous system, the signals need to be preprocessed in the auditory periphery. Here, we review peripheral preprocessing along four computational themes rather than discussing specific physiological mechanisms: (1) control of sensitivity by adaptation, (2) recoding of amplitude modulations of an acoustic signal into a labeled-line code (3) frequency processing and (4) conditioning for binaural processing. Along these lines, we review evidence for canonical computations carried out in the peripheral auditory pathway and show that despite the vast diversity of insect hearing, signal processing is governed by common computational motifs and principles.
Collapse
|
34
|
Rowat PF, Greenwood PE. The ISI distribution of the stochastic Hodgkin-Huxley neuron. Front Comput Neurosci 2014; 8:111. [PMID: 25339894 PMCID: PMC4189387 DOI: 10.3389/fncom.2014.00111] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2014] [Accepted: 08/25/2014] [Indexed: 11/13/2022] Open
Abstract
The simulation of ion-channel noise has an important role in computational neuroscience. In recent years several approximate methods of carrying out this simulation have been published, based on stochastic differential equations, and all giving slightly different results. The obvious, and essential, question is: which method is the most accurate and which is most computationally efficient? Here we make a contribution to the answer. We compare interspike interval histograms from simulated data using four different approximate stochastic differential equation (SDE) models of the stochastic Hodgkin-Huxley neuron, as well as the exact Markov chain model simulated by the Gillespie algorithm. One of the recent SDE models is the same as the Kurtz approximation first published in 1978. All the models considered give similar ISI histograms over a wide range of deterministic and stochastic input. Three features of these histograms are an initial peak, followed by one or more bumps, and then an exponential tail. We explore how these features depend on deterministic input and on level of channel noise, and explain the results using the stochastic dynamics of the model. We conclude with a rough ranking of the four SDE models with respect to the similarity of their ISI histograms to the histogram of the exact Markov chain model.
Collapse
Affiliation(s)
- Peter F Rowat
- Institute for Neural Computation, University of California San Diego, La Jolla, CA, USA
| | | |
Collapse
|
35
|
Dummer B, Wieland S, Lindner B. Self-consistent determination of the spike-train power spectrum in a neural network with sparse connectivity. Front Comput Neurosci 2014; 8:104. [PMID: 25278869 PMCID: PMC4166962 DOI: 10.3389/fncom.2014.00104] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2014] [Accepted: 08/13/2014] [Indexed: 11/13/2022] Open
Abstract
A major source of random variability in cortical networks is the quasi-random arrival of presynaptic action potentials from many other cells. In network studies as well as in the study of the response properties of single cells embedded in a network, synaptic background input is often approximated by Poissonian spike trains. However, the output statistics of the cells is in most cases far from being Poisson. This is inconsistent with the assumption of similar spike-train statistics for pre- and postsynaptic cells in a recurrent network. Here we tackle this problem for the popular class of integrate-and-fire neurons and study a self-consistent statistics of input and output spectra of neural spike trains. Instead of actually using a large network, we use an iterative scheme, in which we simulate a single neuron over several generations. In each of these generations, the neuron is stimulated with surrogate stochastic input that has a similar statistics as the output of the previous generation. For the surrogate input, we employ two distinct approximations: (i) a superposition of renewal spike trains with the same interspike interval density as observed in the previous generation and (ii) a Gaussian current with a power spectrum proportional to that observed in the previous generation. For input parameters that correspond to balanced input in the network, both the renewal and the Gaussian iteration procedure converge quickly and yield comparable results for the self-consistent spike-train power spectrum. We compare our results to large-scale simulations of a random sparsely connected network of leaky integrate-and-fire neurons (Brunel, 2000) and show that in the asynchronous regime close to a state of balanced synaptic input from the network, our iterative schemes provide an excellent approximations to the autocorrelation of spike trains in the recurrent network.
Collapse
Affiliation(s)
- Benjamin Dummer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Stefan Wieland
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| |
Collapse
|
36
|
Norman SE, Butera RJ. Interspike intervals as a discrete time series with history and randomness. BMC Neurosci 2014. [PMCID: PMC4126387 DOI: 10.1186/1471-2202-15-s1-p195] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
37
|
Roemschied FA, Eberhard MJ, Schleimer JH, Ronacher B, Schreiber S. Cell-intrinsic mechanisms of temperature compensation in a grasshopper sensory receptor neuron. eLife 2014; 3:e02078. [PMID: 24843016 PMCID: PMC4012639 DOI: 10.7554/elife.02078] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2013] [Accepted: 04/03/2014] [Indexed: 02/02/2023] Open
Abstract
Changes in temperature affect biochemical reaction rates and, consequently, neural processing. The nervous systems of poikilothermic animals must have evolved mechanisms enabling them to retain their functionality under varying temperatures. Auditory receptor neurons of grasshoppers respond to sound in a surprisingly temperature-compensated manner: firing rates depend moderately on temperature, with average Q10 values around 1.5. Analysis of conductance-based neuron models reveals that temperature compensation of spike generation can be achieved solely relying on cell-intrinsic processes and despite a strong dependence of ion conductances on temperature. Remarkably, this type of temperature compensation need not come at an additional metabolic cost of spike generation. Firing rate-based information transfer is likely to increase with temperature and we derive predictions for an optimal temperature dependence of the tympanal transduction process fostering temperature compensation. The example of auditory receptor neurons demonstrates how neurons may exploit single-cell mechanisms to cope with multiple constraints in parallel.DOI: http://dx.doi.org/10.7554/eLife.02078.001.
Collapse
Affiliation(s)
- Frederic A Roemschied
- Institute of Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Monika Jb Eberhard
- Behavioral Physiology Group, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Jan-Hendrik Schleimer
- Institute of Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Bernhard Ronacher
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany Behavioral Physiology Group, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Susanne Schreiber
- Institute of Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
38
|
Thounaojam US, Cui J, Norman SE, Butera RJ, Canavier CC. Slow noise in the period of a biological oscillator underlies gradual trends and abrupt transitions in phasic relationships in hybrid neural networks. PLoS Comput Biol 2014; 10:e1003622. [PMID: 24830924 PMCID: PMC4022488 DOI: 10.1371/journal.pcbi.1003622] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2013] [Accepted: 03/28/2014] [Indexed: 11/19/2022] Open
Abstract
In order to study the ability of coupled neural oscillators to synchronize in the presence of intrinsic as opposed to synaptic noise, we constructed hybrid circuits consisting of one biological and one computational model neuron with reciprocal synaptic inhibition using the dynamic clamp. Uncoupled, both neurons fired periodic trains of action potentials. Most coupled circuits exhibited qualitative changes between one-to-one phase-locking with fairly constant phasic relationships and phase slipping with a constant progression in the phasic relationships across cycles. The phase resetting curve (PRC) and intrinsic periods were measured for both neurons, and used to construct a map of the firing intervals for both the coupled and externally forced (PRC measurement) conditions. For the coupled network, a stable fixed point of the map predicted phase locking, and its absence produced phase slipping. Repetitive application of the map was used to calibrate different noise models to simultaneously fit the noise level in the measurement of the PRC and the dynamics of the hybrid circuit experiments. Only a noise model that added history-dependent variability to the intrinsic period could fit both data sets with the same parameter values, as well as capture bifurcations in the fixed points of the map that cause switching between slipping and locking. We conclude that the biological neurons in our study have slowly-fluctuating stochastic dynamics that confer history dependence on the period. Theoretical results to date on the behavior of ensembles of noisy biological oscillators may require re-evaluation to account for transitions induced by slow noise dynamics.
Collapse
Affiliation(s)
- Umeshkanta S. Thounaojam
- Department of Cell Biology and Anatomy, Louisiana State University Health Sciences Center New Orleans, Louisiana, United States of America
| | - Jianxia Cui
- BioCircuits Institute, University of California, San Diego, La Jolla, California, United States of America
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia, United States of America
| | - Sharon E. Norman
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia, United States of America
| | - Robert J. Butera
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia, United States of America
- Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta Georgia, United States of America
| | - Carmen C. Canavier
- Department of Cell Biology and Anatomy, Louisiana State University Health Sciences Center New Orleans, Louisiana, United States of America
- Neuroscience Center of Excellence, Louisiana State University Health Sciences Center, New Orleans, Louisiana, United States of America
| |
Collapse
|
39
|
Ocker GK, Doiron B. Kv7 channels regulate pairwise spiking covariability in health and disease. J Neurophysiol 2014; 112:340-52. [PMID: 24790164 DOI: 10.1152/jn.00084.2014] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/13/2023] Open
Abstract
Low-threshold M currents are mediated by the Kv7 family of potassium channels. Kv7 channels are important regulators of spiking activity, having a direct influence on the firing rate, spike time variability, and filter properties of neurons. How Kv7 channels affect the joint spiking activity of populations of neurons is an important and open area of study. Using a combination of computational simulations and analytic calculations, we show that the activation of Kv7 conductances reduces the covariability between spike trains of pairs of neurons driven by common inputs. This reduction is beyond that explained by the lowering of firing rates and involves an active cancellation of common fluctuations in the membrane potentials of the cell pair. Our theory shows that the excess covariance reduction is due to a Kv7-induced shift from low-pass to band-pass filtering of the single neuron spike train response. Dysfunction of Kv7 conductances is related to a number of neurological diseases characterized by both elevated firing rates and increased network-wide correlations. We show how changes in the activation or strength of Kv7 conductances give rise to excess correlations that cannot be compensated for by synaptic scaling or homeostatic modulation of passive membrane properties. In contrast, modulation of Kv7 activation parameters consistent with pharmacological treatments for certain hyperactivity disorders can restore normal firing rates and spiking correlations. Our results provide key insights into how regulation of a ubiquitous potassium channel class can control the coordination of population spiking activity.
Collapse
Affiliation(s)
- Gabriel Koch Ocker
- Department of Neuroscience, University of Pittsburgh, Pittsburgh, Pennsylvania; Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania
| | - Brent Doiron
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania; and Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania
| |
Collapse
|
40
|
Schwalger T, Lindner B. Patterns of interval correlations in neural oscillators with adaptation. Front Comput Neurosci 2013; 7:164. [PMID: 24348372 PMCID: PMC3843362 DOI: 10.3389/fncom.2013.00164] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2013] [Accepted: 10/26/2013] [Indexed: 11/24/2022] Open
Abstract
Neural firing is often subject to negative feedback by adaptation currents. These currents can induce strong correlations among the time intervals between spikes. Here we study analytically the interval correlations of a broad class of noisy neural oscillators with spike-triggered adaptation of arbitrary strength and time scale. Our weak-noise theory provides a general relation between the correlations and the phase-response curve (PRC) of the oscillator, proves anti-correlations between neighboring intervals for adapting neurons with type I PRC and identifies a single order parameter that determines the qualitative pattern of correlations. Monotonically decaying or oscillating correlation structures can be related to qualitatively different voltage traces after spiking, which can be explained by the phase plane geometry. At high firing rates, the long-term variability of the spike train associated with the cumulative interval correlations becomes small, independent of model details. Our results are verified by comparison with stochastic simulations of the exponential, leaky, and generalized integrate-and-fire models with adaptation.
Collapse
Affiliation(s)
- Tilo Schwalger
- Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| |
Collapse
|
41
|
Bauermeister C, Schwalger T, Russell DF, Neiman AB, Lindner B. Characteristic effects of stochastic oscillatory forcing on neural firing: analytical theory and comparison to paddlefish electroreceptor data. PLoS Comput Biol 2013; 9:e1003170. [PMID: 23966844 PMCID: PMC3744407 DOI: 10.1371/journal.pcbi.1003170] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2012] [Accepted: 06/21/2013] [Indexed: 11/18/2022] Open
Abstract
Stochastic signals with pronounced oscillatory components are frequently encountered in neural systems. Input currents to a neuron in the form of stochastic oscillations could be of exogenous origin, e.g. sensory input or synaptic input from a network rhythm. They shape spike firing statistics in a characteristic way, which we explore theoretically in this report. We consider a perfect integrate-and-fire neuron that is stimulated by a constant base current (to drive regular spontaneous firing), along with Gaussian narrow-band noise (a simple example of stochastic oscillations), and a broadband noise. We derive expressions for the nth-order interval distribution, its variance, and the serial correlation coefficients of the interspike intervals (ISIs) and confirm these analytical results by computer simulations. The theory is then applied to experimental data from electroreceptors of paddlefish, which have two distinct types of internal noisy oscillators, one forcing the other. The theory provides an analytical description of their afferent spiking statistics during spontaneous firing, and replicates a pronounced dependence of ISI serial correlation coefficients on the relative frequency of the driving oscillations, and furthermore allows extraction of certain parameters of the intrinsic oscillators embedded in these electroreceptors.
Collapse
Affiliation(s)
| | - Tilo Schwalger
- Max-Planck-Institute for the Physics of Complex Systems, Dresden, Germany
- Bernstein Center for Computational Neuroscience and Physics Department of Humboldt University, Berlin, Germany
| | - David F. Russell
- Department of Biological Sciences and Neuroscience Program, Ohio University, Athens, Ohio, United States of America
| | - Alexander B. Neiman
- Department of Physics and Astronomy and Neuroscience Program, Ohio University, Athens, Ohio, United States of America
| | - Benjamin Lindner
- Max-Planck-Institute for the Physics of Complex Systems, Dresden, Germany
- Bernstein Center for Computational Neuroscience and Physics Department of Humboldt University, Berlin, Germany
- * E-mail:
| |
Collapse
|