1
|
Puttkammer F, Lindner B. Fluctuation-response relations for integrate-and-fire models with an absolute refractory period. BIOLOGICAL CYBERNETICS 2024; 118:7-19. [PMID: 38261004 PMCID: PMC11068698 DOI: 10.1007/s00422-023-00982-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 12/11/2023] [Indexed: 01/24/2024]
Abstract
We study the problem of relating the spontaneous fluctuations of a stochastic integrate-and-fire (IF) model to the response of the instantaneous firing rate to time-dependent stimulation if the IF model is endowed with a non-vanishing refractory period and a finite (stereotypical) spike shape. This seemingly harmless addition to the model is shown to complicate the analysis put forward by Lindner Phys. Rev. Lett. (2022), i.e., the incorporation of the reset into the model equation, the Rice-like averaging of the stochastic differential equation, and the application of the Furutsu-Novikov theorem. We derive a still exact (although more complicated) fluctuation-response relation (FRR) for an IF model with refractory state and a white Gaussian background noise. We also briefly discuss an approximation for the case of a colored Gaussian noise and conclude with a summary and outlook on open problems.
Collapse
Affiliation(s)
- Friedrich Puttkammer
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| |
Collapse
|
2
|
Zhang C, Revah O, Wolf F, Neef A. Dynamic Gain Decomposition Reveals Functional Effects of Dendrites, Ion Channels, and Input Statistics in Population Coding. J Neurosci 2024; 44:e0799232023. [PMID: 38286625 PMCID: PMC10977021 DOI: 10.1523/jneurosci.0799-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2023] [Revised: 12/18/2023] [Accepted: 12/19/2023] [Indexed: 01/31/2024] Open
Abstract
Modern, high-density neuronal recordings reveal at ever higher precision how information is represented by neural populations. Still, we lack the tools to understand these processes bottom-up, emerging from the biophysical properties of neurons, synapses, and network structure. The concept of the dynamic gain function, a spectrally resolved approximation of a population's coding capability, has the potential to link cell-level properties to network-level performance. However, the concept is not only useful but also very complex because the dynamic gain's shape is co-determined by axonal and somato-dendritic parameters and the population's operating regime. Previously, this complexity precluded an understanding of any individual parameter's impact. Here, we decomposed the dynamic gain function into three components corresponding to separate signal transformations. This allowed attribution of network-level encoding features to specific cell-level parameters. Applying the method to data from real neurons and biophysically plausible models, we found: (1) The encoding bandwidth of real neurons, approximately 400 Hz, is constrained by the voltage dependence of axonal currents during early action potential initiation. (2) State-of-the-art models only achieve encoding bandwidths around 100 Hz and are limited mainly by subthreshold processes instead. (3) Large dendrites and low-threshold potassium currents modulate the bandwidth by shaping the subthreshold stimulus-to-voltage transformation. Our decomposition provides physiological interpretations when the dynamic gain curve changes, for instance during spectrinopathies and neurodegeneration. By pinpointing shortcomings of current models, it also guides inference of neuron models best suited for large-scale network simulations.
Collapse
Affiliation(s)
- Chenfei Zhang
- Institute of Science and Technology for Brain-Inspired Intelligence, Shanghai 200433, People's Republic of China
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
- Göttingen Campus Institute for Dynamics of Biological Networks, 37073 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, 37073 Göttingen, Germany
| | - Omer Revah
- Koret School of Veterinary Medicine, Hebrew University of Jerusalem, 7610001 Rehovot, Israel
| | - Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
- Göttingen Campus Institute for Dynamics of Biological Networks, 37073 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, 37073 Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, 37077 Göttingen, Germany
- Max Planck Institute of Multidisciplinary Sciences, 37077 Göttingen, Germany
- Center for Biostructural Imaging of Neurodegeneration, 37075 Göttingen, Germany
| | - Andreas Neef
- Max Planck Institute for Dynamics and Self-Organization, 37077 Göttingen, Germany
- Göttingen Campus Institute for Dynamics of Biological Networks, 37073 Göttingen, Germany
- Bernstein Center for Computational Neuroscience, 37073 Göttingen, Germany
- Institute for the Dynamics of Complex Systems, University of Göttingen, 37077 Göttingen, Germany
- Max Planck Institute of Multidisciplinary Sciences, 37077 Göttingen, Germany
- Institute for Auditory Neuroscience and InnerEarLab University Medical Center Göttingen, 37075 Göttingen, Germany
| |
Collapse
|
3
|
Schieferstein N, Schwalger T, Lindner B, Kempter R. Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations. PLoS Comput Biol 2024; 20:e1011886. [PMID: 38377147 PMCID: PMC10923461 DOI: 10.1371/journal.pcbi.1011886] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2023] [Revised: 03/08/2024] [Accepted: 02/01/2024] [Indexed: 02/22/2024] Open
Abstract
Hippocampal ripple oscillations have been implicated in important cognitive functions such as memory consolidation and planning. Multiple computational models have been proposed to explain the emergence of ripple oscillations, relying either on excitation or inhibition as the main pacemaker. Nevertheless, the generating mechanism of ripples remains unclear. An interesting dynamical feature of experimentally measured ripples, which may advance model selection, is intra-ripple frequency accommodation (IFA): a decay of the instantaneous ripple frequency over the course of a ripple event. So far, only a feedback-based inhibition-first model, which relies on delayed inhibitory synaptic coupling, has been shown to reproduce IFA. Here we use an analytical mean-field approach and numerical simulations of a leaky integrate-and-fire spiking network to explain the mechanism of IFA. We develop a drift-based approximation for the oscillation dynamics of the population rate and the mean membrane potential of interneurons under strong excitatory drive and strong inhibitory coupling. For IFA, the speed at which the excitatory drive changes is critical. We demonstrate that IFA arises due to a speed-dependent hysteresis effect in the dynamics of the mean membrane potential, when the interneurons receive transient, sharp wave-associated excitation. We thus predict that the IFA asymmetry vanishes in the limit of slowly changing drive, but is otherwise a robust feature of the feedback-based inhibition-first ripple model.
Collapse
Affiliation(s)
- Natalie Schieferstein
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| | - Tilo Schwalger
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Department of Physics, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Richard Kempter
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Einstein Center for Neurosciences, Berlin, Germany
| |
Collapse
|
4
|
Schlungbaum M, Lindner B. Detecting a periodic signal by a population of spiking neurons in the weakly nonlinear response regime. THE EUROPEAN PHYSICAL JOURNAL. E, SOFT MATTER 2023; 46:108. [PMID: 37930460 PMCID: PMC10627932 DOI: 10.1140/epje/s10189-023-00371-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Accepted: 10/20/2023] [Indexed: 11/07/2023]
Abstract
Motivated by experimental observations, we investigate a variant of the cocktail party problem: the detection of a weak periodic stimulus in the presence of fluctuations and another periodic stimulus which is stronger than the periodic signal to be detected. Specifically, we study the response of a population of stochastic leaky integrate-and-fire (LIF) neurons to two periodic signals and focus in particular on the question, whether the presence of one of the stimuli can be detected from the population activity. As a detection criterion, we use a simple threshold-crossing of the population activity over a certain time window. We show by means of the receiver operating characteristics (ROC) that the detectability depends only weakly on the time window of observation but rather strongly on the stimulus amplitude. Counterintuitively, the detection of the weak periodic signal can be facilitated by the presence of a strong periodic input current depending on the frequencies of the two signals and on the dynamical regime in which the neurons operate. Beside numerical simulations of the model, we present an analytical approximation for the ROC curve that is based on the weakly nonlinear response theory for a stochastic LIF neuron.
Collapse
Affiliation(s)
- Maria Schlungbaum
- Physics Department, Humboldt University Berlin, Berlin, Germany.
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
5
|
Vinci GV, Benzi R, Mattia M. Self-Consistent Stochastic Dynamics for Finite-Size Networks of Spiking Neurons. PHYSICAL REVIEW LETTERS 2023; 130:097402. [PMID: 36930929 DOI: 10.1103/physrevlett.130.097402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 12/23/2022] [Accepted: 02/09/2023] [Indexed: 06/18/2023]
Abstract
Despite the huge number of neurons composing a brain network, ongoing activity of local cell assemblies is intrinsically stochastic. Fluctuations in their instantaneous rate of spike firing ν(t) scale with the size of the assembly and persist in isolated networks, i.e., in the absence of external sources of noise. Although deterministic chaos due to the quenched disorder of the synaptic couplings underlies this seemingly stochastic dynamics, an effective theory for the network dynamics of a finite assembly of spiking neurons is lacking. Here, we fill this gap by extending the so-called population density approach including an activity- and size-dependent stochastic source in the Fokker-Planck equation for the membrane potential density. The finite-size noise embedded in this stochastic partial derivative equation is analytically characterized leading to a self-consistent and nonperturbative description of ν(t) valid for a wide class of spiking neuron networks. Power spectra of ν(t) are found in excellent agreement with those from detailed simulations both in the linear regime and across a synchronization phase transition, when a size-dependent smearing of the critical dynamics emerges.
Collapse
Affiliation(s)
- Gianni V Vinci
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
- PhD Program in Physics, Dept. of Physics, "Tor Vergata" University of Rome, 00133 Roma, Italy
| | - Roberto Benzi
- Dept. of Physics and INFN, "Tor Vergata" University of Rome, 00133 Roma, Italy
- Centro Ricerche "E. Fermi," 00184, Roma, Italy
| | - Maurizio Mattia
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
| |
Collapse
|
6
|
Franzen J, Ramlow L, Lindner B. The steady state and response to a periodic stimulation of the firing rate for a theta neuron with correlated noise. J Comput Neurosci 2023; 51:107-128. [PMID: 36273087 PMCID: PMC9840600 DOI: 10.1007/s10827-022-00836-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 07/29/2022] [Accepted: 09/01/2022] [Indexed: 01/18/2023]
Abstract
The stochastic activity of neurons is caused by various sources of correlated fluctuations and can be described in terms of simplified, yet biophysically grounded, integrate-and-fire models. One paradigmatic model is the quadratic integrate-and-fire model and its equivalent phase description by the theta neuron. Here we study the theta neuron model driven by a correlated Ornstein-Uhlenbeck noise and by periodic stimuli. We apply the matrix-continued-fraction method to the associated Fokker-Planck equation to develop an efficient numerical scheme to determine the stationary firing rate as well as the stimulus-induced modulation of the instantaneous firing rate. For the stationary case, we identify the conditions under which the firing rate decreases or increases by the effect of the colored noise and compare our results to existing analytical approximations for limit cases. For an additional periodic signal we demonstrate how the linear and nonlinear response terms can be computed and report resonant behavior for some of them. We extend the method to the case of two periodic signals, generally with incommensurable frequencies, and present a particular case for which a strong mixed response to both signals is observed, i.e. where the response to the sum of signals differs significantly from the sum of responses to the single signals. We provide Python code for our computational method: https://github.com/jannikfranzen/theta_neuron .
Collapse
Affiliation(s)
- Jannik Franzen
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
| | - Lukas Ramlow
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| | - Benjamin Lindner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstr. 15, Berlin, 12489 Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, Berlin, 10115 Germany
| |
Collapse
|
7
|
Lindner B. Fluctuation-Dissipation Relations for Spiking Neurons. PHYSICAL REVIEW LETTERS 2022; 129:198101. [PMID: 36399734 DOI: 10.1103/physrevlett.129.198101] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 09/27/2022] [Accepted: 10/17/2022] [Indexed: 06/16/2023]
Abstract
Spontaneous fluctuations and stimulus response are essential features of neural functioning, but how they are connected is poorly understood. I derive fluctuation-dissipation relations (FDR) between the spontaneous spike and voltage correlations and the firing rate susceptibility for (i) the leaky integrate-and-fire (IF) model with white noise and (ii) an IF model with arbitrary voltage dependence, an adaptation current, and correlated noise. The FDRs can be used to derive thus far unknown statistics analytically [model (i)] or the otherwise inaccessible intrinsic noise statistics [model (ii)].
Collapse
Affiliation(s)
- Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
8
|
Layer M, Senk J, Essink S, van Meegen A, Bos H, Helias M. NNMT: Mean-Field Based Analysis Tools for Neuronal Network Models. Front Neuroinform 2022; 16:835657. [PMID: 35712677 PMCID: PMC9196133 DOI: 10.3389/fninf.2022.835657] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Accepted: 03/17/2022] [Indexed: 11/13/2022] Open
Abstract
Mean-field theory of neuronal networks has led to numerous advances in our analytical and intuitive understanding of their dynamics during the past decades. In order to make mean-field based analysis tools more accessible, we implemented an extensible, easy-to-use open-source Python toolbox that collects a variety of mean-field methods for the leaky integrate-and-fire neuron model. The Neuronal Network Mean-field Toolbox (NNMT) in its current state allows for estimating properties of large neuronal networks, such as firing rates, power spectra, and dynamical stability in mean-field and linear response approximation, without running simulations. In this article, we describe how the toolbox is implemented, show how it is used to reproduce results of previous studies, and discuss different use-cases, such as parameter space explorations, or mapping different network models. Although the initial version of the toolbox focuses on methods for leaky integrate-and-fire neurons, its structure is designed to be open and extensible. It aims to provide a platform for collecting analytical methods for neuronal network model analysis, such that the neuroscientific community can take maximal advantage of them.
Collapse
Affiliation(s)
- Moritz Layer
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Johanna Senk
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Simon Essink
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- RWTH Aachen University, Aachen, Germany
| | - Alexander van Meegen
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Institute of Zoology, Faculty of Mathematics and Natural Sciences, University of Cologne, Cologne, Germany
| | - Hannah Bos
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA-Institute Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
9
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
10
|
|
11
|
Dodda A, Das S. Demonstration of Stochastic Resonance, Population Coding, and Population Voting Using Artificial MoS 2 Based Synapses. ACS NANO 2021; 15:16172-16182. [PMID: 34648278 DOI: 10.1021/acsnano.1c05042] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Fast detection of weak signals at low energy expenditure is a challenging but inescapable task for the evolutionary success of animals that survive in resource constrained environments. This task is accomplished by the sensory nervous system by exploiting the synergy between three astounding neural phenomena, namely, stochastic resonance (SR), population coding (PC), and population voting (PV). In SR, the constructive role of synaptic noise is exploited for the detection of otherwise invisible signals. In PC, the redundancy in neural population is exploited to reduce the detection latency. Finally, PV ensures unambiguous signal detection even in the presence of excessive noise. Here we adopt a similar strategies and experimentally demonstrate how a population of stochastic artificial neurons based on monolayer MoS2 field effect transistors (FETs) can use an optimum amount of white Gaussian noise and population voting to detect invisible signals at a frugal energy expenditure (∼10s of nano-Joules). Our findings can aid remote sensing in the emerging era of the Internet of things (IoT) that thrive on energy efficiency.
Collapse
Affiliation(s)
- Akhil Dodda
- Department of Engineering Science and Mechanics, Pennsylvania State University, University Park, Pennsylvania 16802, United States
| | - Saptarshi Das
- Department of Engineering Science and Mechanics, Pennsylvania State University, University Park, Pennsylvania 16802, United States
- Department of Materials Science and Engineering, Pennsylvania State University, University Park, Pennsylvania 16802, United States
- Materials Research Institute, Pennsylvania State University, University Park, Pennsylvania 16802, United States
| |
Collapse
|
12
|
Barta T, Kostal L. Regular spiking in high-conductance states: The essential role of inhibition. Phys Rev E 2021; 103:022408. [PMID: 33736083 DOI: 10.1103/physreve.103.022408] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Accepted: 02/03/2021] [Indexed: 06/12/2023]
Abstract
Strong inhibitory input to neurons, which occurs in balanced states of neural networks, increases synaptic current fluctuations. This has led to the assumption that inhibition contributes to the high spike-firing irregularity observed in vivo. We used single compartment neuronal models with time-correlated (due to synaptic filtering) and state-dependent (due to reversal potentials) input to demonstrate that inhibitory input acts to decrease membrane potential fluctuations, a result that cannot be achieved with simplified neural input models. To clarify the effects on spike-firing regularity, we used models with different spike-firing adaptation mechanisms, and we observed that the addition of inhibition increased firing regularity in models with dynamic firing thresholds and decreased firing regularity if spike-firing adaptation was implemented through ionic currents or not at all. This fluctuation-stabilization mechanism provides an alternative perspective on the importance of strong inhibitory inputs observed in balanced states of neural networks, and it highlights the key roles of biologically plausible inputs and specific adaptation mechanisms in neuronal modeling.
Collapse
Affiliation(s)
- Tomas Barta
- Institute of Physiology of the Czech Academy of Sciences, 14220 Prague, Czech Republic; Charles University, First Medical Faculty, 12108 Prague, Czech Republic; and Institute of Ecology and Environmental Sciences, INRAE, 78026 Versailles, France
| | - Lubomir Kostal
- Institute of Physiology of the Czech Academy of Sciences, 14220 Prague, Czech Republic
| |
Collapse
|
13
|
Cofré R, Maldonado C, Cessac B. Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics. ENTROPY (BASEL, SWITZERLAND) 2020; 22:E1330. [PMID: 33266513 PMCID: PMC7712217 DOI: 10.3390/e22111330] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/09/2020] [Revised: 11/13/2020] [Accepted: 11/15/2020] [Indexed: 12/04/2022]
Abstract
The Thermodynamic Formalism provides a rigorous mathematical framework for studying quantitative and qualitative aspects of dynamical systems. At its core, there is a variational principle that corresponds, in its simplest form, to the Maximum Entropy principle. It is used as a statistical inference procedure to represent, by specific probability measures (Gibbs measures), the collective behaviour of complex systems. This framework has found applications in different domains of science. In particular, it has been fruitful and influential in neurosciences. In this article, we review how the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, in order to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models. We comment on perspectives and open problems in theoretical neuroscience that could be addressed within this formalism.
Collapse
Affiliation(s)
- Rodrigo Cofré
- CIMFAV-Ingemat, Facultad de Ingeniería, Universidad de Valparaíso, Valparaíso 2340000, Chile
| | - Cesar Maldonado
- IPICYT/División de Matemáticas Aplicadas, San Luis Potosí 78216, Mexico;
| | - Bruno Cessac
- Inria Biovision team and Neuromod Institute, Université Côte d’Azur, 06901 CEDEX Inria, France;
| |
Collapse
|
14
|
Bernardi D, Lindner B. Receiver operating characteristic curves for a simple stochastic process that carries a static signal. Phys Rev E 2020; 101:062132. [PMID: 32688497 DOI: 10.1103/physreve.101.062132] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2020] [Accepted: 05/14/2020] [Indexed: 11/07/2022]
Abstract
The detection of a weak signal in the presence of noise is an important problem that is often studied in terms of the receiver operating characteristic (ROC) curve, in which the probability of correct detection is plotted against the probability for a false positive. This kind of analysis is typically applied to the situation in which signal and noise are stochastic variables; the detection problem emerges, however, also often in a context in which both signal and noise are stochastic processes and the (correct or false) detection is said to take place when the process crosses a threshold in a given time window. Here we consider the problem for a combination of a static signal which has to be detected against a dynamic noise process, the well-known Ornstein-Uhlenbeck process. We give exact (but difficult to evaluate) quadrature expressions for the detection rates for false positives and correct detections, investigate systematically a simple sampling approximation suggested earlier, compare to an approximation by Stratonovich for the limit of high threshold, and briefly explore the case of multiplicative signal; all theoretical results are compared to extensive numerical simulations of the corresponding Langevin equation. Our results demonstrate that the sampling approximation provides a reasonable description of the ROC curve for this system, and it clarifies limit cases for the ROC curve.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, 12489 Berlin, Germany
| |
Collapse
|
15
|
Bostner Ž, Knoll G, Lindner B. Information filtering by coincidence detection of synchronous population output: analytical approaches to the coherence function of a two-stage neural system. BIOLOGICAL CYBERNETICS 2020; 114:403-418. [PMID: 32583370 PMCID: PMC7326833 DOI: 10.1007/s00422-020-00838-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Accepted: 05/18/2020] [Indexed: 06/11/2023]
Abstract
Information about time-dependent sensory stimuli is encoded in the activity of neural populations; distinct aspects of the stimulus are read out by different types of neurons: while overall information is perceived by integrator cells, so-called coincidence detector cells are driven mainly by the synchronous activity in the population that encodes predominantly high-frequency content of the input signal (high-pass information filtering). Previously, an analytically accessible statistic called the partial synchronous output was introduced as a proxy for the coincidence detector cell's output in order to approximate its information transmission. In the first part of the current paper, we compare the information filtering properties (specifically, the coherence function) of this proxy to those of a simple coincidence detector neuron. We show that the latter's coherence function can indeed be well-approximated by the partial synchronous output with a time scale and threshold criterion that are related approximately linearly to the membrane time constant and firing threshold of the coincidence detector cell. In the second part of the paper, we propose an alternative theory for the spectral measures (including the coherence) of the coincidence detector cell that combines linear-response theory for shot-noise driven integrate-and-fire neurons with a novel perturbation ansatz for the spectra of spike-trains driven by colored noise. We demonstrate how the variability of the synaptic weights for connections from the population to the coincidence detector can shape the information transmission of the entire two-stage system.
Collapse
Affiliation(s)
- Žiga Bostner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| |
Collapse
|
16
|
Fu Y, Kang Y, Chen G. Stochastic Resonance Based Visual Perception Using Spiking Neural Networks. Front Comput Neurosci 2020; 14:24. [PMID: 32499690 PMCID: PMC7242793 DOI: 10.3389/fncom.2020.00024] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2019] [Accepted: 03/17/2020] [Indexed: 01/20/2023] Open
Abstract
Our aim is to propose an efficient algorithm for enhancing the contrast of dark images based on the principle of stochastic resonance in a global feedback spiking network of integrate-and-fire neurons. By linear approximation and direct simulation, we disclose the dependence of the peak signal-to-noise ratio on the spiking threshold and the feedback coupling strength. Based on this theoretical analysis, we then develop a dynamical system algorithm for enhancing dark images. In the new algorithm, an explicit formula is given on how to choose a suitable spiking threshold for the images to be enhanced, and a more effective quantifying index, the variance of image, is used to replace the commonly used measure. Numerical tests verify the efficiency of the new algorithm. The investigation provides a good example for the application of stochastic resonance, and it might be useful for explaining the biophysical mechanism behind visual perception.
Collapse
Affiliation(s)
- Yuxuan Fu
- Department of Applied Mathematics, School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an, China
| | - Yanmei Kang
- Department of Applied Mathematics, School of Mathematics and Statistics, Xi'an Jiaotong University, Xi'an, China
| | - Guanrong Chen
- Department of Electrical Engineering, City University of Hong Kong, Hong Kong, China
| |
Collapse
|
17
|
FENG TIANQUAN. SIGNAL-TO-NOISE RATIO GAIN VIA CORRELATED NOISE IN AN ENSEMBLE OF NOISY NEURONS. J BIOL SYST 2020. [DOI: 10.1142/s0218339020500059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
The collective response of an ensemble of leaky integrate-and-fire neurons induced by local correlated noise is investigated theoretically. Based on the linear response theory, we derive the analytic expression of signal-to-noise ratio (SNR). Numerical results show that the amplitude of internal noise can be increased up to an optimal value where the output SNR reaches a maximum value. Interestingly, we find that the correlated noise between the nearest neurons could lead to the obvious SNR gain. We also show that the SNR can reach unity under condition that the correlated noise between the nearest neurons is negative. This nonlinear amplification of SNR gain in an ensemble of noisy neurons can be related to the array stochastic resonance (SR) phenomenon. Furthermore, we also show that the SNR gain can also be optimized by tuning the number of neuron units, frequency and amplitude of the weak periodic signal.
Collapse
Affiliation(s)
- TIANQUAN FENG
- College of Teacher Education, Nanjing Normal University, Nanjing 210023, P. R. China
- State Key Laboratory of Bioelectronics, School of Biological Science and Medical Engineering, Southeast University, Nanjing 210096, P. R. China
| |
Collapse
|
18
|
Gowers RP, Timofeeva Y, Richardson MJE. Low-rate firing limit for neurons with axon, soma and dendrites driven by spatially distributed stochastic synapses. PLoS Comput Biol 2020; 16:e1007175. [PMID: 32310936 PMCID: PMC7217482 DOI: 10.1371/journal.pcbi.1007175] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2019] [Revised: 05/12/2020] [Accepted: 01/27/2020] [Indexed: 11/18/2022] Open
Abstract
Analytical forms for neuronal firing rates are important theoretical tools for the analysis of network states. Since the 1960s, the majority of approaches have treated neurons as being electrically compact and therefore isopotential. These approaches have yielded considerable insight into how single-cell properties affect network activity; however, many neuronal classes, such as cortical pyramidal cells, are electrically extended objects. Calculation of the complex flow of electrical activity driven by stochastic spatio-temporal synaptic input streams in these structures has presented a significant analytical challenge. Here we demonstrate that an extension of the level-crossing method of Rice, previously used for compact cells, provides a general framework for approximating the firing rate of neurons with spatial structure. Even for simple models, the analytical approximations derived demonstrate a surprising richness including: independence of the firing rate to the electrotonic length for certain models, but with a form distinct to the point-like leaky integrate-and-fire model; a non-monotonic dependence of the firing rate on the number of dendrites receiving synaptic drive; a significant effect of the axonal and somatic load on the firing rate; and the role that the trigger position on the axon for spike initiation has on firing properties. The approach necessitates only calculating the mean and variances of the non-thresholded voltage and its rate of change in neuronal structures subject to spatio-temporal synaptic fluctuations. The combination of simplicity and generality promises a framework that can be built upon to incorporate increasing levels of biophysical detail and extend beyond the low-rate firing limit treated in this paper.
Collapse
Affiliation(s)
- Robert P. Gowers
- Mathematics for Real-World Systems Centre for Doctoral Training, University of Warwick, Coventry, United Kingdom
- Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Philippstrasse 13, Haus 4, Berlin, Germany
| | - Yulia Timofeeva
- Department of Computer Science, University of Warwick, Coventry, United Kingdom
- Department of Clinical and Experimental Epilepsy, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom
| | | |
Collapse
|
19
|
Herfurth T, Tchumatchenko T. Information transmission of mean and variance coding in integrate-and-fire neurons. Phys Rev E 2019; 99:032420. [PMID: 30999481 DOI: 10.1103/physreve.99.032420] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2017] [Indexed: 11/07/2022]
Abstract
Neurons process information by translating continuous signals into patterns of discrete spike times. An open question is how much information these spike times contain about signals which modulate either the mean or the variance of the somatic currents in neurons, as is observed experimentally. Here we calculate the exact information contained in discrete spike times about a continuous signal in both encoding strategies. We show that the information content about mean modulating signals is generally substantially larger than about variance modulating signals for biological parameters. Our analysis further reveals that higher information transmission is associated with a larger proportion of nonlinear signal encoding. Our study measures the complete information content of mean and variance coding and provides a method to determine what fraction of the total information is linearly decodable.
Collapse
Affiliation(s)
- Tim Herfurth
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| | - Tatjana Tchumatchenko
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| |
Collapse
|
20
|
di Volo M, Romagnoni A, Capone C, Destexhe A. Biologically Realistic Mean-Field Models of Conductance-Based Networks of Spiking Neurons with Adaptation. Neural Comput 2019; 31:653-680. [DOI: 10.1162/neco_a_01173] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Accurate population models are needed to build very large-scale neural models, but their derivation is difficult for realistic networks of neurons, in particular when nonlinear properties are involved, such as conductance-based interactions and spike-frequency adaptation. Here, we consider such models based on networks of adaptive exponential integrate-and-fire excitatory and inhibitory neurons. Using a master equation formalism, we derive a mean-field model of such networks and compare it to the full network dynamics. The mean-field model is capable of correctly predicting the average spontaneous activity levels in asynchronous irregular regimes similar to in vivo activity. It also captures the transient temporal response of the network to complex external inputs. Finally, the mean-field model is also able to quantitatively describe regimes where high- and low-activity states alternate (up-down state dynamics), leading to slow oscillations. We conclude that such mean-field models are biologically realistic in the sense that they can capture both spontaneous and evoked activity, and they naturally appear as candidates to build very large-scale models involving multiple brain areas.
Collapse
Affiliation(s)
- Matteo di Volo
- Unité de Neuroscience, Information et Complexité, CNRS FRE 3693, 91198 Gif sur Yvette, France
| | - Alberto Romagnoni
- Centre de Recherche sur l'inflammation UMR 1149, Inserm-Université Paris Diderot, 75018 Paris, France, and Data Team, Departement d'informatique de l'Ecole normale supérieure, CNRS, PSL Research University, 75005 Paris, France, and European Institute for Theoretical Neuroscience, 75012 Paris, France
| | - Cristiano Capone
- European Institute for Theoretical Neuroscience, 75012 Paris, France, and INFN Sezione di Roma, Rome 00185, Italy
| | - Alain Destexhe
- Unité de Neuroscience, Information et Complexité, CNRS FRE 3693, 91198 Gif sur Yvette, France, and European Institute for Theoretical Neuroscience, 75012 Paris, France
| |
Collapse
|
21
|
Multiplicative noise is beneficial for the transmission of sensory signals in simple neuron models. Biosystems 2019; 178:25-31. [PMID: 30735693 DOI: 10.1016/j.biosystems.2019.02.002] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2018] [Revised: 01/27/2019] [Accepted: 02/04/2019] [Indexed: 11/23/2022]
Abstract
We study simple integrate-and-fire type models with multiplicative noise and consider the transmission of a weak and slow signal, i.e. a signal that evokes a small modulation of the instantaneous firing rate on time scales that are much larger than the membrane time scale and the mean interspike interval. The specific question of interest is whether and how the state-dependence of the noise can be optimized with respect to information transmission. First, in a simple model in which the noise intensity varies linearly with the state variable, we show analytically that multiplicative fluctuations may benefit the signal transfer and we elucidate the mechanism for this improvement. In a conductance-based integrate-and-fire model with synaptically filtered shot-noise input, we show by means of extended numerical simulations that also in a biophysically more relevant situation, multiplicative noise can enhance the signal-to-noise ratio. Our results shed light on a so far unexplored aspect of stochastic signal transmission in neural systems.
Collapse
|
22
|
Nomura R, Liang YZ, Morita K, Fujiwara K, Ikeguchi T. Threshold-varying integrate-and-fire model reproduces distributions of spontaneous blink intervals. PLoS One 2018; 13:e0206528. [PMID: 30376565 PMCID: PMC6207319 DOI: 10.1371/journal.pone.0206528] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2018] [Accepted: 10/15/2018] [Indexed: 01/20/2023] Open
Abstract
Spontaneous blinking is one of the most frequent human behaviours. While attentionally guided blinking may benefit human survival, the function of spontaneous frequent blinking in cognitive processes is poorly understood. To model human spontaneous blinking, we proposed a leaky integrate-and-fire model with a variable threshold which is assumed to represent physiological fluctuations during cognitive tasks. The proposed model is capable of reproducing bimodal, normal, and widespread peak-less distributions of inter-blink intervals as well as the more common popular positively skewed distributions. For bimodal distributions, the temporal positions of the two peaks depend on the baseline and the amplitude of the fluctuating threshold function. Parameters that reproduce experimentally derived bimodal distributions suggest that relatively slow oscillations (0.11–0.25 Hz) govern blink elicitations. The results also suggest that changes in blink rates would reflect fluctuations of threshold regulated by human internal states.
Collapse
Affiliation(s)
- Ryota Nomura
- Graduate School of Engineering, Tokyo University of Science, Tokyo, Japan
- Graduate School of Education, The University of Tokyo, Tokyo, Japan
- * E-mail: ,
| | - Ying-Zong Liang
- Graduate School of Engineering, The University of Tokyo, Tokyo, Japan
| | - Kenji Morita
- Graduate School of Education, The University of Tokyo, Tokyo, Japan
| | - Kantaro Fujiwara
- International Research Center for Neurointelligence, The University of Tokyo, Tokyo, Japan
| | - Tohru Ikeguchi
- Graduate School of Engineering, Tokyo University of Science, Tokyo, Japan
- Faculty of Engineering, Tokyo University of Science, Tokyo, Japan
| |
Collapse
|
23
|
Feng T, Chen Q, Yi M, Xiao Z. Improvement of signal-to-noise ratio in parallel neuron arrays with spatially nearest neighbor correlated noise. PLoS One 2018; 13:e0200890. [PMID: 30021023 PMCID: PMC6051645 DOI: 10.1371/journal.pone.0200890] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2017] [Accepted: 07/04/2018] [Indexed: 11/18/2022] Open
Abstract
We theoretically investigate the signal-to-noise ratio (SNR) of a parallel array of leaky integrate-and-fire (LIF) neurons that receives a weak periodic signal and uses spatially nearest neighbor correlated noise. By using linear response theory, we derive the analytic expression of the SNR. The results show that the amplitude of internal noise can be increased up to an optimal value, which corresponds to a maximum SNR. Given the existence of spatially nearest neighbor correlated noise in the neural ensemble, the SNR gain of the collective ensemble response can exceed unity, especially for a negative correlation. This nonlinear collective phenomenon of SNR gain amplification may be related to the array stochastic resonance. In addition, we show that the SNR can be improved by varying the number of neurons, frequency, and amplitude of the weak periodic signal. We expect that this investigation will be useful for both controlling the collective response of neurons and enhancing weak signal transmission.
Collapse
Affiliation(s)
- Tianquan Feng
- College of Teacher Education, Nanjing Normal University, Nanjing, China
- State Key Laboratory of Bioelectronics, School of Biological Science and Medical Engineering, Southeast University, Nanjing, China
- * E-mail:
| | - Qingrong Chen
- School of Psychology, Nanjing Normal University, Nanjing, China
| | - Ming Yi
- College of Sciences, Huazhong Agricultural University, Wuhan, China
| | - Zhongdang Xiao
- State Key Laboratory of Bioelectronics, School of Biological Science and Medical Engineering, Southeast University, Nanjing, China
| |
Collapse
|
24
|
Up-Down-Like Background Spiking Can Enhance Neural Information Transmission. eNeuro 2018; 4:eN-TNC-0282-17. [PMID: 29354678 PMCID: PMC5773284 DOI: 10.1523/eneuro.0282-17.2017] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2017] [Revised: 11/15/2017] [Accepted: 11/20/2017] [Indexed: 11/23/2022] Open
Abstract
How neurons transmit information about sensory or internal signals is strongly influenced by ongoing internal activity. Depending on brain state, this background spiking can occur asynchronously or clustered in up states, periods of collective firing that are interspersed by silent down states. Here, we study which effect such up-down (UD) transitions have on signal transmission. In a simple model, we obtain numerical and analytical results for information theoretic measures. We find that, surprisingly, an UD background can benefit information transmission: when background activity is sparse, it is advantageous to distribute spikes into up states rather than uniformly in time. We reproduce the same effect in a more realistic recurrent network and show that signal transmission is further improved by incorporating that up states propagate across cortex as traveling waves. We propose that traveling UD activity might represent a compromise between reducing metabolic strain and maintaining information transmission capabilities.
Collapse
|
25
|
Beiran M, Kruscha A, Benda J, Lindner B. Coding of time-dependent stimuli in homogeneous and heterogeneous neural populations. J Comput Neurosci 2017; 44:189-202. [PMID: 29222729 DOI: 10.1007/s10827-017-0674-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Revised: 11/08/2017] [Accepted: 11/12/2017] [Indexed: 11/29/2022]
Abstract
We compare the information transmission of a time-dependent signal by two types of uncoupled neuron populations that differ in their sources of variability: i) a homogeneous population whose units receive independent noise and ii) a deterministic heterogeneous population, where each unit exhibits a different baseline firing rate ('disorder'). Our criterion for making both sources of variability quantitatively comparable is that the interspike-interval distributions are identical for both systems. Numerical simulations using leaky integrate-and-fire neurons unveil that a non-zero amount of both noise or disorder maximizes the encoding efficiency of the homogeneous and heterogeneous system, respectively, as a particular case of suprathreshold stochastic resonance. Our findings thus illustrate that heterogeneity can render similarly profitable effects for neuronal populations as dynamic noise. The optimal noise/disorder depends on the system size and the properties of the stimulus such as its intensity or cutoff frequency. We find that weak stimuli are better encoded by a noiseless heterogeneous population, whereas for strong stimuli a homogeneous population outperforms an equivalent heterogeneous system up to a moderate noise level. Furthermore, we derive analytical expressions of the coherence function for the cases of very strong noise and of vanishing intrinsic noise or heterogeneity, which predict the existence of an optimal noise intensity. Our results show that, depending on the type of signal, noise as well as heterogeneity can enhance the encoding performance of neuronal populations.
Collapse
Affiliation(s)
- Manuel Beiran
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany. .,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, Département Études Cognitives, École Normale Supérieure, INSERM, PSL Research University, Paris, France.
| | - Alexandra Kruscha
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Jan Benda
- Institute for Neurobiology, Eberhard Karls Universität, Tübingen, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
26
|
Shomali SR, Ahmadabadi MN, Shimazaki H, Rasuli SN. How does transient signaling input affect the spike timing of postsynaptic neuron near the threshold regime: an analytical study. J Comput Neurosci 2017; 44:147-171. [PMID: 29192377 PMCID: PMC5851711 DOI: 10.1007/s10827-017-0664-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Revised: 07/14/2017] [Accepted: 09/11/2017] [Indexed: 11/05/2022]
Abstract
The noisy threshold regime, where even a small set of presynaptic neurons can significantly affect postsynaptic spike-timing, is suggested as a key requisite for computation in neurons with high variability. It also has been proposed that signals under the noisy conditions are successfully transferred by a few strong synapses and/or by an assembly of nearly synchronous synaptic activities. We analytically investigate the impact of a transient signaling input on a leaky integrate-and-fire postsynaptic neuron that receives background noise near the threshold regime. The signaling input models a single strong synapse or a set of synchronous synapses, while the background noise represents a lot of weak synapses. We find an analytic solution that explains how the first-passage time (ISI) density is changed by transient signaling input. The analysis allows us to connect properties of the signaling input like spike timing and amplitude with postsynaptic first-passage time density in a noisy environment. Based on the analytic solution, we calculate the Fisher information with respect to the signaling input’s amplitude. For a wide range of amplitudes, we observe a non-monotonic behavior for the Fisher information as a function of background noise. Moreover, Fisher information non-trivially depends on the signaling input’s amplitude; changing the amplitude, we observe one maximum in the high level of the background noise. The single maximum splits into two maximums in the low noise regime. This finding demonstrates the benefit of the analytic solution in investigating signal transfer by neurons.
Collapse
Affiliation(s)
- Safura Rashid Shomali
- School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5746 (1954851167), Tehran, Iran.
| | - Majid Nili Ahmadabadi
- Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran, 14395-515, Iran
| | - Hideaki Shimazaki
- Graduate School of Informatics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto, 606-8501, Japan.,Honda Research Institute Japan, Honcho 8-1, Wako-shi, Saitama, 351-0188, Japan
| | - Seyyed Nader Rasuli
- Department of Physics, University of Guilan, Rasht, 41335-1914, Iran.,School of Physics, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran, Iran
| |
Collapse
|
27
|
How linear response shaped models of neural circuits and the quest for alternatives. Curr Opin Neurobiol 2017; 46:234-240. [DOI: 10.1016/j.conb.2017.09.001] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2017] [Accepted: 09/07/2017] [Indexed: 11/23/2022]
|
28
|
Noisy Juxtacellular Stimulation In Vivo Leads to Reliable Spiking and Reveals High-Frequency Coding in Single Neurons. J Neurosci 2017; 36:11120-11132. [PMID: 27798191 DOI: 10.1523/jneurosci.0787-16.2016] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2016] [Accepted: 09/09/2016] [Indexed: 01/16/2023] Open
Abstract
Single cells in the motor and somatosensory cortex of rats were stimulated in vivo with broadband fluctuating currents applied juxtacellularly. Unlike the DC current steps used previously, fluctuating stimulation currents reliably evoked spike trains with precise timing of individual spikes. Fluctuating currents resulted in strong cellular responses at stimulation frequencies beyond the inverse membrane time constant and the mean firing rate of the neuron. Neuronal firing was associated with high rates of information transmission, even for the high-frequency components of the stimulus. Such response characteristics were also revealed in additional experiments with sinusoidal juxtacellular stimulation. For selected cells, we could reproduce these statistics with compartmental models of varying complexity. We also developed a method to generate Gaussian stimuli that evoke spike trains with prescribed spike times (under the constraint of a certain rate and coefficient of variation) and exemplify its ability to achieve precise and reliable spiking in cortical neurons in vivo Our results demonstrate a novel method for precise control of spike timing by juxtacellular stimulation, confirm and extend earlier conclusions from ex vivo work about the capacity of cortical neurons to generate precise discharges, and contribute to the understanding of the biophysics of information transfer of single neurons in vivo at high frequencies. SIGNIFICANCE STATEMENT Nanostimulation of single identified neurons in vivo can control spike frequency parametrically and, surprisingly, can even bias the animal's behavioral response. Here, we extend this stimulation protocol to time-dependent broadband noise stimulation in sensory and motor cortices of rat. In response to such stimuli, we found increased temporal spike-time reliability. The information transmission properties reveal, both experimentally and theoretically, that the neurons support high-frequency stimulation beyond the inverse membrane time. Generating a stimulus using the neuron's response properties, we could evoke prescribed spike times with high precision. Our work helps to establish a novel method for precise temporal control of single-cell spiking and provides a simplified biophysical description of single-neuron spiking under time-dependent in vivo-like stimulation.
Collapse
|
29
|
Kühn T, Helias M. Locking of correlated neural activity to ongoing oscillations. PLoS Comput Biol 2017; 13:e1005534. [PMID: 28604771 PMCID: PMC5484611 DOI: 10.1371/journal.pcbi.1005534] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2016] [Revised: 06/26/2017] [Accepted: 04/26/2017] [Indexed: 02/01/2023] Open
Abstract
Population-wide oscillations are ubiquitously observed in mesoscopic signals of cortical activity. In these network states a global oscillatory cycle modulates the propensity of neurons to fire. Synchronous activation of neurons has been hypothesized to be a separate channel of signal processing information in the brain. A salient question is therefore if and how oscillations interact with spike synchrony and in how far these channels can be considered separate. Experiments indeed showed that correlated spiking co-modulates with the static firing rate and is also tightly locked to the phase of beta-oscillations. While the dependence of correlations on the mean rate is well understood in feed-forward networks, it remains unclear why and by which mechanisms correlations tightly lock to an oscillatory cycle. We here demonstrate that such correlated activation of pairs of neurons is qualitatively explained by periodically-driven random networks. We identify the mechanisms by which covariances depend on a driving periodic stimulus. Mean-field theory combined with linear response theory yields closed-form expressions for the cyclostationary mean activities and pairwise zero-time-lag covariances of binary recurrent random networks. Two distinct mechanisms cause time-dependent covariances: the modulation of the susceptibility of single neurons (via the external input and network feedback) and the time-varying variances of single unit activities. For some parameters, the effectively inhibitory recurrent feedback leads to resonant covariances even if mean activities show non-resonant behavior. Our analytical results open the question of time-modulated synchronous activity to a quantitative analysis.
Collapse
Affiliation(s)
- Tobias Kühn
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- * E-mail:
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
30
|
Droste F, Lindner B. Exact analytical results for integrate-and-fire neurons driven by excitatory shot noise. J Comput Neurosci 2017; 43:81-91. [PMID: 28585050 DOI: 10.1007/s10827-017-0649-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2017] [Revised: 04/14/2017] [Accepted: 05/04/2017] [Indexed: 11/24/2022]
Abstract
A neuron receives input from other neurons via electrical pulses, so-called spikes. The pulse-like nature of the input is frequently neglected in analytical studies; instead, the input is usually approximated to be Gaussian. Recent experimental studies have shown, however, that an assumption underlying this approximation is often not met: Individual presynaptic spikes can have a significant effect on a neuron's dynamics. It is thus desirable to explicitly account for the pulse-like nature of neural input, i.e. consider neurons driven by a shot noise - a long-standing problem that is mathematically challenging. In this work, we exploit the fact that excitatory shot noise with exponentially distributed weights can be obtained as a limit case of dichotomous noise, a Markovian two-state process. This allows us to obtain novel exact expressions for the stationary voltage density and the moments of the interspike-interval density of general integrate-and-fire neurons driven by such an input. For the special case of leaky integrate-and-fire neurons, we also give expressions for the power spectrum and the linear response to a signal. We verify and illustrate our expressions by comparison to simulations of leaky-, quadratic- and exponential integrate-and-fire neurons.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany. .,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany
| |
Collapse
|
31
|
When do correlations increase with firing rates in recurrent networks? PLoS Comput Biol 2017; 13:e1005506. [PMID: 28448499 PMCID: PMC5426798 DOI: 10.1371/journal.pcbi.1005506] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Revised: 05/11/2017] [Accepted: 04/07/2017] [Indexed: 02/04/2023] Open
Abstract
A central question in neuroscience is to understand how noisy firing patterns are used to transmit information. Because neural spiking is noisy, spiking patterns are often quantified via pairwise correlations, or the probability that two cells will spike coincidentally, above and beyond their baseline firing rate. One observation frequently made in experiments, is that correlations can increase systematically with firing rate. Theoretical studies have determined that stimulus-dependent correlations that increase with firing rate can have beneficial effects on information coding; however, we still have an incomplete understanding of what circuit mechanisms do, or do not, produce this correlation-firing rate relationship. Here, we studied the relationship between pairwise correlations and firing rates in recurrently coupled excitatory-inhibitory spiking networks with conductance-based synapses. We found that with stronger excitatory coupling, a positive relationship emerged between pairwise correlations and firing rates. To explain these findings, we used linear response theory to predict the full correlation matrix and to decompose correlations in terms of graph motifs. We then used this decomposition to explain why covariation of correlations with firing rate—a relationship previously explained in feedforward networks driven by correlated input—emerges in some recurrent networks but not in others. Furthermore, when correlations covary with firing rate, this relationship is reflected in low-rank structure in the correlation matrix. A central question in neuroscience is to understand how noisy firing patterns are used to transmit information. We quantify spiking patterns by using pairwise correlations, or the probability that two cells will spike coincidentally, above and beyond their baseline firing rate. One observation frequently made in experiments is that correlations can increase systematically with firing rate. Recent studies of a type of output cell in mouse retina found this relationship; furthermore, they determined that the increase of correlation with firing rate helped the cells encode information, provided the correlations were stimulus-dependent. Several theoretical studies have explored this basic structure, and found that it is generally beneficial to modulate correlations in this way. However—aside from mouse retinal cells referenced here—we do not yet have many examples of real neural circuits that show this correlation-firing rate pattern, so we do not know what common features (or mechanisms) might occur between them. In this study, we address this question via a computational model. We set up a computational model with features representative of a generic cortical network, to see whether correlations would increase with firing rate. To produce different firing patterns, we varied excitatory coupling. We found that with stronger excitatory coupling, there was a positive relationship between pairwise correlations and firing rates. We used a network linear response theory to show why correlations could increase with firing rates in some networks, but not in others; this could be explained by how cells responded to fluctuations in inhibitory conductances.
Collapse
|
32
|
Droste F, Lindner B. Exact results for power spectrum and susceptibility of a leaky integrate-and-fire neuron with two-state noise. Phys Rev E 2017; 95:012411. [PMID: 28208429 DOI: 10.1103/physreve.95.012411] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2016] [Indexed: 11/07/2022]
Abstract
The response properties of excitable systems driven by colored noise are of great interest, but are usually mathematically only accessible via approximations. For this reason, dichotomous noise, a rare example of a colored noise leading often to analytically tractable problems, has been extensively used in the study of stochastic systems. Here, we calculate exact expressions for the power spectrum and the susceptibility of a leaky integrate-and-fire neuron driven by asymmetric dichotomous noise. While our results are in excellent agreement with simulations, they also highlight a limitation of using dichotomous noise as a simple model for more complex fluctuations: Both power spectrum and susceptibility exhibit an undamped periodic structure, the origin of which we discuss in detail.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
33
|
Deniz T, Rotter S. Solving the two-dimensional Fokker-Planck equation for strongly correlated neurons. Phys Rev E 2017; 95:012412. [PMID: 28208505 DOI: 10.1103/physreve.95.012412] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2016] [Indexed: 06/06/2023]
Abstract
Pairs of neurons in brain networks often share much of the input they receive from other neurons. Due to essential nonlinearities of the neuronal dynamics, the consequences for the correlation of the output spike trains are generally not well understood. Here we analyze the case of two leaky integrate-and-fire neurons using an approach which is nonperturbative with respect to the degree of input correlation. Our treatment covers both weakly and strongly correlated dynamics, generalizing previous results based on linear response theory.
Collapse
Affiliation(s)
- Taşkın Deniz
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Hansastraße 9a, 79104 Freiburg, Germany
| | - Stefan Rotter
- Bernstein Center Freiburg & Faculty of Biology, University of Freiburg, Hansastraße 9a, 79104 Freiburg, Germany
| |
Collapse
|
34
|
Bos H, Diesmann M, Helias M. Identifying Anatomical Origins of Coexisting Oscillations in the Cortical Microcircuit. PLoS Comput Biol 2016; 12:e1005132. [PMID: 27736873 PMCID: PMC5063581 DOI: 10.1371/journal.pcbi.1005132] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2015] [Accepted: 09/06/2016] [Indexed: 12/20/2022] Open
Abstract
Oscillations are omnipresent in neural population signals, like multi-unit recordings, EEG/MEG, and the local field potential. They have been linked to the population firing rate of neurons, with individual neurons firing in a close-to-irregular fashion at low rates. Using a combination of mean-field and linear response theory we predict the spectra generated in a layered microcircuit model of V1, composed of leaky integrate-and-fire neurons and based on connectivity compiled from anatomical and electrophysiological studies. The model exhibits low- and high-γ oscillations visible in all populations. Since locally generated frequencies are imposed onto other populations, the origin of the oscillations cannot be deduced from the spectra. We develop an universally applicable systematic approach that identifies the anatomical circuits underlying the generation of oscillations in a given network. Based on a theoretical reduction of the dynamics, we derive a sensitivity measure resulting in a frequency-dependent connectivity map that reveals connections crucial for the peak amplitude and frequency of the observed oscillations and identifies the minimal circuit generating a given frequency. The low-γ peak turns out to be generated in a sub-circuit located in layer 2/3 and 4, while the high-γ peak emerges from the inter-neurons in layer 4. Connections within and onto layer 5 are found to regulate slow rate fluctuations. We further demonstrate how small perturbations of the crucial connections have significant impact on the population spectra, while the impairment of other connections leaves the dynamics on the population level unaltered. The study uncovers connections where mechanisms controlling the spectra of the cortical microcircuit are most effective. Recordings of brain activity show multiple coexisting oscillations. The generation of these oscillations has so far only been investigated in generic one- and two-population networks, neglecting their embedment into larger systems. We introduce a method that determines the mechanisms and sub-circuits generating oscillations in structured spiking networks. Analyzing a multi-layered model of the cortical microcircuit, we trace back characteristic oscillations to experimentally observed connectivity patterns. The approach exposes the influence of individual connections on frequency and amplitude of these oscillations and therefore reveals locations, where biological mechanisms controlling oscillations and experimental manipulations have the largest impact. The new analytical tool replaces parameter scans in computationally expensive models, guides circuit design, and can be employed to validate connectivity data.
Collapse
Affiliation(s)
- Hannah Bos
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- * E-mail:
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
35
|
Kruscha A, Lindner B. Partial synchronous output of a neuronal population under weak common noise: Analytical approaches to the correlation statistics. Phys Rev E 2016; 94:022422. [PMID: 27627347 DOI: 10.1103/physreve.94.022422] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2016] [Indexed: 06/06/2023]
Abstract
We consider a homogeneous population of stochastic neurons that are driven by weak common noise (stimulus). To capture and analyze the joint firing events within the population, we introduce the partial synchronous output of the population. This is a time series defined by the events that at least a fixed fraction γ∈[0,1] of the population fires simultaneously within a small time interval. For this partial synchronous output we develop two analytical approaches to the correlation statistics. In the Gaussian approach we represent the synchronous output as a nonlinear transformation of the summed population activity and approximate the latter by a Gaussian process. In the combinatorial approach the synchronous output is represented by products of box-filtered spike trains of the single neurons. In both approaches we use linear-response theory to derive approximations for statistical measures that hold true for weak common noise. In particular, we calculate the mean value and power spectrum of the synchronous output and the cross-spectrum between synchronous output and common noise. We apply our results to the leaky integrate-and-fire neuron model and compare them to numerical simulations. The combinatorial approach is shown to provide a more accurate description of the statistics for small populations, whereas the Gaussian approximation yields compact formulas that work well for a sufficiently large population size. In particular, in the Gaussian approximation all statistical measures reveal a symmetry in the synchrony threshold γ around the mean value of the population activity. Our results may contribute to a better understanding of the role of coincidence detection in neural signal processing.
Collapse
Affiliation(s)
- Alexandra Kruscha
- Bernstein Center for Computational Neuroscience, Berlin, 10115, Germany and Institute for Physics, Humboldt-Universität zu Berlin, Berlin, 12489, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin, 10115, Germany and Institute for Physics, Humboldt-Universität zu Berlin, Berlin, 12489, Germany
| |
Collapse
|
36
|
Doiron B, Litwin-Kumar A, Rosenbaum R, Ocker GK, Josić K. The mechanics of state-dependent neural correlations. Nat Neurosci 2016; 19:383-93. [PMID: 26906505 DOI: 10.1038/nn.4242] [Citation(s) in RCA: 173] [Impact Index Per Article: 21.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2015] [Accepted: 01/12/2016] [Indexed: 12/12/2022]
Abstract
Simultaneous recordings from large neural populations are becoming increasingly common. An important feature of population activity is the trial-to-trial correlated fluctuation of spike train outputs from recorded neuron pairs. Similar to the firing rate of single neurons, correlated activity can be modulated by a number of factors, from changes in arousal and attentional state to learning and task engagement. However, the physiological mechanisms that underlie these changes are not fully understood. We review recent theoretical results that identify three separate mechanisms that modulate spike train correlations: changes in input correlations, internal fluctuations and the transfer function of single neurons. We first examine these mechanisms in feedforward pathways and then show how the same approach can explain the modulation of correlations in recurrent networks. Such mechanistic constraints on the modulation of population activity will be important in statistical analyses of high-dimensional neural data.
Collapse
Affiliation(s)
- Brent Doiron
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA
| | - Ashok Litwin-Kumar
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA.,Center for Theoretical Neuroscience, Columbia University, New York, New York, USA
| | - Robert Rosenbaum
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA.,Department of Applied and Computational Mathematics and Statistics, University of Notre Dame, Notre Dame, Indiana, USA.,Interdisciplinary Center for Network Science and Applications, University of Notre Dame, Notre Dame, Indiana, USA
| | - Gabriel K Ocker
- Department of Mathematics, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.,Center for the Neural Basis of Cognition, Pittsburgh, Pennsylvania, USA.,Allen Institute for Brain Science, Seattle, Washington, USA
| | - Krešimir Josić
- Department of Mathematics, University of Houston, Houston, Texas, USA.,Department of Biology and Biochemistry, University of Houston, Houston, Texas, USA
| |
Collapse
|
37
|
|
38
|
Bier M, Lisowski B, Gudowska-Nowak E. Phase transitions and entropies for synchronizing oscillators. Phys Rev E 2016; 93:012143. [PMID: 26871059 DOI: 10.1103/physreve.93.012143] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2015] [Indexed: 06/05/2023]
Abstract
We study a generic model of coupled oscillators. In the model there is competition between phase synchronization and diffusive effects. For a model with a finite number of states we derive how a phase transition occurs when the coupling parameter is varied. The phase transition is characterized by a symmetry breaking and a discontinuity in the first derivative of the order parameter. We quantitatively account for how the synchronized pulse is a low-entropy structure that facilitates the production of more entropy by the system as a whole. For a model with many states we apply a continuum approximation and derive a potential Burgers' equation for a propagating pulse. No phase transition occurs in that case. However, positive entropy production by diffusive effects still exceeds negative entropy production by the shock formation.
Collapse
Affiliation(s)
- Martin Bier
- M. Smoluchowski Institute of Physics, Jagiellonian University, ul. Łojasiewicza 11, 30-348 Kraków, Poland
- Department of Physics, East Carolina University, Greenville, North Carolina 27858, USA
| | - Bartosz Lisowski
- M. Smoluchowski Institute of Physics, Jagiellonian University, ul. Łojasiewicza 11, 30-348 Kraków, Poland
- Unit of Pharmacoepidemiology and Pharmacoeconomics, Faculty of Pharmacy, Jagiellonian University Medical College, ul. Medyczna 9, 30-688 Kraków, Poland
| | - Ewa Gudowska-Nowak
- M. Smoluchowski Institute of Physics, Jagiellonian University, ul. Łojasiewicza 11, 30-348 Kraków, Poland
- Mark Kac Center for Complex Systems Research and Malopolska Center of Biotechnology, Jagiellonian University, Gronostajowa 7A, 30-387 Kraków, Poland
| |
Collapse
|
39
|
Puelma Touzel M, Wolf F. Complete Firing-Rate Response of Neurons with Complex Intrinsic Dynamics. PLoS Comput Biol 2015; 11:e1004636. [PMID: 26720924 PMCID: PMC4697854 DOI: 10.1371/journal.pcbi.1004636] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2015] [Accepted: 10/29/2015] [Indexed: 11/23/2022] Open
Abstract
The response of a neuronal population over a space of inputs depends on the intrinsic properties of its constituent neurons. Two main modes of single neuron dynamics–integration and resonance–have been distinguished. While resonator cell types exist in a variety of brain areas, few models incorporate this feature and fewer have investigated its effects. To understand better how a resonator’s frequency preference emerges from its intrinsic dynamics and contributes to its local area’s population firing rate dynamics, we analyze the dynamic gain of an analytically solvable two-degree of freedom neuron model. In the Fokker-Planck approach, the dynamic gain is intractable. The alternative Gauss-Rice approach lifts the resetting of the voltage after a spike. This allows us to derive a complete expression for the dynamic gain of a resonator neuron model in terms of a cascade of filters on the input. We find six distinct response types and use them to fully characterize the routes to resonance across all values of the relevant timescales. We find that resonance arises primarily due to slow adaptation with an intrinsic frequency acting to sharpen and adjust the location of the resonant peak. We determine the parameter regions for the existence of an intrinsic frequency and for subthreshold and spiking resonance, finding all possible intersections of the three. The expressions and analysis presented here provide an account of how intrinsic neuron dynamics shape dynamic population response properties and can facilitate the construction of an exact theory of correlations and stability of population activity in networks containing populations of resonator neurons. Dynamic gain, the amount by which features at specific frequencies in the input to a neuron are amplified or attenuated in its output spiking, is fundamental for the encoding of information by neural populations. Most studies of dynamic gain have focused on neurons without intrinsic degrees of freedom exhibiting integrator-type subthreshold dynamics. Many neuron types in the brain, however, exhibit complex subthreshold dynamics such as resonance, found for instance in cortical interneurons, stellate cells, and mitral cells. A resonator neuron has at least two degrees of freedom for which the classical Fokker-Planck approach to calculating the dynamic gain is largely intractable. Here, we lift the voltage-reset rule after a spike, allowing us to derive a complete expression of the dynamic gain of a resonator neuron model. We find the gain can exhibit only six shapes. The resonant ones have peaks that become large due to intrinsic adaptation and become sharp due to an intrinsic frequency. A resonance can nevertheless result from either property. The analysis presented here helps explain how intrinsic neuron dynamics shape population-level response properties and provides a powerful tool for developing theories of inter-neuron correlations and dynamic responses of neural populations.
Collapse
Affiliation(s)
- Maximilian Puelma Touzel
- Department for Nonlinear Dynamics, Max Planck Institute for Dynamics and Self-Organization, Goettingen, Germany
- Bernstein Center for Computational Neuroscience, Goettingen, Germany
- Institute for Nonlinear Dynamics, Georg-August University School of Science, Goettingen, Germany
- * E-mail:
| | - Fred Wolf
- Department for Nonlinear Dynamics, Max Planck Institute for Dynamics and Self-Organization, Goettingen, Germany
- Bernstein Center for Computational Neuroscience, Goettingen, Germany
- Institute for Nonlinear Dynamics, Georg-August University School of Science, Goettingen, Germany
- Kavli Institute for Theoretical Physics, University of California Santa Barbara, Santa Barbara, California, United States of America
| |
Collapse
|
40
|
Kruscha A, Lindner B. Spike-count distribution in a neuronal population under weak common stimulation. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:052817. [PMID: 26651754 DOI: 10.1103/physreve.92.052817] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/27/2015] [Indexed: 06/05/2023]
Abstract
We study the probability distribution of the number of synchronous action potentials (spike count) in a model network consisting of a homogeneous neural population that is driven by a common time-dependent stimulus. We derive two analytical approximations for the count statistics, which are based on linear response theory and hold true for weak input correlations. Comparison to numerical simulations of populations of integrate-and-fire neurons in different parameter regimes reveals that our theory correctly predicts how much a weak common stimulus increases the probability of common firing and of common silence in the neural population.
Collapse
Affiliation(s)
- Alexandra Kruscha
- Bernstein Center for Computational Neuroscience Berlin and Institute of Physics, Humboldt University Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin and Institute of Physics, Humboldt University Berlin, Germany
| |
Collapse
|
41
|
Schuecker J, Diesmann M, Helias M. Modulated escape from a metastable state driven by colored noise. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:052119. [PMID: 26651659 DOI: 10.1103/physreve.92.052119] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/06/2014] [Indexed: 06/05/2023]
Abstract
Many phenomena in nature are described by excitable systems driven by colored noise. The temporal correlations in the fluctuations hinder an analytical treatment. We here present a general method of reduction to a white-noise system, capturing the color of the noise by effective and time-dependent boundary conditions. We apply the formalism to a model of the excitability of neuronal membranes, the leaky integrate-and-fire neuron model, revealing an analytical expression for the linear response of the system valid up to moderate frequencies. The closed form analytical expression enables the characterization of the response properties of such excitable units and the assessment of oscillations emerging in networks thereof.
Collapse
Affiliation(s)
- Jannis Schuecker
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen University, Aachen, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany
- Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
42
|
Wei W, Wolf F, Wang XJ. Impact of membrane bistability on dynamical response of neuronal populations. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:032726. [PMID: 26465517 DOI: 10.1103/physreve.92.032726] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/05/2015] [Indexed: 06/05/2023]
Abstract
Neurons in many brain areas can develop a pronounced depolarized state of membrane potential (up state) in addition to the normal hyperpolarized state near the resting potential. The influence of the up state on signal encoding, however, is not well investigated. Here we construct a one-dimensional bistable neuron model and calculate the linear dynamical response to noisy oscillatory inputs analytically. We find that with the appearance of an up state, the transmission function is enhanced by the emergence of a local maximum at some optimal frequency and the phase lag relative to the input signal is reduced. We characterize the dependence of the enhancement of frequency response on intrinsic dynamics and on the occupancy of the up state.
Collapse
Affiliation(s)
- Wei Wei
- Center for Neural Science, New York University, New York, New York 10003, USA
- Department of Neurobiology and Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, Connecticut 06520, USA
| | - Fred Wolf
- Max Planck Institute for Dynamics and Self-Organization and Bernstein Center for Computational Neuroscience, D-37077 Göttingen, Germany
| | - Xiao-Jing Wang
- Center for Neural Science, New York University, New York, New York 10003, USA
- Department of Neurobiology and Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, Connecticut 06520, USA
- NYU-ECNU Institute of Brain and Cognitive Science, NYU Shanghai, Shanghai, China
| |
Collapse
|
43
|
Abstract
The attenuation of neuronal voltage responses to high-frequency current inputs by the membrane capacitance is believed to limit single-cell bandwidth. However, neuronal populations subject to stochastic fluctuations can follow inputs beyond this limit. We investigated this apparent paradox theoretically and experimentally using Purkinje cells in the cerebellum, a motor structure that benefits from rapid information transfer. We analyzed the modulation of firing in response to the somatic injection of sinusoidal currents. Computational modeling suggested that, instead of decreasing with frequency, modulation amplitude can increase up to high frequencies because of cellular morphology. Electrophysiological measurements in adult rat slices confirmed this prediction and displayed a marked resonance at 200 Hz. We elucidated the underlying mechanism, showing that the two-compartment morphology of the Purkinje cell, interacting with a simple spiking mechanism and dendritic fluctuations, is sufficient to create high-frequency signal amplification. This mechanism, which we term morphology-induced resonance, is selective for somatic inputs, which in the Purkinje cell are exclusively inhibitory. The resonance sensitizes Purkinje cells in the frequency range of population oscillations observed in vivo.
Collapse
|
44
|
Double-well dynamics of noise-driven control activation in human intermittent control: the case of stick balancing. Cogn Process 2015; 16:351-8. [PMID: 25925132 DOI: 10.1007/s10339-015-0653-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2014] [Accepted: 04/07/2015] [Indexed: 10/23/2022]
Abstract
When facing a task of balancing a dynamic system near an unstable equilibrium, humans often adopt intermittent control strategy: Instead of continuously controlling the system, they repeatedly switch the control on and off. Paradigmatic example of such a task is stick balancing. Despite the simplicity of the task itself, the complexity of human intermittent control dynamics in stick balancing still puzzles researchers in motor control. Here we attempt to model one of the key mechanisms of human intermittent control, control activation, using as an example the task of overdamped stick balancing. In doing so, we focus on the concept of noise-driven activation, a more general alternative to the conventional threshold-driven activation. We describe control activation as a random walk in an energy potential, which changes in response to the state of the controlled system. By way of numerical simulations, we show that the developed model captures the core properties of human control activation observed previously in the experiments on overdamped stick balancing. Our results demonstrate that the double-well potential model provides tractable mathematical description of human control activation at least in the considered task and suggest that the adopted approach can potentially aid in understanding human intermittent control in more complex processes.
Collapse
|
45
|
Abstract
The time course of behaviorally relevant environmental events sets temporal constraints on neuronal processing. How does the mammalian brain make use of the increasingly complex networks of the neocortex, while making decisions and executing behavioral reactions within a reasonable time? The key parameter determining the speed of computations in neuronal networks is a time interval that neuronal ensembles need to process changes at their input and communicate results of this processing to downstream neurons. Theoretical analysis identified basic requirements for fast processing: use of neuronal populations for encoding, background activity, and fast onset dynamics of action potentials in neurons. Experimental evidence shows that populations of neocortical neurons fulfil these requirements. Indeed, they can change firing rate in response to input perturbations very quickly, within 1 to 3 ms, and encode high-frequency components of the input by phase-locking their spiking to frequencies up to 300 to 1000 Hz. This implies that time unit of computations by cortical ensembles is only few, 1 to 3 ms, which is considerably faster than the membrane time constant of individual neurons. The ability of cortical neuronal ensembles to communicate on a millisecond time scale allows for complex, multiple-step processing and precise coordination of neuronal activity in parallel processing streams, while keeping the speed of behavioral reactions within environmentally set temporal constraints.
Collapse
|
46
|
Lagzi F, Rotter S. A Markov model for the temporal dynamics of balanced random networks of finite size. Front Comput Neurosci 2014; 8:142. [PMID: 25520644 PMCID: PMC4253948 DOI: 10.3389/fncom.2014.00142] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2014] [Accepted: 10/20/2014] [Indexed: 11/21/2022] Open
Abstract
The balanced state of recurrent networks of excitatory and inhibitory spiking neurons is characterized by fluctuations of population activity about an attractive fixed point. Numerical simulations show that these dynamics are essentially nonlinear, and the intrinsic noise (self-generated fluctuations) in networks of finite size is state-dependent. Therefore, stochastic differential equations with additive noise of fixed amplitude cannot provide an adequate description of the stochastic dynamics. The noise model should, rather, result from a self-consistent description of the network dynamics. Here, we consider a two-state Markovian neuron model, where spikes correspond to transitions from the active state to the refractory state. Excitatory and inhibitory input to this neuron affects the transition rates between the two states. The corresponding nonlinear dependencies can be identified directly from numerical simulations of networks of leaky integrate-and-fire neurons, discretized at a time resolution in the sub-millisecond range. Deterministic mean-field equations, and a noise component that depends on the dynamic state of the network, are obtained from this model. The resulting stochastic model reflects the behavior observed in numerical simulations quite well, irrespective of the size of the network. In particular, a strong temporal correlation between the two populations, a hallmark of the balanced state in random recurrent networks, are well represented by our model. Numerical simulations of such networks show that a log-normal distribution of short-term spike counts is a property of balanced random networks with fixed in-degree that has not been considered before, and our model shares this statistical property. Furthermore, the reconstruction of the flow from simulated time series suggests that the mean-field dynamics of finite-size networks are essentially of Wilson-Cowan type. We expect that this novel nonlinear stochastic model of the interaction between neuronal populations also opens new doors to analyze the joint dynamics of multiple interacting networks.
Collapse
Affiliation(s)
- Fereshteh Lagzi
- Bernstein Center Freiburg and Faculty of Biology, University of FreiburgFreiburg, Germany
| | | |
Collapse
|
47
|
Droste F, Lindner B. Integrate-and-fire neurons driven by asymmetric dichotomous noise. BIOLOGICAL CYBERNETICS 2014; 108:825-843. [PMID: 25037240 DOI: 10.1007/s00422-014-0621-7] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/31/2014] [Accepted: 07/08/2014] [Indexed: 06/03/2023]
Abstract
We consider a general integrate-and-fire (IF) neuron driven by asymmetric dichotomous noise. In contrast to the Gaussian white noise usually used in the so-called diffusion approximation, this noise is colored, i.e., it exhibits temporal correlations. We give an analytical expression for the stationary voltage distribution of a neuron receiving such noise and derive recursive relations for the moments of the first passage time density, which allow us to calculate the firing rate and the coefficient of variation of interspike intervals. We study how correlations in the input affect the rate and regularity of firing under variation of the model's parameters for leaky and quadratic IF neurons. Further, we consider the limit of small correlation times and find lowest order corrections to the first passage time moments to be proportional to the square root of the correlation time. We show analytically that to this lowest order, correlations always lead to a decrease in firing rate for a leaky IF neuron. All theoretical expressions are compared to simulations of leaky and quadratic IF neurons.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany,
| | | |
Collapse
|
48
|
Schuecker J, Diesmann M, Helias M. The transfer function of the LIF model: from white to filtered noise. BMC Neurosci 2014. [PMCID: PMC4125055 DOI: 10.1186/1471-2202-15-s1-p146] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
|
49
|
Kriener B, Helias M, Rotter S, Diesmann M, Einevoll GT. How pattern formation in ring networks of excitatory and inhibitory spiking neurons depends on the input current regime. Front Comput Neurosci 2014; 7:187. [PMID: 24501591 PMCID: PMC3882721 DOI: 10.3389/fncom.2013.00187] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2013] [Accepted: 12/09/2013] [Indexed: 11/13/2022] Open
Abstract
Pattern formation, i.e., the generation of an inhomogeneous spatial activity distribution in a dynamical system with translation invariant structure, is a well-studied phenomenon in neuronal network dynamics, specifically in neural field models. These are population models to describe the spatio-temporal dynamics of large groups of neurons in terms of macroscopic variables such as population firing rates. Though neural field models are often deduced from and equipped with biophysically meaningful properties, a direct mapping to simulations of individual spiking neuron populations is rarely considered. Neurons have a distinct identity defined by their action on their postsynaptic targets. In its simplest form they act either excitatorily or inhibitorily. When the distribution of neuron identities is assumed to be periodic, pattern formation can be observed, given the coupling strength is supracritical, i.e., larger than a critical weight. We find that this critical weight is strongly dependent on the characteristics of the neuronal input, i.e., depends on whether neurons are mean- or fluctuation driven, and different limits in linearizing the full non-linear system apply in order to assess stability. In particular, if neurons are mean-driven, the linearization has a very simple form and becomes independent of both the fixed point firing rate and the variance of the input current, while in the very strongly fluctuation-driven regime the fixed point rate, as well as the input mean and variance are important parameters in the determination of the critical weight. We demonstrate that interestingly even in "intermediate" regimes, when the system is technically fluctuation-driven, the simple linearization neglecting the variance of the input can yield the better prediction of the critical coupling strength. We moreover analyze the effects of structural randomness by rewiring individual synapses or redistributing weights, as well as coarse-graining on the formation of inhomogeneous activity patterns.
Collapse
Affiliation(s)
- Birgit Kriener
- Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences Ås, Norway
| | - Moritz Helias
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6), Jülich Research Centre and JARA Jülich, Germany
| | - Stefan Rotter
- Faculty of Biology, University of Freiburg Freiburg, Germany ; Bernstein Center Freiburg, University of Freiburg Freiburg, Germany
| | - Markus Diesmann
- Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6), Jülich Research Centre and JARA Jülich, Germany ; Medical Faculty, RWTH Aachen University Aachen, Germany
| | - Gaute T Einevoll
- Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences Ås, Norway
| |
Collapse
|
50
|
Low-Pass Filtering of Information in the Leaky Integrate-and-Fire Neuron Driven by White Noise. ACTA ACUST UNITED AC 2013. [DOI: 10.1007/978-3-319-02925-2_22] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/07/2023]
|