1
|
Puttkammer F, Lindner B. Fluctuation-response relations for integrate-and-fire models with an absolute refractory period. BIOLOGICAL CYBERNETICS 2024; 118:7-19. [PMID: 38261004 PMCID: PMC11068698 DOI: 10.1007/s00422-023-00982-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 12/11/2023] [Indexed: 01/24/2024]
Abstract
We study the problem of relating the spontaneous fluctuations of a stochastic integrate-and-fire (IF) model to the response of the instantaneous firing rate to time-dependent stimulation if the IF model is endowed with a non-vanishing refractory period and a finite (stereotypical) spike shape. This seemingly harmless addition to the model is shown to complicate the analysis put forward by Lindner Phys. Rev. Lett. (2022), i.e., the incorporation of the reset into the model equation, the Rice-like averaging of the stochastic differential equation, and the application of the Furutsu-Novikov theorem. We derive a still exact (although more complicated) fluctuation-response relation (FRR) for an IF model with refractory state and a white Gaussian background noise. We also briefly discuss an approximation for the case of a colored Gaussian noise and conclude with a summary and outlook on open problems.
Collapse
Affiliation(s)
- Friedrich Puttkammer
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| |
Collapse
|
2
|
Richardson MJE. Linear and nonlinear integrate-and-fire neurons driven by synaptic shot noise with reversal potentials. Phys Rev E 2024; 109:024407. [PMID: 38491664 DOI: 10.1103/physreve.109.024407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Accepted: 12/18/2023] [Indexed: 03/18/2024]
Abstract
The steady-state firing rate and firing-rate response of the leaky and exponential integrate-and-fire models receiving synaptic shot noise with excitatory and inhibitory reversal potentials is examined. For the particular case where the underlying synaptic conductances are exponentially distributed, it is shown that the master equation for a population of such model neurons can be reduced from an integrodifferential form to a more tractable set of three differential equations. The system is nevertheless more challenging analytically than for current-based synapses: where possible, analytical results are provided with an efficient numerical scheme and code provided for other quantities. The increased tractability of the framework developed supports an ongoing critical comparison between models in which synapses are treated with and without reversal potentials, such as recently in the context of networks with balanced excitatory and inhibitory conductances.
Collapse
Affiliation(s)
- Magnus J E Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry CV4 7AL, United Kingdom
| |
Collapse
|
3
|
Lindner B. Fluctuation-Dissipation Relations for Spiking Neurons. PHYSICAL REVIEW LETTERS 2022; 129:198101. [PMID: 36399734 DOI: 10.1103/physrevlett.129.198101] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 09/27/2022] [Accepted: 10/17/2022] [Indexed: 06/16/2023]
Abstract
Spontaneous fluctuations and stimulus response are essential features of neural functioning, but how they are connected is poorly understood. I derive fluctuation-dissipation relations (FDR) between the spontaneous spike and voltage correlations and the firing rate susceptibility for (i) the leaky integrate-and-fire (IF) model with white noise and (ii) an IF model with arbitrary voltage dependence, an adaptation current, and correlated noise. The FDRs can be used to derive thus far unknown statistics analytically [model (i)] or the otherwise inaccessible intrinsic noise statistics [model (ii)].
Collapse
Affiliation(s)
- Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstraße 15, 12489 Berlin, Germany
| |
Collapse
|
4
|
Bidari S, El Hady A, Davidson JD, Kilpatrick ZP. Stochastic dynamics of social patch foraging decisions. PHYSICAL REVIEW RESEARCH 2022; 4:033128. [PMID: 36090768 PMCID: PMC9461581 DOI: 10.1103/physrevresearch.4.033128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Animals typically forage in groups. Social foraging can help animals avoid predation and decrease their uncertainty about the richness of food resources. Despite this, theoretical mechanistic models of patch foraging have overwhelmingly focused on the behavior of single foragers. In this study, we develop a mechanistic model that accounts for the behavior of individuals foraging together and departing food patches following an evidence accumulation process. Each individual's belief about patch quality is represented by a stochastically accumulating variable, which is coupled to another's belief to represent the transfer of information. We consider a cohesive group, and model information sharing by considering both intermittent pulsatile coupling (only communicate decision to leave) and continuous diffusive coupling (communicate throughout the deliberation process). Groups employing pulsatile coupling can obtain higher foraging efficiency, which depends more strongly on the coupling parameter compared to those using diffusive coupling. Conversely, groups using diffusive coupling are more robust to changes and heterogeneities in belief weighting and departure criteria. Efficiency is measured by a reward rate function that balances the amount of energy accumulated against the time spent in a patch, computed by solving an ordered first passage time problem for the patch departures of each individual. Using synthetic departure time data, we can distinguish between the two modes of communication and identify the model parameters. Our model establishes a social patch foraging framework to identify deliberative decision strategies and forms of social communication, and to allow model fitting to field data from foraging animal groups.
Collapse
Affiliation(s)
- Subekshya Bidari
- Department of Applied Mathematics, University of Colorado Boulder, Colorado 80309, USA
| | - Ahmed El Hady
- Princeton Neuroscience Institute, Princeton, New Jersey 08540, USA
- Department of Collective Behavior, Max Planck Institute for Animal Behavior, Konstanz D-78457, Germany
- Cluster for Advanced Study of Collective Behavior, Max Planck Institute for Animal Behavior, Konstanz D-78457, Germany
| | - Jacob D. Davidson
- Department of Collective Behavior, Max Planck Institute for Animal Behavior, Konstanz D-78457, Germany
| | - Zachary P. Kilpatrick
- Department of Applied Mathematics, University of Colorado, Boulder, Colorado 80309, USA
| |
Collapse
|
5
|
Metzner C, Krauss P. Dynamics and Information Import in Recurrent Neural Networks. Front Comput Neurosci 2022; 16:876315. [PMID: 35573264 PMCID: PMC9091337 DOI: 10.3389/fncom.2022.876315] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Accepted: 04/04/2022] [Indexed: 12/27/2022] Open
Abstract
Recurrent neural networks (RNNs) are complex dynamical systems, capable of ongoing activity without any driving input. The long-term behavior of free-running RNNs, described by periodic, chaotic and fixed point attractors, is controlled by the statistics of the neural connection weights, such as the density d of non-zero connections, or the balance b between excitatory and inhibitory connections. However, for information processing purposes, RNNs need to receive external input signals, and it is not clear which of the dynamical regimes is optimal for this information import. We use both the average correlations C and the mutual information I between the momentary input vector and the next system state vector as quantitative measures of information import and analyze their dependence on the balance and density of the network. Remarkably, both resulting phase diagrams C(b, d) and I(b, d) are highly consistent, pointing to a link between the dynamical systems and the information-processing approach to complex systems. Information import is maximal not at the "edge of chaos," which is optimally suited for computation, but surprisingly in the low-density chaotic regime and at the border between the chaotic and fixed point regime. Moreover, we find a completely new type of resonance phenomenon, which we call "Import Resonance" (IR), where the information import shows a maximum, i.e., a peak-like dependence on the coupling strength between the RNN and its external input. IR complements previously found Recurrence Resonance (RR), where correlation and mutual information of successive system states peak for a certain amplitude of noise added to the system. Both IR and RR can be exploited to optimize information processing in artificial neural networks and might also play a crucial role in biological neural systems.
Collapse
Affiliation(s)
- Claus Metzner
- Neuroscience Lab, University Hospital Erlangen, Erlangen, Germany
| | - Patrick Krauss
- Neuroscience Lab, University Hospital Erlangen, Erlangen, Germany
- Cognitive Computational Neuroscience Group, Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, Germany
- Pattern Recognition Lab, Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, Germany
| |
Collapse
|
6
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
7
|
|
8
|
Pietras B, Gallice N, Schwalger T. Low-dimensional firing-rate dynamics for populations of renewal-type spiking neurons. Phys Rev E 2021; 102:022407. [PMID: 32942450 DOI: 10.1103/physreve.102.022407] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2020] [Accepted: 06/29/2020] [Indexed: 11/07/2022]
Abstract
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects and nonstationary responses of the population activity to rapidly changing stimuli. Here we derive low-dimensional firing-rate models for homogeneous populations of neurons modeled as time-dependent renewal processes. The class of renewal neurons includes integrate-and-fire models driven by white noise and has been frequently used to model neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues characterizing the timescales of the firing rate dynamics and the Laplace transform of the interspike interval density, for which explicit expressions are available for many renewal models. Retaining only the first eigenmode already yields a reliable low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness and other renewal models that admit an explicit analytical calculation of the eigenvalues. The eigenmode expansion presented here provides a systematic framework for alternative firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| | - Noé Gallice
- Brain Mind Institute, École polytechnique fédérale de Lausanne (EPFL), Station 15, CH-1015 Lausanne, Switzerland
| | - Tilo Schwalger
- Institute of Mathematics, Technical University Berlin, 10623 Berlin, Germany.,Bernstein Center for Computational Neuroscience Berlin, 10115 Berlin, Germany
| |
Collapse
|
9
|
Baspinar E, Schülen L, Olmi S, Zakharova A. Coherence resonance in neuronal populations: Mean-field versus network model. Phys Rev E 2021; 103:032308. [PMID: 33862689 DOI: 10.1103/physreve.103.032308] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2020] [Accepted: 02/22/2021] [Indexed: 01/17/2023]
Abstract
The counterintuitive phenomenon of coherence resonance describes a nonmonotonic behavior of the regularity of noise-induced oscillations in the excitable regime, leading to an optimal response in terms of regularity of the excited oscillations for an intermediate noise intensity. We study this phenomenon in populations of FitzHugh-Nagumo (FHN) neurons with different coupling architectures. For networks of FHN systems in an excitable regime, coherence resonance has been previously analyzed numerically. Here we focus on an analytical approach studying the mean-field limits of the globally and locally coupled populations. The mean-field limit refers to an averaged behavior of a complex network as the number of elements goes to infinity. We apply the mean-field approach to the globally coupled FHN network. Further, we derive a mean-field limit approximating the locally coupled FHN network with low noise intensities. We study the effects of the coupling strength and noise intensity on coherence resonance for both the network and the mean-field models. We compare the results of the mean-field and network frameworks and find good agreement in the globally coupled case, where the correspondence between the two approaches is sufficiently good to capture the emergence of coherence resonance, as well as of anticoherence resonance.
Collapse
Affiliation(s)
- Emre Baspinar
- Inria Sophia Antipolis Méditerranée Research Centre, 2004 Route des Lucioles, 06902 Valbonne, France
| | - Leonhard Schülen
- Institut für Theoretische Physik, Technische Universität Berlin, Hardenbergstraße 36, 10623 Berlin, Germany
| | - Simona Olmi
- Inria Sophia Antipolis Méditerranée Research Centre, 2004 Route des Lucioles, 06902 Valbonne, France.,CNR - Consiglio Nazionale delle Ricerche - Istituto dei Sistemi complessi, 50019, Sesto Fiorentino, Italy.,Joint Senior Authorship
| | - Anna Zakharova
- Institut für Theoretische Physik, Technische Universität Berlin, Hardenbergstraße 36, 10623 Berlin, Germany.,Joint Senior Authorship
| |
Collapse
|
10
|
D’Huys O, Veltz R, Dolcemascolo A, Marino F, Barland S. Canard resonance: on noise-induced ordering of trajectories in heterogeneous networks of slow-fast systems. JPHYS PHOTONICS 2021. [DOI: 10.1088/2515-7647/abcbe3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Abstract
We analyse the dynamics of a network of semiconductor lasers coupled via their mean intensity through a non-linear optoelectronic feedback loop. We establish experimentally the excitable character of a single node, which stems from the slow-fast nature of the system, adequately described by a set of rate equations with three well separated time scales. Beyond the excitable regime, the system undergoes relaxation oscillations where the nodes display canard dynamics. We show numerically that, without noise, the coupled system follows an intricate canard trajectory, with the nodes switching on one by one. While incorporating noise leads to a better correspondence between numerical simulations and experimental data, it also has an unexpected ordering effect on the canard orbit, causing the nodes to switch on closer together in time. We find that the dispersion of the trajectories of the network nodes in phase space is minimized for a non-zero noise strength, and call this phenomenon canard resonance.
Collapse
|
11
|
Zhu J. Unified mechanism of inverse stochastic resonance for monostability and bistability in Hindmarsh-Rose neuron. CHAOS (WOODBURY, N.Y.) 2021; 31:033119. [PMID: 33810765 DOI: 10.1063/5.0041410] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Accepted: 02/16/2021] [Indexed: 06/12/2023]
Abstract
Noise is ubiquitous and has been verified to play constructive roles in various systems, among which the inverse stochastic resonance (ISR) has aroused much attention in contrast to positive effects such as stochastic resonance. The ISR has been observed in both bistable and monostable systems for which the mechanisms are revealed as noise-induced biased switching and noise-enhanced stability, respectively. In this paper, we investigate the ISR phenomenon in the monostable and bistable Hindmarsh-Rose neurons within a unified framework of large deviation theory. The critical noise strengths for both cases can be obtained by matching the timescales between noise-induced boundary crossing and the limit cycle. Furthermore, different stages of ISR are revealed by the bursting frequency distribution, where the gradual increase of the peak bursting frequency can also be explained within the same framework. The perspective and results in this paper may shed some light on the understanding of the noise-induced complex phenomena in stochastic dynamical systems.
Collapse
Affiliation(s)
- Jinjie Zhu
- School of Mechanical Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
| |
Collapse
|
12
|
Abstract
Power spectra of spike trains reveal important properties of neuronal behavior. They exhibit several peaks, whose shape and position depend on applied stimuli and intrinsic biophysical properties, such as input current density and channel noise. The position of the spectral peaks in the frequency domain is not straightforwardly predictable from statistical averages of the interspike intervals, especially when stochastic behavior prevails. In this work, we provide a model for the neuronal power spectrum, obtained from Discrete Fourier Transform and expressed as a series of expected value of sinusoidal terms. The first term of the series allows us to estimate the frequencies of the spectral peaks to a maximum error of a few Hz, and to interpret why they are not harmonics of the first peak frequency. Thus, the simple expression of the proposed power spectral density (PSD) model makes it a powerful interpretative tool of PSD shape, and also useful for neurophysiological studies aimed at extracting information on neuronal behavior from spike train spectra.
Collapse
|
13
|
Bostner Ž, Knoll G, Lindner B. Information filtering by coincidence detection of synchronous population output: analytical approaches to the coherence function of a two-stage neural system. BIOLOGICAL CYBERNETICS 2020; 114:403-418. [PMID: 32583370 PMCID: PMC7326833 DOI: 10.1007/s00422-020-00838-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Accepted: 05/18/2020] [Indexed: 06/11/2023]
Abstract
Information about time-dependent sensory stimuli is encoded in the activity of neural populations; distinct aspects of the stimulus are read out by different types of neurons: while overall information is perceived by integrator cells, so-called coincidence detector cells are driven mainly by the synchronous activity in the population that encodes predominantly high-frequency content of the input signal (high-pass information filtering). Previously, an analytically accessible statistic called the partial synchronous output was introduced as a proxy for the coincidence detector cell's output in order to approximate its information transmission. In the first part of the current paper, we compare the information filtering properties (specifically, the coherence function) of this proxy to those of a simple coincidence detector neuron. We show that the latter's coherence function can indeed be well-approximated by the partial synchronous output with a time scale and threshold criterion that are related approximately linearly to the membrane time constant and firing threshold of the coincidence detector cell. In the second part of the paper, we propose an alternative theory for the spectral measures (including the coherence) of the coincidence detector cell that combines linear-response theory for shot-noise driven integrate-and-fire neurons with a novel perturbation ansatz for the spectra of spike-trains driven by colored noise. We demonstrate how the variability of the synaptic weights for connections from the population to the coincidence detector can shape the information transmission of the entire two-stage system.
Collapse
Affiliation(s)
- Žiga Bostner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| |
Collapse
|
14
|
Herfurth T, Tchumatchenko T. Quantifying encoding redundancy induced by rate correlations in Poisson neurons. Phys Rev E 2019; 99:042402. [PMID: 31108645 DOI: 10.1103/physreve.99.042402] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2018] [Indexed: 11/07/2022]
Abstract
Temporal correlations in neuronal spike trains are known to introduce redundancy to stimulus encoding. However, exact methods to describe how these correlations impact neural information transmission quantitatively are lacking. Here, we provide a general measure for the information carried by correlated rate modulations only, neglecting other spike correlations, and use it to investigate the effect of rate correlations on encoding redundancy. We derive it analytically by calculating the mutual information between a time-correlated, rate modulating signal and the resulting spikes of Poisson neurons. Whereas this information is determined by spike autocorrelations only, the redundancy in information encoding due to rate correlations depends on both the distribution and the autocorrelation of the rate histogram. We further demonstrate that at very small signal strengths the information carried by rate correlated spikes becomes identical to that of independent spikes, in effect measuring the signal modulation depth. In contrast, a vanishing signal correlation time maximizes information but does not generally yield the information of independent spikes. Overall, our study sheds light on the role of signal-induced temporal correlations for neural coding, by providing insight into how signal features shape redundancy and by establishing mathematical links between existing methods.
Collapse
Affiliation(s)
- Tim Herfurth
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| | - Tatjana Tchumatchenko
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| |
Collapse
|
15
|
Braun W, Longtin A. Interspike interval correlations in networks of inhibitory integrate-and-fire neurons. Phys Rev E 2019; 99:032402. [PMID: 30999498 DOI: 10.1103/physreve.99.032402] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Indexed: 11/07/2022]
Abstract
We study temporal correlations of interspike intervals, quantified by the network-averaged serial correlation coefficient (SCC), in networks of both current- and conductance-based purely inhibitory integrate-and-fire neurons. Numerical simulations reveal transitions to negative SCCs at intermediate values of bias current drive and network size. As bias drive and network size are increased past these values, the SCC returns to zero. The SCC is maximally negative at an intermediate value of the network oscillation strength. The dependence of the SCC on two canonical schemes for synaptic connectivity is studied, and it is shown that the results occur robustly in both schemes. For conductance-based synapses, the SCC becomes negative at the onset of both a fast and slow coherent network oscillation. We then show by means of offline simulations using prerecorded network activity that a neuron's SCC is highly sensitive to its number of presynaptic inputs. Finally, we devise a noise-reduced diffusion approximation for current-based networks that accounts for the observed temporal correlation transitions.
Collapse
Affiliation(s)
- Wilhelm Braun
- Neural Network Dynamics and Computation, Institut für Genetik, Universität Bonn, Kirschallee 1, 53115 Bonn, Germany.,Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| | - André Longtin
- Department of Physics and Centre for Neural Dynamics, University of Ottawa, 598 King Edward, Ottawa K1N 6N5, Canada
| |
Collapse
|
16
|
Voronenko SO, Lindner B. Improved lower bound for the mutual information between signal and neural spike count. BIOLOGICAL CYBERNETICS 2018; 112:523-538. [PMID: 30155699 DOI: 10.1007/s00422-018-0779-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/24/2018] [Accepted: 08/20/2018] [Indexed: 06/08/2023]
Abstract
The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound for the mutual information by incorporating nonlinear correlations. For the special case of a Gaussian output variable with nonlinear signal dependencies of mean and variance we also derive an exact integral formula for the full mutual information. In our numerical analysis, we first compare the linear and nonlinear lower bounds and the exact integral formula for two different Gaussian models and show under which conditions the nonlinear lower bound provides a significant improvement to the linear approximation. We then inspect two neuron models, the leaky integrate-and-fire model with white Gaussian noise and the Na-K model with channel noise. We show that for certain firing regimes and for intermediate signal strengths the nonlinear lower bound can provide a substantial improvement compared to the linear lower bound. Our results demonstrate the importance of nonlinear input-output correlations for neural information transmission and provide a simple nonlinear approximation for the mutual information that can be applied to more complicated neuron models as well as to experimental data.
Collapse
Affiliation(s)
- Sergej O Voronenko
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
17
|
D'Onofrio G, Tamborrino M, Lansky P. The Jacobi diffusion process as a neuronal model. CHAOS (WOODBURY, N.Y.) 2018; 28:103119. [PMID: 30384666 DOI: 10.1063/1.5051494] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/09/2018] [Accepted: 10/01/2018] [Indexed: 06/08/2023]
Abstract
The Jacobi process is a stochastic diffusion characterized by a linear drift and a special form of multiplicative noise which keeps the process confined between two boundaries. One example of such a process can be obtained as the diffusion limit of the Stein's model of membrane depolarization which includes both excitatory and inhibitory reversal potentials. The reversal potentials create the two boundaries between which the process is confined. Solving the first-passage-time problem for the Jacobi process, we found closed-form expressions for mean, variance, and third moment that are easy to implement numerically. The first two moments are used here to determine the role played by the parameters of the neuronal model; namely, the effect of multiplicative noise on the output of the Jacobi neuronal model with input-dependent parameters is examined in detail and compared with the properties of the generic Jacobi diffusion. It appears that the dependence of the model parameters on the rate of inhibition turns out to be of primary importance to observe a change in the slope of the response curves. This dependence also affects the variability of the output as reflected by the coefficient of variation. It often takes values larger than one, and it is not always a monotonic function in dependency on the rate of excitation.
Collapse
Affiliation(s)
- Giuseppe D'Onofrio
- Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| | - Massimiliano Tamborrino
- Johannes Kepler University Linz, Institute for Stochastics Altenbergerstraße 69, 4040 Linz, Austria
| | - Petr Lansky
- Institute of Physiology of the Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| |
Collapse
|
18
|
Zhang L, Fan D, Wang Q, Baier G. Effects of brain-derived neurotrophic factor and noise on transitions in temporal lobe epilepsy in a hippocampal network. CHAOS (WOODBURY, N.Y.) 2018; 28:106322. [PMID: 30384669 DOI: 10.1063/1.5036690] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2018] [Accepted: 09/19/2018] [Indexed: 06/08/2023]
Abstract
Brain-derived neurotrophic factor (BDNF) has recently been implicated in the modulation of receptor activation leading to dynamic state transitions in temporal lobe epilepsy (TLE). In addition, the crucial role of neuronal noise in these transitions has been studied in electrophysiological experiments. However, the precise role of these factors during seizure generation in TLE is not known. Building on a previously proposed model of an epileptogenic hippocampal network, we included the actions of BDNF-regulated receptors and intrinsic noise. We found that the effects of both BDNF and noise can increase the activation of N-methyl-D-aspartate receptors leading to excessive C a 2 + flux, which induces abnormal fast spiking and bursting. Our results indicate that the combined effects have a strong influence on the seizure-generating network, resulting in higher firing frequency and amplitude. As correlations between firing increase, the synchronization of the entire network increases, a marker of the ictogenic transitions from normal to seizures-like dynamics. Our work on the effects of BDNF dynamics in a noisy environment might lead to an improved model-based understanding of the pathological mechanisms in TLE.
Collapse
Affiliation(s)
- Liyuan Zhang
- Department of Dynamics and Control, Beihang University, 100191 Beijing, China
| | - Denggui Fan
- School of Mathematics and Physics, University of Science and Technology Beijing, 100083 Beijing, China
| | - Qingyun Wang
- Department of Dynamics and Control, Beihang University, 100191 Beijing, China
| | - Gerold Baier
- Cell and Developmental Biology, University College London, London WC1E 6BT, United Kingdom
| |
Collapse
|
19
|
Bird AD, Richardson MJE. Transmission of temporally correlated spike trains through synapses with short-term depression. PLoS Comput Biol 2018; 14:e1006232. [PMID: 29933363 PMCID: PMC6039054 DOI: 10.1371/journal.pcbi.1006232] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Revised: 07/10/2018] [Accepted: 05/24/2018] [Indexed: 11/18/2022] Open
Abstract
Short-term synaptic depression, caused by depletion of releasable neurotransmitter, modulates the strength of neuronal connections in a history-dependent manner. Quantifying the statistics of synaptic transmission requires stochastic models that link probabilistic neurotransmitter release with presynaptic spike-train statistics. Common approaches are to model the presynaptic spike train as either regular or a memory-less Poisson process: few analytical results are available that describe depressing synapses when the afferent spike train has more complex, temporally correlated statistics such as bursts. Here we present a series of analytical results—from vesicle release-site occupancy statistics, via neurotransmitter release, to the post-synaptic voltage mean and variance—for depressing synapses driven by correlated presynaptic spike trains. The class of presynaptic drive considered is that fully characterised by the inter-spike-interval distribution and encompasses a broad range of models used for neuronal circuit and network analyses, such as integrate-and-fire models with a complete post-spike reset and receiving sufficiently short-time correlated drive. We further demonstrate that the derived post-synaptic voltage mean and variance allow for a simple and accurate approximation of the firing rate of the post-synaptic neuron, using the exponential integrate-and-fire model as an example. These results extend the level of biological detail included in models of synaptic transmission and will allow for the incorporation of more complex and physiologically relevant firing patterns into future studies of neuronal networks. Synapses between neurons transmit signals with strengths that vary with the history of their activity, over scales from milliseconds to decades. Short-term changes in synaptic strength modulate and sculpt ongoing neuronal activity, whereas long-term changes underpin memory formation. Here we focus on changes of strength over timescales of less than a second caused by transitory depletion of the neurotransmitters that carry signals across the synapse. Neurotransmitters are stored in small vesicles that release their contents, with a certain probability, when the presynaptic neuron is active. Once a vesicle has been used it is replenished after a variable delay. There is therefore a complex interaction between the pattern of incoming signals to the synapse and the probablistic release and restock of packaged neurotransmitter. Here we extend existing models to examine how correlated synaptic activity is transmitted through synapses and affects the voltage fluctuations and firing rate of the target neuron. Our results provide a framework that will allow for the inclusion of biophysically realistic synaptic behaviour in studies of neuronal circuits.
Collapse
Affiliation(s)
- Alex D. Bird
- Warwick Systems Biology Centre, University of Warwick, Coventry, United Kingdom
- Ernst Strüngmann Institute for Neuroscience, Max Planck Society, Frankfurt, Germany
- Frankfurt Institute for Advanced Studies, Frankfurt, Germany
- * E-mail: (ADB); (MJER)
| | - Magnus J. E. Richardson
- Warwick Mathematics Institute, University of Warwick, Coventry, United Kingdom
- * E-mail: (ADB); (MJER)
| |
Collapse
|
20
|
D'Onofrio G, Lansky P, Pirozzi E. On two diffusion neuronal models with multiplicative noise: The mean first-passage time properties. CHAOS (WOODBURY, N.Y.) 2018; 28:043103. [PMID: 31906649 DOI: 10.1063/1.5009574] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Two diffusion processes with multiplicative noise, able to model the changes in the neuronal membrane depolarization between two consecutive spikes of a single neuron, are considered and compared. The processes have the same deterministic part but different stochastic components. The differences in the state-dependent variabilities, their asymptotic distributions, and the properties of the first-passage time across a constant threshold are investigated. Closed form expressions for the mean of the first-passage time of both processes are derived and applied to determine the role played by the parameters involved in the model. It is shown that for some values of the input parameters, the higher variability, given by the second moment, does not imply shorter mean first-passage time. The reason for that can be found in the complete shape of the stationary distribution of the two processes. Applications outside neuroscience are also mentioned.
Collapse
Affiliation(s)
- G D'Onofrio
- Institute of Physiology, Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| | - P Lansky
- Institute of Physiology, Czech Academy of Sciences, Videnska 1083, 14220 Prague 4, Czech Republic
| | - E Pirozzi
- Dipartimento di Matematica e Applicazioni, University of Napoli Federico II, Via Cintia, 80126 Napoli, Italy
| |
Collapse
|
21
|
Pena RFO, Vellmer S, Bernardi D, Roque AC, Lindner B. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks. Front Comput Neurosci 2018; 12:9. [PMID: 29551968 PMCID: PMC5840464 DOI: 10.3389/fncom.2018.00009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2017] [Accepted: 02/07/2018] [Indexed: 11/13/2022] Open
Abstract
Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.
Collapse
Affiliation(s)
- Rodrigo F O Pena
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Sebastian Vellmer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Davide Bernardi
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Antonio C Roque
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| |
Collapse
|
22
|
Beiran M, Kruscha A, Benda J, Lindner B. Coding of time-dependent stimuli in homogeneous and heterogeneous neural populations. J Comput Neurosci 2017; 44:189-202. [PMID: 29222729 DOI: 10.1007/s10827-017-0674-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Revised: 11/08/2017] [Accepted: 11/12/2017] [Indexed: 11/29/2022]
Abstract
We compare the information transmission of a time-dependent signal by two types of uncoupled neuron populations that differ in their sources of variability: i) a homogeneous population whose units receive independent noise and ii) a deterministic heterogeneous population, where each unit exhibits a different baseline firing rate ('disorder'). Our criterion for making both sources of variability quantitatively comparable is that the interspike-interval distributions are identical for both systems. Numerical simulations using leaky integrate-and-fire neurons unveil that a non-zero amount of both noise or disorder maximizes the encoding efficiency of the homogeneous and heterogeneous system, respectively, as a particular case of suprathreshold stochastic resonance. Our findings thus illustrate that heterogeneity can render similarly profitable effects for neuronal populations as dynamic noise. The optimal noise/disorder depends on the system size and the properties of the stimulus such as its intensity or cutoff frequency. We find that weak stimuli are better encoded by a noiseless heterogeneous population, whereas for strong stimuli a homogeneous population outperforms an equivalent heterogeneous system up to a moderate noise level. Furthermore, we derive analytical expressions of the coherence function for the cases of very strong noise and of vanishing intrinsic noise or heterogeneity, which predict the existence of an optimal noise intensity. Our results show that, depending on the type of signal, noise as well as heterogeneity can enhance the encoding performance of neuronal populations.
Collapse
Affiliation(s)
- Manuel Beiran
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany. .,Group for Neural Theory, Laboratoire de Neurosciences Cognitives, Département Études Cognitives, École Normale Supérieure, INSERM, PSL Research University, Paris, France.
| | - Alexandra Kruscha
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Jan Benda
- Institute for Neurobiology, Eberhard Karls Universität, Tübingen, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany.,Physics Department, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
23
|
Tiwari I, Dave D, Phogat R, Khera N, Parmananda P. An alternate protocol to achieve stochastic and deterministic resonances. CHAOS (WOODBURY, N.Y.) 2017; 27:103112. [PMID: 29092418 DOI: 10.1063/1.4995329] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Periodic and Aperiodic Stochastic Resonance (SR) and Deterministic Resonance (DR) are studied in this paper. To check for the ubiquitousness of the phenomena, two unrelated systems, namely, FitzHugh-Nagumo and a particle in a bistable potential well, are studied. Instead of the conventional scenario of noise amplitude (in the case of SR) or chaotic signal amplitude (in the case of DR) variation, a tunable system parameter ("a" in the case of FitzHugh-Nagumo model and the damping coefficient "j" in the bistable model) is regulated. The operating values of these parameters are defined as the "setpoint" of the system throughout the present work. Our results indicate that there exists an optimal value of the setpoint for which maximum information transfer between the input and the output signals takes place. This information transfer from the input sub-threshold signal to the output dynamics is quantified by the normalised cross-correlation coefficient ( |CCC|). |CCC| as a function of the setpoint exhibits a unimodal variation which is characteristic of SR (or DR). Furthermore, |CCC| is computed for a grid of noise (or chaotic signal) amplitude and setpoint values. The heat map of |CCC| over this grid yields the presence of a resonance region in the noise-setpoint plane for which the maximum enhancement of the input sub-threshold signal is observed. This resonance region could be possibly used to explain how organisms maintain their signal detection efficacy with fluctuating amounts of noise present in their environment. Interestingly, the method of regulating the setpoint without changing the noise amplitude was not able to induce Coherence Resonance (CR). A possible, qualitative reasoning for this is provided.
Collapse
Affiliation(s)
- Ishant Tiwari
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| | - Darshil Dave
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| | - Richa Phogat
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| | - Neev Khera
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| | - P Parmananda
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| |
Collapse
|
24
|
Droste F, Lindner B. Exact analytical results for integrate-and-fire neurons driven by excitatory shot noise. J Comput Neurosci 2017; 43:81-91. [PMID: 28585050 DOI: 10.1007/s10827-017-0649-5] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2017] [Revised: 04/14/2017] [Accepted: 05/04/2017] [Indexed: 11/24/2022]
Abstract
A neuron receives input from other neurons via electrical pulses, so-called spikes. The pulse-like nature of the input is frequently neglected in analytical studies; instead, the input is usually approximated to be Gaussian. Recent experimental studies have shown, however, that an assumption underlying this approximation is often not met: Individual presynaptic spikes can have a significant effect on a neuron's dynamics. It is thus desirable to explicitly account for the pulse-like nature of neural input, i.e. consider neurons driven by a shot noise - a long-standing problem that is mathematically challenging. In this work, we exploit the fact that excitatory shot noise with exponentially distributed weights can be obtained as a limit case of dichotomous noise, a Markovian two-state process. This allows us to obtain novel exact expressions for the stationary voltage density and the moments of the interspike-interval density of general integrate-and-fire neurons driven by such an input. For the special case of leaky integrate-and-fire neurons, we also give expressions for the power spectrum and the linear response to a signal. We verify and illustrate our expressions by comparison to simulations of leaky-, quadratic- and exponential integrate-and-fire neurons.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany. .,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Newtonstr 15, 12489, Berlin, Germany
| |
Collapse
|
25
|
Exact firing time statistics of neurons driven by discrete inhibitory noise. Sci Rep 2017; 7:1577. [PMID: 28484244 PMCID: PMC5431561 DOI: 10.1038/s41598-017-01658-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2017] [Accepted: 03/29/2017] [Indexed: 12/15/2022] Open
Abstract
Neurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory pre- synaptic neurons, which determines the firing activity of the stimulated neuron. In order to investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous post- synaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.
Collapse
|
26
|
Synchronous spikes are necessary but not sufficient for a synchrony code in populations of spiking neurons. Proc Natl Acad Sci U S A 2017; 114:E1977-E1985. [PMID: 28202729 DOI: 10.1073/pnas.1615561114] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Synchronous activity in populations of neurons potentially encodes special stimulus features. Selective readout of either synchronous or asynchronous activity allows formation of two streams of information processing. Theoretical work predicts that such a synchrony code is a fundamental feature of populations of spiking neurons if they operate in specific noise and stimulus regimes. Here we experimentally test the theoretical predictions by quantifying and comparing neuronal response properties in tuberous and ampullary electroreceptor afferents of the weakly electric fish Apteronotus leptorhynchus These related systems show similar levels of synchronous activity, but only in the more irregularly firing tuberous afferents a synchrony code is established, whereas in the more regularly firing ampullary afferents it is not. The mere existence of synchronous activity is thus not sufficient for a synchrony code. Single-cell features such as the irregularity of spiking and the frequency dependence of the neuron's transfer function determine whether synchronous spikes possess a distinct meaning for the encoding of time-dependent signals.
Collapse
|
27
|
Droste F, Lindner B. Exact results for power spectrum and susceptibility of a leaky integrate-and-fire neuron with two-state noise. Phys Rev E 2017; 95:012411. [PMID: 28208429 DOI: 10.1103/physreve.95.012411] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2016] [Indexed: 11/07/2022]
Abstract
The response properties of excitable systems driven by colored noise are of great interest, but are usually mathematically only accessible via approximations. For this reason, dichotomous noise, a rare example of a colored noise leading often to analytically tractable problems, has been extensively used in the study of stochastic systems. Here, we calculate exact expressions for the power spectrum and the susceptibility of a leaky integrate-and-fire neuron driven by asymmetric dichotomous noise. While our results are in excellent agreement with simulations, they also highlight a limitation of using dichotomous noise as a simple model for more complex fluctuations: Both power spectrum and susceptibility exhibit an undamped periodic structure, the origin of which we discuss in detail.
Collapse
Affiliation(s)
- Felix Droste
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Haus 2, Philippstrasse 13, 10115 Berlin, Germany and Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| |
Collapse
|
28
|
Weak electric fields detectability in a noisy neural network. Cogn Neurodyn 2016; 11:81-90. [PMID: 28174614 DOI: 10.1007/s11571-016-9409-x] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2016] [Revised: 08/16/2016] [Accepted: 09/06/2016] [Indexed: 12/18/2022] Open
Abstract
We investigate the detectability of weak electric field in a noisy neural network based on Izhikevich neuron model systematically. The neural network is composed of excitatory and inhibitory neurons with similar ratio as that in the mammalian neocortex, and the axonal conduction delays between neurons are also considered. It is found that the noise intensity can modulate the detectability of weak electric field. Stochastic resonance (SR) phenomenon induced by white noise is observed when the weak electric field is added to the network. It is interesting that SR almost disappeared when the connections between neurons are cancelled, suggesting the amplification effects of the neural coupling on the synchronization of neuronal spiking. Furthermore, the network parameters, such as the connection probability, the synaptic coupling strength, the scale of neuron population and the neuron heterogeneity, can also affect the detectability of the weak electric field. Finally, the model sensitivity is studied in detail, and results show that the neural network model has an optimal region for the detectability of weak electric field signal.
Collapse
|
29
|
Levakova M, Tamborrino M, Kostal L, Lansky P. Presynaptic Spontaneous Activity Enhances the Accuracy of Latency Coding. Neural Comput 2016; 28:2162-80. [PMID: 27557098 DOI: 10.1162/neco_a_00880] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The time to the first spike after stimulus onset typically varies with the stimulation intensity. Experimental evidence suggests that neural systems use such response latency to encode information about the stimulus. We investigate the decoding accuracy of the latency code in relation to the level of noise in the form of presynaptic spontaneous activity. Paradoxically, the optimal performance is achieved at a nonzero level of noise and suprathreshold stimulus intensities. We argue that this phenomenon results from the influence of the spontaneous activity on the stabilization of the membrane potential in the absence of stimulation. The reported decoding accuracy improvement represents a novel manifestation of the noise-aided signal enhancement.
Collapse
Affiliation(s)
- Marie Levakova
- Institute of Physiology of the Czech Academy of Sciences, 142 20 Prague 4, Czech Republic
| | | | - Lubomir Kostal
- Institute of Physiology of the Czech Academy of Sciences, Prague 4, Czech Republic
| | - Petr Lansky
- Institute of Physiology of the Czech Academy of Sciences, Prague 4, Czech Republic
| |
Collapse
|
30
|
Tiwari I, Phogat R, Parmananda P, Ocampo-Espindola JL, Rivera M. Intrinsic periodic and aperiodic stochastic resonance in an electrochemical cell. Phys Rev E 2016; 94:022210. [PMID: 27627301 DOI: 10.1103/physreve.94.022210] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2016] [Indexed: 11/07/2022]
Abstract
In this paper we show the interaction of a composite of a periodic or aperiodic signal and intrinsic electrochemical noise with the nonlinear dynamics of an electrochemical cell configured to study the corrosion of iron in an acidic media. The anodic voltage setpoint (V_{0}) in the cell is chosen such that the anodic current (I) exhibits excitable fixed point behavior in the absence of noise. The subthreshold periodic (aperiodic) signal consists of a train of rectangular pulses with a fixed amplitude and width, separated by regular (irregular) time intervals. The irregular time intervals chosen are of deterministic and stochastic origins. The amplitude of the intrinsic internal noise, regulated by the concentration of chloride ions, is then monotonically increased, and the provoked dynamics are analyzed. The signal to noise ratio and the cross-correlation coefficient versus the chloride ions' concentration curves have a unimodal shape indicating the emergence of an intrinsic periodic or aperiodic stochastic resonance. The abscissa for the maxima of these unimodal curves correspond to the optimum value of intrinsic noise where maximum regularity of the invoked dynamics is observed. In the particular case of the intrinsic periodic stochastic resonance, the scanning electron microscope images for the electrode metal surfaces are shown for certain values of chloride ions' concentrations. These images, qualitatively, corroborate the emergence of order as a result of the interaction between the nonlinear dynamics and the composite signal.
Collapse
Affiliation(s)
- Ishant Tiwari
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| | - Richa Phogat
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| | - P Parmananda
- Department of Physics, Indian Institute of Technology, Bombay, Powai, Mumbai-400 076, India
| | - J L Ocampo-Espindola
- Centro de Investigación en Ciencias-(IICBA), UAEM, Avenida Universidad 1001, 62209 Cuernavaca, Morelos, México
| | - M Rivera
- Centro de Investigación en Ciencias-(IICBA), UAEM, Avenida Universidad 1001, 62209 Cuernavaca, Morelos, México
| |
Collapse
|
31
|
Kruscha A, Lindner B. Partial synchronous output of a neuronal population under weak common noise: Analytical approaches to the correlation statistics. Phys Rev E 2016; 94:022422. [PMID: 27627347 DOI: 10.1103/physreve.94.022422] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2016] [Indexed: 06/06/2023]
Abstract
We consider a homogeneous population of stochastic neurons that are driven by weak common noise (stimulus). To capture and analyze the joint firing events within the population, we introduce the partial synchronous output of the population. This is a time series defined by the events that at least a fixed fraction γ∈[0,1] of the population fires simultaneously within a small time interval. For this partial synchronous output we develop two analytical approaches to the correlation statistics. In the Gaussian approach we represent the synchronous output as a nonlinear transformation of the summed population activity and approximate the latter by a Gaussian process. In the combinatorial approach the synchronous output is represented by products of box-filtered spike trains of the single neurons. In both approaches we use linear-response theory to derive approximations for statistical measures that hold true for weak common noise. In particular, we calculate the mean value and power spectrum of the synchronous output and the cross-spectrum between synchronous output and common noise. We apply our results to the leaky integrate-and-fire neuron model and compare them to numerical simulations. The combinatorial approach is shown to provide a more accurate description of the statistics for small populations, whereas the Gaussian approximation yields compact formulas that work well for a sufficiently large population size. In particular, in the Gaussian approximation all statistical measures reveal a symmetry in the synchrony threshold γ around the mean value of the population activity. Our results may contribute to a better understanding of the role of coincidence detection in neural signal processing.
Collapse
Affiliation(s)
- Alexandra Kruscha
- Bernstein Center for Computational Neuroscience, Berlin, 10115, Germany and Institute for Physics, Humboldt-Universität zu Berlin, Berlin, 12489, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience, Berlin, 10115, Germany and Institute for Physics, Humboldt-Universität zu Berlin, Berlin, 12489, Germany
| |
Collapse
|
32
|
|
33
|
Surace SC, Pfister JP. A Statistical Model for In Vivo Neuronal Dynamics. PLoS One 2015; 10:e0142435. [PMID: 26571371 PMCID: PMC4646699 DOI: 10.1371/journal.pone.0142435] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2015] [Accepted: 10/21/2015] [Indexed: 11/19/2022] Open
Abstract
Single neuron models have a long tradition in computational neuroscience. Detailed biophysical models such as the Hodgkin-Huxley model as well as simplified neuron models such as the class of integrate-and-fire models relate the input current to the membrane potential of the neuron. Those types of models have been extensively fitted to in vitro data where the input current is controlled. Those models are however of little use when it comes to characterize intracellular in vivo recordings since the input to the neuron is not known. Here we propose a novel single neuron model that characterizes the statistical properties of in vivo recordings. More specifically, we propose a stochastic process where the subthreshold membrane potential follows a Gaussian process and the spike emission intensity depends nonlinearly on the membrane potential as well as the spiking history. We first show that the model has a rich dynamical repertoire since it can capture arbitrary subthreshold autocovariance functions, firing-rate adaptations as well as arbitrary shapes of the action potential. We then show that this model can be efficiently fitted to data without overfitting. We finally show that this model can be used to characterize and therefore precisely compare various intracellular in vivo recordings from different animals and experimental conditions.
Collapse
Affiliation(s)
- Simone Carlo Surace
- Department of Physiology, University of Bern, Bern, Switzerland
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
- * E-mail:
| | - Jean-Pascal Pfister
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
| |
Collapse
|
34
|
Blankenburg S, Wu W, Lindner B, Schreiber S. Information filtering in resonant neurons. J Comput Neurosci 2015; 39:349-70. [DOI: 10.1007/s10827-015-0580-6] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2015] [Revised: 09/23/2015] [Accepted: 09/29/2015] [Indexed: 10/22/2022]
|
35
|
Kim JH, Lee HJ, Min CH, Lee KJ. Coherence resonance in bursting neural networks. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:042701. [PMID: 26565266 DOI: 10.1103/physreve.92.042701] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2015] [Indexed: 06/05/2023]
Abstract
Synchronized neural bursts are one of the most noticeable dynamic features of neural networks, being essential for various phenomena in neuroscience, yet their complex dynamics are not well understood. With extrinsic electrical and optical manipulations on cultured neural networks, we demonstrate that the regularity (or randomness) of burst sequences is in many cases determined by a (few) low-dimensional attractor(s) working under strong neural noise. Moreover, there is an optimal level of noise strength at which the regularity of the interburst interval sequence becomes maximal-a phenomenon of coherence resonance. The experimental observations are successfully reproduced through computer simulations on a well-established neural network model, suggesting that the same phenomena may occur in many in vivo as well as in vitro neural networks.
Collapse
Affiliation(s)
- June Hoan Kim
- Department of Physics, Korea University, Seoul 136-713, Korea
| | - Ho Jun Lee
- Department of Physics, Korea University, Seoul 136-713, Korea
| | - Cheol Hong Min
- Department of Physics, Korea University, Seoul 136-713, Korea
| | - Kyoung J Lee
- Department of Physics, Korea University, Seoul 136-713, Korea
| |
Collapse
|
36
|
Amro RM, Lindner B, Neiman AB. Phase Diffusion in Unequally Noisy Coupled Oscillators. PHYSICAL REVIEW LETTERS 2015; 115:034101. [PMID: 26230796 DOI: 10.1103/physrevlett.115.034101] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/28/2015] [Indexed: 06/04/2023]
Abstract
We consider the dynamics of two directionally coupled unequally noisy oscillators, the first oscillator being noisier than the second oscillator. We derive analytically the phase diffusion coefficient of both oscillators in a heterogeneous setup (different frequencies, coupling coefficients, and intrinsic noise intensities) and show that the phase coherence of the second oscillator depends in a nonmonotonic fashion on the noise intensity of the first oscillator: as the first oscillator becomes less coherent, i.e., worse, the second one becomes more coherent, i.e., better. This surprising effect is related to the statistics of the first oscillator which provides a source of noise for the second oscillator, that is non-Gaussian, bounded, and possesses a finite bandwidth. We verify that the effect is robust by numerical simulations of two coupled FitzHugh-Nagumo models.
Collapse
Affiliation(s)
- Rami M Amro
- Department of Physics and Astronomy, Ohio University, Athens, Ohio 45701, USA
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstraße 13, Haus 2, 10115 Berlin, Germany
- Department of Physics, Humboldt Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Alexander B Neiman
- Department of Physics and Astronomy, Ohio University, Athens, Ohio 45701, USA
| |
Collapse
|
37
|
da Silva LA, Vilela RD. Colored noise and memory effects on formal spiking neuron models. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 91:062702. [PMID: 26172731 DOI: 10.1103/physreve.91.062702] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2015] [Indexed: 06/04/2023]
Abstract
Simplified neuronal models capture the essence of the electrical activity of a generic neuron, besides being more interesting from the computational point of view when compared to higher-dimensional models such as the Hodgkin-Huxley one. In this work, we propose a generalized resonate-and-fire model described by a generalized Langevin equation that takes into account memory effects and colored noise. We perform a comprehensive numerical analysis to study the dynamics and the point process statistics of the proposed model, highlighting interesting new features such as (i) nonmonotonic behavior (emergence of peak structures, enhanced by the choice of colored noise characteristic time scale) of the coefficient of variation (CV) as a function of memory characteristic time scale, (ii) colored noise-induced shift in the CV, and (iii) emergence and suppression of multimodality in the interspike interval (ISI) distribution due to memory-induced subthreshold oscillations. Moreover, in the noise-induced spike regime, we study how memory and colored noise affect the coherence resonance (CR) phenomenon. We found that for sufficiently long memory, not only is CR suppressed but also the minimum of the CV-versus-noise intensity curve that characterizes the presence of CR may be replaced by a maximum. The aforementioned features allow to interpret the interplay between memory and colored noise as an effective control mechanism to neuronal variability. Since both variability and nontrivial temporal patterns in the ISI distribution are ubiquitous in biological cells, we hope the present model can be useful in modeling real aspects of neurons.
Collapse
Affiliation(s)
- L A da Silva
- Centro de Matemática, Computação e Cognição, UFABC, Santo André-SP, Brazil
| | - R D Vilela
- Centro de Matemática, Computação e Cognição, UFABC, Santo André-SP, Brazil
| |
Collapse
|
38
|
Braun W, Matthews PC, Thul R. First-passage times in integrate-and-fire neurons with stochastic thresholds. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 91:052701. [PMID: 26066193 DOI: 10.1103/physreve.91.052701] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/10/2014] [Indexed: 06/04/2023]
Abstract
We consider a leaky integrate-and-fire neuron with deterministic subthreshold dynamics and a firing threshold that evolves as an Ornstein-Uhlenbeck process. The formulation of this minimal model is motivated by the experimentally observed widespread variation of neural firing thresholds. We show numerically that the mean first-passage time can depend nonmonotonically on the noise amplitude. For sufficiently large values of the correlation time of the stochastic threshold the mean first-passage time is maximal for nonvanishing noise. We provide an explanation for this effect by analytically transforming the original model into a first-passage-time problem for Brownian motion. This transformation also allows for a perturbative calculation of the first-passage-time histograms. In turn this provides quantitative insights into the mechanisms that lead to the nonmonotonic behavior of the mean first-passage time. The perturbation expansion is in excellent agreement with direct numerical simulations. The approach developed here can be applied to any deterministic subthreshold dynamics and any Gauss-Markov processes for the firing threshold. This opens up the possibility to incorporate biophysically detailed components into the subthreshold dynamics, rendering our approach a powerful framework that sits between traditional integrate-and-fire models and complex mechanistic descriptions of neural dynamics.
Collapse
Affiliation(s)
- Wilhelm Braun
- School of Mathematical Sciences and Centre for Mathematical Medicine and Biology, University of Nottingham, University Park, Nottingham, NG7 2RD, United Kingdom
| | - Paul C Matthews
- School of Mathematical Sciences and Centre for Mathematical Medicine and Biology, University of Nottingham, University Park, Nottingham, NG7 2RD, United Kingdom
| | - Rüdiger Thul
- School of Mathematical Sciences and Centre for Mathematical Medicine and Biology, University of Nottingham, University Park, Nottingham, NG7 2RD, United Kingdom
| |
Collapse
|
39
|
McDonnell MD, Iannella N, To MS, Tuckwell HC, Jost J, Gutkin BS, Ward LM. A review of methods for identifying stochastic resonance in simulations of single neuron models. NETWORK (BRISTOL, ENGLAND) 2015; 26:35-71. [PMID: 25760433 DOI: 10.3109/0954898x.2014.990064] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
Stochastic resonance (SR) is said to be observed when the presence of noise in a nonlinear system enables an output signal from the system to better represent some feature of an input signal than it does in the absence of noise. The effect has been observed in models of individual neurons, and in experiments performed on real neural systems. Despite the ubiquity of biophysical sources of stochastic noise in the nervous system, however, it has not yet been established whether neuronal computation mechanisms involved in performance of specific functions such as perception or learning might exploit such noise as an integral component, such that removal of the noise would diminish performance of these functions. In this paper we revisit the methods used to demonstrate stochastic resonance in models of single neurons. This includes a previously unreported observation in a multicompartmental model of a CA1-pyramidal cell. We also discuss, as a contrast to these classical studies, a form of 'stochastic facilitation', known as inverse stochastic resonance. We draw on the reviewed examples to argue why new approaches to studying 'stochastic facilitation' in neural systems need to be developed.
Collapse
Affiliation(s)
- Mark D McDonnell
- Computational and Theoretical Neuroscience Laboratory, Institute for Telecommunications Research, University of South Australia , Mawson Lakes, SA , Australia
| | | | | | | | | | | | | |
Collapse
|
40
|
Tsigkri-DeSmedt N, Hizanidis J, Hövel P, Provata A. Multi-chimera States in the Leaky Integrate-and-Fire Model. ACTA ACUST UNITED AC 2015. [DOI: 10.1016/j.procs.2015.11.004] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
41
|
Dummer B, Wieland S, Lindner B. Self-consistent determination of the spike-train power spectrum in a neural network with sparse connectivity. Front Comput Neurosci 2014; 8:104. [PMID: 25278869 PMCID: PMC4166962 DOI: 10.3389/fncom.2014.00104] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2014] [Accepted: 08/13/2014] [Indexed: 11/13/2022] Open
Abstract
A major source of random variability in cortical networks is the quasi-random arrival of presynaptic action potentials from many other cells. In network studies as well as in the study of the response properties of single cells embedded in a network, synaptic background input is often approximated by Poissonian spike trains. However, the output statistics of the cells is in most cases far from being Poisson. This is inconsistent with the assumption of similar spike-train statistics for pre- and postsynaptic cells in a recurrent network. Here we tackle this problem for the popular class of integrate-and-fire neurons and study a self-consistent statistics of input and output spectra of neural spike trains. Instead of actually using a large network, we use an iterative scheme, in which we simulate a single neuron over several generations. In each of these generations, the neuron is stimulated with surrogate stochastic input that has a similar statistics as the output of the previous generation. For the surrogate input, we employ two distinct approximations: (i) a superposition of renewal spike trains with the same interspike interval density as observed in the previous generation and (ii) a Gaussian current with a power spectrum proportional to that observed in the previous generation. For input parameters that correspond to balanced input in the network, both the renewal and the Gaussian iteration procedure converge quickly and yield comparable results for the self-consistent spike-train power spectrum. We compare our results to large-scale simulations of a random sparsely connected network of leaky integrate-and-fire neurons (Brunel, 2000) and show that in the asynchronous regime close to a state of balanced synaptic input from the network, our iterative schemes provide an excellent approximations to the autocorrelation of spike trains in the recurrent network.
Collapse
Affiliation(s)
- Benjamin Dummer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Stefan Wieland
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience Berlin, Germany ; Department of Physics, Humboldt Universität zu Berlin Berlin, Germany
| |
Collapse
|
42
|
Kromer JA, Lindner B, Schimansky-Geier L. Event-triggered feedback in noise-driven phase oscillators. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2014; 89:032138. [PMID: 24730820 DOI: 10.1103/physreve.89.032138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2013] [Indexed: 06/03/2023]
Abstract
Using a stochastic nonlinear phase oscillator model, we study the effect of event-triggered feedback on the statistics of interevent intervals. Events are associated with the entering of a new cycle. The feedback is modeled by an instantaneous increase (positive feedback) or decrease (negative feedback) of the oscillator frequency whenever an event occurs followed by an exponential decay on a slow time scale. In addition to the known excitable and oscillatory regimes, which are separated by a saddle node on invariant circle bifurcation, positive feedback can lead to bistable dynamics and a change of the system's excitability. The feedback has also a strong effect on noise-induced phenomena like coherence resonance or anticoherence resonance. Both positive and negative feedback can lead to more regular output for particular noise strengths. Finally, we investigate serial correlations in the sequence of interevent intervals that occur due to the additional slow dynamics. We derive approximations for the serial correlation coefficient and show that positive feedback results in extended positive interval correlations, whereas negative feedback yields short-ranging negative correlations. Investigating the interplay of feedback and the nonlinear phase dynamics close to the bifurcation, we find that correlations are most pronounced for optimal feedback strengths.
Collapse
Affiliation(s)
- Justus A Kromer
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Lutz Schimansky-Geier
- Department of Physics, Humboldt-Universität zu Berlin, Newtonstrasse 15, 12489 Berlin, Germany and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
43
|
Xie J, Wang Z. Effect of inhibitory feedback on correlated firing of spiking neural network. Cogn Neurodyn 2014; 7:325-31. [PMID: 24427208 DOI: 10.1007/s11571-013-9241-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2012] [Revised: 12/20/2012] [Accepted: 01/02/2013] [Indexed: 11/26/2022] Open
Abstract
Understanding the properties and mechanisms that generate different forms of correlation is critical for determining their role in cortical processing. Researches on retina, visual cortex, sensory cortex, and computational model have suggested that fast correlation with high temporal precision appears consistent with common input, and correlation on a slow time scale likely involves feedback. Based on feedback spiking neural network model, we investigate the role of inhibitory feedback in shaping correlations on a time scale of 100 ms. Notably, the relationship between the correlation coefficient and inhibitory feedback strength is non-monotonic. Further, computational simulations show how firing rate and oscillatory activity form the basis of the mechanisms underlying this relationship. When the mean firing rate holds unvaried, the correlation coefficient increases monotonically with inhibitory feedback, but the correlation coefficient keeps decreasing when the network has no oscillatory activity. Our findings reveal that two opposing effects of the inhibitory feedback on the firing activity of the network contribute to the non-monotonic relationship between the correlation coefficient and the strength of the inhibitory feedback. The inhibitory feedback affects the correlated firing activity by modulating the intensity and regularity of the spike trains. Finally, the non-monotonic relationship is replicated with varying transmission delay and different spatial network structure, demonstrating the universality of the results.
Collapse
Affiliation(s)
- Jinli Xie
- School of Electrical Engineering, University of Jinan, Jinan, 250022 Shandong China
| | - Zhijie Wang
- College of Information Science and Technology, Donghua University, Shanghai, 201620 China
| |
Collapse
|
44
|
Tait AN, Nahmias MA, Tian Y, Shastri BJ, Prucnal PR. Photonic Neuromorphic Signal Processing and Computing. NANOPHOTONIC INFORMATION PHYSICS 2014. [DOI: 10.1007/978-3-642-40224-1_8] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
|
45
|
Low-Pass Filtering of Information in the Leaky Integrate-and-Fire Neuron Driven by White Noise. ACTA ACUST UNITED AC 2013. [DOI: 10.1007/978-3-319-02925-2_22] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/07/2023]
|
46
|
Joint distribution of first exit times of a two dimensional Wiener process with jumps with application to a pair of coupled neurons. Math Biosci 2013; 245:61-9. [PMID: 23816927 DOI: 10.1016/j.mbs.2013.06.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2012] [Revised: 06/18/2013] [Accepted: 06/19/2013] [Indexed: 11/23/2022]
Abstract
Motivated by a neuronal modeling problem, a bivariate Wiener process with two independent components is considered. Each component evolves independently until one of them reaches a threshold value. If the first component crosses the threshold value, it is reset while the dynamics of the other component remains unchanged. But, if this happens to the second component, the first one has a jump of constant amplitude; the second component is then reset to its starting value and its evolution restarts. Both processes evolve once again until one of them reaches again its boundary. In this work, the coupling of the first exit times of the two connected processes is studied.
Collapse
|
47
|
Information filtering by synchronous spikes in a neural population. J Comput Neurosci 2012; 34:285-301. [PMID: 22968549 PMCID: PMC3605500 DOI: 10.1007/s10827-012-0421-9] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2012] [Revised: 07/12/2012] [Accepted: 08/08/2012] [Indexed: 10/27/2022]
Abstract
Information about time-dependent sensory stimuli is encoded by the spike trains of neurons. Here we consider a population of uncoupled but noisy neurons (each subject to some intrinsic noise) that are driven by a common broadband signal. We ask specifically how much information is encoded in the synchronous activity of the population and how this information transfer is distributed with respect to frequency bands. In order to obtain some insight into the mechanism of information filtering effects found previously in the literature, we develop a mathematical framework to calculate the coherence of the synchronous output with the common stimulus for populations of simple neuron models. Within this frame, the synchronous activity is treated as the product of filtered versions of the spike trains of a subset of neurons. We compare our results for the simple cases of (1) a Poisson neuron with a rate modulation and (2) an LIF neuron with intrinsic white current noise and a current stimulus. For the Poisson neuron, formulas are particularly simple but show only a low-pass behavior of the coherence of synchronous activity. For the LIF model, in contrast, the coherence function of the synchronous activity shows a clear peak at high frequencies, comparable to recent experimental findings. We uncover the mechanism for this shift in the maximum of the coherence and discuss some biological implications of our findings.
Collapse
|
48
|
Deemyad T, Kroeger J, Chacron MJ. Sub- and suprathreshold adaptation currents have opposite effects on frequency tuning. J Physiol 2012; 590:4839-58. [PMID: 22733663 DOI: 10.1113/jphysiol.2012.234401] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
Natural stimuli are often characterized by statistics that can vary over orders of magnitude. Experiments have shown that sensory neurons continuously adapt their responses to changes in these statistics, thereby optimizing information transmission. However, such adaptation can also alter the neuronal transfer function by attenuating if not eliminating responses to the low frequency components of time varying stimuli,which can create ambiguity in the neural code. We recorded from electrosensory pyramidal neurons before and after pharmacological inactivation of either calcium-activated (I(AHP)) or KCNQ voltage-gated potassium currents (I(M)). We found that blocking each current decreased adaptation in a similar fashion but led to opposite changes in the neuronal transfer function. Indeed, blocking I(AHP) increased while blocking I(M) instead decreased the response to low temporal frequencies. To understand this surprising result, we built a mathematical model incorporating each channel type. This model predicted that these differential effects could be accounted for by differential activation properties. Our results show that the mechanisms that mediate adaptation can either increase or decrease the response to low frequency stimuli. As such, they suggest that the nervous system resolves ambiguity resulting from adaptation through independent control of adaptation and the neuronal transfer function.
Collapse
Affiliation(s)
- Tara Deemyad
- Department of Physiology, McGill University, 3655 Sir William Osler, room 1137, Montreal, QC, H3G 1Y6, Canada
| | | | | |
Collapse
|
49
|
|
50
|
Guo D. Inhibition of rhythmic spiking by colored noise in neural systems. Cogn Neurodyn 2011; 5:293-300. [PMID: 22942918 DOI: 10.1007/s11571-011-9160-2] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2011] [Revised: 06/11/2011] [Accepted: 06/14/2011] [Indexed: 11/27/2022] Open
Abstract
We study the effect of colored noise on the rhythmic spiking activity of neural systems in this paper. The phenomenon of the so-called inverse stochastic resonance , that is, noise with appropriate intensity suppresses the spiking activity in neural systems, is clearly observed in a special parameter regime. We find that the inhibition effect of colored noise is stronger than that of Gaussian white noise. Furthermore, our simulation results show that the inhibition effect of colored noise provides a useful mechanism for the generation of synchronized burst in type-2 mixed-feed-forward-feedback loop neuronal network motif, which indicates that such inhibition effect might have some biological implications.
Collapse
Affiliation(s)
- Daqing Guo
- School of Electronic Engineering, University of Electronic Science and Technology of China, Chengdu, 610054 People's Republic of China
| |
Collapse
|