1
|
Knoll G, Lindner B. Information transmission in recurrent networks: Consequences of network noise for synchronous and asynchronous signal encoding. Phys Rev E 2022; 105:044411. [PMID: 35590546 DOI: 10.1103/physreve.105.044411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2021] [Accepted: 03/04/2022] [Indexed: 06/15/2023]
Abstract
Information about natural time-dependent stimuli encoded by the sensory periphery or communication between cortical networks may span a large frequency range or be localized to a smaller frequency band. Biological systems have been shown to multiplex such disparate broadband and narrow-band signals and then discriminate them in later populations by employing either an integration (low-pass) or coincidence detection (bandpass) encoding strategy. Analytical expressions have been developed for both encoding methods in feedforward populations of uncoupled neurons and confirm that the integration of a population's output low-pass filters the information, whereas synchronous output encodes less information overall and retains signal information in a selected frequency band. The present study extends the theory to recurrent networks and shows that recurrence may sharpen the synchronous bandpass filter. The frequency of the pass band is significantly influenced by the synaptic strengths, especially for inhibition-dominated networks. Synchronous information transfer is also increased when network models take into account heterogeneity that arises from the stochastic distribution of the synaptic weights.
Collapse
Affiliation(s)
- Gregory Knoll
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany and Physics Department of Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| |
Collapse
|
2
|
Bostner Ž, Knoll G, Lindner B. Information filtering by coincidence detection of synchronous population output: analytical approaches to the coherence function of a two-stage neural system. BIOLOGICAL CYBERNETICS 2020; 114:403-418. [PMID: 32583370 PMCID: PMC7326833 DOI: 10.1007/s00422-020-00838-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/25/2020] [Accepted: 05/18/2020] [Indexed: 06/11/2023]
Abstract
Information about time-dependent sensory stimuli is encoded in the activity of neural populations; distinct aspects of the stimulus are read out by different types of neurons: while overall information is perceived by integrator cells, so-called coincidence detector cells are driven mainly by the synchronous activity in the population that encodes predominantly high-frequency content of the input signal (high-pass information filtering). Previously, an analytically accessible statistic called the partial synchronous output was introduced as a proxy for the coincidence detector cell's output in order to approximate its information transmission. In the first part of the current paper, we compare the information filtering properties (specifically, the coherence function) of this proxy to those of a simple coincidence detector neuron. We show that the latter's coherence function can indeed be well-approximated by the partial synchronous output with a time scale and threshold criterion that are related approximately linearly to the membrane time constant and firing threshold of the coincidence detector cell. In the second part of the paper, we propose an alternative theory for the spectral measures (including the coherence) of the coincidence detector cell that combines linear-response theory for shot-noise driven integrate-and-fire neurons with a novel perturbation ansatz for the spectra of spike-trains driven by colored noise. We demonstrate how the variability of the synaptic weights for connections from the population to the coincidence detector can shape the information transmission of the entire two-stage system.
Collapse
Affiliation(s)
- Žiga Bostner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
| | - Gregory Knoll
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| | - Benjamin Lindner
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin, Germany
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115 Berlin, Germany
| |
Collapse
|
3
|
Herfurth T, Tchumatchenko T. Quantifying encoding redundancy induced by rate correlations in Poisson neurons. Phys Rev E 2019; 99:042402. [PMID: 31108645 DOI: 10.1103/physreve.99.042402] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2018] [Indexed: 11/07/2022]
Abstract
Temporal correlations in neuronal spike trains are known to introduce redundancy to stimulus encoding. However, exact methods to describe how these correlations impact neural information transmission quantitatively are lacking. Here, we provide a general measure for the information carried by correlated rate modulations only, neglecting other spike correlations, and use it to investigate the effect of rate correlations on encoding redundancy. We derive it analytically by calculating the mutual information between a time-correlated, rate modulating signal and the resulting spikes of Poisson neurons. Whereas this information is determined by spike autocorrelations only, the redundancy in information encoding due to rate correlations depends on both the distribution and the autocorrelation of the rate histogram. We further demonstrate that at very small signal strengths the information carried by rate correlated spikes becomes identical to that of independent spikes, in effect measuring the signal modulation depth. In contrast, a vanishing signal correlation time maximizes information but does not generally yield the information of independent spikes. Overall, our study sheds light on the role of signal-induced temporal correlations for neural coding, by providing insight into how signal features shape redundancy and by establishing mathematical links between existing methods.
Collapse
Affiliation(s)
- Tim Herfurth
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| | - Tatjana Tchumatchenko
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| |
Collapse
|
4
|
Herfurth T, Tchumatchenko T. Information transmission of mean and variance coding in integrate-and-fire neurons. Phys Rev E 2019; 99:032420. [PMID: 30999481 DOI: 10.1103/physreve.99.032420] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2017] [Indexed: 11/07/2022]
Abstract
Neurons process information by translating continuous signals into patterns of discrete spike times. An open question is how much information these spike times contain about signals which modulate either the mean or the variance of the somatic currents in neurons, as is observed experimentally. Here we calculate the exact information contained in discrete spike times about a continuous signal in both encoding strategies. We show that the information content about mean modulating signals is generally substantially larger than about variance modulating signals for biological parameters. Our analysis further reveals that higher information transmission is associated with a larger proportion of nonlinear signal encoding. Our study measures the complete information content of mean and variance coding and provides a method to determine what fraction of the total information is linearly decodable.
Collapse
Affiliation(s)
- Tim Herfurth
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| | - Tatjana Tchumatchenko
- Max Planck Institute for Brain Research, Theory of Neural Dynamics, Max-von-Laue-Strasse 4, 60438 Frankfurt, Germany
| |
Collapse
|
5
|
Voronenko SO, Lindner B. Improved lower bound for the mutual information between signal and neural spike count. BIOLOGICAL CYBERNETICS 2018; 112:523-538. [PMID: 30155699 DOI: 10.1007/s00422-018-0779-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/24/2018] [Accepted: 08/20/2018] [Indexed: 06/08/2023]
Abstract
The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound for the mutual information by incorporating nonlinear correlations. For the special case of a Gaussian output variable with nonlinear signal dependencies of mean and variance we also derive an exact integral formula for the full mutual information. In our numerical analysis, we first compare the linear and nonlinear lower bounds and the exact integral formula for two different Gaussian models and show under which conditions the nonlinear lower bound provides a significant improvement to the linear approximation. We then inspect two neuron models, the leaky integrate-and-fire model with white Gaussian noise and the Na-K model with channel noise. We show that for certain firing regimes and for intermediate signal strengths the nonlinear lower bound can provide a substantial improvement compared to the linear lower bound. Our results demonstrate the importance of nonlinear input-output correlations for neural information transmission and provide a simple nonlinear approximation for the mutual information that can be applied to more complicated neuron models as well as to experimental data.
Collapse
Affiliation(s)
- Sergej O Voronenko
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany.
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany.
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Philippstr. 13, Haus 2, 10115, Berlin, Germany
- Physics Department, Humboldt University Berlin, Newtonstr. 15, 12489, Berlin, Germany
| |
Collapse
|
6
|
Sub-threshold signal encoding in coupled FitzHugh-Nagumo neurons. Sci Rep 2018; 8:8276. [PMID: 29844354 PMCID: PMC5974132 DOI: 10.1038/s41598-018-26618-8] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 05/15/2018] [Indexed: 11/09/2022] Open
Abstract
Despite intensive research, the mechanisms underlying the neural code remain poorly understood. Recent work has focused on the response of a single neuron to a weak, sub-threshold periodic signal. By simulating the stochastic FitzHugh-Nagumo (FHN) model and then using a symbolic method to analyze the firing activity, preferred and infrequent spike patterns (defined by the relative timing of the spikes) were detected, whose probabilities encode information about the signal. As not individual neurons but neuronal populations are responsible for sensory coding and information transfer, a relevant question is how a second neuron, which does not perceive the signal, affects the detection and the encoding of the signal, done by the first neuron. Through simulations of two stochastic FHN neurons we show that the encoding of a sub-threshold signal in symbolic spike patterns is a plausible mechanism. The neuron that perceives the signal fires a spike train that, despite having an almost random temporal structure, has preferred and infrequent patterns which carry information about the signal. Our findings could be relevant for sensory systems composed by two noisy neurons, when only one detects a weak external input.
Collapse
|
7
|
Pena RFO, Vellmer S, Bernardi D, Roque AC, Lindner B. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks. Front Comput Neurosci 2018; 12:9. [PMID: 29551968 PMCID: PMC5840464 DOI: 10.3389/fncom.2018.00009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2017] [Accepted: 02/07/2018] [Indexed: 11/13/2022] Open
Abstract
Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.
Collapse
Affiliation(s)
- Rodrigo F O Pena
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Sebastian Vellmer
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Davide Bernardi
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| | - Antonio C Roque
- Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, Brazil
| | - Benjamin Lindner
- Theory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, Germany.,Department of Physics, Humboldt Universität zu Berlin, Berlin, Germany
| |
Collapse
|
8
|
Synchronous spikes are necessary but not sufficient for a synchrony code in populations of spiking neurons. Proc Natl Acad Sci U S A 2017; 114:E1977-E1985. [PMID: 28202729 DOI: 10.1073/pnas.1615561114] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Synchronous activity in populations of neurons potentially encodes special stimulus features. Selective readout of either synchronous or asynchronous activity allows formation of two streams of information processing. Theoretical work predicts that such a synchrony code is a fundamental feature of populations of spiking neurons if they operate in specific noise and stimulus regimes. Here we experimentally test the theoretical predictions by quantifying and comparing neuronal response properties in tuberous and ampullary electroreceptor afferents of the weakly electric fish Apteronotus leptorhynchus These related systems show similar levels of synchronous activity, but only in the more irregularly firing tuberous afferents a synchrony code is established, whereas in the more regularly firing ampullary afferents it is not. The mere existence of synchronous activity is thus not sufficient for a synchrony code. Single-cell features such as the irregularity of spiking and the frequency dependence of the neuron's transfer function determine whether synchronous spikes possess a distinct meaning for the encoding of time-dependent signals.
Collapse
|
9
|
Blankenburg S, Lindner B. The effect of positive interspike interval correlations on neuronal information transmission. MATHEMATICAL BIOSCIENCES AND ENGINEERING : MBE 2016; 13:461-481. [PMID: 27106183 DOI: 10.3934/mbe.2016001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Experimentally it is known that some neurons encode preferentially information about low-frequency (slow) components of a time-dependent stimulus while others prefer intermediate or high-frequency (fast) components. Accordingly, neurons can be categorized as low-pass, band-pass or high-pass information filters. Mechanisms of information filtering at the cellular and the network levels have been suggested. Here we propose yet another mechanism, based on noise shaping due to spontaneous non-renewal spiking statistics. We compare two integrate-and-fire models with threshold noise that differ solely in their interspike interval (ISI) correlations: the renewal model generates independent ISIs, whereas the non-renewal model exhibits positive correlations between adjacent ISIs. For these simplified neuron models we analytically calculate ISI density and power spectrum of the spontaneous spike train as well as approximations for input-output cross-spectrum and spike-train power spectrum in the presence of a broad-band Gaussian stimulus. This yields the spectral coherence, an approximate frequency-resolved measure of information transmission. We demonstrate that for low spiking variability the renewal model acts as a low-pass filter of information (coherence has a global maximum at zero frequency), whereas the non-renewal model displays a pronounced maximum of the coherence at non-vanishing frequency and thus can be regarded as a band-pass filter of information.
Collapse
Affiliation(s)
- Sven Blankenburg
- Bernstein Center for Computational Neuroscience Berlin, Berlin 10115, Germany.
| | | |
Collapse
|
10
|
|
11
|
Baker CA, Huck KR, Carlson BA. Peripheral sensory coding through oscillatory synchrony in weakly electric fish. eLife 2015; 4:e08163. [PMID: 26238277 PMCID: PMC4522468 DOI: 10.7554/elife.08163] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2015] [Accepted: 07/08/2015] [Indexed: 01/28/2023] Open
Abstract
Adaptations to an organism's environment often involve sensory system modifications. In this study, we address how evolutionary divergence in sensory perception relates to the physiological coding of stimuli. Mormyrid fishes that can detect subtle variations in electric communication signals encode signal waveform into spike-timing differences between sensory receptors. In contrast, the receptors of species insensitive to waveform variation produce spontaneously oscillating potentials. We found that oscillating receptors respond to electric pulses by resetting their phase, resulting in transient synchrony among receptors that encodes signal timing and location, but not waveform. These receptors were most sensitive to frequencies found only in the collective signals of groups of conspecifics, and this was correlated with increased behavioral responses to these frequencies. Thus, different perceptual capabilities correspond to different receptor physiologies. We hypothesize that these divergent mechanisms represent adaptations for different social environments. Our findings provide the first evidence for sensory coding through oscillatory synchrony.
Collapse
Affiliation(s)
- Christa A Baker
- Department of Biology, Washington University in St. Louis, St. Louis, United States
| | - Kevin R Huck
- Department of Biology, Washington University in St. Louis, St. Louis, United States
| | - Bruce A Carlson
- Department of Biology, Washington University in St. Louis, St. Louis, United States
| |
Collapse
|
12
|
Bernardi D, Lindner B. A frequency-resolved mutual information rate and its application to neural systems. J Neurophysiol 2014; 113:1342-57. [PMID: 25475346 DOI: 10.1152/jn.00354.2014] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The encoding and processing of time-dependent signals into sequences of action potentials of sensory neurons is still a challenging theoretical problem. Although, with some effort, it is possible to quantify the flow of information in the model-free framework of Shannon's information theory, this yields just a single number, the mutual information rate. This rate does not indicate which aspects of the stimulus are encoded. Several studies have identified mechanisms at the cellular and network level leading to low- or high-pass filtering of information, i.e., the selective coding of slow or fast stimulus components. However, these findings rely on an approximation, specifically, on the qualitative behavior of the coherence function, an approximate frequency-resolved measure of information flow, whose quality is generally unknown. Here, we develop an assumption-free method to measure a frequency-resolved information rate about a time-dependent Gaussian stimulus. We demonstrate its application for three paradigmatic descriptions of neural firing: an inhomogeneous Poisson process that carries a signal in its instantaneous firing rate; an integrator neuron (stochastic integrate-and-fire model) driven by a time-dependent stimulus; and the synchronous spikes fired by two commonly driven integrator neurons. In agreement with previous coherence-based estimates, we find that Poisson and integrate-and-fire neurons are broadband and low-pass filters of information, respectively. The band-pass information filtering observed in the coherence of synchronous spikes is confirmed by our frequency-resolved information measure in some but not all parameter configurations. Our results also explicitly show how the response-response coherence can fail as an upper bound on the information rate.
Collapse
Affiliation(s)
- Davide Bernardi
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; and Physics Department, Humboldt University Berlin, Berlin, Germany
| | - Benjamin Lindner
- Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany; and Physics Department, Humboldt University Berlin, Berlin, Germany
| |
Collapse
|
13
|
Bauermeister C, Schwalger T, Russell DF, Neiman AB, Lindner B. Characteristic effects of stochastic oscillatory forcing on neural firing: analytical theory and comparison to paddlefish electroreceptor data. PLoS Comput Biol 2013; 9:e1003170. [PMID: 23966844 PMCID: PMC3744407 DOI: 10.1371/journal.pcbi.1003170] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2012] [Accepted: 06/21/2013] [Indexed: 11/18/2022] Open
Abstract
Stochastic signals with pronounced oscillatory components are frequently encountered in neural systems. Input currents to a neuron in the form of stochastic oscillations could be of exogenous origin, e.g. sensory input or synaptic input from a network rhythm. They shape spike firing statistics in a characteristic way, which we explore theoretically in this report. We consider a perfect integrate-and-fire neuron that is stimulated by a constant base current (to drive regular spontaneous firing), along with Gaussian narrow-band noise (a simple example of stochastic oscillations), and a broadband noise. We derive expressions for the nth-order interval distribution, its variance, and the serial correlation coefficients of the interspike intervals (ISIs) and confirm these analytical results by computer simulations. The theory is then applied to experimental data from electroreceptors of paddlefish, which have two distinct types of internal noisy oscillators, one forcing the other. The theory provides an analytical description of their afferent spiking statistics during spontaneous firing, and replicates a pronounced dependence of ISI serial correlation coefficients on the relative frequency of the driving oscillations, and furthermore allows extraction of certain parameters of the intrinsic oscillators embedded in these electroreceptors.
Collapse
Affiliation(s)
| | - Tilo Schwalger
- Max-Planck-Institute for the Physics of Complex Systems, Dresden, Germany
- Bernstein Center for Computational Neuroscience and Physics Department of Humboldt University, Berlin, Germany
| | - David F. Russell
- Department of Biological Sciences and Neuroscience Program, Ohio University, Athens, Ohio, United States of America
| | - Alexander B. Neiman
- Department of Physics and Astronomy and Neuroscience Program, Ohio University, Athens, Ohio, United States of America
| | - Benjamin Lindner
- Max-Planck-Institute for the Physics of Complex Systems, Dresden, Germany
- Bernstein Center for Computational Neuroscience and Physics Department of Humboldt University, Berlin, Germany
- * E-mail:
| |
Collapse
|
14
|
Influence of biophysical properties on temporal filters in a sensory neuron. BMC Neurosci 2013. [PMCID: PMC3704666 DOI: 10.1186/1471-2202-14-s1-p347] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
|
15
|
Bahar S, Neiman AB, Jung P, Kurths J, Schimansky-Geier L, Showalter K. Introduction to Focus Issue: nonlinear and stochastic physics in biology. CHAOS (WOODBURY, N.Y.) 2011; 21:047501. [PMID: 22225375 DOI: 10.1063/1.3671647] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
Frank Moss was a leading figure in the study of nonlinear and stochastic processes in biological systems. His work, particularly in the area of stochastic resonance, has been highly influential to the interdisciplinary scientific community. This Focus Issue pays tribute to Moss with articles that describe the most recent advances in the field he helped to create. In this Introduction, we review Moss's seminal scientific contributions and introduce the articles that make up this Focus Issue.
Collapse
Affiliation(s)
- Sonya Bahar
- Department of Physics and Astronomy and Center for Neurodynamics, University of Missouri at St. Louis, St. Louis, Missouri 63121, USA
| | | | | | | | | | | |
Collapse
|