201
|
Statistics of the electrosensory input in the freely swimming weakly electric fish Apteronotus leptorhynchus. J Neurosci 2013; 33:13758-72. [PMID: 23966697 DOI: 10.1523/jneurosci.0998-13.2013] [Citation(s) in RCA: 61] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The neural computations underlying sensory-guided behaviors can best be understood in view of the sensory stimuli to be processed under natural conditions. This input is often actively shaped by the movements of the animal and its sensory receptors. Little is known about natural sensory scene statistics taking into account the concomitant movement of sensory receptors in freely moving animals. South American weakly electric fish use a self-generated quasi-sinusoidal electric field for electrolocation and electrocommunication. Thousands of cutaneous electroreceptors detect changes in the transdermal potential (TDP) as the fish interact with conspecifics and the environment. Despite substantial knowledge about the circuitry and physiology of the electrosensory system, the statistical properties of the electrosensory input evoked by natural swimming movements have never been measured directly. Using underwater wireless telemetry, we recorded the TDP of Apteronotus leptorhynchus as they swam freely by themselves and during interaction with a conspecific. Swimming movements caused low-frequency TDP amplitude modulations (AMs). Interacting with a conspecific caused additional AMs around the difference frequency of their electric fields, with the amplitude of the AMs (envelope) varying at low frequencies due to mutual movements. Both AMs and envelopes showed a power-law relationship with frequency, indicating spectral scale invariance. Combining a computational model of the electric field with video tracking of movements, we show that specific swimming patterns cause characteristic spatiotemporal sensory input correlations that contain information that may be used by the brain to guide behavior.
Collapse
|
202
|
Abstract
Adaptation is a fundamental computational motif in neural processing. To maintain stable perception in the face of rapidly shifting input, neural systems must extract relevant information from background fluctuations under many different contexts. Many neural systems are able to adjust their input-output properties such that an input's ability to trigger a response depends on the size of that input relative to its local statistical context. This "gain-scaling" strategy has been shown to be an efficient coding strategy. We report here that this property emerges during early development as an intrinsic property of single neurons in mouse sensorimotor cortex, coinciding with the disappearance of spontaneous waves of network activity, and can be modulated by changing the balance of spike-generating currents. Simultaneously, developing neurons move toward a common intrinsic operating point and a stable ratio of spike-generating currents. This developmental trajectory occurs in the absence of sensory input or spontaneous network activity. Through a combination of electrophysiology and modeling, we demonstrate that developing cortical neurons develop the ability to perform nearly perfect gain scaling by virtue of the maturing spike-generating currents alone. We use reduced single neuron models to identify the conditions for this property to hold.
Collapse
|
203
|
Farkhooi F, Froese A, Muller E, Menzel R, Nawrot MP. Cellular adaptation facilitates sparse and reliable coding in sensory pathways. PLoS Comput Biol 2013; 9:e1003251. [PMID: 24098101 PMCID: PMC3789775 DOI: 10.1371/journal.pcbi.1003251] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2013] [Accepted: 08/16/2013] [Indexed: 11/30/2022] Open
Abstract
Most neurons in peripheral sensory pathways initially respond vigorously when a preferred stimulus is presented, but adapt as stimulation continues. It is unclear how this phenomenon affects stimulus coding in the later stages of sensory processing. Here, we show that a temporally sparse and reliable stimulus representation develops naturally in sequential stages of a sensory network with adapting neurons. As a modeling framework we employ a mean-field approach together with an adaptive population density treatment, accompanied by numerical simulations of spiking neural networks. We find that cellular adaptation plays a critical role in the dynamic reduction of the trial-by-trial variability of cortical spike responses by transiently suppressing self-generated fast fluctuations in the cortical balanced network. This provides an explanation for a widespread cortical phenomenon by a simple mechanism. We further show that in the insect olfactory system cellular adaptation is sufficient to explain the emergence of the temporally sparse and reliable stimulus representation in the mushroom body. Our results reveal a generic, biophysically plausible mechanism that can explain the emergence of a temporally sparse and reliable stimulus representation within a sequential processing architecture.
Collapse
Affiliation(s)
- Farzad Farkhooi
- Neuroinformatics & Theoretical Neuroscience, Freie Universität Berlin, and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| | - Anja Froese
- Institute für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Eilif Muller
- Blue Brain Project, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Randolf Menzel
- Institute für Biologie-Neurobiologie, Freie Universität Berlin, Berlin, Germany
| | - Martin P. Nawrot
- Neuroinformatics & Theoretical Neuroscience, Freie Universität Berlin, and Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
| |
Collapse
|
204
|
Edelman M. Universal fractional map and cascade of bifurcations type attractors. CHAOS (WOODBURY, N.Y.) 2013; 23:033127. [PMID: 24089963 DOI: 10.1063/1.4819165] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
We modified the way in which the Universal Map is obtained in the regular dynamics to derive the Universal α-Family of Maps depending on a single parameter α>0, which is the order of the fractional derivative in the nonlinear fractional differential equation describing a system experiencing periodic kicks. We consider two particular α-families corresponding to the Standard and Logistic Maps. For fractional α<2 in the area of parameter values of the transition through the period doubling cascade of bifurcations from regular to chaotic motion in regular dynamics corresponding fractional systems demonstrate a new type of attractors--cascade of bifurcations type trajectories.
Collapse
Affiliation(s)
- M Edelman
- Department of Physics, Stern College at Yeshiva University, 245 Lexington Ave, New York, New York 10016, USA and Courant Institute of Mathematical Sciences, New York University, 251 Mercer St., New York, New York 10012, USA
| |
Collapse
|
205
|
|
206
|
Imbalance between excitation and inhibition in the somatosensory cortex produces postadaptation facilitation. J Neurosci 2013; 33:8463-71. [PMID: 23658183 DOI: 10.1523/jneurosci.4845-12.2013] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Adaptation is typically associated with attenuation of the neuronal response during sustained or repetitive sensory stimulation, followed by a gradual recovery of the response to its baseline level thereafter. Here, we examined the process of recovery from sensory adaptation in layer IV cells of the rat barrel cortex using in vivo intracellular recordings. Surprisingly, in approximately one-third of the cells, the response to a test stimulus delivered a few hundred milliseconds after the adapting stimulation was significantly facilitated. Recordings under different holding potentials revealed that the enhanced response was the result of an imbalance between excitation and inhibition, where a faster recovery of excitation compared with inhibition facilitated the response. Hence, our data provide the first mechanistic explanation of sensory facilitation after adaptation and suggest that adaptation increases the sensitivity of cortical neurons to sensory stimulation by altering the balance between excitation and inhibition.
Collapse
|
207
|
Speed-invariant encoding of looming object distance requires power law spike rate adaptation. Proc Natl Acad Sci U S A 2013; 110:13624-9. [PMID: 23898185 DOI: 10.1073/pnas.1306428110] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Neural representations of a moving object's distance and approach speed are essential for determining appropriate orienting responses, such as those observed in the localization behaviors of the weakly electric fish, Apteronotus leptorhynchus. We demonstrate that a power law form of spike rate adaptation transforms an electroreceptor afferent's response to "looming" object motion, effectively parsing information about distance and approach speed into distinct measures of the firing rate. Neurons with dynamics characterized by fixed time scales are shown to confound estimates of object distance and speed. Conversely, power law adaptation modifies an electroreceptor afferent's response according to the time scales present in the stimulus, generating a rate code for looming object distance that is invariant to speed and acceleration. Consequently, estimates of both object distance and approach speed can be uniquely determined from an electroreceptor afferent's firing rate, a multiplexed neural code operating over the extended time scales associated with behaviorally relevant stimuli.
Collapse
|
208
|
|
209
|
Temporal whitening by power-law adaptation in neocortical neurons. Nat Neurosci 2013; 16:942-8. [PMID: 23749146 DOI: 10.1038/nn.3431] [Citation(s) in RCA: 115] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2012] [Accepted: 05/08/2013] [Indexed: 11/08/2022]
Abstract
Spike-frequency adaptation (SFA) is widespread in the CNS, but its function remains unclear. In neocortical pyramidal neurons, adaptation manifests itself by an increase in the firing threshold and by adaptation currents triggered after each spike. Combining electrophysiological recordings in mice with modeling, we found that these adaptation processes lasted for more than 20 s and decayed over multiple timescales according to a power law. The power-law decay associated with adaptation mirrored and canceled the temporal correlations of input current received in vivo at the somata of layer 2/3 somatosensory pyramidal neurons. These findings suggest that, in the cortex, SFA causes temporal decorrelation of output spikes (temporal whitening), an energy-efficient coding procedure that, at high signal-to-noise ratio, improves the information transfer.
Collapse
|
210
|
Abstract
Cognitive neuroscience boils down to describing the ways in which cognitive function results from brain activity. In turn, brain activity shows complex fluctuations, with structure at many spatio-temporal scales. Exactly how cognitive function inherits the physical dimensions of neural activity, though, is highly non-trivial, and so are generally the corresponding dimensions of cognitive phenomena. As for any physical phenomenon, when studying cognitive function, the first conceptual step should be that of establishing its dimensions. Here, we provide a systematic presentation of the temporal aspects of task-related brain activity, from the smallest scale of the brain imaging technique's resolution, to the observation time of a given experiment, through the characteristic time scales of the process under study. We first review some standard assumptions on the temporal scales of cognitive function. In spite of their general use, these assumptions hold true to a high degree of approximation for many cognitive (viz. fast perceptual) processes, but have their limitations for other ones (e.g., thinking or reasoning). We define in a rigorous way the temporal quantifiers of cognition at all scales, and illustrate how they qualitatively vary as a function of the properties of the cognitive process under study. We propose that each phenomenon should be approached with its own set of theoretical, methodological and analytical tools. In particular, we show that when treating cognitive processes such as thinking or reasoning, complex properties of ongoing brain activity, which can be drastically simplified when considering fast (e.g., perceptual) processes, start playing a major role, and not only characterize the temporal properties of task-related brain activity, but also determine the conditions for proper observation of the phenomena. Finally, some implications on the design of experiments, data analyses, and the choice of recording parameters are discussed.
Collapse
Affiliation(s)
- David Papo
- Center for Biomedical Technology, Universidad Politécnica de MadridMadrid, Spain
| |
Collapse
|
211
|
Vosika ZB, Lazovic GM, Misevic GN, Simic-Krstic JB. Fractional calculus model of electrical impedance applied to human skin. PLoS One 2013; 8:e59483. [PMID: 23577065 PMCID: PMC3618342 DOI: 10.1371/journal.pone.0059483] [Citation(s) in RCA: 47] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2012] [Accepted: 02/14/2013] [Indexed: 11/19/2022] Open
Abstract
Fractional calculus is a mathematical approach dealing with derivatives and integrals of arbitrary and complex orders. Therefore, it adds a new dimension to understand and describe basic nature and behavior of complex systems in an improved way. Here we use the fractional calculus for modeling electrical properties of biological systems. We derived a new class of generalized models for electrical impedance and applied them to human skin by experimental data fitting. The primary model introduces new generalizations of: 1) Weyl fractional derivative operator, 2) Cole equation, and 3) Constant Phase Element (CPE). These generalizations were described by the novel equation which presented parameter related to remnant memory and corrected four essential parameters We further generalized single generalized element by introducing specific partial sum of Maclaurin series determined by parameters We defined individual primary model elements and their serial combination models by the appropriate equations and electrical schemes. Cole equation is a special case of our generalized class of models for Previous bioimpedance data analyses of living systems using basic Cole and serial Cole models show significant imprecisions. Our new class of models considerably improves the quality of fitting, evaluated by mean square errors, for bioimpedance data obtained from human skin. Our models with new parameters presented in specific partial sum of Maclaurin series also extend representation, understanding and description of complex systems electrical properties in terms of remnant memory effects.
Collapse
Affiliation(s)
- Zoran B. Vosika
- Department of Biomedical Engineering, Faculty of Mechanical Engineering at University of Belgrade, Belgrade, Serbia
| | - Goran M. Lazovic
- Department of Mathematics, Faculty of Mechanical Engineering at University of Belgrade, Belgrade, Serbia
| | | | - Jovana B. Simic-Krstic
- Department of Biomedical Engineering, Faculty of Mechanical Engineering at University of Belgrade, Belgrade, Serbia
| |
Collapse
|
212
|
Papo D. Why should cognitive neuroscientists study the brain's resting state? Front Hum Neurosci 2013; 7:45. [PMID: 23431277 PMCID: PMC3576622 DOI: 10.3389/fnhum.2013.00045] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2012] [Accepted: 02/04/2013] [Indexed: 11/26/2022] Open
Affiliation(s)
- David Papo
- Center for Biomedical Technology, Universidad Politécnica de MadridMadrid, Spain
| |
Collapse
|
213
|
Yu J, Hu C, Jiang H. α-stability and α-synchronization for fractional-order neural networks. Neural Netw 2012; 35:82-7. [DOI: 10.1016/j.neunet.2012.07.009] [Citation(s) in RCA: 109] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2012] [Revised: 06/29/2012] [Accepted: 07/26/2012] [Indexed: 10/28/2022]
|
214
|
Nonlinear dynamics and chaos in fractional-order neural networks. Neural Netw 2012; 32:245-56. [DOI: 10.1016/j.neunet.2012.02.030] [Citation(s) in RCA: 291] [Impact Index Per Article: 24.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2011] [Revised: 01/12/2012] [Accepted: 02/07/2012] [Indexed: 11/22/2022]
|
215
|
Parallel coding of first- and second-order stimulus attributes by midbrain electrosensory neurons. J Neurosci 2012; 32:5510-24. [PMID: 22514313 DOI: 10.1523/jneurosci.0478-12.2012] [Citation(s) in RCA: 55] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Natural stimuli often have time-varying first-order (i.e., mean) and second-order (i.e., variance) attributes that each carry critical information for perception and can vary independently over orders of magnitude. Experiments have shown that sensory systems continuously adapt their responses based on changes in each of these attributes. This adaptation creates ambiguity in the neural code as multiple stimuli may elicit the same neural response. While parallel processing of first- and second-order attributes by separate neural pathways is sufficient to remove this ambiguity, the existence of such pathways and the neural circuits that mediate their emergence have not been uncovered to date. We recorded the responses of midbrain electrosensory neurons in the weakly electric fish Apteronotus leptorhynchus to stimuli with first- and second-order attributes that varied independently in time. We found three distinct groups of midbrain neurons: the first group responded to both first- and second-order attributes, the second group responded selectively to first-order attributes, and the last group responded selectively to second-order attributes. In contrast, all afferent hindbrain neurons responded to both first- and second-order attributes. Using computational analyses, we show how inputs from a heterogeneous population of ON- and OFF-type afferent neurons are combined to give rise to response selectivity to either first- or second-order stimulus attributes in midbrain neurons. Our study thus uncovers, for the first time, generic and widely applicable mechanisms by which parallel processing of first- and second-order stimulus attributes emerges in the brain.
Collapse
|
216
|
Rathour RK, Narayanan R. Influence fields: a quantitative framework for representation and analysis of active dendrites. J Neurophysiol 2012; 107:2313-34. [DOI: 10.1152/jn.00846.2011] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023] Open
Abstract
Neuronal dendrites express numerous voltage-gated ion channels (VGICs), typically with spatial gradients in their densities and properties. Dendritic VGICs, their gradients, and their plasticity endow neurons with information processing capabilities that are higher than those of neurons with passive dendrites. Despite this, frameworks that incorporate dendritic VGICs and their plasticity into neurophysiological and learning theory models have been far and few. Here, we develop a generalized quantitative framework to analyze the extent of influence of a spatially localized VGIC conductance on different physiological properties along the entire stretch of a neuron. Employing this framework, we show that the extent of influence of a VGIC conductance is largely independent of the conductance magnitude but is heavily dependent on the specific physiological property and background conductances. Morphologically, our analyses demonstrate that the influences of different VGIC conductances located on an oblique dendrite are confined within that oblique dendrite, thus providing further credence to the postulate that dendritic branches act as independent computational units. Furthermore, distinguishing between active and passive propagation of signals within a neuron, we demonstrate that the influence of a VGIC conductance is spatially confined only when propagation is active. Finally, we reconstruct functional gradients from VGIC conductance gradients using influence fields and demonstrate that the cumulative contribution of VGIC conductances in adjacent compartments plays a critical role in determining physiological properties at a given location. We suggest that our framework provides a quantitative basis for unraveling the roles of dendritic VGICs and their plasticity in neural coding, learning, and homeostasis.
Collapse
|
217
|
Lankheet MJM, Klink PC, Borghuis BG, Noest AJ. Spike-interval triggered averaging reveals a quasi-periodic spiking alternative for stochastic resonance in catfish electroreceptors. PLoS One 2012; 7:e32786. [PMID: 22403709 PMCID: PMC3293861 DOI: 10.1371/journal.pone.0032786] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2011] [Accepted: 02/05/2012] [Indexed: 11/18/2022] Open
Abstract
Catfish detect and identify invisible prey by sensing their ultra-weak electric fields with electroreceptors. Any neuron that deals with small-amplitude input has to overcome sensitivity limitations arising from inherent threshold non-linearities in spike-generation mechanisms. Many sensory cells solve this issue with stochastic resonance, in which a moderate amount of intrinsic noise causes irregular spontaneous spiking activity with a probability that is modulated by the input signal. Here we show that catfish electroreceptors have adopted a fundamentally different strategy. Using a reverse correlation technique in which we take spike interval durations into account, we show that the electroreceptors generate a supra-threshold bias current that results in quasi-periodically produced spikes. In this regime stimuli modulate the interval between successive spikes rather than the instantaneous probability for a spike. This alternative for stochastic resonance combines threshold-free sensitivity for weak stimuli with similar sensitivity for excitations and inhibitions based on single interspike intervals.
Collapse
|
218
|
Goychuk I. Viscoelastic Subdiffusion: Generalized Langevin Equation Approach. ADVANCES IN CHEMICAL PHYSICS 2012. [DOI: 10.1002/9781118197714.ch5] [Citation(s) in RCA: 91] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
|
219
|
Bin G. Multiple Solutions for a Class of Fractional Boundary Value Problems. ABSTRACT AND APPLIED ANALYSIS 2012; 2012:1-16. [DOI: 10.1155/2012/468980] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/01/2023]
Affiliation(s)
- Ge Bin
- Department of Mathematics, Harbin Engineering University, Harbin 150001, China
| |
Collapse
|
220
|
Cardanobile S, Rotter S. Emergent properties of interacting populations of spiking neurons. Front Comput Neurosci 2011; 5:59. [PMID: 22207844 PMCID: PMC3245521 DOI: 10.3389/fncom.2011.00059] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2011] [Accepted: 11/28/2011] [Indexed: 12/05/2022] Open
Abstract
Dynamic neuronal networks are a key paradigm of increasing importance in brain research, concerned with the functional analysis of biological neuronal networks and, at the same time, with the synthesis of artificial brain-like systems. In this context, neuronal network models serve as mathematical tools to understand the function of brains, but they might as well develop into future tools for enhancing certain functions of our nervous system. Here, we present and discuss our recent achievements in developing multiplicative point processes into a viable mathematical framework for spiking network modeling. The perspective is that the dynamic behavior of these neuronal networks is faithfully reflected by a set of non-linear rate equations, describing all interactions on the population level. These equations are similar in structure to Lotka-Volterra equations, well known by their use in modeling predator-prey relations in population biology, but abundant applications to economic theory have also been described. We present a number of biologically relevant examples for spiking network function, which can be studied with the help of the aforementioned correspondence between spike trains and specific systems of non-linear coupled ordinary differential equations. We claim that, enabled by the use of multiplicative point processes, we can make essential contributions to a more thorough understanding of the dynamical properties of interacting neuronal populations.
Collapse
|
221
|
Tell me something interesting: context dependent adaptation in somatosensory cortex. J Neurosci Methods 2011; 210:35-48. [PMID: 22186665 DOI: 10.1016/j.jneumeth.2011.12.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2011] [Revised: 12/01/2011] [Accepted: 12/05/2011] [Indexed: 11/21/2022]
Abstract
It is widely accepted that through a process of adaptation cells adjust their sensitivity in accordance with prevailing stimulus conditions. However, in two recent studies exploring adaptation in the rodent inferior colliculus and somatosensory cortex, neurons did not adapt towards global mean, but rather became most sensitive to inputs that were located towards the edge of the stimulus distribution with greater intensity than the mean. We re-examined electrophysiological data from the somatosensory study with the purpose of exploring the underlying encoding strategies. We found that neural gain tended to decrease as stimulus variance increased. Following adaptation to changes in global mean, neuronal output was scaled such that the relationship between firing rate and local, rather than global, differences in stimulus intensity was maintained. The majority of cells responded to large, positive deviations in stimulus amplitude; with a small number responding to both positive and negative changes in stimulus intensity. Adaptation to global mean was replicated in a model neuron by incorporating both spike-rate adaptation and tonic-inhibition, which increased in proportion to stimulus mean. Adaptation to stimulus variance was replicated by approximating the output of a population of neurons adapted to global mean and using it to drive a layer of recurrently connected depressing synapses. Within the barrel cortex, adaptation ensures that neurons are able to encode both overall levels of variance and large deviations in the input. This is achieved through a combination of gain modulation and a shift in sensitivity to intensity levels that are greater than the mean.
Collapse
|
222
|
Linaro D, Storace M, Mattia M. Inferring network dynamics and neuron properties from population recordings. Front Comput Neurosci 2011; 5:43. [PMID: 22016731 PMCID: PMC3191764 DOI: 10.3389/fncom.2011.00043] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2011] [Accepted: 09/14/2011] [Indexed: 11/18/2022] Open
Abstract
Understanding the computational capabilities of the nervous system means to “identify” its emergent multiscale dynamics. For this purpose, we propose a novel model-driven identification procedure and apply it to sparsely connected populations of excitatory integrate-and-fire neurons with spike frequency adaptation (SFA). Our method does not characterize the system from its microscopic elements in a bottom-up fashion, and does not resort to any linearization. We investigate networks as a whole, inferring their properties from the response dynamics of the instantaneous discharge rate to brief and aspecific supra-threshold stimulations. While several available methods assume generic expressions for the system as a black box, we adopt a mean-field theory for the evolution of the network transparently parameterized by identified elements (such as dynamic timescales), which are in turn non-trivially related to single-neuron properties. In particular, from the elicited transient responses, the input–output gain function of the neurons in the network is extracted and direct links to the microscopic level are made available: indeed, we show how to extract the decay time constant of the SFA, the absolute refractory period and the average synaptic efficacy. In addition and contrary to previous attempts, our method captures the system dynamics across bifurcations separating qualitatively different dynamical regimes. The robustness and the generality of the methodology is tested on controlled simulations, reporting a good agreement between theoretically expected and identified values. The assumptions behind the underlying theoretical framework make the method readily applicable to biological preparations like cultured neuron networks and in vitro brain slices.
Collapse
Affiliation(s)
- Daniele Linaro
- Department of Biophysical and Electronic Engineering, University of Genoa Genoa, Italy
| | | | | |
Collapse
|
223
|
Abstract
Metaphors of Computation and Information tended to detract attention from the intrinsic modes of neural system functions, uncontaminated by the observer's role in collection, and interpretation of experimental data. Recognizing the self-referential mode of function, and the propensity for self-organization to critical states requires a fundamentally new orientation, based on Complex System Dynamics as non-ergodic, non-stationary processes with inverse-power-law statistical distributions. Accordingly, local cooperative processes, intrinsic to neural structures, and of fractal nature, call for applying Fractional Calculus and models of Random Walks with long-term memory in Theoretical Neuroscience studies.
Collapse
Affiliation(s)
- Gerhard Werner
- Department of Biomedical Engineering, University of Texas at AustinAustin, TX, USA
| |
Collapse
|
224
|
Naud R, Gerhard F, Mensi S, Gerstner W. Improved similarity measures for small sets of spike trains. Neural Comput 2011; 23:3016-69. [PMID: 21919785 DOI: 10.1162/neco_a_00208] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Multiple measures have been developed to quantify the similarity between two spike trains. These measures have been used for the quantification of the mismatch between neuron models and experiments as well as for the classification of neuronal responses in neuroprosthetic devices and electrophysiological experiments. Frequently only a few spike trains are available in each class. We derive analytical expressions for the small-sample bias present when comparing estimators of the time-dependent firing intensity. We then exploit analogies between the comparison of firing intensities and previously used spike train metrics and show that improved spike train measures can be successfully used for fitting neuron models to experimental data, for comparisons of spike trains, and classification of spike train data. In classification tasks, the improved similarity measures can increase the recovered information. We demonstrate that when similarity measures are used for fitting mathematical models, all previous methods systematically underestimate the noise. Finally, we show a striking implication of this deterministic bias by reevaluating the results of the single-neuron prediction challenge.
Collapse
Affiliation(s)
- Richard Naud
- Brain Mind Institute and School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.
| | | | | | | |
Collapse
|
225
|
Druckmann S, Berger TK, Schürmann F, Hill S, Markram H, Segev I. Effective stimuli for constructing reliable neuron models. PLoS Comput Biol 2011; 7:e1002133. [PMID: 21876663 PMCID: PMC3158041 DOI: 10.1371/journal.pcbi.1002133] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2010] [Accepted: 06/08/2011] [Indexed: 11/19/2022] Open
Abstract
The rich dynamical nature of neurons poses major conceptual and technical challenges for unraveling their nonlinear membrane properties. Traditionally, various current waveforms have been injected at the soma to probe neuron dynamics, but the rationale for selecting specific stimuli has never been rigorously justified. The present experimental and theoretical study proposes a novel framework, inspired by learning theory, for objectively selecting the stimuli that best unravel the neuron's dynamics. The efficacy of stimuli is assessed in terms of their ability to constrain the parameter space of biophysically detailed conductance-based models that faithfully replicate the neuron's dynamics as attested by their ability to generalize well to the neuron's response to novel experimental stimuli. We used this framework to evaluate a variety of stimuli in different types of cortical neurons, ages and animals. Despite their simplicity, a set of stimuli consisting of step and ramp current pulses outperforms synaptic-like noisy stimuli in revealing the dynamics of these neurons. The general framework that we propose paves a new way for defining, evaluating and standardizing effective electrical probing of neurons and will thus lay the foundation for a much deeper understanding of the electrical nature of these highly sophisticated and non-linear devices and of the neuronal networks that they compose.
Collapse
Affiliation(s)
- Shaul Druckmann
- Interdisciplinary Center for Neural Computation, Hebrew University of Jerusalem, Jerusalem, Israel
- Edmond and Lily Safra Center for Brain Sciences and Department of Neurobiology, Institute of Life Sciences, Hebrew University of Jerusalem, Jerusalem, Israel
| | - Thomas K. Berger
- Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Felix Schürmann
- Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Sean Hill
- Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Henry Markram
- Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Idan Segev
- Interdisciplinary Center for Neural Computation, Hebrew University of Jerusalem, Jerusalem, Israel
- Edmond and Lily Safra Center for Brain Sciences and Department of Neurobiology, Institute of Life Sciences, Hebrew University of Jerusalem, Jerusalem, Israel
- * E-mail:
| |
Collapse
|
226
|
Rowekamp RJ, Sharpee TO. Analyzing multicomponent receptive fields from neural responses to natural stimuli. NETWORK (BRISTOL, ENGLAND) 2011; 22:45-73. [PMID: 21780916 PMCID: PMC3251001 DOI: 10.3109/0954898x.2011.566303] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/18/2010] [Accepted: 02/22/2011] [Indexed: 05/31/2023]
Abstract
The challenge of building increasingly better models of neural responses to natural stimuli is to accurately estimate the multiple stimulus features that may jointly affect the neural spike probability. The selectivity for combinations of features is thought to be crucial for achieving classical properties of neural responses such as contrast invariance. The joint search for these multiple stimulus features is difficult because estimating spike probability as a multidimensional function of stimulus projections onto candidate relevant dimensions is subject to the curse of dimensionality. An attractive alternative is to search for relevant dimensions sequentially, as in projection pursuit regression. Here we demonstrate using analytic arguments and simulations of model cells that different types of sequential search strategies exhibit systematic biases when used with natural stimuli. Simulations show that joint optimization is feasible for up to three dimensions with current algorithms. When applied to the responses of V1 neurons to natural scenes, models based on three jointly optimized dimensions had better predictive power in a majority of cases compared to dimensions optimized sequentially, with different sequential methods yielding comparable results. Thus, although the curse of dimensionality remains, at least several relevant dimensions can be estimated by joint information maximization.
Collapse
Affiliation(s)
- Ryan J Rowekamp
- Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037, USA
| | | |
Collapse
|
227
|
Farkhooi F, Muller E, Nawrot MP. Adaptation reduces variability of the neuronal population code. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2011; 83:050905. [PMID: 21728481 DOI: 10.1103/physreve.83.050905] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/16/2010] [Revised: 03/22/2011] [Indexed: 05/31/2023]
Abstract
Sequences of events in noise-driven excitable systems with slow variables often show serial correlations among their intervals of events. Here, we employ a master equation for generalized non-renewal processes to calculate the interval and count statistics of superimposed processes governed by a slow adaptation variable. For an ensemble of neurons with spike-frequency adaptation, this results in the regularization of the population activity and an enhanced postsynaptic signal decoding. We confirm our theoretical results in a population of cortical neurons recorded in vivo.
Collapse
Affiliation(s)
- Farzad Farkhooi
- Neuroinformatics and Theoretical Neuroscience, Freie Universität Berlin and BCCN-Berlin, Berlin, Germany.
| | | | | |
Collapse
|
228
|
Abstract
Although neuronal excitability is well understood and accurately modeled over timescales of up to hundreds of milliseconds, it is currently unclear whether extrapolating from this limited duration to longer behaviorally relevant timescales is appropriate. Here we used an extracellular recording and stimulation paradigm that extends the duration of single-neuron electrophysiological experiments, exposing the dynamics of excitability in individual cultured cortical neurons over timescales hitherto inaccessible. We show that the long-term neuronal excitability dynamics is unstable and dominated by critical fluctuations, intermittency, scale-invariant rate statistics, and long memory. These intrinsic dynamics bound the firing rate over extended timescales, contrasting observed short-term neuronal response to stimulation onset. Furthermore, the activity of a neuron over extended timescales shows transitions between quasi-stable modes, each characterized by a typical response pattern. Like in the case of rate statistics, the short-term onset response pattern that often serves to functionally define a given neuron is not indicative of its long-term ongoing response. These observations question the validity of describing neuronal excitability based on temporally restricted electrophysiological data, calling for in-depth exploration of activity over wider temporal scales. Such extended experiments will probably entail a different kind of neuronal models, accounting for the unbounded range, from milliseconds up.
Collapse
|
229
|
Bohte SM. Error-Backpropagation in Networks of Fractionally Predictive Spiking Neurons. LECTURE NOTES IN COMPUTER SCIENCE 2011. [DOI: 10.1007/978-3-642-21735-7_8] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/03/2022]
|
230
|
Engelmann J, Gertz S, Goulet J, Schuh A, von der Emde G. Coding of Stimuli by Ampullary Afferents in Gnathonemus petersii. J Neurophysiol 2010; 104:1955-68. [DOI: 10.1152/jn.00503.2009] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Weakly electric fish use electroreception for both active and passive electrolocation and for electrocommunication. While both active and passive electrolocation systems are prominent in weakly electric Mormyriform fishes, knowledge of their passive electrolocation ability is still scarce. To better estimate the contribution of passive electric sensing to the orientation toward electric stimuli in weakly electric fishes, we investigated frequency tuning applying classical input-output characterization and stimulus reconstruction methods to reveal the encoding capabilities of ampullary receptor afferents. Ampullary receptor afferents were most sensitive (threshold: 40 μV/cm) at low frequencies (<10 Hz) and appear to be tuned to a mix of amplitude and slope of the input signals. The low-frequency tuning was corroborated by behavioral experiments, but behavioral thresholds were one order of magnitude higher. The integration of simultaneously recorded afferents of similar frequency-tuning resulted in strongly enhanced signal-to-noise ratios and increased mutual information rates but did not increase the range of frequencies detectable by the system. Theoretically the neuronal integration of input from receptors experiencing opposite polarities of a stimulus (left and right side of the fish) was shown to enhance encoding of such stimuli, including an increase of bandwidth. Covariance and coherence analysis showed that spiking of ampullary afferents is sufficiently explained by the spike-triggered average, i.e., receptors respond to a single linear feature of the stimulus. Our data support the notion of a division of labor of the active and passive electrosensory systems in weakly electric fishes based on frequency tuning. Future experiments will address the role of central convergence of ampullary input that we expect to lead to higher sensitivity and encoding power of the system.
Collapse
Affiliation(s)
- J. Engelmann
- University of Bonn, Institute for Zoology, Neuroethology—Sensory Ecology, Bonn, Germany
- University of Bielefeld, Faculty of Biology, Active Sensing, Bielefeld, Germany; and
| | - S. Gertz
- University of Bonn, Institute for Zoology, Neuroethology—Sensory Ecology, Bonn, Germany
| | - J. Goulet
- Physik Department, TU München and Bernstein Center for Computational Neuroscience, Garching, Germany
- Radboud University Nijmegen, Donders Institute for Brain Cognition and Behaviour, Nijmegen, The Netherlands
| | - A. Schuh
- University of Bonn, Institute for Zoology, Neuroethology—Sensory Ecology, Bonn, Germany
| | - G. von der Emde
- University of Bonn, Institute for Zoology, Neuroethology—Sensory Ecology, Bonn, Germany
| |
Collapse
|
231
|
Acoustic experience but not attention modifies neural population phase expressed in human primary auditory cortex. Hear Res 2010; 269:81-94. [DOI: 10.1016/j.heares.2010.07.001] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/08/2010] [Revised: 06/07/2010] [Accepted: 07/05/2010] [Indexed: 11/21/2022]
|
232
|
Werner G. Fractals in the nervous system: conceptual implications for theoretical neuroscience. Front Physiol 2010; 1:15. [PMID: 21423358 PMCID: PMC3059969 DOI: 10.3389/fphys.2010.00015] [Citation(s) in RCA: 91] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2010] [Accepted: 06/05/2010] [Indexed: 11/15/2022] Open
Abstract
This essay is presented with two principal objectives in mind: first, to document the prevalence of fractals at all levels of the nervous system, giving credence to the notion of their functional relevance; and second, to draw attention to the as yet still unresolved issues of the detailed relationships among power-law scaling, self-similarity, and self-organized criticality. As regards criticality, I will document that it has become a pivotal reference point in Neurodynamics. Furthermore, I will emphasize the not yet fully appreciated significance of allometric control processes. For dynamic fractals, I will assemble reasons for attributing to them the capacity to adapt task execution to contextual changes across a range of scales. The final Section consists of general reflections on the implications of the reviewed data, and identifies what appear to be issues of fundamental importance for future research in the rapidly evolving topic of this review.
Collapse
Affiliation(s)
- Gerhard Werner
- Department of Biomedical Engineering, University of Texas at Austin TX, USA.
| |
Collapse
|
233
|
Abstract
Temporal derivatives are computed by a wide variety of neural circuits, but the problem of performing this computation accurately has received little theoretical study. Here we systematically compare the performance of diverse networks that calculate derivatives using cell-intrinsic adaptation and synaptic depression dynamics, feedforward network dynamics, and recurrent network dynamics. Examples of each type of network are compared by quantifying the errors they introduce into the calculation and their rejection of high-frequency input noise. This comparison is based on both analytical methods and numerical simulations with spiking leaky-integrate-and-fire (LIF) neurons. Both adapting and feedforward-network circuits provide good performance for signals with frequency bands that are well matched to the time constants of postsynaptic current decay and adaptation, respectively. The synaptic depression circuit performs similarly to the adaptation circuit, although strictly speaking, precisely linear differentiation based on synaptic depression is not possible, because depression scales synaptic weights multiplicatively. Feedback circuits introduce greater errors than functionally equivalent feedforward circuits, but they have the useful property that their dynamics are determined by feedback strength. For this reason, these circuits are better suited for calculating the derivatives of signals that evolve on timescales outside the range of membrane dynamics and, possibly, for providing the wide range of timescales needed for precise fractional-order differentiation.
Collapse
Affiliation(s)
- Bryan P Tripp
- Centre for Theoretical Neuroscience, University of Waterloo, Ontario, Canada.
| | | |
Collapse
|
234
|
Ganmor E, Katz Y, Lampl I. Intensity-dependent adaptation of cortical and thalamic neurons is controlled by brainstem circuits of the sensory pathway. Neuron 2010; 66:273-86. [PMID: 20435003 DOI: 10.1016/j.neuron.2010.03.032] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/24/2010] [Indexed: 11/17/2022]
Abstract
Current views of sensory adaptation in the rat somatosensory system suggest that it results mainly from short-term synaptic depression. Experimental and theoretical studies predict that increasing the intensity of sensory stimulation, followed by an increase in firing probability at early sensory stages, is expected to attenuate the response at later stages disproportionately more than weaker stimuli, due to greater depletion of synaptic resources and the relatively slow recovery process. This may lead to coding ambiguity of stimulus intensity during adaptation. In contrast, we found that increasing the intensity of repetitive whisker stimulation entails less adaptation in cortical neurons. In a series of recordings, from the trigeminal ganglion to the thalamus, we pinpointed the source of the unexpected pattern of adaptation to the brainstem trigeminal complex. We suggest that low-level sensory processing counterbalances later effects of short-term synaptic depression by increasing the throughput of high-intensity sensory inputs.
Collapse
Affiliation(s)
- Elad Ganmor
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel
| | | | | |
Collapse
|
235
|
Multiple timescale encoding of slowly varying whisker stimulus envelope in cortical and thalamic neurons in vivo. J Neurosci 2010; 30:5071-7. [PMID: 20371827 DOI: 10.1523/jneurosci.2193-09.2010] [Citation(s) in RCA: 63] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Adaptive processes over many timescales endow neurons with sensitivity to stimulus changes over a similarly wide range of scales. Although spike timing of single neurons can precisely signal rapid fluctuations in their inputs, the mean firing rate can convey information about slower-varying properties of the stimulus. Here, we investigate the firing rate response to a slowly varying envelope of whisker motion in two processing stages of the rat vibrissa pathway. The whiskers of anesthetized rats were moved through a noise trajectory with an amplitude that was sinusoidally modulated at one of several frequencies. In thalamic neurons, we found that the rate response to the stimulus envelope was also sinusoidal, with an approximately frequency-independent phase advance with respect to the input. Responses in cortex were similar but with a phase shift that was about three times larger, consistent with a larger amount of rate adaptation. These response properties can be described as a linear transformation of the input for which a single parameter quantifies the phase shift as well as the degree of adaptation. These results are reproduced by a model of adapting neurons connected by synapses with short-term plasticity, showing that the observed linear response and phase lead can be built up from a network that includes a sequence of nonlinear adapting elements. Our study elucidates how slowly varying envelope information under passive stimulation is preserved and transformed through the vibrissa processing pathway.
Collapse
|
236
|
Soudry D, Meir R. History-dependent Dynamics in a Generic Model of Ion Channels - an Analytic Study. Front Comput Neurosci 2010; 4. [PMID: 20725633 PMCID: PMC2916672 DOI: 10.3389/fncom.2010.00003] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2009] [Accepted: 03/02/2010] [Indexed: 01/21/2023] Open
Abstract
Recent experiments have demonstrated that the timescale of adaptation of single neurons and ion channel populations to stimuli slows down as the length of stimulation increases; in fact, no upper bound on temporal timescales seems to exist in such systems. Furthermore, patch clamp experiments on single ion channels have hinted at the existence of large, mostly unobservable, inactivation state spaces within a single ion channel. This raises the question of the relation between this multitude of inactivation states and the observed behavior. In this work we propose a minimal model for ion channel dynamics which does not assume any specific structure of the inactivation state space. The model is simple enough to render an analytical study possible. This leads to a clear and concise explanation of the experimentally observed exponential history-dependent relaxation in sodium channels in a voltage clamp setting, and shows that their recovery rate from slow inactivation must be voltage dependent. Furthermore, we predict that history-dependent relaxation cannot be created by overly sparse spiking activity. While the model was created with ion channel populations in mind, its simplicity and genericalness render it a good starting point for modeling similar effects in other systems, and for scaling up to higher levels such as single neurons which are also known to exhibit multiple time scales.
Collapse
Affiliation(s)
- Daniel Soudry
- Department of Electrical Engineering, Technion Haifa, Israel
| | | |
Collapse
|
237
|
Panzeri S, Brunel N, Logothetis NK, Kayser C. Sensory neural codes using multiplexed temporal scales. Trends Neurosci 2010; 33:111-20. [PMID: 20045201 DOI: 10.1016/j.tins.2009.12.001] [Citation(s) in RCA: 301] [Impact Index Per Article: 21.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2009] [Revised: 10/28/2009] [Accepted: 12/03/2009] [Indexed: 10/20/2022]
Abstract
Determining how neuronal activity represents sensory information is central for understanding perception. Recent work shows that neural responses at different timescales can encode different stimulus attributes, resulting in a temporal multiplexing of sensory information. Multiplexing increases the encoding capacity of neural responses, enables disambiguation of stimuli that cannot be discriminated at a single response timescale, and makes sensory representations stable to the presence of variability in the sensory world. Thus, as we discuss here, temporal multiplexing could be a key strategy used by the brain to form an information-rich and stable representation of the environment.
Collapse
Affiliation(s)
- Stefano Panzeri
- Robotics, Brain and Cognitive Sciences Department, Italian Institute of Technology, Via Morego 30, 16163 Genova, Italy.
| | | | | | | |
Collapse
|
238
|
Abstract
Many membrane channels and receptors exhibit adaptive, or desensitized, response to a strong sustained input stimulus. A key mechanism that underlies this response is the slow, activity-dependent removal of responding molecules to a pool which is unavailable to respond immediately to the input. This mechanism is implemented in different ways in various biological systems and has traditionally been studied separately for each. Here we highlight the common aspects of this principle, shared by many biological systems, and suggest a unifying theoretical framework. We study theoretically a class of models which describes the general mechanism and allows us to distinguish its universal from system-specific features. We show that under general conditions, regardless of the details of kinetics, molecule availability encodes an averaging over past activity and feeds back multiplicatively on the system output. The kinetics of recovery from unavailability determines the effective memory kernel inside the feedback branch, giving rise to a variety of system-specific forms of adaptive response-precise or input-dependent, exponential or power-law-as special cases of the same model.
Collapse
|
239
|
Miller KJ, Sorensen LB, Ojemann JG, den Nijs M. Power-law scaling in the brain surface electric potential. PLoS Comput Biol 2009; 5:e1000609. [PMID: 20019800 PMCID: PMC2787015 DOI: 10.1371/journal.pcbi.1000609] [Citation(s) in RCA: 464] [Impact Index Per Article: 30.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2009] [Accepted: 11/12/2009] [Indexed: 11/25/2022] Open
Abstract
Recent studies have identified broadband phenomena in the electric potentials produced by the brain. We report the finding of power-law scaling in these signals using subdural electrocorticographic recordings from the surface of human cortex. The power spectral density (PSD) of the electric potential has the power-law form P(f ) approximately Af(-chi) from 80 to 500 Hz. This scaling index, chi = 4.0+/-0.1, is conserved across subjects, area in the cortex, and local neural activity levels. The shape of the PSD does not change with increases in local cortical activity, but the amplitude, A, increases. We observe a "knee" in the spectra at f(0) approximately 75 Hz, implying the existence of a characteristic time scale tau = (2pif(0))(-1) approximately 2 - 4ms. Below f(0), we explore two-power-law forms of the PSD, and demonstrate that there are activity-related fluctuations in the amplitude of a power-law process lying beneath the alpha/beta rhythms. Finally, we illustrate through simulation how, small-scale, simplified neuronal models could lead to these power-law observations. This suggests a new paradigm of non-oscillatory "asynchronous," scale-free, changes in cortical potentials, corresponding to changes in mean population-averaged firing rate, to complement the prevalent "synchronous" rhythm-based paradigm.
Collapse
Affiliation(s)
- Kai J Miller
- Department of Physics, University of Washington, Seattle, Washington, USA.
| | | | | | | |
Collapse
|
240
|
Zilany MSA, Bruce IC, Nelson PC, Carney LH. A phenomenological model of the synapse between the inner hair cell and auditory nerve: long-term adaptation with power-law dynamics. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2009; 126:2390-412. [PMID: 19894822 PMCID: PMC2787068 DOI: 10.1121/1.3238250] [Citation(s) in RCA: 194] [Impact Index Per Article: 12.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
There is growing evidence that the dynamics of biological systems that appear to be exponential over short time courses are in some cases better described over the long-term by power-law dynamics. A model of rate adaptation at the synapse between inner hair cells and auditory-nerve (AN) fibers that includes both exponential and power-law dynamics is presented here. Exponentially adapting components with rapid and short-term time constants, which are mainly responsible for shaping onset responses, are followed by two parallel paths with power-law adaptation that provide slowly and rapidly adapting responses. The slowly adapting power-law component significantly improves predictions of the recovery of the AN response after stimulus offset. The faster power-law adaptation is necessary to account for the "additivity" of rate in response to stimuli with amplitude increments. The proposed model is capable of accurately predicting several sets of AN data, including amplitude-modulation transfer functions, long-term adaptation, forward masking, and adaptation to increments and decrements in the amplitude of an ongoing stimulus.
Collapse
Affiliation(s)
- Muhammad S A Zilany
- Department of Biomedical Engineering, University of Rochester, NY 14642, USA
| | | | | | | |
Collapse
|
241
|
Network-state modulation of power-law frequency-scaling in visual cortical neurons. PLoS Comput Biol 2009; 5:e1000519. [PMID: 19779556 PMCID: PMC2740863 DOI: 10.1371/journal.pcbi.1000519] [Citation(s) in RCA: 59] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2009] [Accepted: 08/25/2009] [Indexed: 11/19/2022] Open
Abstract
Various types of neural-based signals, such as EEG, local field potentials and intracellular synaptic potentials, integrate multiple sources of activity distributed across large assemblies. They have in common a power-law frequency-scaling structure at high frequencies, but it is still unclear whether this scaling property is dominated by intrinsic neuronal properties or by network activity. The latter case is particularly interesting because if frequency-scaling reflects the network state it could be used to characterize the functional impact of the connectivity. In intracellularly recorded neurons of cat primary visual cortex in vivo, the power spectral density of Vm activity displays a power-law structure at high frequencies with a fractional scaling exponent. We show that this exponent is not constant, but depends on the visual statistics used to drive the network. To investigate the determinants of this frequency-scaling, we considered a generic recurrent model of cortex receiving a retinotopically organized external input. Similarly to the in vivo case, our in computo simulations show that the scaling exponent reflects the correlation level imposed in the input. This systematic dependence was also replicated at the single cell level, by controlling independently, in a parametric way, the strength and the temporal decay of the pairwise correlation between presynaptic inputs. This last model was implemented in vitro by imposing the correlation control in artificial presynaptic spike trains through dynamic-clamp techniques. These in vitro manipulations induced a modulation of the scaling exponent, similar to that observed in vivo and predicted in computo. We conclude that the frequency-scaling exponent of the Vm reflects stimulus-driven correlations in the cortical network activity. Therefore, we propose that the scaling exponent could be used to read-out the “effective” connectivity responsible for the dynamical signature of the population signals measured at different integration levels, from Vm to LFP, EEG and fMRI. Intracellular recording of neocortical neurons provides an opportunity of characterizing the statistical signature of the synaptic bombardment to which it is submitted. Indeed the membrane potential displays intense fluctuations which reflect the cumulative activity of thousands of input neurons. In sensory cortical areas, this measure could be used to estimate the correlational structure of the external drive. We show that changes in the statistical properties of network activity, namely the local correlation between neurons, can be detected by analyzing the power spectrum density (PSD) of the subthreshold membrane potential. These PSD can be fitted by a power-law function 1/fα in the upper temporal frequency range. In vivo recordings in primary visual cortex show that the α exponent varies with the statistics of the sensory input. Most remarkably, the exponent observed in the ongoing activity is indistinguishable from that evoked by natural visual statistics. These results are emulated by models which demonstrate that the exponent α is determined by the local level of correlation imposed in the recurrent network activity. Similar relationships are also reproduced in cortical neurons recorded in vitro with artificial synaptic inputs by controlling in computo the level of correlation in real time.
Collapse
|
242
|
Wark B, Fairhall A, Rieke F. Timescales of inference in visual adaptation. Neuron 2009; 61:750-61. [PMID: 19285471 DOI: 10.1016/j.neuron.2009.01.019] [Citation(s) in RCA: 136] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2008] [Revised: 10/23/2008] [Accepted: 01/22/2009] [Indexed: 10/21/2022]
Abstract
Adaptation is a hallmark of sensory function. Adapting optimally requires matching the dynamics of adaptation to those of changes in the stimulus distribution. Here we show that the dynamics of adaptation in the responses of mouse retinal ganglion cells depend on stimulus history. We hypothesized that the accumulation of evidence for a change in the stimulus distribution controls the dynamics of adaptation, and developed a model for adaptation as an ongoing inference problem. Guided by predictions of this model, we found that the dynamics of adaptation depend on the discriminability of the change in stimulus distribution and that the retina exploits information contained in properties of the stimulus beyond the mean and variance to adapt more quickly when possible.
Collapse
Affiliation(s)
- Barry Wark
- Graduate Program in Neurobiology and Behavior, University of Washington, Seattle, WA 98195, USA
| | | | | |
Collapse
|
243
|
Marom S. Adaptive transition rates in excitable membranes. Front Comput Neurosci 2009; 3:2. [PMID: 19225576 PMCID: PMC2644617 DOI: 10.3389/neuro.10.002.2009] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2009] [Accepted: 02/01/2009] [Indexed: 11/13/2022] Open
Abstract
Adaptation of activity in excitable membranes occurs over a wide range of timescales. Standard computational approaches handle this wide temporal range in terms of multiple states and related reaction rates emanating from the complexity of ionic channels. The study described here takes a different (perhaps complementary) approach, by interpreting ion channel kinetics in terms of population dynamics. I show that adaptation in excitable membranes is reducible to a simple Logistic-like equation in which the essential non-linearity is replaced by a feedback loop between the history of activation and an adaptive transition rate that is sensitive to a single dimension of the space of inactive states. This physiologically measurable dimension contributes to the stability of the system and serves as a powerful modulator of input–output relations that depends on the patterns of prior activity; an intrinsic scale free mechanism for cellular adaptation that emerges from the microscopic biophysical properties of ion channels of excitable membranes.
Collapse
Affiliation(s)
- Shimon Marom
- Department of Physiology in the Faculty of Medicine and the Network Biology Research Laboratories, Technion - Israel Institute of Technology Haifa, Israel.
| |
Collapse
|
244
|
Higgs MH, Spain WJ. Conditional bursting enhances resonant firing in neocortical layer 2-3 pyramidal neurons. J Neurosci 2009; 29:1285-99. [PMID: 19193876 PMCID: PMC6666063 DOI: 10.1523/jneurosci.3728-08.2009] [Citation(s) in RCA: 68] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2008] [Revised: 12/10/2008] [Accepted: 12/10/2008] [Indexed: 11/21/2022] Open
Abstract
The frequency response properties of neurons are critical for signal transmission and control of network oscillations. At subthreshold membrane potential, some neurons show resonance caused by voltage-gated channels. During action potential firing, resonance of the spike output may arise from subthreshold mechanisms and/or spike-dependent currents that cause afterhyperpolarizations (AHPs) and afterdepolarizations (ADPs). Layer 2-3 pyramidal neurons (L2-3 PNs) have a fast ADP that can trigger bursts. The present study investigated what stimuli elicit bursting in these cells and whether bursts transmit specific frequency components of the synaptic input, leading to resonance at particular frequencies. We found that two-spike bursts are triggered by step onsets, sine waves in two frequency bands, and noise. Using noise adjusted to elicit firing at approximately 10 Hz, we measured the gain for modulation of the time-varying firing rate as a function of stimulus frequency, finding a primary peak (7-16 Hz) and a high-frequency resonance (250-450 Hz). Gain was also measured separately for single and burst spikes. For a given spike rate, bursts provided higher gain at the primary peak and lower gain at intermediate frequencies, sharpening the high-frequency resonance. Suppression of bursting using automated current feedback weakened the primary and high-frequency resonances. The primary resonance was also influenced by the SK channel-mediated medium AHP (mAHP), because the SK blocker apamin reduced the sharpness of the primary peak. Our results suggest that resonance in L2-3 PNs depends on burst firing and the mAHP. Bursting enhances resonance in two distinct frequency bands.
Collapse
Affiliation(s)
- Matthew H. Higgs
- Neurology Section, Veterans Affairs Puget Sound Health Care System, Seattle, Washington 98108, and
- Departments of Physiology and Biophysics and
| | - William J. Spain
- Neurology Section, Veterans Affairs Puget Sound Health Care System, Seattle, Washington 98108, and
- Departments of Physiology and Biophysics and
- Neurology, University of Washington, Seattle, Washington 98195
| |
Collapse
|
245
|
Zheng L, Nikolaev A, Wardill TJ, O'Kane CJ, de Polavieja GG, Juusola M. Network adaptation improves temporal representation of naturalistic stimuli in Drosophila eye: I dynamics. PLoS One 2009; 4:e4307. [PMID: 19180196 PMCID: PMC2628724 DOI: 10.1371/journal.pone.0004307] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2008] [Accepted: 12/23/2008] [Indexed: 12/17/2022] Open
Abstract
Because of the limited processing capacity of eyes, retinal networks must adapt constantly to best present the ever changing visual world to the brain. However, we still know little about how adaptation in retinal networks shapes neural encoding of changing information. To study this question, we recorded voltage responses from photoreceptors (R1–R6) and their output neurons (LMCs) in the Drosophila eye to repeated patterns of contrast values, collected from natural scenes. By analyzing the continuous photoreceptor-to-LMC transformations of these graded-potential neurons, we show that the efficiency of coding is dynamically improved by adaptation. In particular, adaptation enhances both the frequency and amplitude distribution of LMC output by improving sensitivity to under-represented signals within seconds. Moreover, the signal-to-noise ratio of LMC output increases in the same time scale. We suggest that these coding properties can be used to study network adaptation using the genetic tools in Drosophila, as shown in a companion paper (Part II).
Collapse
Affiliation(s)
- Lei Zheng
- Department of Biomedical Science, University of Sheffield, Sheffield, United Kingdom
| | - Anton Nikolaev
- Department of Biomedical Science, University of Sheffield, Sheffield, United Kingdom
| | - Trevor J. Wardill
- Department of Biomedical Science, University of Sheffield, Sheffield, United Kingdom
| | - Cahir J. O'Kane
- Department of Genetics, University of Cambridge, Cambridge, United Kingdom
| | - Gonzalo G. de Polavieja
- Department of Theoretical Physics, Universidad Autónoma de Madrid, Madrid, Spain
- Instituto ‘Nicolás Cabrera’ de Física de Materiales, Universidad Autónoma de Madrid, Madrid, Spain
| | - Mikko Juusola
- Department of Biomedical Science, University of Sheffield, Sheffield, United Kingdom
- State Key Laboratory of Cognitive Neuroscience, Beijing Normal University, Beijing, China
- * E-mail:
| |
Collapse
|