1
|
Pamplona D, Hilgen G, Hennig MH, Cessac B, Sernagor E, Kornprobst P. Receptive field estimation in large visual neuron assemblies using a super-resolution approach. J Neurophysiol 2022; 127:1334-1347. [PMID: 35235437 DOI: 10.1152/jn.00076.2021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Computing the spike-triggered average (STA) is a simple method to estimate the sensory neurons' linear receptive fields (RFs). For random, uncorrelated stimuli the STA provides an unbiased RF estimate, but in practice, white noise is not a feasible stimulus as it usually evokes only weak responses. Therefore, for a visual stimulus, it is often used images of randomly modulated blocks of pixels. This solution naturally limits the resolution at which an RF can be obtained. Here we show that this limitation can be overcome by using a simple super-resolution technique. We define a novel type of stimulus, the Shifted White Noise (SWN), by introducing random spatial shifts in the usual stimulus in order to increase the resolution of the measurements. In simulated data we show that the average error using the SWN was 1.7 times smaller than when using the classical stimulus, with successful mapping of 2.3 times more neurons, covering a broader range of RF sizes. Moreover, successful RF mapping was achieved with short recordings of about one minute of activity, more than 10 times more efficient than the classical white noise stimulus. In recordings from mouse retinal ganglion cells with large scale microelectrode arrays, we could map 18 times more RFs covering a broader range of sizes. In summary, here we show that randomly shifting the usual white noise stimulus significantly improves RFs estimation, and requires only short recordings. It is straight forward to extend this method into the time dimension and adapt it to other sensory modalities.
Collapse
Affiliation(s)
- Daniela Pamplona
- Ecole Nationale Supérieure de Techniques Avancées, Institut Polytechnique de Paris, Palaiseau, France.,Université Côte d'Azur, Inria, France
| | - Gerrit Hilgen
- Biosciences Institute, Faculty of Medical Sciences, Newcastle University, Newcastle upon Tyne, United Kingdom.,Applied Sciences, Health and Life Sciences, Northumbria University, Newcastle upon Tyne, United Kingdom
| | - Matthias Helge Hennig
- Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh, Edinburgh, United Kingdom
| | | | - Evelyne Sernagor
- Biosciences Institute, Faculty of Medical Sciences, Newcastle University, Newcastle upon Tyne, United Kingdom
| | | |
Collapse
|
2
|
Williams AH, Poole B, Maheswaranathan N, Dhawale AK, Fisher T, Wilson CD, Brann DH, Trautmann EM, Ryu S, Shusterman R, Rinberg D, Ölveczky BP, Shenoy KV, Ganguli S. Discovering Precise Temporal Patterns in Large-Scale Neural Recordings through Robust and Interpretable Time Warping. Neuron 2019; 105:246-259.e8. [PMID: 31786013 DOI: 10.1016/j.neuron.2019.10.020] [Citation(s) in RCA: 38] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2019] [Revised: 09/17/2019] [Accepted: 10/10/2019] [Indexed: 12/22/2022]
Abstract
Though the temporal precision of neural computation has been studied intensively, a data-driven determination of this precision remains a fundamental challenge. Reproducible spike patterns may be obscured on single trials by uncontrolled temporal variability in behavior and cognition and may not be time locked to measurable signatures in behavior or local field potentials (LFP). To overcome these challenges, we describe a general-purpose time warping framework that reveals precise spike-time patterns in an unsupervised manner, even when these patterns are decoupled from behavior or are temporally stretched across single trials. We demonstrate this method across diverse systems: cued reaching in nonhuman primates, motor sequence production in rats, and olfaction in mice. This approach flexibly uncovers diverse dynamical firing patterns, including pulsatile responses to behavioral events, LFP-aligned oscillatory spiking, and even unanticipated patterns, such as 7 Hz oscillations in rat motor cortex that are not time locked to measured behaviors or LFP.
Collapse
Affiliation(s)
- Alex H Williams
- Neuroscience Program, Stanford University, Stanford, CA 94305, USA.
| | - Ben Poole
- Google Brain, Google Inc., Mountain View, CA 94043, USA
| | | | - Ashesh K Dhawale
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA; Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Tucker Fisher
- Neuroscience Program, Stanford University, Stanford, CA 94305, USA
| | - Christopher D Wilson
- Neuroscience Institute, New York University School of Medicine, New York, NY 10016, USA
| | - David H Brann
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA; Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Eric M Trautmann
- Neuroscience Program, Stanford University, Stanford, CA 94305, USA; Howard Hughes Medical Institute, Stanford University, Stanford, CA 94305, USA
| | - Stephen Ryu
- Electrical Engineering Department, Stanford University, Stanford, CA 94305, USA; Department of Neurosurgery, Palo Alto Medical Foundation, Palo Alto, CA 94301, USA
| | - Roman Shusterman
- Institute of Neuroscience, University of Oregon, Eugene, OR 97403, USA
| | - Dmitry Rinberg
- Neuroscience Institute, New York University School of Medicine, New York, NY 10016, USA; Center for Neural Science, New York University, New York, NY 10016, USA
| | - Bence P Ölveczky
- Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA; Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Krishna V Shenoy
- Neurobiology Department, Stanford University, Stanford, CA 94305, USA; Electrical Engineering Department, Stanford University, Stanford, CA 94305, USA; Bioengineering Department, Stanford University, Stanford, CA 94305, USA; Bio-X Program, Stanford University, Stanford, CA 94305, USA; Wu Tsai Stanford Neurosciences Institute, Stanford University, Stanford, CA 94305, USA; Howard Hughes Medical Institute, Stanford University, Stanford, CA 94305, USA
| | - Surya Ganguli
- Applied Physics Department, Stanford University, Stanford, CA 94305, USA; Neurobiology Department, Stanford University, Stanford, CA 94305, USA; Electrical Engineering Department, Stanford University, Stanford, CA 94305, USA; Bio-X Program, Stanford University, Stanford, CA 94305, USA; Wu Tsai Stanford Neurosciences Institute, Stanford University, Stanford, CA 94305, USA; Google Brain, Google Inc., Mountain View, CA 94043, USA.
| |
Collapse
|
3
|
Lawlor PN, Perich MG, Miller LE, Kording KP. Linear-nonlinear-time-warp-poisson models of neural activity. J Comput Neurosci 2018; 45:173-191. [PMID: 30294750 DOI: 10.1007/s10827-018-0696-6] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2018] [Revised: 08/13/2018] [Accepted: 09/10/2018] [Indexed: 01/15/2023]
Abstract
Prominent models of spike trains assume only one source of variability - stochastic (Poisson) spiking - when stimuli and behavior are fixed. However, spike trains may also reflect variability due to internal processes such as planning. For example, we can plan a movement at one point in time and execute it at some arbitrary later time. Neurons involved in planning may thus share an underlying time course that is not precisely locked to the actual movement. Here we combine the standard Linear-Nonlinear-Poisson (LNP) model with Dynamic Time Warping (DTW) to account for shared temporal variability. When applied to recordings from macaque premotor cortex, we find that time warping considerably improves predictions of neural activity. We suggest that such temporal variability is a widespread phenomenon in the brain which should be modeled.
Collapse
Affiliation(s)
- Patrick N Lawlor
- Division of Child Neurology, Children's Hospital of Philadelphia, Philadelphia, PA, USA.
| | | | - Lee E Miller
- Department of Physiology, Northwestern University, Chicago, IL, USA
| | - Konrad P Kording
- Departments of Bioengineering and Neuroscience, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
4
|
Cluster-based analysis improves predictive validity of spike-triggered receptive field estimates. PLoS One 2017; 12:e0183914. [PMID: 28877194 PMCID: PMC5587334 DOI: 10.1371/journal.pone.0183914] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2016] [Accepted: 08/14/2017] [Indexed: 11/19/2022] Open
Abstract
Spectrotemporal receptive field (STRF) characterization is a central goal of auditory physiology. STRFs are often approximated by the spike-triggered average (STA), which reflects the average stimulus preceding a spike. In many cases, the raw STA is subjected to a threshold defined by gain values expected by chance. However, such correction methods have not been universally adopted, and the consequences of specific gain-thresholding approaches have not been investigated systematically. Here, we evaluate two classes of statistical correction techniques, using the resulting STRF estimates to predict responses to a novel validation stimulus. The first, more traditional technique eliminated STRF pixels (time-frequency bins) with gain values expected by chance. This correction method yielded significant increases in prediction accuracy, including when the threshold setting was optimized for each unit. The second technique was a two-step thresholding procedure wherein clusters of contiguous pixels surviving an initial gain threshold were then subjected to a cluster mass threshold based on summed pixel values. This approach significantly improved upon even the best gain-thresholding techniques. Additional analyses suggested that allowing threshold settings to vary independently for excitatory and inhibitory subfields of the STRF resulted in only marginal additional gains, at best. In summary, augmenting reverse correlation techniques with principled statistical correction choices increased prediction accuracy by over 80% for multi-unit STRFs and by over 40% for single-unit STRFs, furthering the interpretational relevance of the recovered spectrotemporal filters for auditory systems analysis.
Collapse
|
5
|
Sharpee TO. How Invariant Feature Selectivity Is Achieved in Cortex. Front Synaptic Neurosci 2016; 8:26. [PMID: 27601991 PMCID: PMC4993779 DOI: 10.3389/fnsyn.2016.00026] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2016] [Accepted: 08/05/2016] [Indexed: 02/03/2023] Open
Abstract
Parsing the visual scene into objects is paramount to survival. Yet, how this is accomplished by the nervous system remains largely unknown, even in the comparatively well understood visual system. It is especially unclear how detailed peripheral signal representations are transformed into the object-oriented representations that are independent of object position and are provided by the final stages of visual processing. This perspective discusses advances in computational algorithms for fitting large-scale models that make it possible to reconstruct the intermediate steps of visual processing based on neural responses to natural stimuli. In particular, it is now possible to characterize how different types of position invariance, such as local (also known as phase invariance) and more global, are interleaved with nonlinear operations to allow for coding of curved contours. Neurons in the mid-level visual area V4 exhibit selectivity to pairs of even- and odd-symmetric profiles along curved contours. Such pairing is reminiscent of the response properties of complex cells in the primary visual cortex (V1) and suggests specific ways in which V1 signals are transformed within subsequent visual cortical areas. These examples illustrate that large-scale models fitted to neural responses to natural stimuli can provide generative models of successive stages of sensory processing.
Collapse
Affiliation(s)
- Tatyana O. Sharpee
- Computational Neurobiology Laboratory, Salk Institute for Biological StudiesLa Jolla, CA, USA
| |
Collapse
|
6
|
Keller CH, Takahashi TT. Spike timing precision changes with spike rate adaptation in the owl's auditory space map. J Neurophysiol 2015; 114:2204-19. [PMID: 26269555 PMCID: PMC4600961 DOI: 10.1152/jn.00442.2015] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2015] [Accepted: 08/07/2015] [Indexed: 11/22/2022] Open
Abstract
Spike rate adaptation (SRA) is a continuing change of responsiveness to ongoing stimuli, which is ubiquitous across species and levels of sensory systems. Under SRA, auditory responses to constant stimuli change over time, relaxing toward a long-term rate often over multiple timescales. With more variable stimuli, SRA causes the dependence of spike rate on sound pressure level to shift toward the mean level of recent stimulus history. A model based on subtractive adaptation (Benda J, Hennig RM. J Comput Neurosci 24: 113-136, 2008) shows that changes in spike rate and level dependence are mechanistically linked. Space-specific neurons in the barn owl's midbrain, when recorded under ketamine-diazepam anesthesia, showed these classical characteristics of SRA, while at the same time exhibiting changes in spike timing precision. Abrupt level increases of sinusoidally amplitude-modulated (SAM) noise initially led to spiking at higher rates with lower temporal precision. Spike rate and precision relaxed toward their long-term values with a time course similar to SRA, results that were also replicated by the subtractive model. Stimuli whose amplitude modulations (AMs) were not synchronous across carrier frequency evoked spikes in response to stimulus envelopes of a particular shape, characterized by the spectrotemporal receptive field (STRF). Again, abrupt stimulus level changes initially disrupted the temporal precision of spiking, which then relaxed along with SRA. We suggest that shifts in latency associated with stimulus level changes may differ between carrier frequency bands and underlie decreased spike precision. Thus SRA is manifest not simply as a change in spike rate but also as a change in the temporal precision of spiking.
Collapse
|
7
|
Liu JK, Gollisch T. Spike-Triggered Covariance Analysis Reveals Phenomenological Diversity of Contrast Adaptation in the Retina. PLoS Comput Biol 2015; 11:e1004425. [PMID: 26230927 PMCID: PMC4521887 DOI: 10.1371/journal.pcbi.1004425] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2015] [Accepted: 07/03/2015] [Indexed: 11/25/2022] Open
Abstract
When visual contrast changes, retinal ganglion cells adapt by adjusting their sensitivity as well as their temporal filtering characteristics. The latter has classically been described by contrast-induced gain changes that depend on temporal frequency. Here, we explored a new perspective on contrast-induced changes in temporal filtering by using spike-triggered covariance analysis to extract multiple parallel temporal filters for individual ganglion cells. Based on multielectrode-array recordings from ganglion cells in the isolated salamander retina, we found that contrast adaptation of temporal filtering can largely be captured by contrast-invariant sets of filters with contrast-dependent weights. Moreover, differences among the ganglion cells in the filter sets and their contrast-dependent contributions allowed us to phenomenologically distinguish three types of filter changes. The first type is characterized by newly emerging features at higher contrast, which can be reproduced by computational models that contain response-triggered gain-control mechanisms. The second type follows from stronger adaptation in the Off pathway as compared to the On pathway in On-Off-type ganglion cells. Finally, we found that, in a subset of neurons, contrast-induced filter changes are governed by particularly strong spike-timing dynamics, in particular by pronounced stimulus-dependent latency shifts that can be observed in these cells. Together, our results show that the contrast dependence of temporal filtering in retinal ganglion cells has a multifaceted phenomenology and that a multi-filter analysis can provide a useful basis for capturing the underlying signal-processing dynamics. Our sensory systems have to process stimuli under a wide range of environmental conditions. To cope with this challenge, the involved neurons adapt by adjusting their signal processing to the recently encountered intensity range. In the visual system, one finds, for example, that higher visual contrast leads to changes in how visual signals are temporally filtered, making signal processing faster and more band-pass-like at higher contrast. By analyzing signals from neurons in the retina of salamanders, we here found that these adaptation effects can be described by a fixed set of filters, independent of contrast, whose relative contributions change with contrast. Also, we found that different phenomena contribute to this adaptation. In particular, some cells change their relative sensitivity to light increments and light decrements, whereas other cells are influenced by a strong contrast-dependence of the exact timing of their responses. Our results show that contrast adaptation in the retina is not an entirely homogeneous phenomenon, and that models with multiple filters can help in characterizing sensory adaptation.
Collapse
Affiliation(s)
- Jian K. Liu
- Department of Ophthalmology, University Medical Center Göttingen, Göttingen, Germany
- Bernstein Center for Computational Neuroscience Göttingen, Göttingen, Germany
| | - Tim Gollisch
- Department of Ophthalmology, University Medical Center Göttingen, Göttingen, Germany
- Bernstein Center for Computational Neuroscience Göttingen, Göttingen, Germany
- * E-mail:
| |
Collapse
|
8
|
Kollmorgen S, Hahnloser RHR. Dynamic alignment models for neural coding. PLoS Comput Biol 2014; 10:e1003508. [PMID: 24625448 PMCID: PMC3952821 DOI: 10.1371/journal.pcbi.1003508] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2013] [Accepted: 01/28/2014] [Indexed: 11/18/2022] Open
Abstract
Recently, there have been remarkable advances in modeling the relationships between the sensory environment, neuronal responses, and behavior. However, most models cannot encompass variable stimulus-response relationships such as varying response latencies and state or context dependence of the neural code. Here, we consider response modeling as a dynamic alignment problem and model stimulus and response jointly by a mixed pair hidden Markov model (MPH). In MPHs, multiple stimulus-response relationships (e.g., receptive fields) are represented by different states or groups of states in a Markov chain. Each stimulus-response relationship features temporal flexibility, allowing modeling of variable response latencies, including noisy ones. We derive algorithms for learning of MPH parameters and for inference of spike response probabilities. We show that some linear-nonlinear Poisson cascade (LNP) models are a special case of MPHs. We demonstrate the efficiency and usefulness of MPHs in simulations of both jittered and switching spike responses to white noise and natural stimuli. Furthermore, we apply MPHs to extracellular single and multi-unit data recorded in cortical brain areas of singing birds to showcase a novel method for estimating response lag distributions. MPHs allow simultaneous estimation of receptive fields, latency statistics, and hidden state dynamics and so can help to uncover complex stimulus response relationships that are subject to variable timing and involve diverse neural codes. The brain computes using electrical discharges of nerve cells, so called spikes. Specific sensory stimuli, for instance, tones, often lead to specific spiking patterns. The same is true for behavior: specific motor actions are generated by specific spiking patterns. The relationship between neural activity and stimuli or motor actions can be difficult to infer, because of dynamic dependencies and hidden nonlinearities. For instance, in a freely behaving animal a neuron could exhibit variable levels of sensory and motor involvements depending on the state of the animal and on current motor plans—a situation that cannot be accounted for by many existing models. Here we present a new type of model that is specifically designed to cope with such changing regularities. We outline the mathematical framework and show, through computer simulations and application to recorded neural data, how MPHs can advance our understanding of stimulus-response relationships.
Collapse
Affiliation(s)
- Sepp Kollmorgen
- Institute of Neuroinformatics, University of Zurich/ETH Zurich, Zurich, Switzerland
| | | |
Collapse
|
9
|
Rajan K, Bialek W. Maximally informative "stimulus energies" in the analysis of neural responses to natural signals. PLoS One 2013; 8:e71959. [PMID: 24250780 PMCID: PMC3826732 DOI: 10.1371/journal.pone.0071959] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2013] [Accepted: 07/06/2013] [Indexed: 11/19/2022] Open
Abstract
The concept of feature selectivity in sensory signal processing can be formalized as dimensionality reduction: in a stimulus space of very high dimensions, neurons respond only to variations within some smaller, relevant subspace. But if neural responses exhibit invariances, then the relevant subspace typically cannot be reached by a Euclidean projection of the original stimulus. We argue that, in several cases, we can make progress by appealing to the simplest nonlinear construction, identifying the relevant variables as quadratic forms, or “stimulus energies.” Natural examples include non–phase–locked cells in the auditory system, complex cells in the visual cortex, and motion–sensitive neurons in the visual system. Generalizing the idea of maximally informative dimensions, we show that one can search for kernels of the relevant quadratic forms by maximizing the mutual information between the stimulus energy and the arrival times of action potentials. Simple implementations of this idea successfully recover the underlying properties of model neurons even when the number of parameters in the kernel is comparable to the number of action potentials and stimuli are completely natural. We explore several generalizations that allow us to incorporate plausible structure into the kernel and thereby restrict the number of parameters. We hope that this approach will add significantly to the set of tools available for the analysis of neural responses to complex, naturalistic stimuli.
Collapse
Affiliation(s)
- Kanaka Rajan
- Joseph Henry Laboratories of Physics and Lewis–Sigler Institute for Integrative Genomics, Princeton University, Princeton, New Jersey, United States of America
- * E-mail:
| | - William Bialek
- Joseph Henry Laboratories of Physics and Lewis–Sigler Institute for Integrative Genomics, Princeton University, Princeton, New Jersey, United States of America
| |
Collapse
|
10
|
Smith C, Paninski L. Computing loss of efficiency in optimal Bayesian decoders given noisy or incomplete spike trains. NETWORK (BRISTOL, ENGLAND) 2013; 24:75-98. [PMID: 23742213 DOI: 10.3109/0954898x.2013.789568] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
We investigate Bayesian methods for optimal decoding of noisy or incompletely-observed spike trains. Information about neural identity or temporal resolution may be lost during spike detection and sorting, or spike times measured near the soma may be corrupted with noise due to stochastic membrane channel effects in the axon. We focus on neural encoding models in which the (discrete) neural state evolves according to stimulus-dependent Markovian dynamics. Such models are sufficiently flexible that we may incorporate realistic stimulus encoding and spiking dynamics, but nonetheless permit exact computation via efficient hidden Markov model forward-backward methods. We analyze two types of signal degradation. First, we quantify the information lost due to jitter or downsampling in the spike-times. Second, we quantify the information lost when knowledge of the identities of different spiking neurons is corrupted. In each case the methods introduced here make it possible to quantify the dependence of the information loss on biophysical parameters such as firing rate, spike jitter amplitude, spike observation noise, etc. In particular, decoders that model the probability distribution of spike-neuron assignments significantly outperform decoders that use only the most likely spike assignments, and are ignorant of the posterior spike assignment uncertainty.
Collapse
Affiliation(s)
- Carl Smith
- Department of Chemistry, Columbia University, New York, NY 10027, USA.
| | | |
Collapse
|
11
|
Gollisch T, Herz AVM. The iso-response method: measuring neuronal stimulus integration with closed-loop experiments. Front Neural Circuits 2012; 6:104. [PMID: 23267315 PMCID: PMC3525953 DOI: 10.3389/fncir.2012.00104] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2012] [Accepted: 11/29/2012] [Indexed: 11/29/2022] Open
Abstract
Throughout the nervous system, neurons integrate high-dimensional input streams and transform them into an output of their own. This integration of incoming signals involves filtering processes and complex non-linear operations. The shapes of these filters and non-linearities determine the computational features of single neurons and their functional roles within larger networks. A detailed characterization of signal integration is thus a central ingredient to understanding information processing in neural circuits. Conventional methods for measuring single-neuron response properties, such as reverse correlation, however, are often limited by the implicit assumption that stimulus integration occurs in a linear fashion. Here, we review a conceptual and experimental alternative that is based on exploring the space of those sensory stimuli that result in the same neural output. As demonstrated by recent results in the auditory and visual system, such iso-response stimuli can be used to identify the non-linearities relevant for stimulus integration, disentangle consecutive neural processing steps, and determine their characteristics with unprecedented precision. Automated closed-loop experiments are crucial for this advance, allowing rapid search strategies for identifying iso-response stimuli during experiments. Prime targets for the method are feed-forward neural signaling chains in sensory systems, but the method has also been successfully applied to feedback systems. Depending on the specific question, “iso-response” may refer to a predefined firing rate, single-spike probability, first-spike latency, or other output measures. Examples from different studies show that substantial progress in understanding neural dynamics and coding can be achieved once rapid online data analysis and stimulus generation, adaptive sampling, and computational modeling are tightly integrated into experiments.
Collapse
Affiliation(s)
- Tim Gollisch
- Department of Ophthalmology and Bernstein Center for Computational Neuroscience Göttingen, University Medical Center Göttingen Göttingen, Germany
| | | |
Collapse
|
12
|
Samengo I, Gollisch T. Spike-triggered covariance: geometric proof, symmetry properties, and extension beyond Gaussian stimuli. J Comput Neurosci 2012; 34:137-61. [PMID: 22798148 PMCID: PMC3558678 DOI: 10.1007/s10827-012-0411-y] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2012] [Revised: 05/12/2012] [Accepted: 06/27/2012] [Indexed: 12/01/2022]
Abstract
The space of sensory stimuli is complex and high-dimensional. Yet, single neurons in sensory systems are typically affected by only a small subset of the vast space of all possible stimuli. A proper understanding of the input–output transformation represented by a given cell therefore requires the identification of the subset of stimuli that are relevant in shaping the neuronal response. As an extension to the commonly-used spike-triggered average, the analysis of the spike-triggered covariance matrix provides a systematic methodology to detect relevant stimuli. As originally designed, the consistency of this method is guaranteed only if stimuli are drawn from a Gaussian distribution. Here we present a geometric proof of consistency, which provides insight into the foundations of the method, in particular, into the crucial role played by the geometry of stimulus space and symmetries in the stimulus–response relation. This approach leads to a natural extension of the applicability of the spike-triggered covariance technique to arbitrary spherical or elliptic stimulus distributions. The extension only requires a subtle modification of the original prescription. Furthermore, we present a new resampling method for assessing statistical significance of identified relevant stimuli, applicable to spherical and elliptic stimulus distributions. Finally, we exemplify the modified method and compare it to other prescriptions given in the literature.
Collapse
Affiliation(s)
- Inés Samengo
- Centro Atómico Bariloche and Instituto Balseiro, (8400) San Carlos de Bariloche, Río Negro, Argentina
| | - Tim Gollisch
- Department of Ophthalmology and Bernstein Center for Computational Neuroscience Göttingen, Georg-August University Göttingen, 37073 Göttingen, Germany
| |
Collapse
|
13
|
Goulet J, van Hemmen JL, Jung SN, Chagnaud BP, Scholze B, Engelmann J. Temporal precision and reliability in the velocity regime of a hair-cell sensory system: the mechanosensory lateral line of goldfish, Carassius auratus. J Neurophysiol 2012; 107:2581-93. [PMID: 22378175 DOI: 10.1152/jn.01073.2011] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Fish and aquatic frogs detect minute water motion by means of a specialized mechanosensory system, the lateral line. Ubiquitous in fish, the lateral-line system is characterized by hair-cell based sensory structures across the fish's surface called neuromasts. These neuromasts occur free-standing on the skin as superficial neuromasts (SN) or are recessed into canals as canal neuromasts. SNs respond to rapid changes of water velocity in a small layer of fluid around the fish, including the so-called boundary layer. Although omnipresent, the boundary layer's impact on the SN response is still a matter of debate. For the first time using an information-theoretic approach to this sensory system, we have investigated the SN afferents encoding capabilities. Combining covariance analysis, phase analysis, and modeling of recorded neuronal responses of primary lateral line afferents, we show that encoding by the SNs is adequately described as a linear, velocity-responsive mechanism. Afferent responses display a bimodal distribution of opposite Wiener kernels that likely reflected the two hair-cell populations within a given neuromast. Using frozen noise stimuli, we further demonstrate that SN afferents respond in an extremely precise manner and with high reproducibility across a broad frequency band (10-150 Hz), revealing that an optimal decoder would need to rely extensively on a temporal code. This was further substantiated by means of signal reconstruction of spike trains that were time shifted with respect to their original. On average, a time shift of 3.5 ms was enough to diminish the encoding capabilities of primary afferents by 70%. Our results further demonstrate that the SNs' encoding capability is linearly related to the stimulus outside the boundary layer, and that the boundary layer can, therefore, be neglected while interpreting lateral line response of SN afferents to hydrodynamic stimuli.
Collapse
Affiliation(s)
- Julie Goulet
- Univ. of Bielefeld, AG Active Sensing, 33501 Bielefeld, Germany.
| | | | | | | | | | | |
Collapse
|
14
|
Sharpee TO, Nagel KI, Doupe AJ. Two-dimensional adaptation in the auditory forebrain. J Neurophysiol 2011; 106:1841-61. [PMID: 21753019 PMCID: PMC3296429 DOI: 10.1152/jn.00905.2010] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2010] [Accepted: 07/07/2011] [Indexed: 11/22/2022] Open
Abstract
Sensory neurons exhibit two universal properties: sensitivity to multiple stimulus dimensions, and adaptation to stimulus statistics. How adaptation affects encoding along primary dimensions is well characterized for most sensory pathways, but if and how it affects secondary dimensions is less clear. We studied these effects for neurons in the avian equivalent of primary auditory cortex, responding to temporally modulated sounds. We showed that the firing rate of single neurons in field L was affected by at least two components of the time-varying sound log-amplitude. When overall sound amplitude was low, neural responses were based on nonlinear combinations of the mean log-amplitude and its rate of change (first time differential). At high mean sound amplitude, the two relevant stimulus features became the first and second time derivatives of the sound log-amplitude. Thus a strikingly systematic relationship between dimensions was conserved across changes in stimulus intensity, whereby one of the relevant dimensions approximated the time differential of the other dimension. In contrast to stimulus mean, increases in stimulus variance did not change relevant dimensions, but selectively increased the contribution of the second dimension to neural firing, illustrating a new adaptive behavior enabled by multidimensional encoding. Finally, we demonstrated theoretically that inclusion of time differentials as additional stimulus features, as seen so prominently in the single-neuron responses studied here, is a useful strategy for encoding naturalistic stimuli, because it can lower the necessary sampling rate while maintaining the robustness of stimulus reconstruction to correlated noise.
Collapse
Affiliation(s)
- Tatyana O Sharpee
- The Crick-Jacobs Center for Theoretical and Computational Biology, Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, and the Center for Theoretical Biological Physics, University of California, San Diego, La Jolla, CA, USA.
| | | | | |
Collapse
|
15
|
Chang TR, Chiu TW, Sun X, Poon PWF. Modeling frequency modulated responses of midbrain auditory neurons based on trigger features and artificial neural networks. Brain Res 2011; 1434:90-101. [PMID: 22035565 DOI: 10.1016/j.brainres.2011.09.042] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2011] [Revised: 09/20/2011] [Accepted: 09/21/2011] [Indexed: 11/25/2022]
Abstract
Frequency modulation (FM) is an important building block of communication signals for animals and human. Attempts to predict the response of central neurons to FM sounds have not been very successful, though achieving successful results could bring insights regarding the underlying neural mechanisms. Here we proposed a new method to predict responses of FM-sensitive neurons in the auditory midbrain. First we recorded single unit responses in anesthetized rats using a random FM tone to construct their spectro-temporal receptive fields (STRFs). Training of neurons in the artificial neural network to respond to a second random FM tone was based on the temporal information derived from the STRF. Specifically, the time window covered by the presumed trigger feature and its delay time to spike occurrence were used to train a finite impulse response neural network (FIRNN) to respond to this random FM. Finally we tested the model performance in predicting the response to another similar FM stimuli (third random FM tone). We found good performance in predicting the time of responses if not also the response magnitudes. Furthermore, the weighting function of the FIRNN showed temporal 'bumps' suggesting temporal integration of synaptic inputs from different frequency laminae. This article is part of a Special Issue entitled: Neural Coding.
Collapse
Affiliation(s)
- T R Chang
- Dept. of Computer Sciences and Information Engineering, Southern Taiwan University, Tainan, Taiwan.
| | | | | | | |
Collapse
|
16
|
Aldworth ZN, Dimitrov AG, Cummins GI, Gedeon T, Miller JP. Temporal encoding in a nervous system. PLoS Comput Biol 2011; 7:e1002041. [PMID: 21573206 PMCID: PMC3088658 DOI: 10.1371/journal.pcbi.1002041] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2010] [Accepted: 03/19/2011] [Indexed: 11/29/2022] Open
Abstract
We examined the extent to which temporal encoding may be implemented by single neurons in the cercal sensory system of the house cricket Acheta domesticus. We found that these neurons exhibit a greater-than-expected coding capacity, due in part to an increased precision in brief patterns of action potentials. We developed linear and non-linear models for decoding the activity of these neurons. We found that the stimuli associated with short-interval patterns of spikes (ISIs of 8 ms or less) could be predicted better by second-order models as compared to linear models. Finally, we characterized the difference between these linear and second-order models in a low-dimensional subspace, and showed that modification of the linear models along only a few dimensions improved their predictive power to parity with the second order models. Together these results show that single neurons are capable of using temporal patterns of spikes as fundamental symbols in their neural code, and that they communicate specific stimulus distributions to subsequent neural structures. The information coding schemes used within nervous systems have been the focus of an entire field within neuroscience. An unresolved issue within the general coding problem is the determination of the neural “symbols” with which information is encoded in neural spike trains, analogous to the determination of the nucleotide sequences used to represent proteins in molecular biology. The goal of our study was to determine if pairs of consecutive action potentials contain more or different information about the stimuli that elicit them than would be predicted from an analysis of individual action potentials. We developed linear and non-linear coding models and used likelihood analysis to address this question for sensory interneurons in the cricket cercal sensory system. Our results show that these neurons' spike trains can be decomposed into sequences of two neural symbols: isolated single spikes and short-interval spike doublets. Given the ubiquitous nature of similar neural activity reported in other systems, we suspect that the implementation of such temporal encoding schemes may be widespread across animal phyla. Knowledge of the basic coding units used by single cells will help in building the large-scale neural network models necessary for understanding how nervous systems function.
Collapse
Affiliation(s)
- Zane N Aldworth
- Center for Computational Biology, Montana State University, Bozeman, Montana, United States of America.
| | | | | | | | | |
Collapse
|
17
|
Should spikes be treated with equal weightings in the generation of spectro-temporal receptive fields? ACTA ACUST UNITED AC 2009; 104:215-22. [PMID: 19941954 DOI: 10.1016/j.jphysparis.2009.11.026] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
Abstract
Knowledge on the trigger features of central auditory neurons is important in the understanding of speech processing. Spectro-temporal receptive fields (STRFs) obtained using random stimuli and spike-triggered averaging allow visualization of trigger features which often appear blurry in the time-versus-frequency plot. For a clearer visualization we have previously developed a dejittering algorithm to sharpen trigger features in the STRF of FM-sensitive cells. Here we extended this algorithm to segregate spikes, based on their dejitter values, into two groups: normal and outlying, and to construct their STRF separately. We found that while the STRF of the normal jitter group resembled full trigger feature in the original STRF, those of the outlying jitter group resembled a different or partial trigger feature. This algorithm allowed the extraction of other weaker trigger features. Due to the presence of different trigger features in a given cell, we proposed that in the generation of STRF, the evoked spikes should not be treated indiscriminately with equal weightings.
Collapse
|
18
|
Dimitrov AG, Sheiko MA, Baker J, Yen SC. Spatial and temporal jitter distort estimated functional properties of visual sensory neurons. J Comput Neurosci 2009; 27:309-19. [PMID: 19353259 DOI: 10.1007/s10827-009-0144-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2008] [Revised: 12/31/2008] [Accepted: 02/18/2009] [Indexed: 11/30/2022]
Abstract
The functional properties of neural sensory cells or small neural ensembles are often characterized by analyzing response-conditioned stimulus ensembles. Many widely used analytical methods, like receptive fields (RF), Wiener kernels or spatio-temporal receptive fields (STRF), rely on simple statistics of those ensembles. They also tend to rely on simple noise models for the residuals of the conditional ensembles. However, in many cases the response-conditioned stimulus set has more complex structure. If not taken explicitly into account, it can bias the estimates of many simple statistics, and lead to erroneous conclusions about the functionality of a neural sensory system. In this article, we consider sensory noise in the visual system generated by small stimulus shifts in two dimensions (2 spatial or 1-space 1-time jitter). We model this noise as the action of a set of translations onto the stimulus that leave the response invariant. The analysis demonstrates that the spike-triggered average is a biased estimator of the model mean, and provides a de-biasing method. We apply this approach to observations from the stimulus/response characteristics of cells in the cat visual cortex and provide improved estimates of the structure of visual receptive fields. In several cases the new estimates differ substantially from the classic receptive fields, to a degree that may require re-evaluation of the functional description of the associated cells.
Collapse
Affiliation(s)
- Alexander G Dimitrov
- Center for Computational Biology, Montana State University, Bozeman, MT 59717, USA.
| | | | | | | |
Collapse
|
19
|
Kouh M, Sharpee TO. Estimating linear-nonlinear models using Renyi divergences. NETWORK (BRISTOL, ENGLAND) 2009; 20:49-68. [PMID: 19568981 PMCID: PMC2782376 DOI: 10.1080/09548980902950891] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
This article compares a family of methods for characterizing neural feature selectivity using natural stimuli in the framework of the linear-nonlinear model. In this model, the spike probability depends in a nonlinear way on a small number of stimulus dimensions. The relevant stimulus dimensions can be found by optimizing a Rényi divergence that quantifies a change in the stimulus distribution associated with the arrival of single spikes. Generally, good reconstructions can be obtained based on optimization of Rényi divergence of any order, even in the limit of small numbers of spikes. However, the smallest error is obtained when the Rényi divergence of order 1 is optimized. This type of optimization is equivalent to information maximization, and is shown to saturate the Cramer-Rao bound describing the smallest error allowed for any unbiased method. We also discuss conditions under which information maximization provides a convenient way to perform maximum likelihood estimation of linear-nonlinear models from neural data.
Collapse
Affiliation(s)
- Minjoon Kouh
- The Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037; The Center for Theoretical Biological Physics, University of California, San Diego, La Jolla, CA
| | - Tatyana O. Sharpee
- The Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, La Jolla, CA 92037; The Center for Theoretical Biological Physics, University of California, San Diego, La Jolla, CA
| |
Collapse
|
20
|
Gollisch T, Meister M. Modeling convergent ON and OFF pathways in the early visual system. BIOLOGICAL CYBERNETICS 2008; 99:263-278. [PMID: 19011919 PMCID: PMC2784078 DOI: 10.1007/s00422-008-0252-y] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/19/2008] [Accepted: 08/25/2008] [Indexed: 05/27/2023]
Abstract
For understanding the computation and function of single neurons in sensory systems, one needs to investigate how sensory stimuli are related to a neuron's response and which biological mechanisms underlie this relationship. Mathematical models of the stimulus-response relationship have proved very useful in approaching these issues in a systematic, quantitative way. A starting point for many such analyses has been provided by phenomenological "linear-nonlinear" (LN) models, which comprise a linear filter followed by a static nonlinear transformation. The linear filter is often associated with the neuron's receptive field. However, the structure of the receptive field is generally a result of inputs from many presynaptic neurons, which may form parallel signal processing pathways. In the retina, for example, certain ganglion cells receive excitatory inputs from ON-type as well as OFF-type bipolar cells. Recent experiments have shown that the convergence of these pathways leads to intriguing response characteristics that cannot be captured by a single linear filter. One approach to adjust the LN model to the biological circuit structure is to use multiple parallel filters that capture ON and OFF bipolar inputs. Here, we review these new developments in modeling neuronal responses in the early visual system and provide details about one particular technique for obtaining the required sets of parallel filters from experimental data.
Collapse
Affiliation(s)
- Tim Gollisch
- Visual Coding Group, Max Planck Institute of Neurobiology, Am Klopferspitz 18, 82152 Martinsried, Germany
| | - Markus Meister
- Department of Molecular and Cellular Biology, Center for Brain Science, Harvard University, 16 Divinity Ave, Cambridge, MA 02138 USA
| |
Collapse
|
21
|
Eyherabide HG, Rokem A, Herz AVM, Samengo I. Burst firing is a neural code in an insect auditory system. Front Comput Neurosci 2008; 2:3. [PMID: 18946533 PMCID: PMC2525941 DOI: 10.3389/neuro.10.003.2008] [Citation(s) in RCA: 31] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2008] [Accepted: 06/27/2008] [Indexed: 11/13/2022] Open
Abstract
Various classes of neurons alternate between high-frequency discharges and silent intervals. This phenomenon is called burst firing. To analyze burst activity in an insect system, grasshopper auditory receptor neurons were recorded in vivo for several distinct stimulus types. The experimental data show that both burst probability and burst characteristics are strongly influenced by temporal modulations of the acoustic stimulus. The tendency to burst, hence, is not only determined by cell-intrinsic processes, but also by their interaction with the stimulus time course. We study this interaction quantitatively and observe that bursts containing a certain number of spikes occur shortly after stimulus deflections of specific intensity and duration. Our findings suggest a sparse neural code where information about the stimulus is represented by the number of spikes per burst, irrespective of the detailed interspike-interval structure within a burst. This compact representation cannot be interpreted as a firing-rate code. An information-theoretical analysis reveals that the number of spikes per burst reliably conveys information about the amplitude and duration of sound transients, whereas their time of occurrence is reflected by the burst onset time. The investigated neurons encode almost half of the total transmitted information in burst activity.
Collapse
Affiliation(s)
- Hugo G Eyherabide
- Institute for Theoretical Biology, Department of Biology, Humboldt Universität Berlin, Germany
| | | | | | | |
Collapse
|
22
|
Siveke I, Leibold C, Grothe B. Spectral composition of concurrent noise affects neuronal sensitivity to interaural time differences of tones in the dorsal nucleus of the lateral lemniscus. J Neurophysiol 2007; 98:2705-15. [PMID: 17699697 DOI: 10.1152/jn.00275.2007] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We are regularly exposed to several concurrent sounds, producing a mixture of binaural cues. The neuronal mechanisms underlying the localization of concurrent sounds are not well understood. The major binaural cues for localizing low-frequency sounds in the horizontal plane are interaural time differences (ITDs). Auditory brain stem neurons encode ITDs by firing maximally in response to "favorable" ITDs and weakly or not at all in response to "unfavorable" ITDs. We recorded from ITD-sensitive neurons in the dorsal nucleus of the lateral lemniscus (DNLL) while presenting pure tones at different ITDs embedded in noise. We found that increasing levels of concurrent white noise suppressed the maximal response rate to tones with favorable ITDs and slightly enhanced the response rate to tones with unfavorable ITDs. Nevertheless, most of the neurons maintained ITD sensitivity to tones even for noise intensities equal to that of the tone. Using concurrent noise with a spectral composition in which the neuron's excitatory frequencies are omitted reduced the maximal response similar to that obtained with concurrent white noise. This finding indicates that the decrease of the maximal rate is mediated by suppressive cross-frequency interactions, which we also observed during monaural stimulation with additional white noise. In contrast, the enhancement of the firing rate to tones at unfavorable ITD might be due to early binaural interactions (e.g., at the level of the superior olive). A simple simulation corroborates this interpretation. Taken together, these findings suggest that the spectral composition of a concurrent sound strongly influences the spatial processing of ITD-sensitive DNLL neurons.
Collapse
Affiliation(s)
- Ida Siveke
- Division of Neurobiology, Department Biology II, Ludwig-Maximilians-Universität München, Germany
| | | | | |
Collapse
|
23
|
Herz AVM, Gollisch T, Machens CK, Jaeger D. Modeling single-neuron dynamics and computations: a balance of detail and abstraction. Science 2006; 314:80-5. [PMID: 17023649 DOI: 10.1126/science.1127240] [Citation(s) in RCA: 214] [Impact Index Per Article: 11.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
Abstract
The fundamental building block of every nervous system is the single neuron. Understanding how these exquisitely structured elements operate is an integral part of the quest to solve the mysteries of the brain. Quantitative mathematical models have proved to be an indispensable tool in pursuing this goal. We review recent advances and examine how single-cell models on five levels of complexity, from black-box approaches to detailed compartmental simulations, address key questions about neural dynamics and signal processing.
Collapse
Affiliation(s)
- Andreas V M Herz
- Bernstein Center for Computational Neuroscience Berlin and Humboldt-Universität zu Berlin, Berlin 10099, Germany.
| | | | | | | |
Collapse
|
24
|
Dimitrov AG, Gedeon T. Effects of stimulus transformations on estimates of sensory neuron selectivity. J Comput Neurosci 2006; 20:265-83. [PMID: 16683207 DOI: 10.1007/s10827-006-6357-1] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2005] [Revised: 10/24/2005] [Accepted: 11/28/2005] [Indexed: 10/24/2022]
Abstract
Stimulus selectivity of sensory systems is often characterized by analyzing response-conditioned stimulus ensembles. However, in many cases these response-triggered stimulus sets have structure that is more complex than assumed. If not taken into account, when present it will bias the estimates of many simple statistics, and distort the estimated stimulus selectivity of a neural sensory system. We present an approach that mitigates these problems by modeling some of the response-conditioned stimulus structure as being generated by a set of transformations acting on a simple stimulus distribution. This approach corrects the estimates of key statistics and counters biases introduced by the transformations. In cases involving temporal spike jitter or spatial jitter of images, the main observed effects of transformations are blurring of the conditional mean and introduction of artefacts in the spectral decomposition of the conditional covariance matrix. We illustrate this approach by analyzing and correcting a set of model stimuli perturbed by temporal and spatial jitter. We apply the approach to neurophysiological data from the cricket cercal sensory system to correct the effects of temporal jitter.
Collapse
Affiliation(s)
- Alexander G Dimitrov
- Center for Computational Biology, Montana State University, Bozeman, Montana, USA.
| | | |
Collapse
|