1
|
Dou H, Wang H, Liu S, Huang J, Liu Z, Zhou T, Yang Y. Form Properties of Moving Targets Bias Smooth Pursuit Target Selection in Monkeys. Neurosci Bull 2023; 39:1246-1262. [PMID: 36689042 PMCID: PMC10387034 DOI: 10.1007/s12264-023-01022-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Accepted: 09/21/2022] [Indexed: 01/24/2023] Open
Abstract
During natural viewing, we often recognize multiple objects, detect their motion, and select one object as the target to track. It remains to be determined how such behavior is guided by the integration of visual form and motion perception. To address this, we studied how monkeys made a choice to track moving targets with different forms by smooth pursuit eye movements in a two-target task. We found that pursuit responses were biased toward the motion direction of a target with a hole. By computing the relative weighting, we found that the target with a hole exhibited a larger weight for vector computation. The global hole feature dominated other form properties. This dominance failed to account for changes in pursuit responses to a target with different forms moving singly. These findings suggest that the integration of visual form and motion perception can reshape the competition in sensorimotor networks to guide behavioral selection.
Collapse
Affiliation(s)
- Huixi Dou
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, 100101, China
- Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei, 230088, China
| | - Huan Wang
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, 100101, China
| | - Sainan Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, 100101, China
- Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, 230026, China
| | - Jun Huang
- Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei, 230088, China
| | - Zuxiang Liu
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, 100101, China
- Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei, 230088, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Tiangang Zhou
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, 100101, China
- Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei, 230088, China
- University of Chinese Academy of Sciences, Beijing, 100049, China
| | - Yan Yang
- State Key Laboratory of Brain and Cognitive Science, Institute of Biophysics, Chinese Academy of Sciences, Beijing, 100101, China.
- Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei, 230088, China.
- University of Chinese Academy of Sciences, Beijing, 100049, China.
| |
Collapse
|
2
|
Barthélemy FV, Fleuriet J, Perrinet LU, Masson GS. A behavioral receptive field for ocular following in monkeys: Spatial summation and its spatial frequency tuning. eNeuro 2022; 9:ENEURO.0374-21.2022. [PMID: 35760525 PMCID: PMC9275147 DOI: 10.1523/eneuro.0374-21.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Revised: 06/13/2022] [Accepted: 06/16/2022] [Indexed: 11/21/2022] Open
Abstract
In human and non-human primates, reflexive tracking eye movements can be initiated at very short latency in response to a rapid shift of the image. Previous studies in humans have shown that only a part of the central visual field is optimal for driving ocular following responses. Herein, we have investigated spatial summation of motion information across a wide range of spatial frequencies and speeds of drifting gratings by recording short-latency ocular following responses in macaque monkeys. We show that optimal stimulus size for driving ocular responses cover a small (<20° diameter), central part of the visual field that shrinks with higher spatial frequency. This signature of linear motion integration remains invariant with speed and temporal frequency. For low and medium spatial frequencies, we found a strong suppressive influence from surround motion, evidenced by a decrease of response amplitude for stimulus sizes larger than optimal. Such suppression disappears with gratings at high frequencies. The contribution of peripheral motion was investigated by presenting grating annuli of increasing eccentricity. We observed an exponential decay of response amplitude with grating eccentricity, the decrease being faster for higher spatial frequencies. Weaker surround suppression can thus be explained by sparser eccentric inputs at high frequencies. A Difference-of-Gaussians model best renders the antagonistic contributions of peripheral and central motions. Its best-fit parameters coincide with several, well-known spatial properties of area MT neuronal populations. These results describe the mechanism by which central motion information is automatically integrated in a context-dependent manner to drive ocular responses.Significance statementOcular following is driven by visual motion at ultra-short latency in both humans and monkeys. Its dynamics reflect the properties of low-level motion integration. Here, we show that a strong center-surround suppression mechanism modulates initial eye velocity. Its spatial properties are dependent upon visual inputs' spatial frequency but are insensitive to either its temporal frequency or speed. These properties are best described with a Difference-of-Gaussian model of spatial integration. The model parameters reflect many spatial characteristics of motion sensitive neuronal populations in monkey area MT. Our results further outline the computational properties of the behavioral receptive field underpinning automatic, context-dependent motion integration.
Collapse
Affiliation(s)
- Frédéric V Barthélemy
- Institut de Neurosciences de la Timone, UMR7289, CNRS & Aix-Marseille Université, 13385 Marseille, France
| | - Jérome Fleuriet
- Institut de Neurosciences de la Timone, UMR7289, CNRS & Aix-Marseille Université, 13385 Marseille, France
- Assistance Publique-Hôpitaux de Paris, Intensive Care Unit, Raymond Poincaré Hospital, Garches, France
| | - Laurent U Perrinet
- Institut de Neurosciences de la Timone, UMR7289, CNRS & Aix-Marseille Université, 13385 Marseille, France
| | - Guillaume S Masson
- Institut de Neurosciences de la Timone, UMR7289, CNRS & Aix-Marseille Université, 13385 Marseille, France
| |
Collapse
|
3
|
Yoshimoto S, Hayasaka T. Common and independent processing of visual motion perception and oculomotor response. J Vis 2022; 22:6. [PMID: 35293955 PMCID: PMC8944401 DOI: 10.1167/jov.22.4.6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Visual motion signals are used not only to drive motion perception but also to elicit oculomotor responses. A fundamental question is whether perceptual and oculomotor processing of motion signals shares a common mechanism. This study aimed to address this question using visual motion priming, in which the perceived direction of a directionally ambiguous stimulus is biased in the same (positive priming) or opposite (negative priming) direction as that of a priming stimulus. The priming effect depends on the duration of the priming stimulus. It is assumed that positive and negative priming are mediated by high- and low-level motion systems, respectively. Participants were asked to judge the perceived direction of a π-phase-shifted test grating after a smoothly drifting priming grating during varied durations. Their eye movements were measured while the test grating was presented. The perception and eye movements were discrepant under positive priming and correlated under negative priming on a trial-by-trial basis when an interstimulus interval was inserted between the priming and test stimuli, indicating that the eye movements were evoked by the test stimulus per se. These findings suggest that perceptual and oculomotor responses are induced by a common mechanism at a low level of motion processing but by independent mechanisms at a high level of motion processing.
Collapse
Affiliation(s)
- Sanae Yoshimoto
- School of Integrated Arts and Sciences, Hiroshima University, Hiroshima, Japan.,
| | - Tomoyuki Hayasaka
- School of Integrated Arts and Sciences, Hiroshima University, Hiroshima, Japan.,
| |
Collapse
|
4
|
Spatiotemporal Filter for Visual Motion Integration from Pursuit Eye Movements in Humans and Monkeys. J Neurosci 2016; 37:1394-1412. [PMID: 28003348 DOI: 10.1523/jneurosci.2682-16.2016] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2016] [Revised: 12/03/2016] [Accepted: 12/10/2016] [Indexed: 11/21/2022] Open
Abstract
Despite the enduring interest in motion integration, a direct measure of the space-time filter that the brain imposes on a visual scene has been elusive. This is perhaps because of the challenge of estimating a 3D function from perceptual reports in psychophysical tasks. We take a different approach. We exploit the close connection between visual motion estimates and smooth pursuit eye movements to measure stimulus-response correlations across space and time, computing the linear space-time filter for global motion direction in humans and monkeys. Although derived from eye movements, we find that the filter predicts perceptual motion estimates quite well. To distinguish visual from motor contributions to the temporal duration of the pursuit motion filter, we recorded single-unit responses in the monkey middle temporal cortical area (MT). We find that pursuit response delays are consistent with the distribution of cortical neuron latencies and that temporal motion integration for pursuit is consistent with a short integration MT subpopulation. Remarkably, the visual system appears to preferentially weight motion signals across a narrow range of foveal eccentricities rather than uniformly over the whole visual field, with a transiently enhanced contribution from locations along the direction of motion. We find that the visual system is most sensitive to motion falling at approximately one-third the radius of the stimulus aperture. Hypothesizing that the visual drive for pursuit is related to the filtered motion energy in a motion stimulus, we compare measured and predicted eye acceleration across several other target forms.SIGNIFICANCE STATEMENT A compact model of the spatial and temporal processing underlying global motion perception has been elusive. We used visually driven smooth eye movements to find the 3D space-time function that best predicts both eye movements and perception of translating dot patterns. We found that the visual system does not appear to use all available motion signals uniformly, but rather weights motion preferentially in a narrow band at approximately one-third the radius of the stimulus. Although not universal, the filter predicts responses to other types of stimuli, demonstrating a remarkable degree of generalization that may lead to a deeper understanding of visual motion processing.
Collapse
|
5
|
Sheliga BM, Quaia C, FitzGibbon EJ, Cumming BG. Ocular-following responses to white noise stimuli in humans reveal a novel nonlinearity that results from temporal sampling. J Vis 2016; 16:8. [PMID: 26762277 PMCID: PMC4743714 DOI: 10.1167/16.1.8] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022] Open
Abstract
White noise stimuli are frequently used to study the visual processing of broadband images in the laboratory. A common goal is to describe how responses are derived from Fourier components in the image. We investigated this issue by recording the ocular-following responses (OFRs) to white noise stimuli in human subjects. For a given speed we compared OFRs to unfiltered white noise with those to noise filtered with band-pass filters and notch filters. Removing components with low spatial frequency (SF) reduced OFR magnitudes, and the SF associated with the greatest reduction matched the SF that produced the maximal response when presented alone. This reduction declined rapidly with SF, compatible with a winner-take-all operation. Removing higher SF components increased OFR magnitudes. For higher speeds this effect became larger and propagated toward lower SFs. All of these effects were quantitatively well described by a model that combined two factors: (a) an excitatory drive that reflected the OFRs to individual Fourier components and (b) a suppression by higher SF channels where the temporal sampling of the display led to flicker. This nonlinear interaction has an important practical implication: Even with high refresh rates (150 Hz), the temporal sampling introduced by visual displays has a significant impact on visual processing. For instance, we show that this distorts speed tuning curves, shifting the peak to lower speeds. Careful attention to spectral content, in the light of this nonlinearity, is necessary to minimize the resulting artifact when using white noise patterns undergoing apparent motion.
Collapse
|
6
|
Quaia C, Optican LM, Cumming BG. A Motion-from-Form Mechanism Contributes to Extracting Pattern Motion from Plaids. J Neurosci 2016; 36:3903-18. [PMID: 27053199 PMCID: PMC4821905 DOI: 10.1523/jneurosci.3398-15.2016] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2015] [Revised: 02/22/2016] [Accepted: 02/24/2016] [Indexed: 11/21/2022] Open
Abstract
Since the discovery of neurons selective for pattern motion direction in primate middle temporal area MT (Albright, 1984; Movshon et al., 1985), the neural computation of this signal has been the subject of intense study. The bulk of this work has explored responses to plaids obtained by summing two drifting sinusoidal gratings. Unfortunately, with these stimuli, many different mechanisms are similarly effective at extracting pattern motion. We devised a new set of stimuli, obtained by summing two random line stimuli with different orientations. This allowed several novel manipulations, including generating plaids that do not contain rigid 2D motion. Importantly, these stimuli do not engage most of the previously proposed mechanisms. We then recorded the ocular following responses that such stimuli induce in human subjects. We found that pattern motion is computed even with stimuli that do not cohere perceptually, including those without rigid motion, and even when the two gratings are presented separately to the two eyes. Moderate temporal and/or spatial separation of the gratings impairs the computation. We show that, of the models proposed so far, only those based on the intersection-of-constraints rule, embedding a motion-from-form mechanism (in which orientation signals are used in the computation of motion direction signals), can account for our results. At least for the eye movements reported here, a motion-from-form mechanism is thus involved in one of the most basic functions of the visual motion system: extracting motion direction from complex scenes. SIGNIFICANCE STATEMENT Anatomical considerations led to the proposal that visual function is organized in separate processing streams: one (ventral) devoted to form and one (dorsal) devoted to motion. Several experimental results have challenged this view, arguing in favor of a more integrated view of visual processing. Here we add to this body of work, supporting a role for form information even in a function--extracting pattern motion direction from complex scenes--for which decisive evidence for the involvement of form signals has been lacking.
Collapse
Affiliation(s)
- Christian Quaia
- Laboratory of Sensorimotor Research, National Eye Institute, National Institutes of Health, Department of Health and Human Services, Bethesda, Maryland 20892
| | - Lance M Optican
- Laboratory of Sensorimotor Research, National Eye Institute, National Institutes of Health, Department of Health and Human Services, Bethesda, Maryland 20892
| | - Bruce G Cumming
- Laboratory of Sensorimotor Research, National Eye Institute, National Institutes of Health, Department of Health and Human Services, Bethesda, Maryland 20892
| |
Collapse
|
7
|
Glasser DM, Tadin D. Modularity in the motion system: independent oculomotor and perceptual processing of brief moving stimuli. J Vis 2014; 14:28. [PMID: 24665091 DOI: 10.1167/14.3.28] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
In addition to motion perception per se, we utilize motion information for a wide range of brain functions. These varied functions place different demands on the visual system, and therefore a stimulus that provides useful information for one function may be inadequate for another. For example, the direction of motion of large high-contrast stimuli is difficult to discriminate perceptually, but other studies have shown that such stimuli are highly effective at eliciting directional oculomotor responses such as the ocular following response (OFR). Here, we investigated the degree of independence between perceptual and oculomotor processing by determining whether perceptually suppressed moving stimuli can nonetheless evoke reliable eye movements. We measured reflexively evoked tracking eye movements while observers discriminated the motion direction of large high-contrast stimuli. To quantify the discrimination ability of the oculomotor system, we used signal detection theory to generate associated oculometric functions. The results showed that oculomotor sensitivity to motion direction is not predicted by perceptual sensitivity to the same stimuli. In fact, in several cases oculomotor responses were more reliable than perceptual responses. Moreover, a trial-by-trial analysis indicated that, for stimuli tested in this study, oculomotor processing was statistically independent from perceptual processing. Evidently, perceptual and oculomotor responses reflect the activity of independent dissociable mechanisms despite operating on the same input. While results of this kind have traditionally been interpreted in the framework of perception versus action, we propose that these differences reflect a more general principle of modularity.
Collapse
|
8
|
Abstract
Active sensation poses unique challenges to sensory systems because moving the sensor necessarily alters the input sensory stream. Sensory input quality is additionally compromised if the sensor moves rapidly, as during rapid eye movements, making the period immediately after the movement critical for recovering reliable sensation. Here, we studied this immediate postmovement interval for the case of microsaccades during fixation, which rapidly jitter the "sensor" exactly when it is being voluntarily stabilized to maintain clear vision. We characterized retinal-image slip in monkeys immediately after microsaccades by analyzing postmovement ocular drifts. We observed enhanced ocular drifts by up to ~28% relative to premicrosaccade levels, and for up to ~50 ms after movement end. Moreover, we used a technique to trigger full-field image motion contingent on real-time microsaccade detection, and we used the initial ocular following response to this motion as a proxy for changes in early visual motion processing caused by microsaccades. When the full-field image motion started during microsaccades, ocular following was strongly suppressed, consistent with detrimental retinal effects of the movements. However, when the motion started after microsaccades, there was up to ~73% increase in ocular following speed, suggesting an enhanced motion sensitivity. These results suggest that the interface between even the smallest possible saccades and "fixation" includes a period of faster than usual image slip, as well as an enhanced responsiveness to image motion, and that both of these phenomena need to be considered when interpreting the pervasive neural and perceptual modulations frequently observed around the time of microsaccades.
Collapse
|
9
|
Sheliga BM, Quaia C, Cumming BG, Fitzgibbon EJ. Spatial summation properties of the human ocular following response (OFR): dependence upon the spatial frequency of the stimulus. Vision Res 2012; 68:1-13. [PMID: 22819728 PMCID: PMC3430370 DOI: 10.1016/j.visres.2012.07.006] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2012] [Revised: 07/03/2012] [Accepted: 07/10/2012] [Indexed: 11/17/2022]
Abstract
Ocular following responses (OFRs) are the initial tracking eye movements that can be elicited at ultra-short latency by sudden motion of a textured pattern. The OFR magnitude depends upon stimulus size, and also upon the spatial frequency (SF) of sine-wave gratings. Here we investigate the interaction of size and SF. We recorded initial OFRs in human subjects when 1D vertical sine-wave gratings were subject to horizontal motion. Gratings were restricted to elongated horizontal apertures-"strips"-aligned with the axis of motion. In Experiment 1 the SF and the height of a single strip was manipulated. The magnitude of the OFR increased with strip height up to some optimum value, while strip heights greater than this optimum produced smaller responses. This effect was strongly dependent on SF: the optimum strip height was smaller for higher SFs. In order to explore the underlying mechanism, Experiment 2 measured OFRs to stimuli composed of two thin horizontal strips-one in the upper visual field, the other in the lower visual field-whose vertical separation varied 32-fold. Stimuli of different sizes can be reconstructed from the sum of such horizontal strips. We found that the OFRs in Experiment 1 were smaller than the sum of the responses to the component stimuli, but greater than the average of those responses. We defined an averaging coefficient that described whether a given response was closer to the sum or to the average. For any one SF, the averaging coefficients were similar over a wide range of stimulus sizes, while they varied considerably (7-fold) for stimuli of different SFs.
Collapse
Affiliation(s)
- B M Sheliga
- Laboratory of Sensorimotor Research, National Eye Institute, National Institutes of Health, Bethesda, MD 20892, USA.
| | | | | | | |
Collapse
|
10
|
Facilitative integration of local motion signals in the peripheral visual field observed in monkey ocular following responses. Neurosci Res 2012; 74:48-58. [DOI: 10.1016/j.neures.2012.06.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2012] [Revised: 06/05/2012] [Accepted: 06/21/2012] [Indexed: 11/21/2022]
|
11
|
Abstract
Ocular following responses (OFRs) are tracking eye movements elicited at ultrashort latency by the sudden movement of a textured pattern. Here we report the results of our study of their dependency on the spatial arrangement of the motion stimulus. Unlike previous studies that looked at the effect of stimulus size, we investigated the impact of stimulus location and how two distinct stimuli, presented together, collectively determine the OFR. We used as stimuli vertical gratings that moved in the horizontal direction and that were confined to either one or two 0.58° high strips, spanning the width of the screen. We found that the response to individual strips varied as a function of the location and spatial frequency (SF) of the stimulus. The response decreased as the stimulus eccentricity increased, but this relationship was more accentuated at high than at low spatial frequencies. We also found that when pairs of stimuli were presented, nearby stimuli interacted strongly, so that the response to the pair was barely larger than the response to a single strip in the pair. This suppressive effect faded away as the separation between the strips increased. The variation of the suppressive interaction with strip separation, paired with the dependency on eccentricity of the responses to single strips, caused the peak response for strip pairs to be achieved at a specific separation, which varied as a function of SF.
Collapse
Affiliation(s)
- Christian Quaia
- Laboratory of Sensorimotor Research, National Eye Institute, Bethesda, MD 20815, USA.
| | | | | | | |
Collapse
|
12
|
Masson GS, Perrinet LU. The behavioral receptive field underlying motion integration for primate tracking eye movements. Neurosci Biobehav Rev 2012; 36:1-25. [DOI: 10.1016/j.neubiorev.2011.03.009] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2010] [Revised: 03/11/2011] [Accepted: 03/13/2011] [Indexed: 11/26/2022]
|
13
|
Niu YQ, Lisberger SG. Sensory versus motor loci for integration of multiple motion signals in smooth pursuit eye movements and human motion perception. J Neurophysiol 2011; 106:741-53. [PMID: 21593392 DOI: 10.1152/jn.01025.2010] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
We have investigated how visual motion signals are integrated for smooth pursuit eye movements by measuring the initiation of pursuit in monkeys for pairs of moving stimuli of the same or differing luminance. The initiation of pursuit for pairs of stimuli of the same luminance could be accounted for as a vector average of the responses to the two stimuli singly. When stimuli comprised two superimposed patches of moving dot textures, the brighter stimulus suppressed the inputs from the dimmer stimulus, so that the initiation of pursuit became winner-take-all when the luminance ratio of the two stimuli was 8 or greater. The dominance of the brighter stimulus could be not attributed to either the latency difference or the ratio of the eye accelerations for the bright and dim stimuli presented singly. When stimuli comprised either spot targets or two patches of dots moving across separate locations in the visual field, the brighter stimulus had a much weaker suppressive influence; the initiation of pursuit could be accounted for by nearly equal vector averaging of the responses to the two stimuli singly. The suppressive effects of the brighter stimulus also appeared in human perceptual judgments, but again only for superimposed stimuli. We conclude that one locus of the interaction of two moving visual stimuli is shared by perception and action and resides in local inhibitory connections in the visual cortex. A second locus resides deeper in sensory-motor processing and may be more closely related to action selection than to stimulus selection.
Collapse
Affiliation(s)
- Yu-Qiong Niu
- Department of Physiology, Howard Hughes Medical Institute, University of California, San Francisco, California, USA
| | | |
Collapse
|
14
|
Distribution of optokinetic sensitivity across the retina of mice in relation to eye orientation. Neuroscience 2010; 168:200-8. [DOI: 10.1016/j.neuroscience.2010.03.025] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2009] [Revised: 02/27/2010] [Accepted: 03/12/2010] [Indexed: 11/21/2022]
|
15
|
Sheliga BM, Fitzgibbon EJ, Miles FA. The initial torsional Ocular Following Response (tOFR) in humans: a response to the total motion energy in the stimulus? J Vis 2009; 9:2.1-38. [PMID: 20053093 DOI: 10.1167/9.12.2] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2009] [Accepted: 10/02/2009] [Indexed: 11/24/2022] Open
Abstract
We recorded the initial torsional Ocular Following Responses (tOFRs) elicited at short latency by visual images that occupied the frontal plane and rotated about the lines of sight. Using 1-D radial gratings, the local spatio-temporal characteristics of these tOFRs closely resembled those we previously reported for the hOFRs to horizontal motion with 1-D vertical gratings. When the 1-D radial grating was subdivided into a number of concentric annuli, each with the same radial thickness, tOFRs were less than predicted from the sum of the responses to the individual annuli: spatial normalization. However, the normalization was much weaker than that which we previously reported for the hOFRs. Further, when the number, thickness and contrast of these concentric annuli were varied systematically, the latency and magnitude of the tOFRs were well described by single monotonic functions when plotted against the product of the total area of the annuli and the square of their Michelson contrast ("A*C(2)"), consistent with the hypothesis that the onset and magnitude of the initial tOFR are determined by the total motion energy in the stimulus. When our previously published hOFR data were plotted against A*C(2), a single monotonic function sufficed to describe the latency but not the magnitude.
Collapse
Affiliation(s)
- B M Sheliga
- Laboratory of Sensorimotor Research, National Eye Institute, Bethesda, MD, USA.
| | | | | |
Collapse
|
16
|
Taki M, Miura K, Tabata H, Hisa Y, Kawano K. The effects of prolonged viewing of motion on short-latency ocular following responses. Exp Brain Res 2009; 195:195-205. [PMID: 19308363 DOI: 10.1007/s00221-009-1768-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2008] [Accepted: 03/03/2009] [Indexed: 11/26/2022]
Abstract
The adaptive effects of prolonged viewing of conditioning motion on ocular following responses (OFRs) elicited by brief test motion of a random-dot pattern were studied in humans. We found that the OFRs were significantly reduced when the directions of the conditioning and test motions were the same. The effect of conditioning motion was still observed when the speeds of the conditioning and test motions did not match. The effect was larger when the conditioning duration was longer, and decayed over time with increased temporal separation between the conditioning and test periods. These results are consistent with the characteristics of motion adaptation on the initial smooth pursuit responses. We also obtained data suggesting that the persistence of the effect depends on visual stimulation in the time between the conditioning and test periods, and that the presence of a stationary visual stimulus facilitates recovery from the motion adaptation.
Collapse
Affiliation(s)
- Masakatsu Taki
- Department of Integrative Brain Science, Graduate School of Medicine, Kyoto University, Kyoto, Kyoto, Japan
| | | | | | | | | |
Collapse
|