1
|
Yan C, Chen Y, Zhang Y, Kong L, Durgin FH, Li Z. EXPRESS: Perceptual scale expansion: A natural design for improving the precision of motor control. Q J Exp Psychol (Hove) 2022:17470218221115075. [PMID: 35866338 DOI: 10.1177/17470218221115075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Space perception is systematically biased. Few theories of spatial bias address the possible functional advantages of mechanisms that produce spatial biases. The scale expansion hypothesis proposes that many spatial biases are due to the perceptual expansion of visual angles, which acts somewhat like a natural magnifying glass in vision. The present study examined the idea that visual expansion may improve motor precision (i.e., reduce motor variability) in movements when using closed-loop control but not when using open-loop control. Experiment 1 tested this idea in an online tracking task (closed-loop control), whereas Experiment 2 tested it in a fast-hitting task (open-loop control). The results were consistent with the hypothesis. To rule out the effect of the task difference (i.e., tracking vs. fast hitting), Experiment 3 examined the effect of visual expansion on the variability of motor performance in a line-reproduction task. The control type (closed-loop or open-loop) was manipulated by the form of visual feedback (online or offline). The results were again consistent with the present assumption. Taken together, the present data suggest that perceptual expansion in vision improves motor control precision when using closed-loop control (but not when using open-loop control), which supports the scale-expansion hypothesis. In addition, the present findings also improve our understanding of how visual error amplification affects motor control.
Collapse
Affiliation(s)
- Chenyu Yan
- Department of Psychology and Behavioral Sciences, Zhejiang University 12377.,These authors contributed equally as first authors
| | - Yilin Chen
- Department of Psychology and Behavioral Sciences, Zhejiang University 12377
| | - Yu Zhang
- Department of Psychology and Behavioral Sciences, Zhejiang University 12377
| | - Linghang Kong
- Department of Psychology and Behavioral Sciences, Zhejiang University 12377
| | | | - Zhi Li
- Department of Psychology and Behavioral Sciences, Zhejiang University 12377
| |
Collapse
|
2
|
Zhang J, Yang X, Jin Z, Li L. Distance Estimation in Virtual Reality Is Affected by Both the Virtual and the Real-World Environments. Iperception 2021; 12:20416695211023956. [PMID: 34211686 PMCID: PMC8216372 DOI: 10.1177/20416695211023956] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2020] [Accepted: 05/19/2021] [Indexed: 11/17/2022] Open
Abstract
The experience in virtual reality (VR) is unique, in that observers are in a real-world location while browsing through a virtual scene. Previous studies have investigated the effect of the virtual environment on distance estimation. However, it is unclear how the real-world environment influences distance estimation in VR. Here, we measured the distance estimation using a bisection (Experiment 1) and a blind-walking (Experiments 2 and 3) method. Participants performed distance judgments in VR, which rendered either virtual indoor or outdoor scenes. Experiments were also carried out in either real-world indoor or outdoor locations. In the bisection experiment, judged distance in virtual outdoor was greater than that in virtual indoor. However, the real-world environment had no impact on distance judgment estimated by bisection. In the blind-walking experiment, judged distance in real-world outdoor was greater than that in real-world indoor. On the other hand, the virtual environment had no impact on distance judgment estimated by blind-walking. Generally, our results suggest that both the virtual and real-world environments have an impact on distance judgment in VR. Especially, the real-world environment where a person is physically located during a VR experience influences the person's distance estimation in VR.
Collapse
Affiliation(s)
- Junjun Zhang
- MOE Key Lab for Neuroinformation, The Clinical Hospital of Chengdu Brain Science Institute, University of Electronic Science and Technology of China, Chengdu, China
| | - Xiaoyan Yang
- MOE Key Lab for Neuroinformation, The Clinical Hospital of Chengdu Brain Science Institute, University of Electronic Science and Technology of China, Chengdu, China
| | - Zhenlan Jin
- MOE Key Lab for Neuroinformation, The Clinical Hospital of Chengdu Brain Science Institute, University of Electronic Science and Technology of China, Chengdu, China
| | - Ling Li
- MOE Key Lab for Neuroinformation, The Clinical Hospital of Chengdu Brain Science Institute, University of Electronic Science and Technology of China, Chengdu, China
| |
Collapse
|
3
|
Kelly SA. Blind-Walking Behavior in the Dark Affected by Previewing the Testing Space. Perception 2019; 48:1058-1078. [PMID: 31554477 DOI: 10.1177/0301006619876446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Visual environments affect egocentric distance perceptions in full cue conditions. In this study, the effect of three spatial layouts was tested on the perceived location of a self-illuminated single target viewed in the dark. Blind-walking (BW) estimates of target distance were underestimated in all testing spaces, as expected, but foreshortened significantly more in the shortest of the three testing rooms. Additional experiments revealed that neither changes in the perceived angle of declination nor perceived eye height were responsible for this effect. The possibility that subjects made cognitive adjustments to BW behavior to reduce physical risk was assessed by remeasuring target locations in the three different locations with magnitude estimation and by comparing the BW results obtained from subjects who had no preview of the testing space with those who had. The results support the conclusion that the effect of spatial layout is likely due to cognitive adjustments to BW behavior. The results also indicate that the perceived angle of declination is always overestimated by at least a factor of 1.5. These results can be interpreted within the context of a theory of space perception called the angular expansion theory (AET).
Collapse
Affiliation(s)
- Susan A Kelly
- Department of Vision Sciences, Illinois College of Optometry, Chicago, IL, USA
| |
Collapse
|
4
|
The role of top-down knowledge about environmental context in egocentric distance judgments. Atten Percept Psychophys 2019; 80:586-599. [PMID: 29204865 DOI: 10.3758/s13414-017-1461-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Judgments of egocentric distances in well-lit natural environments can differ substantially in indoor versus outdoor contexts. Visual cues (e.g., linear perspective, texture gradients) no doubt play a strong role in context-dependent judgments when cues are abundant. Here we investigated a possible top-down influence on distance judgments that might play a unique role under conditions of perceptual uncertainty: assumptions or knowledge that one is indoors or outdoors. We presented targets in a large outdoor field and in an indoor classroom. To control visual distance and depth cues between the environments, we restricted the field of view by using a 14-deg aperture. Evidence of context effects depended on the response mode: Blindfolded-walking responses were systematically shorter indoors than outdoors, whereas verbal and size gesture judgments showed no context effects. These results suggest that top-down knowledge about the environmental context does not strongly influence visually perceived egocentric distance. However, this knowledge can operate as an output-level bias, such that blindfolded-walking responses are shorter when observers' top-down knowledge indicates that they are indoors and when the size of the room is uncertain.
Collapse
|
5
|
Etchemendy PE, Spiousas I, Vergara R. Relationship Between Auditory Context and Visual Distance Perception: Effect of Musical Expertise in the Ability to Translate Reverberation Cues Into Room-Size Perception. Perception 2018; 47:873-880. [PMID: 29759044 DOI: 10.1177/0301006618776225] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
In a recently published work by our group [ Scientific Reports, 7, 7189 (2017)], we performed experiments of visual distance perception in two dark rooms with extremely different reverberation times: one anechoic ( T ∼ 0.12 s) and the other reverberant ( T ∼ 4 s). The perceived distance of the targets was systematically greater in the reverberant room when contrasted to the anechoic chamber. Participants also provided auditorily perceived room-size ratings which were greater for the reverberant room. Our hypothesis was that distance estimates are affected by room size, resulting in farther responses for the room perceived larger. Of much importance to the task was the subjects' ability to infer room size from reverberation. In this article, we report a postanalysis showing that participants having musical expertise were better able to extract and translate reverberation cues into room-size information than nonmusicians. However, the degree to which musical expertise affects visual distance estimates remains unclear.
Collapse
Affiliation(s)
- Pablo E Etchemendy
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de Quilmes, Bernal, Argentina
| | - Ignacio Spiousas
- Department of Psychiatry, Douglas Mental Health University Institute, McGill University, Montreal, Canada; BRAMS Laboratory, Centre for Research on Brain, Language and Music (CRBLM), Montreal, Canada
| | - Ramiro Vergara
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de Quilmes, Bernal, Argentina
| |
Collapse
|
6
|
Etchemendy PE, Abregú E, Calcagno ER, Eguia MC, Vechiatti N, Iasi F, Vergara RO. Auditory environmental context affects visual distance perception. Sci Rep 2017; 7:7189. [PMID: 28775372 PMCID: PMC5543138 DOI: 10.1038/s41598-017-06495-3] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2017] [Accepted: 06/13/2017] [Indexed: 11/21/2022] Open
Abstract
In this article, we show that visual distance perception (VDP) is influenced by the auditory environmental context through reverberation-related cues. We performed two VDP experiments in two dark rooms with extremely different reverberation times: an anechoic chamber and a reverberant room. Subjects assigned to the reverberant room perceived the targets farther than subjects assigned to the anechoic chamber. Also, we found a positive correlation between the maximum perceived distance and the auditorily perceived room size. We next performed a second experiment in which the same subjects of Experiment 1 were interchanged between rooms. We found that subjects preserved the responses from the previous experiment provided they were compatible with the present perception of the environment; if not, perceived distance was biased towards the auditorily perceived boundaries of the room. Results of both experiments show that the auditory environment can influence VDP, presumably through reverberation cues related to the perception of room size.
Collapse
Affiliation(s)
- Pablo E Etchemendy
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de Quilmes, B1876BXD, Bernal, Buenos Aires, Argentina
| | - Ezequiel Abregú
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de Quilmes, B1876BXD, Bernal, Buenos Aires, Argentina
| | - Esteban R Calcagno
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de Quilmes, B1876BXD, Bernal, Buenos Aires, Argentina
| | - Manuel C Eguia
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de Quilmes, B1876BXD, Bernal, Buenos Aires, Argentina
| | - Nilda Vechiatti
- Laboratorio de Acústica y Luminotecnia. Comisión de Investigaciones Científicas de la Provincia de Buenos Aires. Cno. Centenario e/505 y 508, M. B. Gonnet, Buenos Aires, Argentina
| | - Federico Iasi
- Laboratorio de Acústica y Luminotecnia. Comisión de Investigaciones Científicas de la Provincia de Buenos Aires. Cno. Centenario e/505 y 508, M. B. Gonnet, Buenos Aires, Argentina
| | - Ramiro O Vergara
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de Quilmes, B1876BXD, Bernal, Buenos Aires, Argentina.
| |
Collapse
|
7
|
Abstract
Visual cognition in our 3D world requires understanding how we accurately localize objects in 2D and depth, and what influence both types of location information have on visual processing. Spatial location is known to play a special role in visual processing, but most of these findings have focused on the special role of 2D location. One such phenomena is the spatial congruency bias (Golomb, Kupitz, & Thiemann, 2014), where 2D location biases judgments of object features but features do not bias location judgments. This paradigm has recently been used to compare different types of location information in terms of how much they bias different types of features. Here we used this paradigm to ask a related question: whether 2D and depth-from-disparity location bias localization judgments for each other. We found that presenting two objects in the same 2D location biased position-in-depth judgments, but presenting two objects at the same depth (disparity) did not bias 2D location judgments. We conclude that an object's 2D location may be automatically incorporated into perception of its depth location, but not vice versa, which is consistent with a fundamentally special role for 2D location in visual processing.
Collapse
Affiliation(s)
- Nonie J. Finlayson
- Department of Psychology, Center for Cognitive & Brain Sciences, The Ohio State University, Columbus, OH 43210, USA
| | | |
Collapse
|
8
|
Wallin CP, Gajewski DA, Teplitz RW, Mihelic Jaidzeka S, Philbeck JW. The Roles for Prior Visual Experience and Age on the Extraction of Egocentric Distance. J Gerontol B Psychol Sci Soc Sci 2017; 72:91-99. [PMID: 27473147 PMCID: PMC5156495 DOI: 10.1093/geronb/gbw089] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2016] [Accepted: 07/04/2016] [Indexed: 11/14/2022] Open
Abstract
OBJECTIVES In a well-lit room, observers can generate well-constrained estimates of the distance to an object on the floor even with just a fleeting glimpse. Performance under these conditions is typically characterized by some underestimation but improves when observers have previewed the room. Such evidence suggests that information extracted from longer durations may be stored to contribute to the perception of distance at limited time frames. Here, we examined the possibility that this stored information is used differentially across age. Specifically, we posited that older adults would rely more than younger adults on information gathered and stored at longer glimpses to judge the distance of briefly glimpsed objects. METHOD We collected distance judgments from younger and older adults after brief target glimpses. Half of the participants were provided 20-s previews of the testing room in advance; the other half received no preview. RESULTS Performance benefits were observed for all individuals with prior visual experience, and these were moderately more pronounced for the older adults. DISCUSSION The results suggest that observers store contextual information gained from longer viewing durations to aid in the perception of distance at brief glimpses, and that this memory becomes more important with age.
Collapse
Affiliation(s)
- Courtney P Wallin
- Department of Psychology, The George Washington University, District of Columbia.
| | - Daniel A Gajewski
- Department of Psychology, The George Washington University, District of Columbia
| | - Rebeca W Teplitz
- Department of Psychology, The George Washington University, District of Columbia
| | | | - John W Philbeck
- Department of Psychology, The George Washington University, District of Columbia
| |
Collapse
|
9
|
Ratzlaff M, Nawrot M. A Pursuit Theory Account for the Perception of Common Motion in Motion Parallax. Perception 2016; 45:991-1007. [PMID: 27060180 PMCID: PMC4990516 DOI: 10.1177/0301006616643679] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2023]
Abstract
The visual system uses an extraretinal pursuit eye movement signal to disambiguate the perception of depth from motion parallax. Visual motion in the same direction as the pursuit is perceived nearer in depth while visual motion in the opposite direction as pursuit is perceived farther in depth. This explanation of depth sign applies to either an allocentric frame of reference centered on the fixation point or an egocentric frame of reference centered on the observer. A related problem is that of depth order when two stimuli have a common direction of motion. The first psychophysical study determined whether perception of egocentric depth order is adequately explained by a model employing an allocentric framework, especially when the motion parallax stimuli have common rather than divergent motion. A second study determined whether a reversal in perceived depth order, produced by a reduction in pursuit velocity, is also explained by this model employing this allocentric framework. The results show than an allocentric model can explain both the egocentric perception of depth order with common motion and the perceptual depth order reversal created by a reduction in pursuit velocity. We conclude that an egocentric model is not the only explanation for perceived depth order in these common motion conditions.
Collapse
Affiliation(s)
- Michael Ratzlaff
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| | - Mark Nawrot
- Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND, USA
| |
Collapse
|
10
|
Gunzelmann G, Lyon DR. Constructing representations of spatial location from briefly presented displays. Cogn Process 2016; 18:81-85. [PMID: 27465806 DOI: 10.1007/s10339-016-0775-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2015] [Accepted: 07/12/2016] [Indexed: 12/01/2022]
Abstract
Spatial memory and reasoning rely heavily on allocentric (often map-like) representations of spatial knowledge. While research has documented many ways in which spatial information can be represented in allocentric form, less is known about how such representations are constructed. For example: Are the very early, pre-attentive parts of the process hard-wired, or can they be altered by experience? We addressed this issue by presenting sub-saccadic (53 ms) masked stimuli consisting of a target among one to three reference features. We then shifted the location of the feature array, and asked participants to identify the target's new relative location. Experience altered feature processing even when the display duration was too short to allow attention re-allocation. The results demonstrate the importance of early perceptual processes in the creation of representations of spatial location, and the malleability of those processes based on experience and expectations.
Collapse
Affiliation(s)
- Glenn Gunzelmann
- Air Force Research Laboratory, Cognitive Models and Agents Branch, Wright Patterson AFB, 2620 Q St. - Building 852, Dayton, OH, 45433, USA.
| | - Don R Lyon
- Oak Ridge Institute for Science and Education, Wright Patterson Air Force Base, USA
| |
Collapse
|
11
|
Klein BJ, Li Z, Durgin FH. Large perceptual distortions of locomotor action space occur in ground-based coordinates: Angular expansion and the large-scale horizontal-vertical illusion. J Exp Psychol Hum Percept Perform 2015; 42:581-93. [PMID: 26594884 DOI: 10.1037/xhp0000173] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides to dissociate egocentric from allocentric reference frames. In Experiment 1, it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. (PsycINFO Database Record
Collapse
Affiliation(s)
| | - Zhi Li
- Department of Psychology, Swarthmore College
| | | |
Collapse
|
12
|
Philbeck JW, Witt JK. Action-specific influences on perception and postperceptual processes: Present controversies and future directions. Psychol Bull 2015; 141:1120-44. [PMID: 26501227 PMCID: PMC4621785 DOI: 10.1037/a0039738] [Citation(s) in RCA: 51] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The action-specific perception account holds that people perceive the environment in terms of their ability to act in it. In this view, for example, decreased ability to climb a hill because of fatigue makes the hill visually appear to be steeper. Though influential, this account has not been universally accepted, and in fact a heated controversy has emerged. The opposing view holds that action capability has little or no influence on perception. Heretofore, the debate has been quite polarized, with efforts largely being focused on supporting one view and dismantling the other. We argue here that polarized debate can impede scientific progress and that the search for similarities between 2 sides of a debate can sharpen the theoretical focus of both sides and illuminate important avenues for future research. In this article, we present a synthetic review of this debate, drawing from the literatures of both approaches, to clarify both the surprising similarities and the core differences between them. We critically evaluate existing evidence, discuss possible mechanisms of action-specific effects, and make recommendations for future research. A primary focus of future work will involve not only the development of methods that guard against action-specific postperceptual effects but also development of concrete, well-constrained underlying mechanisms. The criteria for what constitutes acceptable control of postperceptual effects and what constitutes an appropriately specific mechanism vary between approaches, and bridging this gap is a central challenge for future research.
Collapse
|
13
|
Abstract
The angular declination of a target with respect to eye level is known to be an important cue to egocentric distance when objects are viewed or can be assumed to be resting on the ground. When targets are fixated, angular declination and the direction of the gaze with respect to eye level have the same objective value. However, any situation that limits the time available to shift gaze could leave to-be-localized objects outside the fovea, and, in these cases, the objective values would differ. Nevertheless, angular declination and gaze declination are often conflated, and the role for retinal eccentricity in egocentric distance judgments is unknown. We report two experiments demonstrating that gaze declination is sufficient to support judgments of distance, even when extraretinal signals are all that are provided by the stimulus and task environment. Additional experiments showed no accuracy costs for extrafoveally viewed targets and no systematic impact of foveal or peripheral biases, although a drop in precision was observed for the most retinally eccentric targets. The results demonstrate the remarkable utility of target direction, relative to eye level, for judging distance (signaled by angular declination and/or gaze declination) and are consonant with the idea that detection of the target is sufficient to capitalize on the angular declination of floor-level targets (regardless of the direction of gaze).
Collapse
|
14
|
Gajewski DA, Wallin CP, Philbeck JW. The Effects of Age and Set Size on the Fast Extraction of Egocentric Distance. VISUAL COGNITION 2015; 23:957-988. [PMID: 27398065 DOI: 10.1080/13506285.2015.1132803] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Angular direction is a source of information about the distance to floor-level objects that can be extracted from brief glimpses (near one's threshold for detection). Age and set size are two factors known to impact the viewing time needed to directionally localize an object, and these were posited to similarly govern the extraction of distance. The question here was whether viewing durations sufficient to support object detection (controlled for age and set size) would also be sufficient to support well-constrained judgments of distance. Regardless of viewing duration, distance judgments were more accurate (less biased towards underestimation) when multiple potential targets were presented, suggesting that the relative angular declinations between the objects are an additional source of useful information. Distance judgments were more precise with additional viewing time, but the benefit did not depend on set size and accuracy did not improve with longer viewing durations. The overall pattern suggests that distance can be efficiently derived from direction for floor-level objects. Controlling for age-related differences in the viewing time needed to support detection was sufficient to support distal localization but only when brief and longer glimpse trials were interspersed. Information extracted from longer glimpse trials presumably supported performance on subsequent trials when viewing time was more limited. This outcome suggests a particularly important role for prior visual experience in distance judgments for older observers.
Collapse
Affiliation(s)
- Daniel A Gajewski
- Department of Psychology, The George Washington University, Washington, D.C
| | - Courtney P Wallin
- Department of Psychology, The George Washington University, Washington, D.C
| | - John W Philbeck
- Department of Psychology, The George Washington University, Washington, D.C
| |
Collapse
|
15
|
Abstract
Tachistoscopic presentation of scenes has been valuable for studying the emerging properties of visual scene representations. The spatial aspects of this work have generally been focused on the conceptual locations (e.g., next to the refrigerator) and directional locations of objects in 2-D arrays and/or images. Less is known about how the perceived egocentric distance of objects develops. Here we describe a novel system for presenting brief glimpses of a real-world environment, followed by a mask. The system includes projectors with mechanical shutters for projecting the fixation and masking images, a set of LED floodlights for illuminating the environment, and computer-controlled electronics to set the timing and initiate the process. Because a real environment is used, most visual distance and depth cues can be manipulated using traditional methods. The system is inexpensive, robust, and its components are readily available in the marketplace. This article describes the system and the timing characteristics of each component. We verified the system's ability to control exposure to time scales as low as a few milliseconds.
Collapse
|
16
|
Gajewski DA, Wallin CP, Philbeck JW. Gaze behavior and the perception of egocentric distance. J Vis 2014; 14:20. [PMID: 24453346 PMCID: PMC3900371 DOI: 10.1167/14.1.20] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2013] [Accepted: 11/26/2013] [Indexed: 11/24/2022] Open
Abstract
The ground plane is thought to be an important reference for localizing objects, particularly when angular declination is informative, as it is for objects seen resting at floor level. A potential role for eye movements has been implicated by the idea that information about the nearby ground is required to localize objects more distant, and by the fact that the time course for the extraction of distance extends beyond the duration of a typical eye fixation. To test this potential role, eye movements were monitored when participants previewed targets. Distance estimates were provided by walking without vision to the remembered target location (blind walking) or by verbal report. We found that a strategy of holding the gaze steady on the object was as frequent as one where the region between the observer and object was fixated. There was no performance advantage associated with making eye movements in an observational study (Experiment 1) or when an eye-movement strategy was manipulated experimentally (Experiment 2). Observers were extracting useful information covertly, however. In Experiments 3 through 5, obscuring the nearby ground plane had a modest impact on performance; obscuring the walls and ceiling was more detrimental. The results suggest that these alternate surfaces provide useful information when judging the distance to objects within indoor environments. Critically, they constrain the role for the nearby ground plane in theories of egocentric distance perception.
Collapse
Affiliation(s)
- Daniel A. Gajewski
- Department of Psychology, George Washington University, Washington, DC, USA
| | - Courtney P. Wallin
- Department of Psychology, George Washington University, Washington, DC, USA
| | - John W. Philbeck
- Department of Psychology, George Washington University, Washington, DC, USA
| |
Collapse
|