1
|
Liu C, Ma S, Liu Y, Wang Y, Song W. Depth Perception in Optical See-Through Augmented Reality: Investigating the Impact of Texture Density, Luminance Contrast, and Color Contrast. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2024; 30:7266-7276. [PMID: 39255102 DOI: 10.1109/tvcg.2024.3456162] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/12/2024]
Abstract
The immersive augmented reality (AR) system necessitates precise depth registration between virtual objects and the real scene. Prior studies have emphasized the efficacy of surface texture in providing depth cues to enhance depth perception across various media, including the real scene, virtual reality, and AR. However, these studies predominantly focus on black-and-white textures, leaving a gap in understanding the effectiveness of colored textures. To address this gap and further explore texture-related factors in AR, a series of experiments were conducted to investigate the effects of different texture cues on depth perception using the perceptual matching method. Findings indicate that the absolute depth error increases with decreasing contrast under black-and-white texture. Moreover, textures with higher color contrast also contribute to enhanced accuracy of depth judgments in AR. However, no significant effect of texture density on depth perception was observed. The findings serve as a theoretical reference for texture design in AR, aiding in the optimization of virtual-real registration processes.
Collapse
|
2
|
Shayman CS, McCracken MK, Finney HC, Katsanevas AM, Fino PC, Stefanucci JK, Creem-Regehr SH. Effects of older age on visual and self-motion sensory cue integration in navigation. Exp Brain Res 2024; 242:1277-1289. [PMID: 38548892 PMCID: PMC11111325 DOI: 10.1007/s00221-024-06818-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2024] [Accepted: 03/01/2024] [Indexed: 05/16/2024]
Abstract
Older adults demonstrate impairments in navigation that cannot be explained by general cognitive and motor declines. Previous work has shown that older adults may combine sensory cues during navigation differently than younger adults, though this work has largely been done in dark environments where sensory integration may differ from full-cue environments. Here, we test whether aging adults optimally combine cues from two sensory systems critical for navigation: vision (landmarks) and body-based self-motion cues. Participants completed a homing (triangle completion) task using immersive virtual reality to offer the ability to navigate in a well-lit environment including visibility of the ground plane. An optimal model, based on principles of maximum-likelihood estimation, predicts that precision in homing should increase with multisensory information in a manner consistent with each individual sensory cue's perceived reliability (measured by variability). We found that well-aging adults (with normal or corrected-to-normal sensory acuity and active lifestyles) were more variable and less accurate than younger adults during navigation. Both older and younger adults relied more on their visual systems than a maximum likelihood estimation model would suggest. Overall, younger adults' visual weighting matched the model's predictions whereas older adults showed sub-optimal sensory weighting. In addition, high inter-individual differences were seen in both younger and older adults. These results suggest that older adults do not optimally weight each sensory system when combined during navigation, and that older adults may benefit from interventions that help them recalibrate the combination of visual and self-motion cues for navigation.
Collapse
Affiliation(s)
- Corey S Shayman
- Department of Psychology, University of Utah, 380 S. 1500 E. Room 502, Salt Lake City, UT, 84112, USA.
- Interdisciplinary Program in Neuroscience, University of Utah, Salt Lake City, USA.
| | - Maggie K McCracken
- Department of Psychology, University of Utah, 380 S. 1500 E. Room 502, Salt Lake City, UT, 84112, USA
| | - Hunter C Finney
- Department of Psychology, University of Utah, 380 S. 1500 E. Room 502, Salt Lake City, UT, 84112, USA
| | - Andoni M Katsanevas
- Department of Psychology, University of Utah, 380 S. 1500 E. Room 502, Salt Lake City, UT, 84112, USA
| | - Peter C Fino
- Department of Health and Kinesiology, University of Utah, Salt Lake City, USA
| | - Jeanine K Stefanucci
- Department of Psychology, University of Utah, 380 S. 1500 E. Room 502, Salt Lake City, UT, 84112, USA
| | - Sarah H Creem-Regehr
- Department of Psychology, University of Utah, 380 S. 1500 E. Room 502, Salt Lake City, UT, 84112, USA
| |
Collapse
|
3
|
Yamamoto N, Nightingale M. How well do we do social distancing? Q J Exp Psychol (Hove) 2024; 77:1106-1112. [PMID: 37542430 PMCID: PMC11032622 DOI: 10.1177/17470218231195247] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Revised: 06/15/2023] [Accepted: 07/03/2023] [Indexed: 08/07/2023]
Abstract
During the pandemic of coronavirus disease 2019 (COVID-19), many jurisdictions around the world introduced a "social distance" rule under which people are instructed to keep a certain distance from others. Generally, this rule is implemented simply by telling people how many metres or feet of separation should be kept, without giving them precise instructions as to how the specified distance can be measured. Consequently, the rule is effective only to the extent that people are able to gauge this distance through their space perception. To examine the effectiveness of the rule from this point of view, this study empirically investigated how much distance people would leave from another person when they relied on their perception of this distance. Participants (N = 153) were asked to stand exactly 1.5 m away from a researcher, and resultant interpersonal distances showed that while their mean was close to the correct 1.5 m distance, they exhibited large individual differences. These results suggest that a number of people would not stay sufficiently away from others even when they intend to do proper social distancing. Given this outcome, it is suggested that official health advice include measures that compensate for this tendency.
Collapse
Affiliation(s)
- Naohide Yamamoto
- School of Psychology and Counselling, Queensland University of Technology (QUT), Brisbane, QLD, Australia
- Centre for Vision and Eye Research, Queensland University of Technology (QUT), Brisbane, QLD, Australia
| | - Mia Nightingale
- School of Psychology and Counselling, Queensland University of Technology (QUT), Brisbane, QLD, Australia
| |
Collapse
|
4
|
Liu S, Kersten DJ, Legge GE. Effect of expansive optic flow and lateral motion parallax on depth estimation with normal and artificially reduced acuity. J Vis 2023; 23:3. [PMID: 37801321 PMCID: PMC10561791 DOI: 10.1167/jov.23.12.3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2023] [Accepted: 09/07/2023] [Indexed: 10/07/2023] Open
Abstract
When an observer moves in space, the retinal projection of a stationary object either expands if the motion is toward the object or shifts horizontally if the motion contains a lateral component. This study examined the impact of expansive optic flow and lateral motion parallax on the accuracy of depth perception for observers with normal or artificially reduced acuity and asked whether any benefit is due to the continuous motion or to the discrete object image displacement. Stationary participants viewed a virtual room on a computer screen. They used an on-screen slider to estimate the depth of a target object relative to a reference object after seeing 2-second videos simulating five conditions: static viewing, expansive optic flow, and lateral motion parallax in either continuous motion or image displacement. Ten participants viewed the stimuli with normal acuity in Experiment 1 and 11 with three levels of artificially reduced acuity in Experiment 2. Linear regression models represented the relationship between the depth estimates of participants and the ground truth. Lateral motion parallax produced more accurate depth estimates than expansive optic flow and static viewing. Depth perception with continuous motion was more accurate than that with displacement under mild and moderate, but not severe, acuity reduction. For observers with both normal and artificially reduced acuity, lateral motion parallax was more helpful for object depth estimation than expansive optic flow, and continuous motion parallax was more helpful than object image displacement.
Collapse
Affiliation(s)
- Siyun Liu
- Institute of Biophysics, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Minnesota, Minneapolis, MN, USA
| | - Daniel J Kersten
- Department of Psychology, University of Minnesota, Minneapolis, MN, USA
| | - Gordon E Legge
- Department of Psychology, University of Minnesota, Minneapolis, MN, USA
| |
Collapse
|
5
|
Creem-Regehr SH, Barhorst-Cates EM, Tarampi MR, Rand KM, Legge GE. How can basic research on spatial cognition enhance the visual accessibility of architecture for people with low vision? COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2021; 6:3. [PMID: 33411062 PMCID: PMC7790979 DOI: 10.1186/s41235-020-00265-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/25/2020] [Accepted: 11/19/2020] [Indexed: 11/10/2022]
Abstract
People with visual impairment often rely on their residual vision when interacting with their spatial environments. The goal of visual accessibility is to design spaces that allow for safe travel for the large and growing population of people who have uncorrectable vision loss, enabling full participation in modern society. This paper defines the functional challenges in perception and spatial cognition with restricted visual information and reviews a body of empirical work on low vision perception of spaces on both local and global navigational scales. We evaluate how the results of this work can provide insights into the complex problem that architects face in the design of visually accessible spaces.
Collapse
Affiliation(s)
| | | | - Margaret R Tarampi
- Department of Psychology, University of Hartford, West Hartford, CT, USA
| | - Kristina M Rand
- Department of Psychology, University of Utah, Salt Lake City, UT, USA
| | - Gordon E Legge
- Department of Psychology, University of Minnesota, Minneapolis, MN, USA
| |
Collapse
|
6
|
Going the distance and beyond: simulated low vision increases perception of distance traveled during locomotion. PSYCHOLOGICAL RESEARCH 2018; 83:1349-1362. [PMID: 29680863 DOI: 10.1007/s00426-018-1019-2] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2018] [Accepted: 04/13/2018] [Indexed: 10/17/2022]
Abstract
In a series of experiments, we tested the hypothesis that severely degraded viewing conditions during locomotion distort the perception of distance traveled. Some research suggests that there is little-to-no systematic error in perceiving closer distances from a static viewpoint with severely degraded acuity and contrast sensitivity (which we will refer to as blur). However, several related areas of research-extending across domains of perception, attention, and spatial learning-suggest that degraded acuity and contrast sensitivity would affect estimates of distance traveled during locomotion. In a first experiment, we measured estimations of distance traveled in a real-world locomotion task and found that distances were overestimated with blur compared to normal vision using two measures: verbal reports and visual matching (Experiments 1 a, b, and c). In Experiment 2, participants indicated their estimate of the length of a previously traveled path by actively walking an equivalent distance in a viewing condition that either matched their initial path (e.g., blur/blur) or did not match (e.g., blur/normal). Overestimation in blur was found only when participants learned the path in blur and made estimates in normal vision (not in matched blur learning/judgment trials), further suggesting a reliance on dynamic visual information in estimates of distance traveled. In Experiment 3, we found evidence that perception of speed is similarly affected by the blur vision condition, showing an overestimation in perception of speed experienced in wheelchair locomotion during blur compared to normal vision. Taken together, our results demonstrate that severely degraded acuity and contrast sensitivity may increase people's tendency to overestimate perception of distance traveled, perhaps because of an increased perception of speed of self-motion.
Collapse
|
7
|
Foreshortening produces errors in the perception of angles pictured as on the ground. Atten Percept Psychophys 2015; 78:309-16. [PMID: 26537919 DOI: 10.3758/s13414-015-1012-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Observers viewed pictures of a simulated ground plane and judged the orientation of lines pictured as lying on the ground. We presented three lines at a time and manipulated three factors: (1) the declination of the lines below the horizon (depicting distance to the target angles), (2) their length, and (3) whether or not they converged to a point on the horizon. Only the first factor had a substantial effect on these errors. We conclude that perspective foreshortening in pictures produces errors in perceived 3-D orientation. Our explanation is based on the different rates of change of elevation and azimuth with distance.
Collapse
|
8
|
Abstract
The angular declination of a target with respect to eye level is known to be an important cue to egocentric distance when objects are viewed or can be assumed to be resting on the ground. When targets are fixated, angular declination and the direction of the gaze with respect to eye level have the same objective value. However, any situation that limits the time available to shift gaze could leave to-be-localized objects outside the fovea, and, in these cases, the objective values would differ. Nevertheless, angular declination and gaze declination are often conflated, and the role for retinal eccentricity in egocentric distance judgments is unknown. We report two experiments demonstrating that gaze declination is sufficient to support judgments of distance, even when extraretinal signals are all that are provided by the stimulus and task environment. Additional experiments showed no accuracy costs for extrafoveally viewed targets and no systematic impact of foveal or peripheral biases, although a drop in precision was observed for the most retinally eccentric targets. The results demonstrate the remarkable utility of target direction, relative to eye level, for judging distance (signaled by angular declination and/or gaze declination) and are consonant with the idea that detection of the target is sufficient to capitalize on the angular declination of floor-level targets (regardless of the direction of gaze).
Collapse
|
9
|
Rand KM, Creem-Regehr SH, Thompson WB. Spatial learning while navigating with severely degraded viewing: The role of attention and mobility monitoring. J Exp Psychol Hum Percept Perform 2015; 41:649-64. [PMID: 25706766 DOI: 10.1037/xhp0000040] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The ability to navigate without getting lost is an important aspect of quality of life. In 5 studies, we evaluated how spatial learning is affected by the increased demands of keeping oneself safe while walking with degraded vision (mobility monitoring). We proposed that safe low vision mobility requires attentional resources, providing competition for those needed to learn a new environment. In Experiments 1 and 2, participants navigated along paths in a real-world indoor environment with simulated degraded vision or normal vision. Memory for object locations seen along the paths was better with normal compared with degraded vision. With degraded vision, memory was better when participants were guided by an experimenter (low monitoring demands) versus unguided (high monitoring demands). In Experiments 3 and 4, participants walked while performing an auditory task. Auditory task performance was superior with normal compared with degraded vision. With degraded vision, auditory task performance was better when guided compared with unguided. In Experiment 5, participants performed both the spatial learning and auditory tasks under degraded vision. Results showed that attention mediates the relationship between mobility-monitoring demands and spatial learning. These studies suggest that more attention is required and spatial learning is impaired when navigating with degraded viewing.
Collapse
|
10
|
Wnuczko M, Kennedy JM. Pointing to azimuths and elevations of targets: blind and blindfolded-sighted. Perception 2014; 43:117-28. [PMID: 24919348 DOI: 10.1068/p7605] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
Three groups of observers pointed to target circles in a path on the ground, in two parallel rows. Participants in one group viewed the circles and then pointed blindfolded. Those in a second group were blindfolded and then touched the circles with a stick while walking past them. Volunteers in the third group were blind adults, a diverse group, who also used a stick to detect the circles. For all three groups, as distance to the circles increased, pointing azimuths shrank and elevations increased. We suggest that directions to targets on major environmental surfaces may be appreciated similarly by the blind and sighted. We challenge the assumption that the principle of convergence to the horizon, available through vision because of the way in which visual angle decreases on the retina, is not available through touch.
Collapse
|
11
|
Confinement has no effect on visual space perception: The results of the Mars-500 experiment. Atten Percept Psychophys 2013; 76:438-51. [DOI: 10.3758/s13414-013-0594-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|