1
|
Yamamoto N, Nightingale M. How well do we do social distancing? Q J Exp Psychol (Hove) 2024; 77:1106-1112. [PMID: 37542430 PMCID: PMC11032622 DOI: 10.1177/17470218231195247] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Revised: 06/15/2023] [Accepted: 07/03/2023] [Indexed: 08/07/2023]
Abstract
During the pandemic of coronavirus disease 2019 (COVID-19), many jurisdictions around the world introduced a "social distance" rule under which people are instructed to keep a certain distance from others. Generally, this rule is implemented simply by telling people how many metres or feet of separation should be kept, without giving them precise instructions as to how the specified distance can be measured. Consequently, the rule is effective only to the extent that people are able to gauge this distance through their space perception. To examine the effectiveness of the rule from this point of view, this study empirically investigated how much distance people would leave from another person when they relied on their perception of this distance. Participants (N = 153) were asked to stand exactly 1.5 m away from a researcher, and resultant interpersonal distances showed that while their mean was close to the correct 1.5 m distance, they exhibited large individual differences. These results suggest that a number of people would not stay sufficiently away from others even when they intend to do proper social distancing. Given this outcome, it is suggested that official health advice include measures that compensate for this tendency.
Collapse
Affiliation(s)
- Naohide Yamamoto
- School of Psychology and Counselling, Queensland University of Technology (QUT), Brisbane, QLD, Australia
- Centre for Vision and Eye Research, Queensland University of Technology (QUT), Brisbane, QLD, Australia
| | - Mia Nightingale
- School of Psychology and Counselling, Queensland University of Technology (QUT), Brisbane, QLD, Australia
| |
Collapse
|
2
|
Kopiske K, Heinrich EM, Jahn G, Bendixen A, Einhäuser W. Multisensory cues for walking in virtual reality: humans combine conflicting visual and self-motion information to reproduce distances. J Neurophysiol 2023; 130:1028-1040. [PMID: 37701952 DOI: 10.1152/jn.00011.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 08/30/2023] [Accepted: 09/06/2023] [Indexed: 09/14/2023] Open
Abstract
When humans walk, it is important for them to have some measure of the distance they have traveled. Typically, many cues from different modalities are available, as humans perceive both the environment around them (for example, through vision and haptics) and their own walking. Here, we investigate the contribution of visual cues and nonvisual self-motion cues to distance reproduction when walking on a treadmill through a virtual environment by separately manipulating the speed of a treadmill belt and of the virtual environment. Using mobile eye tracking, we also investigate how our participants sampled the visual information through gaze. We show that, as predicted, both modalities affected how participants (N = 28) reproduced a distance. Participants weighed nonvisual self-motion cues more strongly than visual cues, corresponding also to their respective reliabilities, but with some interindividual variability. Those who looked more toward those parts of the visual scene that contained cues to speed and distance tended also to weigh visual information more strongly, although this correlation was nonsignificant, and participants generally directed their gaze toward visually informative areas of the scene less than expected. As measured by motion capture, participants adjusted their gait patterns to the treadmill speed but not to walked distance. In sum, we show in a naturalistic virtual environment how humans use different sensory modalities when reproducing distances and how the use of these cues differs between participants and depends on information sampling.NEW & NOTEWORTHY Combining virtual reality with treadmill walking, we measured the relative importance of visual cues and nonvisual self-motion cues for distance reproduction. Participants used both cues but put more weight on self-motion; weight on visual cues had a trend to correlate with looking at visually informative areas. Participants overshot distances, especially when self-motion was slow; they adjusted steps to self-motion cues but not to visual cues. Our work thus quantifies the multimodal contributions to distance reproduction.
Collapse
Affiliation(s)
- Karl Kopiske
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Elisa-Maria Heinrich
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Georg Jahn
- Applied Geropsychology and Cognition, Faculty of Behavioural and Social Sciences, Chemnitz University of Technology, Chemnitz, Germany
| | - Alexandra Bendixen
- Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| | - Wolfgang Einhäuser
- Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
| |
Collapse
|
3
|
Lanfranchi JB, Lemonnier S. The Estimation of Physical Distances Between Oneself and a Social Robot: Am I as Far From the Robot as It is from Me? Eur J Psychol 2023; 19:299-307. [PMID: 37731753 PMCID: PMC10508198 DOI: 10.5964/ejop.9519] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Accepted: 04/06/2023] [Indexed: 09/22/2023]
Abstract
Research on the perception of interpersonal distance has shown the existence of an asymmetry effect which depends on the reference point of the estimation: the distance from oneself to others can be perceived as longer or shorter than the distance from others to oneself. The mechanism underlying this asymmetric effect is related to the object's cognitive salience. The self often functions as a habitual reference point and therefore one's own salience may be higher than that of other objects. In this case, an egocentric asymmetry effect appears with a perceived shorter distance from others to oneself. However, if others are more salient than oneself, then the reverse can happen (allocentric asymmetry effect). The present work investigates if asymmetry in self-other(s) distance perception changes when the other is a social robot. An experiment was conducted with 174 participants who were asked to estimate the distance between themselves and both robotic and human assistants on a schematic map of a hospital emergency room (between-subjects design). With robust ANOVA, the results showed that the participants felt closer to the human assistant than to the robot, notably when the person served as the estimation reference point. Perceived distances to the social robot were not significantly distorted. If a rather allocentric effect with the human assistant might reflect an affiliation goal on the part of the participants, the absence of effect with the social robot forces us to reconsider its humanization. This could nevertheless reflect a purely mechanical and utilitarian conception of it.
Collapse
|
4
|
Eudave L, Pastor MA. Cognition and driving in older adults: a complex relationship. Aging (Albany NY) 2023; 15:887-888. [PMID: 36812474 PMCID: PMC10008492 DOI: 10.18632/aging.204551] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 02/14/2023] [Indexed: 02/24/2023]
Affiliation(s)
- Luis Eudave
- Faculty of Education and Psychology, University of Navarra, Spain
| | - María A Pastor
- Faculty of Education and Psychology, University of Navarra, Spain
| |
Collapse
|
5
|
Soltani P, Morice AHP. A multi-scale analysis of basketball throw in virtual reality for tracking perceptual-motor expertise. Scand J Med Sci Sports 2023; 33:178-188. [PMID: 36315055 PMCID: PMC10100508 DOI: 10.1111/sms.14250] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2022] [Revised: 09/09/2022] [Accepted: 10/12/2022] [Indexed: 11/19/2022]
Abstract
To benefit from virtual reality (VR) as a complementary tool for training, coaches must determine the proper tools and variables for tracking sports performance. We explored the basketball shooting at several scales (basket-ball, ball-player, and player systems) by monitoring success-rate, and ball and body kinematics. We measured how these scales of analysis allowed tracking players' expertise and perceptual sensitivity to basket distance. Experienced and novice players were instructed to naturally throw and swish an instrumented ball in a stereoscopically rendered virtual basket. We challenged their perceptual-motor systems by manipulating the distance of the virtual basket while keeping the surrounding environment unchanged. The success-rate accounted for the players' shooting adjustments to the manipulation of basket distance and allowed tracking their expertise. Ball kinematics also reflected the manipulation of distance and allowed detecting gender, but did not reflect the players' expertise. Finally, body kinematics variables did not echo players' adjustments to the distance manipulation but reflected their expertise and gender. The results gained at each scale of analysis are discussed with regard to the simulator's construct, biomechanical, and psychological fidelity.
Collapse
Affiliation(s)
- Pooya Soltani
- School of Digital, Technologies and Arts, Staffordshire University, Stoke-on-Trent, UK.,Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA), Department of Computer Science, Department for Health, University of Bath, Bath, UK.,Aix-Marseille University, CNRS, ISM, Marseille, France
| | | |
Collapse
|
6
|
Rzepka AM, Hussey KJ, Maltz MV, Babin K, Wilcox LM, Culham JC. Familiar size affects perception differently in virtual reality and the real world. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210464. [PMID: 36511414 PMCID: PMC9745877 DOI: 10.1098/rstb.2021.0464] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022] Open
Abstract
The promise of virtual reality (VR) as a tool for perceptual and cognitive research rests on the assumption that perception in virtual environments generalizes to the real world. Here, we conducted two experiments to compare size and distance perception between VR and physical reality (Maltz et al. 2021 J. Vis. 21, 1-18). In experiment 1, we used VR to present dice and Rubik's cubes at their typical sizes or reversed sizes at distances that maintained a constant visual angle. After viewing the stimuli binocularly (to provide vergence and disparity information) or monocularly, participants manually estimated perceived size and distance. Unlike physical reality, where participants relied less on familiar size and more on presented size during binocular versus monocular viewing, in VR participants relied heavily on familiar size regardless of the availability of binocular cues. In experiment 2, we demonstrated that the effects in VR generalized to other stimuli and to a higher quality VR headset. These results suggest that the use of binocular cues and familiar size differs substantially between virtual and physical reality. A deeper understanding of perceptual differences is necessary before assuming that research outcomes from VR will generalize to the real world. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Anna M. Rzepka
- Neuroscience Program, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Kieran J. Hussey
- Neuroscience Program, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Margaret V. Maltz
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Karsten Babin
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Laurie M. Wilcox
- Department of Psychology, York University, Toronto, ON, Canada M3J 1P3
| | - Jody C. Culham
- Neuroscience Program, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7,Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| |
Collapse
|
7
|
Creem-Regehr SH, Stefanucci JK, Bodenheimer B. Perceiving distance in virtual reality: theoretical insights from contemporary technologies. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210456. [PMID: 36511405 PMCID: PMC9745869 DOI: 10.1098/rstb.2021.0456] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022] Open
Abstract
Decades of research have shown that absolute egocentric distance is underestimated in virtual environments (VEs) when compared with the real world. This finding has implications on the use of VEs for applications that require an accurate sense of absolute scale. Fortunately, this underperception of scale can be attenuated by several factors, making perception more similar to (but still not the same as) that of the real world. Here, we examine these factors as two categories: (i) experience inherent to the observer, and (ii) characteristics inherent to the display technology. We analyse how these factors influence the sources of information for absolute distance perception with the goal of understanding how the scale of virtual spaces is calibrated. We identify six types of cues that change with these approaches, contributing both to a theoretical understanding of depth perception in VEs and a call for future research that can benefit from changing technologies. This article is part of the theme issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
| | | | - Bobby Bodenheimer
- Department of Computer Science, Vanderbilt University, Nashville, TN 37235, USA
| |
Collapse
|
8
|
Sedgwick HA. Ibn al-Haytham's ground theory of distance perception. Iperception 2022; 13:20416695221118388. [PMID: 36082187 PMCID: PMC9445487 DOI: 10.1177/20416695221118388] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2022] [Accepted: 07/21/2022] [Indexed: 11/26/2022] Open
Abstract
The 11th-century Arab scholar, Ibn al-Haytham, in his Optics, offers a detailed, rigorous, empirically oriented explanation of distance perception that may be the first essentially modern, scientific theory of distance perception. Based on carefully described experiments, he argues that for distance to be perceived accurately: (1) the distance must lie along a continuous surface such as the ground; (2) the continuous surface must be visible; (3) the magnitudes of distances along the surface must be perceived and calibrated through bodily interaction (walking and reaching) with them; and finally (4) the distance must be moderate. Al-Haytham's work reached Europe early in the 13th century, and his was the dominant theory of distance perception there for about 400 years. It was superseded early in the 17th century by a theory, based on cues such as convergence and accommodation, of distance seen through empty, mathematized space. Around 1950, an explanation of distance perception strikingly like that of al-Haytham was independently developed by J. J. Gibson, who called his theory the “ground theory” of space perception.
Collapse
Affiliation(s)
- H A Sedgwick
- State University of New York, State College of Optometry New York, New York, NY, USA
| |
Collapse
|
9
|
Kim JJJ, McManus ME, Harris LR. Body Orientation Affects the Perceived Size of Objects. Perception 2021; 51:25-36. [PMID: 34913755 PMCID: PMC8771894 DOI: 10.1177/03010066211065673] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
Abstract
Here, we investigate how body orientation relative to gravity affects the perceived size of visual targets. When in virtual reality, participants judged the size of a visual target projected at simulated distances of between 2 and 10 m and compared it to a physical reference length held in their hands while they were standing or lying prone or supine. Participants needed to make the visual size of the target 5.4% larger when supine and 10.1% larger when prone, compared to when they were in an upright position to perceive that it matched the physical reference length. Needing to make the target larger when lying compared to when standing suggests some not mutually exclusive possibilities. It may be that while tilted participants perceived the targets as smaller than when they were upright. It may be that participants perceived the targets as being closer while tilted compared to when upright. It may also be that participants perceived the physical reference length as longer while tilted. Misperceiving objects as larger and/or closer when lying may provide a survival benefit while in such a vulnerable position.
Collapse
Affiliation(s)
- John J-J Kim
- Centre for Vision Research, 7991York University, Canada
| | | | | |
Collapse
|
10
|
Forsthofer M, Schutte M, Luksch H, Kohl T, Wiegrebe L, Chagnaud BP. Frequency modulation of rattlesnake acoustic display affects acoustic distance perception in humans. Curr Biol 2021; 31:4367-4372.e4. [PMID: 34416177 DOI: 10.1016/j.cub.2021.07.018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Revised: 05/19/2021] [Accepted: 07/08/2021] [Indexed: 11/26/2022]
Abstract
The estimation of one's distance to a potential threat is essential for any animal's survival. Rattlesnakes inform about their presence by generating acoustic broadband rattling sounds.1 Rattlesnakes generate their acoustic signals by clashing a series of keratinous segments onto each other, which are located at the tip of their tails.1-3 Each tail shake results in a broadband sound pulse that merges into a continuous acoustic signal with fast-repeating tail shakes. This acoustic display is readily recognized by other animals4,5 and serves as an aposematic threat and warning display, likely to avoid being preyed upon.1,6 The spectral properties of the rattling sound1,3 and its dependence on the morphology and size of the rattle have been investigated for decades7-9 and carry relevant information for different receivers, including ground squirrels that encounter rattlesnakes regularly.10,11 Combining visual looming stimuli with acoustic measurements, we show that rattlesnakes increase their rattling rate (up to about 40 Hz) with decreasing distance of a potential threat, reminiscent of the acoustic signals of sensors while parking a car. Rattlesnakes then abruptly switch to a higher and less variable rate of 60-100 Hz. In a virtual reality experiment, we show that this behavior systematically affects distance judgments by humans: the abrupt switch in rattling rate generates a sudden, strong percept of decreased distance which, together with the low-frequency rattling, acts as a remarkable interspecies communication signal. VIDEO ABSTRACT.
Collapse
Affiliation(s)
- Michael Forsthofer
- Department Biology II, Ludwig-Maximilians-University Munich, Großhaderner Str. 2, Planegg 82152, Germany; Graduate School of Systemic Neurosciences, Ludwig-Maximilians-University Munich, Großhaderner Str. 2, Planegg 82152, Germany
| | - Michael Schutte
- Department Biology II, Ludwig-Maximilians-University Munich, Großhaderner Str. 2, Planegg 82152, Germany; Graduate School of Systemic Neurosciences, Ludwig-Maximilians-University Munich, Großhaderner Str. 2, Planegg 82152, Germany
| | - Harald Luksch
- Chair of Zoology, Technical University of Munich, School of Life Sciences, Liesel-Beckmann-Str. 4, Freising 85354, Germany
| | - Tobias Kohl
- Chair of Zoology, Technical University of Munich, School of Life Sciences, Liesel-Beckmann-Str. 4, Freising 85354, Germany
| | - Lutz Wiegrebe
- Department Biology II, Ludwig-Maximilians-University Munich, Großhaderner Str. 2, Planegg 82152, Germany
| | - Boris P Chagnaud
- Department Biology II, Ludwig-Maximilians-University Munich, Großhaderner Str. 2, Planegg 82152, Germany; Institute for Biology, Karl-Franzens-University Graz, Universitätsplatz 2, Graz 8010, Austria.
| |
Collapse
|
11
|
Feldstein IT, Kölsch FM, Konrad R. Egocentric Distance Perception: A Comparative Study Investigating Differences Between Real and Virtual Environments. Perception 2020; 49:940-967. [PMID: 33002392 DOI: 10.1177/0301006620951997] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Virtual reality systems are a popular tool in behavioral sciences. The participants' behavior is, however, a response to cognitively processed stimuli. Consequently, researchers must ensure that virtually perceived stimuli resemble those present in the real world to ensure the ecological validity of collected findings. Our article provides a literature review relating to distance perception in virtual reality. Furthermore, we present a new study that compares verbal distance estimates within real and virtual environments. The virtual space-a replica of a real outdoor area-was displayed using a state-of-the-art head-mounted display. Investigated distances ranged from 8 to 13 m. Overall, the results show no significant difference between egocentric distance estimates in real and virtual environments. However, a more in-depth analysis suggests that the order in which participants were exposed to the two environments may affect the outcome. Furthermore, the study suggests that a rising experience of immersion leads to an alignment of the estimated virtual distances with the real ones. The results also show that the discrepancy between estimates of real and virtual distances increases with the incongruity between virtual and actual eye heights, demonstrating the importance of an accurately set virtual eye height.
Collapse
Affiliation(s)
- Ilja T Feldstein
- Harvard Medical School, Department of Ophthalmology, United States
| | - Felix M Kölsch
- Technical University of Munich, Department of Mechanical Engineering, Germany
| | - Robert Konrad
- Stanford University, Department of Electrical Engineering, United States
| |
Collapse
|
12
|
Hahnel-Peeters RK, Idoine JL, Jackson RE, Goetz AT. Is the Vertical-Horizontal Illusion a Byproduct of the Environmental Vertical Illusion? Evol Psychol 2020; 18:1474704920961953. [PMID: 33161781 PMCID: PMC10303484 DOI: 10.1177/1474704920961953] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2020] [Revised: 09/03/2020] [Accepted: 09/08/2020] [Indexed: 11/16/2022] Open
Abstract
The vertical-horizontal illusion is the overestimation of a vertical line compared to a horizontal line of the same length. Jackson and Cormack (2007) proposed that the vertical-horizontal illusion might be a byproduct of the mechanisms that generate the environmental vertical illusion, which is the tendency to overestimate vertical distances (i.e., heights) relative to horizontal distances the same length. In our study, 326 undergraduate participants stood atop an 18.6-meter parking structure and estimated both the height of the structure and the horizontal distance of a target placed 18.6 meters away, using a moveable horizontal target across the length of the structure. Participants also completed a vertical-horizontal illusion task by drawing a horizontal line below a 9.1 cm vertical line. We correlated vertical distance estimates with vertical line estimates to test Jackson and Cormack's byproduct hypothesis. This hypothesis was very weakly-if at all-supported by the data: Participants' overestimations in the vertical-horizontal illusion task explained 1% of the variance associated with their overestimations in the environmental vertical illusion task. Additionally, to test whether the environmental vertical illusion is impervious to explicit awareness, a random half of our participants were advised to be mindful that people tend to overestimate heights. The results supported our second hypothesis: Even when participants were made aware of the environmental vertical illusion, they still reliably overestimated heights. Discussion addressed implications for the robustness of the environmental vertical illusion (e.g., treatment of those with acrophobia).
Collapse
Affiliation(s)
| | - Jessica L. Idoine
- Chapman University, Orange, CA, USA
- These authors share first authorship
| | | | | |
Collapse
|
13
|
Abstract
The ability of 32 younger (ages ranged from 19 to 32 years) and older adults (ages ranged from 65 to 83 years) to visually perceive outdoor distances was evaluated; we used the method of equal-appearing intervals. On any given trial, the observers adjusted five distance intervals in depth so that they all appeared equivalent in magnitude (and equal to a standard initial egocentric distance of 6 m). The judgments of approximately two thirds of the younger and older observers exhibited varying degrees of perceptual compression, while those of the remaining one third were essentially accurate. Unlike a number of previous studies that evaluated the perception of shorter distances, no significant effects of age were obtained in the current experiment. In particular, there were no significant effects of age upon either accuracy or precision. The ability of human observers to evaluate large-scale distances outdoors is well maintained with increasing age.
Collapse
Affiliation(s)
- J Farley Norman
- Ogden College of Science and Engineering, Western Kentucky University, United States.,Center for Applied Science in Health and Aging, Western Kentucky University, United States
| | - Jessica M Dukes
- Ogden College of Science and Engineering, Western Kentucky University, United States
| | - Hannah K Shapiro
- Ogden College of Science and Engineering, Western Kentucky University, United States
| | - Ashley E Peterson
- Ogden College of Science and Engineering, Western Kentucky University, United States
| |
Collapse
|
14
|
Molto L, Nalborczyk L, Palluel-Germain R, Morgado N. Action Effects on Visual Perception of Distances: A Multilevel Bayesian Meta-Analysis. Psychol Sci 2020; 31:488-504. [PMID: 32271656 DOI: 10.1177/0956797619900336] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Previous studies have suggested that action constraints influence visual perception of distances. For instance, the greater the effort to cover a distance, the longer people perceive this distance to be. The present multilevel Bayesian meta-analysis (37 studies with 1,035 total participants) supported the existence of a small action-constraint effect on distance estimation, Hedges's g = 0.29, 95% credible interval = [0.16, 0.47]. This effect varied slightly according to the action-constraint category (effort, weight, tool use) but not according to participants' motor intention. Some authors have argued that such effects reflect experimental demand biases rather than genuine perceptual effects. Our meta-analysis did not allow us to dismiss this possibility, but it also did not support it. We provide field-specific conventions for interpreting action-constraint effect sizes and the minimum sample sizes required to detect them with various levels of power. We encourage researchers to help us update this meta-analysis by directly uploading their published or unpublished data to our online repository ( https://osf.io/bc3wn/ ).
Collapse
Affiliation(s)
- Lisa Molto
- Laboratoire de Psychologie et NeuroCognition (LPNC), Centre National de la Recherche Scientifique (CNRS), Université Grenoble Alpes
| | - Ladislas Nalborczyk
- Laboratoire de Psychologie et NeuroCognition (LPNC), Centre National de la Recherche Scientifique (CNRS), Université Grenoble Alpes.,Department of Experimental Clinical and Health Psychology, Ghent University
| | - Richard Palluel-Germain
- Laboratoire de Psychologie et NeuroCognition (LPNC), Centre National de la Recherche Scientifique (CNRS), Université Grenoble Alpes
| | - Nicolas Morgado
- Laboratoire sur les Interactions Cognition-Action-Émotion (LICAÉ), Université Paris Nanterre
| |
Collapse
|
15
|
Kelly SA. Blind-Walking Behavior in the Dark Affected by Previewing the Testing Space. Perception 2019; 48:1058-1078. [PMID: 31554477 DOI: 10.1177/0301006619876446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Visual environments affect egocentric distance perceptions in full cue conditions. In this study, the effect of three spatial layouts was tested on the perceived location of a self-illuminated single target viewed in the dark. Blind-walking (BW) estimates of target distance were underestimated in all testing spaces, as expected, but foreshortened significantly more in the shortest of the three testing rooms. Additional experiments revealed that neither changes in the perceived angle of declination nor perceived eye height were responsible for this effect. The possibility that subjects made cognitive adjustments to BW behavior to reduce physical risk was assessed by remeasuring target locations in the three different locations with magnitude estimation and by comparing the BW results obtained from subjects who had no preview of the testing space with those who had. The results support the conclusion that the effect of spatial layout is likely due to cognitive adjustments to BW behavior. The results also indicate that the perceived angle of declination is always overestimated by at least a factor of 1.5. These results can be interpreted within the context of a theory of space perception called the angular expansion theory (AET).
Collapse
Affiliation(s)
- Susan A Kelly
- Department of Vision Sciences, Illinois College of Optometry, Chicago, IL, USA
| |
Collapse
|
16
|
Aseeri S, Paraiso K, Interrante V. Investigating the Influence of Virtual Human Entourage Elements on Distance Judgments in Virtual Architectural Interiors. Front Robot AI 2019; 6:44. [PMID: 33501060 PMCID: PMC7805819 DOI: 10.3389/frobt.2019.00044] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2018] [Accepted: 05/29/2019] [Indexed: 12/02/2022] Open
Abstract
Architectural design drawings commonly include entourage elements: accessory objects, such as people, plants, furniture, etc., that can help to provide a sense of the scale of the depicted structure and “bring the drawings to life” by illustrating typical usage scenarios. In this paper, we describe two experiments that explore the extent to which adding a photo-realistic, three-dimensional model of a familiar person as an entourage element in a virtual architectural model might help to address the classical problem of distance underestimation in these environments. In our first experiment, we found no significant differences in participants' distance perception accuracy in a semi-realistic virtual hallway model in the presence of a static or animated figure of a familiar virtual human, compared to their perception of distances in a hallway model in which no virtual human appeared. In our second experiment, we found no significant differences in distance estimation accuracy in a virtual environment in the presence of a moderately larger-than-life or smaller-than-life virtual human entourage model than when a right-sized virtual human model was used. The results of these two experiments suggest that virtual human entourage has limited potential to influence peoples' sense of the scale of an indoor space, and that simply adding entourage, even including an exact-scale model of a familiar person, will not, on its own, directly evoke more accurate egocentric distance judgments in VR.
Collapse
Affiliation(s)
- Sahar Aseeri
- Department of Computer Science and Engineering, University of Minnesota Twin Cities, Minneapolis, MN, United States
| | - Karla Paraiso
- Department of Computer Science and Engineering, Arizona State University, Tempe, AZ, United States
| | - Victoria Interrante
- Department of Computer Science and Engineering, University of Minnesota Twin Cities, Minneapolis, MN, United States
| |
Collapse
|
17
|
Abstract
Detection of the state of self-motion, such as the instantaneous heading direction, the traveled trajectory and traveled distance or time, is critical for efficient spatial navigation. Numerous psychophysical studies have indicated that the vestibular system, originating from the otolith and semicircular canals in our inner ears, provides robust signals for different aspects of self-motion perception. In addition, vestibular signals interact with other sensory signals such as visual optic flow to facilitate natural navigation. These behavioral results are consistent with recent findings in neurophysiological studies. In particular, vestibular activity in response to the translation or rotation of the head/body in darkness is revealed in a growing number of cortical regions, many of which are also sensitive to visual motion stimuli. The temporal dynamics of the vestibular activity in the central nervous system can vary widely, ranging from acceleration-dominant to velocity-dominant. Different temporal dynamic signals may be decoded by higher level areas for different functions. For example, the acceleration signals during the translation of body in the horizontal plane may be used by the brain to estimate the heading directions. Although translation and rotation signals arise from independent peripheral organs, that is, otolith and canals, respectively, they frequently converge onto single neurons in the central nervous system including both the brainstem and the cerebral cortex. The convergent neurons typically exhibit stronger responses during a combined curved motion trajectory which may serve as the neural correlate for complex path perception. During spatial navigation, traveled distance or time may be encoded by different population of neurons in multiple regions including hippocampal-entorhinal system, posterior parietal cortex, or frontal cortex.
Collapse
Affiliation(s)
- Zhixian Cheng
- Department of Neuroscience, Yale School of Medicine, New Haven, CT, United States
| | - Yong Gu
- Key Laboratory of Primate Neurobiology, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Neuroscience, Chinese Academy of Sciences, Shanghai, China
| |
Collapse
|
18
|
Abstract
In the current study of haptic distance perception, 20 younger (median age: 22 years) and 20 older adults (median age: 72 years) used active touch to estimate distance ratios(one length relative to another). Nine tactile stimuli were created from wooden dowels; each consisted of two perpendicular dowels. The stimulus distance ratios ranged from 1.0 to 5.0. Each participant used both hands (without vision) to actively explore (30 s) a single stimulus object on every trial. The task was to numerically estimate the distance ratio. Overall, the participants’ judgments were precise; the overall magnitude of the Pearson r correlation coefficient was 0.943 and did not differ for younger and older adults. While the participants’ judgments were precise, they were not completely accurate: The average slope (of the relationship between actual and judged distance ratios) for all participants was significantly greater than 1.0 (1.15). Surprisingly, differences in manual dexterity had no apparent effect on distance ratio estimates. Older adults apparently retain an excellent ability to perceive distances using their sense of touch. Our results also demonstrate that the geometry of haptic space (at the scale of the hand) is approximately Euclidean in nature (and certainly not merely topological, projective, or affine).
Collapse
Affiliation(s)
- J Farley Norman
- Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University, Bowling Green, Kentucky, USA
| | - Sydney P Wheeler
- Carol Martin Gatton Academy of Mathematics and Science, Bowling Green, Kentucky, USA
| | - Lauren E Pedersen
- Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University, Bowling Green, KY, USA
| | - Catherine J Dowell
- Department of Psychological Sciences, Ogden College of Science and Engineering, Western Kentucky University, Bowling Green, KY, USA
| |
Collapse
|
19
|
Keezing U, Durgin FH. Do Explicit Estimates of Angular Declination Become Ungrounded in the Presence of a Ground Plane? Iperception 2018; 9:2041669518808536. [PMID: 30397429 PMCID: PMC6207978 DOI: 10.1177/2041669518808536] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2018] [Accepted: 09/17/2018] [Indexed: 11/23/2022] Open
Abstract
In a series of seven experiments (total N = 220), it is shown that explicit angular declination judgments are influenced by the presence of a ground plane in the background. This is of theoretical importance because it bears on the interpretation of the relationship between angular declination and perceived distance on a ground plane. Explicit estimates of ground distance are consistent with a simple 1.5 gain in the underlying perceived angular declination function. The experiments show that, in general, functions of estimates of perceived angular declination have a slope of 1.5, but that an additional intercept can often be observed as a result of incorporating changes in ground distance into reports of changes in angular declination. By varying the background context, a variety of functions were observed that are consistent with this contamination hypothesis.
Collapse
Affiliation(s)
- Umi Keezing
- Department of Psychology, Swarthmore College, PA, USA
| | | |
Collapse
|
20
|
Abstract
For too long, the size distance invariance hypothesis (SDIH) has been the prevalent explanation for size perception. Despite inconclusive evidence, the SDIH has endured, primarily due to lack of suitable information sources for size perception. Because it was derived using the geometry of monocular viewing, another issue is whether the SDIH can encompass binocular vision. A possible alternative to SDIH now exists. The binocular source of size information proposed by Kim (2017) provides metric information about an object's size. Comprised of four angular measures and the interpupillary distance (IPD), with the explicit exclusion of egocentric distance information, Kim's binocular variable demands independence of perceived size and perceived distance, whereas the SDIH assumes interdependence of the two percepts. The validity of Kim's proposed information source was tested in three experiments in which participants viewed a virtual object stereoscopically then judged its size and distance. In Experiments 1 and 2, participants' size judgments were more accurate and less biased than their distance judgments, a finding further reinforced by the results of partial correlation analyses, demonstrating that perceived (stereoscopic) size and distance are independent, rather than interdependent as the SDIH assumes. Experiment 3 manipulated participants' IPDs, one component of Kim's proposed variable. Size and distance judgments were overestimated under a diminished IPD, but underestimated under an enlarged IPD, a result consistent with predictions based on participants' utilization of the proposed information source. Results provide unequivocal evidence against the SDIH as an account of size perception and corroborate the utility of Kim's proposed variable as a viable alternative for the binocular visual system.
Collapse
Affiliation(s)
- Nam-Gyoon Kim
- Department of Psychology, Keimyung University, Daegu, South Korea
| |
Collapse
|
21
|
Lundbeck M, Hartog L, Grimm G, Hohmann V, Bramsløw L, Neher T. Influence of Multi-microphone Signal Enhancement Algorithms on the Acoustics and Detectability of Angular and Radial Source Movements. Trends Hear 2018; 22:2331216518779719. [PMID: 29900799 PMCID: PMC6024528 DOI: 10.1177/2331216518779719] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022] Open
Abstract
Hearing-impaired listeners are known to have difficulties not only with understanding speech in noise but also with judging source distance and movement, and these deficits are related to perceived handicap. It is possible that the perception of spatially dynamic sounds can be improved with hearing aids (HAs), but so far this has not been investigated. In a previous study, older hearing-impaired listeners showed poorer detectability for virtual left-right (angular) and near-far (radial) source movements due to lateral interfering sounds and reverberation, respectively. In the current study, potential ways of improving these deficits with HAs were explored. Using stimuli very similar to before, detailed acoustic analyses were carried out to examine the influence of different HA algorithms for suppressing noise and reverberation on the acoustic cues previously shown to be associated with source movement detectability. For an algorithm that combined unilateral directional microphones with binaural coherence-based noise reduction and for a bilateral beamformer with binaural cue preservation, movement-induced changes in spectral coloration, signal-to-noise ratio, and direct-to-reverberant energy ratio were greater compared with no HA processing. To evaluate these two algorithms perceptually, aided measurements of angular and radial source movement detectability were performed with 20 older hearing-impaired listeners. The analyses showed that, in the presence of concurrent interfering sounds and reverberation, the bilateral beamformer could restore source movement detectability in both spatial dimensions, whereas the other algorithm only improved detectability in the near-far dimension. Together, these results provide a basis for improving the detectability of spatially dynamic sounds with HAs.
Collapse
Affiliation(s)
- Micha Lundbeck
- 1 Medizinische Physik and Cluster of Excellence "Hearing4all", Oldenburg University, Germany.,2 HörTech gGmbH, Oldenburg, Germany
| | - Laura Hartog
- 1 Medizinische Physik and Cluster of Excellence "Hearing4all", Oldenburg University, Germany.,2 HörTech gGmbH, Oldenburg, Germany
| | - Giso Grimm
- 1 Medizinische Physik and Cluster of Excellence "Hearing4all", Oldenburg University, Germany.,2 HörTech gGmbH, Oldenburg, Germany
| | - Volker Hohmann
- 1 Medizinische Physik and Cluster of Excellence "Hearing4all", Oldenburg University, Germany.,2 HörTech gGmbH, Oldenburg, Germany
| | - Lars Bramsløw
- 3 Eriksholm Research Centre, Oticon A/S, Snekkersten, Denmark
| | - Tobias Neher
- 1 Medizinische Physik and Cluster of Excellence "Hearing4all", Oldenburg University, Germany.,4 Institute of Clinical Research, University of Southern Denmark, Odense, Denmark
| |
Collapse
|
22
|
Abstract
BACKGROUND/OBJECTIVE Patients with schizophrenia not only have psychiatric symptoms, but also have movement problems, which might also be associated with their reduced quality of life. Little is known about how to improve their movement performance for patients. Manipulating object size and distance is common in occupational therapy practice to evaluate and optimize reaching performance in patients with physical disabilities, but effects of the manipulation in patients with schizophrenia remain unclear. The purpose of this study was to examine whether object size and distance could change performance of reaching kinematics in patients with mild schizophrenia. METHODS Twenty-nine patients with mild schizophrenia and 15 age- and gender-matched healthy controls were required to reach for, as quickly as possible, a small or large object that was placed at a near or far distance. We measured movement time, peak velocity, path length ratio, percentage of time to peak velocity, and movement units to infer movement speed, forcefulness, spatial efficiency (directness), control strategies, and smoothness. RESULTS Patients' reaching movements were slower (p = .017) and less direct (p = .007) than those of controls. A larger object induced faster (p = .016), more preprogrammed (p = .018), and more forceful (p = .010) movements in patients. A farther object induced slower, more feedback dependent, but more forceful and more direct movements (all p < .001). CONCLUSION The results of kinematic deficiencies suggest the need of movement training for patients with mild schizophrenia. Occupational therapists may grade or adapt reaching activities by changing object size and distance to enhance movement performance in patients with schizophrenia.
Collapse
|
23
|
Abstract
A crucial step in forming spatial representations of the environment involves the estimation of relative distance. Active sampling through specific movements is considered essential for optimizing the sensory flow that enables the extraction of distance cues. However, in electric sensing, direct evidence for the generation and exploitation of sensory flow is lacking. Weakly electric fish rely on a self-generated electric field to navigate and capture prey in the dark. This electric sense provides a blurred representation of the environment, making the exquisite sensory abilities of electric fish enigmatic. Stereotyped back-and-forth swimming patterns reminiscent of visual peering movements are suggestive of the active generation of sensory flow, but how motion contributes to the disambiguation of the electrosensory world remains unclear. Here, we show that a dipole-like electric field geometry coupled to motion provides the physical basis for a nonvisual parallax. We then show in a behavioral assay that this cue is used for electrosensory distance perception across phylogenetically distant taxa of weakly electric fish. Notably, these species electrically sample the environment in temporally distinct ways (using discrete pulses or quasisinusoidal waves), suggesting a ubiquitous role for parallax in electric sensing. Our results demonstrate that electrosensory information is extracted from sensory flow and used in a behaviorally relevant context. A better understanding of motion-based electric sensing will provide insight into the sensorimotor coordination required for active sensing in general and may lead to improved electric field-based imaging applications in a variety of contexts.
Collapse
|
24
|
Spiousas I, Etchemendy PE, Eguia MC, Calcagno ER, Abregú E, Vergara RO. Sound Spectrum Influences Auditory Distance Perception of Sound Sources Located in a Room Environment. Front Psychol 2017; 8:969. [PMID: 28690556 PMCID: PMC5479918 DOI: 10.3389/fpsyg.2017.00969] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2016] [Accepted: 05/26/2017] [Indexed: 12/03/2022] Open
Abstract
Previous studies on the effect of spectral content on auditory distance perception (ADP) focused on the physically measurable cues occurring either in the near field (low-pass filtering due to head diffraction) or when the sound travels distances >15 m (high-frequency energy losses due to air absorption). Here, we study how the spectrum of a sound arriving from a source located in a reverberant room at intermediate distances (1–6 m) influences the perception of the distance to the source. First, we conducted an ADP experiment using pure tones (the simplest possible spectrum) of frequencies 0.5, 1, 2, and 4 kHz. Then, we performed a second ADP experiment with stimuli consisting of continuous broadband and bandpass-filtered (with center frequencies of 0.5, 1.5, and 4 kHz and bandwidths of 1/12, 1/3, and 1.5 octave) pink-noise clips. Our results showed an effect of the stimulus frequency on the perceived distance both for pure tones and filtered noise bands: ADP was less accurate for stimuli containing energy only in the low-frequency range. Analysis of the frequency response of the room showed that the low accuracy observed for low-frequency stimuli can be explained by the presence of sparse modal resonances in the low-frequency region of the spectrum, which induced a non-monotonic relationship between binaural intensity and source distance. The results obtained in the second experiment suggest that ADP can also be affected by stimulus bandwidth but in a less straightforward way (i.e., depending on the center frequency, increasing stimulus bandwidth could have different effects). Finally, the analysis of the acoustical cues suggests that listeners judged source distance using mainly changes in the overall intensity of the auditory stimulus with distance rather than the direct-to-reverberant energy ratio, even for low-frequency noise bands (which typically induce high amount of reverberation). The results obtained in this study show that, depending on the spectrum of the auditory stimulus, reverberation can degrade ADP rather than improve it.
Collapse
Affiliation(s)
- Ignacio Spiousas
- Laboratorio de Dinámica Sensomotora, Departamento de Ciencia y Tecnología, CONICET, Universidad Nacional de QuilmesBernal, Argentina
| | - Pablo E Etchemendy
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de QuilmesBernal, Argentina
| | - Manuel C Eguia
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de QuilmesBernal, Argentina
| | - Esteban R Calcagno
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de QuilmesBernal, Argentina
| | - Ezequiel Abregú
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de QuilmesBernal, Argentina
| | - Ramiro O Vergara
- Laboratorio de Acústica y Percepción Sonora, Escuela Universitaria de Artes, CONICET, Universidad Nacional de QuilmesBernal, Argentina
| |
Collapse
|
25
|
van Lent LG, Sungur H, Kunneman FA, van de Velde B, Das E. Too Far to Care? Measuring Public Attention and Fear for Ebola Using Twitter. J Med Internet Res 2017; 19:e193. [PMID: 28611015 PMCID: PMC5487741 DOI: 10.2196/jmir.7219] [Citation(s) in RCA: 51] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2016] [Revised: 03/09/2017] [Accepted: 03/30/2017] [Indexed: 01/19/2023] Open
Abstract
Background In 2014, the world was startled by a sudden outbreak of Ebola. Although Ebola infections and deaths occurred almost exclusively in Guinea, Sierra Leone, and Liberia, few potential Western cases, in particular, caused a great stir among the public in Western countries. Objective This study builds on the construal level theory to examine the relationship between psychological distance to an epidemic and public attention and sentiment expressed on Twitter. Whereas previous research has shown the potential of social media to assess real-time public opinion and sentiment, generalizable insights that further the theory development lack. Methods Epidemiological data (number of Ebola infections and fatalities) and media data (tweet volume and key events reported in the media) were collected for the 2014 Ebola outbreak, and Twitter content from the Netherlands was coded for (1) expressions of fear for self or fear for others and (2) psychological distance of the outbreak to the tweet source. Longitudinal relations were compared using vector error correction model (VECM) methodology. Results Analyses based on 4500 tweets revealed that increases in public attention to Ebola co-occurred with severe world events related to the epidemic, but not all severe events evoked fear. As hypothesized, Web-based public attention and expressions of fear responded mainly to the psychological distance of the epidemic. A chi-square test showed a significant positive relation between proximity and fear: χ22=103.2 (P<.001). Public attention and fear for self in the Netherlands showed peaks when Ebola became spatially closer by crossing the Mediterranean Sea and Atlantic Ocean. Fear for others was mostly predicted by the social distance to the affected parties. Conclusions Spatial and social distance are important predictors of public attention to worldwide crisis such as epidemics. These factors need to be taken into account when communicating about human tragedies.
Collapse
Affiliation(s)
- Liza Gg van Lent
- Centre for Language Studies, Radboud University, Nijmegen, Netherlands
| | - Hande Sungur
- Communication Sciences, University of Amsterdam, Amsterdam, Netherlands
| | | | - Bob van de Velde
- Communication Sciences, University of Amsterdam, Amsterdam, Netherlands
| | - Enny Das
- Centre for Language Studies, Radboud University, Nijmegen, Netherlands
| |
Collapse
|
26
|
Lundbeck M, Grimm G, Hohmann V, Laugesen S, Neher T. Sensitivity to Angular and Radial Source Movements as a Function of Acoustic Complexity in Normal and Impaired Hearing. Trends Hear 2017; 21:2331216517717152. [PMID: 28675088 PMCID: PMC5548306 DOI: 10.1177/2331216517717152] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2016] [Revised: 05/16/2017] [Accepted: 05/23/2017] [Indexed: 11/15/2022] Open
Abstract
In contrast to static sounds, spatially dynamic sounds have received little attention in psychoacoustic research so far. This holds true especially for acoustically complex (reverberant, multisource) conditions and impaired hearing. The current study therefore investigated the influence of reverberation and the number of concurrent sound sources on source movement detection in young normal-hearing (YNH) and elderly hearing-impaired (EHI) listeners. A listening environment based on natural environmental sounds was simulated using virtual acoustics and rendered over headphones. Both near-far ('radial') and left-right ('angular') movements of a frontal target source were considered. The acoustic complexity was varied by adding static lateral distractor sound sources as well as reverberation. Acoustic analyses confirmed the expected changes in stimulus features that are thought to underlie radial and angular source movements under anechoic conditions and suggested a special role of monaural spectral changes under reverberant conditions. Analyses of the detection thresholds showed that, with the exception of the single-source scenarios, the EHI group was less sensitive to source movements than the YNH group, despite adequate stimulus audibility. Adding static sound sources clearly impaired the detectability of angular source movements for the EHI (but not the YNH) group. Reverberation, on the other hand, clearly impaired radial source movement detection for the EHI (but not the YNH) listeners. These results illustrate the feasibility of studying factors related to auditory movement perception with the help of the developed test setup.
Collapse
Affiliation(s)
- Micha Lundbeck
- Medizinische Physik and Cluster of Excellence ‘Hearing4all,’ Department of Medical Physics and Acoustics, Oldenburg University, Germany
- HörTech gGmbH, Oldenburg, Germany
| | - Giso Grimm
- Medizinische Physik and Cluster of Excellence ‘Hearing4all,’ Department of Medical Physics and Acoustics, Oldenburg University, Germany
- HörTech gGmbH, Oldenburg, Germany
| | - Volker Hohmann
- Medizinische Physik and Cluster of Excellence ‘Hearing4all,’ Department of Medical Physics and Acoustics, Oldenburg University, Germany
- HörTech gGmbH, Oldenburg, Germany
| | | | - Tobias Neher
- Medizinische Physik and Cluster of Excellence ‘Hearing4all,’ Department of Medical Physics and Acoustics, Oldenburg University, Germany
| |
Collapse
|
27
|
Vienne C, Plantier J, Neveu P, Priot AE. The Role of Vertical Disparity in Distance and Depth Perception as Revealed by Different Stereo-Camera Configurations. Iperception 2016; 7:2041669516681308. [PMID: 27994843 PMCID: PMC5154397 DOI: 10.1177/2041669516681308] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Vertical binocular disparity is a source of distance information allowing the portrayal of the layout and 3D metrics of the visual space. The role of vertical disparity in the perception of depth, size, curvature, or slant of surfaces was revealed in several previous studies using cue conflict paradigms. In this study, we varied the configuration of stereo-cameras to investigate how changes in the horizontal and vertical disparity fields, conflicting with the vergence cue, affect perceived distance and depth. In four experiments, observers judged the distance of a cylinder displayed in front of a large fronto-parallel surface. Experiment 1 revealed that the presence of a background surface decreases the uncertainty in judgments of distance, suggesting that observers use the relative horizontal disparity between the target and the background as a cue to distance. Two other experiments showed that manipulating the pattern of vertical disparity affected both distance and depth perception. When vertical disparity specified a nearer distance than vergence (convergent cameras), perceived distance and depth were underestimated as compared with the condition where vertical disparity was congruent with vergence cues (parallel cameras). When vertical disparity specified a further distance than vergence, namely an infinite distance, distance and depth were overestimated. The removal of the vertical distortion lessened the effect on perceived distance. Overall, the results suggest that the vertical disparity introduced by the specific camera configuration is mainly responsible for the effect. These findings outline the role of vertical disparity in distance and depth perception and support the use of parallel cameras for designing stereograms.
Collapse
Affiliation(s)
- Cyril Vienne
- Institut de recherche biomédicale des armées, Brétigny-sur-Orge, France
| | - Justin Plantier
- Institut de recherche biomédicale des armées, Brétigny-sur-Orge, France
| | - Pascaline Neveu
- Institut de recherche biomédicale des armées, Brétigny-sur-Orge, France
| | - Anne-Emmanuelle Priot
- Institut de recherche biomédicale des armées, Brétigny-sur-Orge, France INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Center Bron, France
| |
Collapse
|
28
|
Abstract
Cañal-Bruland and van der Kamp present an argument about the incommensurate relationship between affordance perception and spatial perception in a criticism of Proffitt and Linkenauger’s phenotypic approach to perception. Many of their criticisms are based on a difference in the interpretation of the core ideas underlying the phenotypic approach. The most important of these differences in interpretations concern fundamental assumptions about the nature of the perceptions of size and distance themselves. Extent perception must be relative to the organism; therefore, there can be no veridical perception of space. Also, we argue in the phenotypic approach that space perception is an emergent property of affordance perception; they are not different types of perceptions as Cañal-Bruland and van der Kamp presume. Third, affordance perception need not be perfectly accurate, just good enough. Additionally, affordance perception need not be dichotomous; this presumption likely originates in the methodology typically employed to study affordance perception. Finally, I agree with Cañal-Bruland and van der Kamp that joint research efforts will clarify and improve our understanding of these issues.
Collapse
|
29
|
Pigarev IN, Levichkina EV. Absolute Depth Sensitivity in Cat Primary Visual Cortex under Natural Viewing Conditions. Front Syst Neurosci 2016; 10:66. [PMID: 27547179 PMCID: PMC4974279 DOI: 10.3389/fnsys.2016.00066] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2015] [Accepted: 07/21/2016] [Indexed: 11/13/2022] Open
Abstract
Mechanisms of 3D perception, investigated in many laboratories, have defined depth either relative to the fixation plane or to other objects in the visual scene. It is obvious that for efficient perception of the 3D world, additional mechanisms of depth constancy could operate in the visual system to provide information about absolute distance. Neurons with properties reflecting some features of depth constancy have been described in the parietal and extrastriate occipital cortical areas. It has also been shown that, for some neurons in the visual area V1, responses to stimuli of constant angular size differ at close and remote distances. The present study was designed to investigate whether, in natural free gaze viewing conditions, neurons tuned to absolute depths can be found in the primary visual cortex (area V1). Single-unit extracellular activity was recorded from the visual cortex of waking cats sitting on a trolley in front of a large screen. The trolley was slowly approaching the visual scene, which consisted of stationary sinusoidal gratings of optimal orientation rear-projected over the whole surface of the screen. Each neuron was tested with two gratings, with spatial frequency of one grating being twice as high as that of the other. Assuming that a cell is tuned to a spatial frequency, its maximum response to the grating with a spatial frequency twice as high should be shifted to a distance half way closer to the screen in order to attain the same size of retinal projection. For hypothetical neurons selective to absolute depth, location of the maximum response should remain at the same distance irrespective of the type of stimulus. It was found that about 20% of neurons in our experimental paradigm demonstrated sensitivity to particular distances independently of the spatial frequencies of the gratings. We interpret these findings as an indication of the use of absolute depth information in the primary visual cortex.
Collapse
Affiliation(s)
- Ivan N Pigarev
- Institute for Information Transmission Problems (Kharkevich Institute), Russian Academy of Sciences Moscow, Russia
| | - Ekaterina V Levichkina
- Institute for Information Transmission Problems (Kharkevich Institute), Russian Academy of SciencesMoscow, Russia; Department of Optometry and Vision Sciences, The University of Melbourne, ParkvilleVIC, Australia
| |
Collapse
|
30
|
Jung E, Takahashi K, Watanabe K, de la Rosa S, Butz MV, Bülthoff HH, Meilinger T. The Influence of Human Body Orientation on Distance Judgments. Front Psychol 2016; 7:217. [PMID: 27014108 PMCID: PMC4784476 DOI: 10.3389/fpsyg.2016.00217] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2015] [Accepted: 02/03/2016] [Indexed: 11/13/2022] Open
Abstract
People maintain larger distances to other peoples' front than to their back. We investigated if humans also judge another person as closer when viewing their front than their back. Participants watched animated virtual characters (avatars) and moved a virtual plane toward their location after the avatar was removed. In Experiment 1, participants judged avatars, which were facing them as closer and made quicker estimates than to avatars looking away. In Experiment 2, avatars were rotated in 30 degree steps around the vertical axis. Observers judged avatars roughly facing them (i.e., looking max. 60 degrees away) as closer than avatars roughly looking away. No particular effect was observed for avatars directly facing and also gazing at the observer. We conclude that body orientation was sufficient to generate the asymmetry. Sensitivity of the orientation effect to gaze and to interpersonal distance would have suggested involvement of social processing, but this was not observed. We discuss social and lower-level processing as potential reasons for the effect.
Collapse
Affiliation(s)
- Edgard Jung
- Max Planck Institute for Biological CyberneticsTübingen, Germany; University of InnsbruckInnsbruck, Austria
| | - Kohske Takahashi
- Research Center for Advanced Science and Technology, The University of Tokyo Tokyo, Japan
| | - Katsumi Watanabe
- Research Center for Advanced Science and Technology, The University of TokyoTokyo, Japan; Waseda UniversityTokyo, Japan
| | | | | | - Heinrich H Bülthoff
- Max Planck Institute for Biological CyberneticsTübingen, Germany; Department of Brain and Cognitive Engineering, Korea UniversitySeoul, South Korea
| | - Tobias Meilinger
- Max Planck Institute for Biological Cybernetics Tübingen, Germany
| |
Collapse
|
31
|
Yang H, Cai BN, Wang XS, Cong XH, Xu W, Wang JY, Yang J, Xu SP, Ju ZJ, Ma L. Dose Evaluation of Fractionated Schema and Distance From Tumor to Spinal Cord for Spinal SBRT with Simultaneous Integrated Boost: A Preliminary Study. Med Sci Monit 2016; 22:598-607. [PMID: 26902177 PMCID: PMC4767138 DOI: 10.12659/msm.897146] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2015] [Accepted: 01/28/2016] [Indexed: 11/15/2022] Open
Abstract
BACKGROUND This study investigated and quantified the dosimetric impact of the distance from the tumor to the spinal cord and fractionation schemes for patients who received stereotactic body radiation therapy (SBRT) and hypofractionated simultaneous integrated boost (HF-SIB). MATERIAL AND METHODS Six modified planning target volumes (PTVs) for 5 patients with spinal metastases were created by artificial uniform extension in the region of PTV adjacent spinal cord with a specified minimum tumor to cord distance (0-5 mm). The prescription dose (biologic equivalent dose, BED) was 70 Gy in different fractionation schemes (1, 3, 5, and 10 fractions). For PTV V100, Dmin, D98, D95, and D1, spinal cord dose, conformity index (CI), V30 were measured and compared. RESULTS PTV-to-cord distance influenced PTV V100, Dmin, D98, and D95, and fractionation schemes influenced Dmin and D98, with a significant difference. Distances of ≥2 mm, ≥1 mm, ≥1 mm, and ≥0 mm from PTV to spinal cord meet dose requirements in 1, 3, 5, and 10 fractionations, respectively. Spinal cord dose, CI, and V30 were not impacted by PTV-to-cord distance and fractionation schemes. CONCLUSIONS Target volume coverage, Dmin, D98, and D95 were directly correlated with distance from the spinal cord for spine SBRT and HF-SIB. Based on our study, ≥2 mm, ≥1 mm, ≥1 mm, and ≥0 mm distance from PTV to spinal cord meets dose requirements in 1, 3, 5 and 10 fractionations, respectively.
Collapse
Affiliation(s)
- Hao Yang
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
- Department of Radiation Oncology, Inner Mongolia Cancer Hospital and The Affiliated People’s Hospital of Inner Mongolia Medical University, Hohhot, Inner Mongolia, P.R. China
| | - Bo-ning Cai
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
| | - Xiao-shen Wang
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
| | - Xiao-hu Cong
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
| | - Wei Xu
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
| | - Jin-yuan Wang
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
| | - Jun Yang
- Department of Oncology, First Affiliated Hospital of Xinxiang Medical University, Weihui, Henan, P.R. China
| | - Shou-ping Xu
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
| | - Zhong-jian Ju
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
| | - Lin Ma
- Department of Radiation Oncology, Chinese PLA General Hospital, Beijing, P.R. China
- Department of Radiation Oncology, Hainan Branch of Chinese PLA General Hospital, Haitang Bay, Sanya, Hainan, P.R. China
| |
Collapse
|
32
|
Abstract
Facial expressions of emotion are thought to convey expressers’ behavioral intentions, thus priming observers’ approach and avoidance tendencies appropriately. The present study examined whether detecting expressions of behavioral intent influences perceivers’ estimation of the expresser’s distance from them. Eighteen undergraduates (nine male and nine female) participated in the study. Six facial expressions were chosen on the basis of degree of threat—anger, hate (threatening expressions), shame, surprise (neutral expressions), pleasure, and joy (safe expressions). Each facial expression was presented on a tablet PC held by an assistant covered by a black drape who stood 1, 2, or 3 m away from participants. Participants performed a visual matching task to report the perceived distance. Results showed that facial expression influenced distance estimation, with faces exhibiting threatening or safe expressions judged closer than those showing neutral expressions. Females’ judgments were more likely to be influenced; but these influences largely disappeared beyond the 2 m distance. These results suggest that facial expressions of emotion (particularly threatening or safe emotions) influence others’ (especially females’) distance estimations but only within close proximity.
Collapse
Affiliation(s)
- Nam-Gyoon Kim
- Department of Psychology, Keimyung University , Daegu, Korea
| | - Heejung Son
- Department of Psychology, Keimyung University , Daegu, Korea
| |
Collapse
|
33
|
Abstract
PURPOSE Space perception beyond the near distance range (>2 m) is important for target localization, and for directing and guiding a variety of daily activities, including driving and walking. However, it is unclear whether the absolute (egocentric) localization of a single target in the intermediate distance range requires binocular vision, and if so, whether having subnormal stereopsis in strabismus impairs one's ability to localize the target. METHODS We investigated this by measuring the perceived absolute location of a target by observers with normal binocular vision (n = 8; mean age, 24.5 years) and observers with strabismus (n = 8; mean age, 24.9 years) under monocular and binocular conditions. The observers used the blind walking-gesturing task to indicate the judged location of a target located at various viewing distances (2.73-6.93 m) and heights (0, 30, and 90 cm) above the floor. Near stereopsis was assessed with the Randot Stereotest. RESULTS Both groups of observers accurately judged the absolute distance of the target on the ground (height = 0 cm) either with monocular or binocular viewing. However, when the target was suspended in midair, the normal observers accurately judged target location with binocular viewing, but not with monocular viewing (mean slant angle, 0.8° ± 0.5° vs. 7.4° ± 1.4°; P < 0.001, with a slant angle of 0° representing accurate localization). In contrast, the strabismic observers with poorer stereo acuity made larger errors in target localization in both viewing conditions, though with fewer errors during binocular viewing (mean slant angle, 2.7° ± 0.4° vs. 9.2° ± 1.3°; P < 0.0025). Further analysis reveals the localization error, that is, slant angle, correlates positively with stereo threshold during binocular viewing (r(2) = 0.479, P < 0.005), but not during monocular viewing (r(2) = 0.0002, P = 0.963). CONCLUSIONS Locating a single target on the ground is sufficient with monocular depth information, but binocular depth information is required when the target is suspended in midair. Since the absolute binocular disparity information of the single target is weak beyond 2 m, we suggest the visual system localizes the single target using the relative binocular disparity information between the midair target and the visible ground surface. Consequently, strabismic observers with residual stereopsis localize a target more accurately than their counterparts without stereo ability.
Collapse
Affiliation(s)
- Teng Leng Ooi
- The Ohio State University, Columbus, Ohio, United States
| | - Zijiang J. He
- University of Louisville, Louisville, Kentucky, United States
| |
Collapse
|
34
|
Abstract
The auditory system derives locations of sound sources from spatial cues provided by the interaction of sound with the head and external ears. Those cues are analyzed in specific brainstem pathways and then integrated as cortical representation of locations. The principal cues for horizontal localization are interaural time differences (ITDs) and interaural differences in sound level (ILDs). Vertical and front/back localization rely on spectral-shape cues derived from direction-dependent filtering properties of the external ears. The likely first sites of analysis of these cues are the medial superior olive (MSO) for ITDs, lateral superior olive (LSO) for ILDs, and dorsal cochlear nucleus (DCN) for spectral-shape cues. Localization in distance is much less accurate than that in horizontal and vertical dimensions, and interpretation of the basic cues is influenced by additional factors, including acoustics of the surroundings and familiarity of source spectra and levels. Listeners are quite sensitive to sound motion, but it remains unclear whether that reflects specific motion detection mechanisms or simply detection of changes in static location. Intact auditory cortex is essential for normal sound localization. Cortical representation of sound locations is highly distributed, with no evidence for point-to-point topography. Spatial representation is strictly contralateral in laboratory animals that have been studied, whereas humans show a prominent right-hemisphere dominance.
Collapse
Affiliation(s)
- John C Middlebrooks
- Departments of Otolaryngology, Neurobiology and Behavior, Cognitive Sciences, and Biomedical Engineering, University of California at Irvine, Irvine, CA, USA.
| |
Collapse
|
35
|
Abstract
Past research has shown that auditory distance estimation improves when listeners are given the opportunity to see all possible sound sources when compared to no visual input. It has also been established that distance estimation is more accurate in vision than in audition. The present study investigates the degree to which auditory distance estimation is improved when matched with a congruent visual stimulus. Virtual sound sources based on binaural room impulse response (BRIR) measurements made from distances ranging from approximately 0.3 to 9.8 m in a concert hall were used as auditory stimuli. Visual stimuli were photographs taken from the participant's perspective at each distance in the impulse response measurement setup presented on a large HDTV monitor. Participants were asked to estimate egocentric distance to the sound source in each of three conditions: auditory only (A), visual only (V), and congruent auditory/visual stimuli (A+V). Each condition was presented within its own block. Sixty-two participants were tested in order to quantify the response variability inherent in auditory distance perception. Distance estimates from both the V and A+V conditions were found to be considerably more accurate and less variable than estimates from the A condition.
Collapse
Affiliation(s)
- Paul W Anderson
- Department of Psychological and Brain Sciences, University of Louisville Louisville, KY, USA
| | - Pavel Zahorik
- Department of Psychological and Brain Sciences, University of Louisville Louisville, KY, USA ; Division of Communicative Disorders, Department of Surgery, School of Medicine, University of Louisville Louisville, KY, USA
| |
Collapse
|
36
|
Soliman TM, Glenberg AM. How intent to interact can affect action scaling of distance: reply to Wilson. Front Psychol 2014; 5:513. [PMID: 24926272 PMCID: PMC4046488 DOI: 10.3389/fpsyg.2014.00513] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2014] [Accepted: 05/10/2014] [Indexed: 11/13/2022] Open
Affiliation(s)
- Tamer M Soliman
- Department of Psychology, Arizona State University , Tempe, AZ, USA
| | | |
Collapse
|
37
|
Abstract
Being objectively close to or far from a place changes how people perceive the location of that place in a subjective, psychological sense. In the six studies reported here, we investigated whether people's spatial orientation (defined as moving toward or away from a place) will produce similar effects-by specifically influencing psychological closeness in each of its forms (i.e., spatial, temporal, probabilistic, and social distance). Orientation influenced subjective spatial distance at various levels of objective distance (Study 1), regardless of the direction people were facing (Study 2). In addition, when spatially oriented toward, rather than away from, a particular place, participants felt that events there had occurred more recently (Studies 3a and 3b) and that events there would be more likely to occur (Study 4). Finally, participants felt more similarity to people who were spatially oriented toward them than to people who were spatially oriented away from them (Study 5). Our investigation broadens the study of psychological distance from static spatial locations to dynamically moving points in space.
Collapse
Affiliation(s)
- Sam J Maglio
- Department of Marketing, University of Toronto Scarborough
| | - Evan Polman
- Department of Marketing, University of Wisconsin-Madison
| |
Collapse
|
38
|
Abstract
Visual perception is an important component of environmental navigation. Previous research has revealed large individual differences in navigational strategies (i.e., the body's kinesthetic and embodied approach to movement) and the perception of environmental surfaces (via distance estimations), but little research has investigated the potential relationship between these sources of individual variation. An important navigational strategy is the interaction between reliance on visual cues and vestibular or proprioceptive cues. We investigated the role of this navigational strategy in the perception of environmental surfaces. The results supported three embodied evolutionary predictions: Individuals who were most reliant on visual context (1) overestimated vertical surfaces significantly more, and (2) feared falling significantly more, than did those who were least reliant on visual context; and (3) all individuals had roughly accurate horizontal distance estimates, regardless of their navigational strategy. These are among the first data to suggest that individual differences in perception are closely related to the individual differences in navigation that derive from navigational risks. Variable navigational strategies may reflect variable capacities to perceive and navigate the environment.
Collapse
Affiliation(s)
- Chéla R Willey
- Department of Psychology, University of California Los Angeles, 1285 Franz Hall, Los Angeles, CA, 90095, USA,
| | | |
Collapse
|
39
|
Abstract
The ground plane is thought to be an important reference for localizing objects, particularly when angular declination is informative, as it is for objects seen resting at floor level. A potential role for eye movements has been implicated by the idea that information about the nearby ground is required to localize objects more distant, and by the fact that the time course for the extraction of distance extends beyond the duration of a typical eye fixation. To test this potential role, eye movements were monitored when participants previewed targets. Distance estimates were provided by walking without vision to the remembered target location (blind walking) or by verbal report. We found that a strategy of holding the gaze steady on the object was as frequent as one where the region between the observer and object was fixated. There was no performance advantage associated with making eye movements in an observational study (Experiment 1) or when an eye-movement strategy was manipulated experimentally (Experiment 2). Observers were extracting useful information covertly, however. In Experiments 3 through 5, obscuring the nearby ground plane had a modest impact on performance; obscuring the walls and ceiling was more detrimental. The results suggest that these alternate surfaces provide useful information when judging the distance to objects within indoor environments. Critically, they constrain the role for the nearby ground plane in theories of egocentric distance perception.
Collapse
Affiliation(s)
- Daniel A. Gajewski
- Department of Psychology, George Washington University, Washington, DC, USA
| | - Courtney P. Wallin
- Department of Psychology, George Washington University, Washington, DC, USA
| | - John W. Philbeck
- Department of Psychology, George Washington University, Washington, DC, USA
| |
Collapse
|
40
|
Clément G, Skinner A, Lathan C. Distance and Size Perception in Astronauts during Long-Duration Spaceflight. Life (Basel) 2013; 3:524-37. [PMID: 25369884 PMCID: PMC4187133 DOI: 10.3390/life3040524] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2013] [Revised: 12/03/2013] [Accepted: 12/09/2013] [Indexed: 11/17/2022] Open
Abstract
Exposure to microgravity during spaceflight is known to elicit orientation illusions, errors in sensory localization, postural imbalance, changes in vestibulo-spinal and vestibulo-ocular reflexes, and space motion sickness. The objective of this experiment was to investigate whether an alteration in cognitive visual-spatial processing, such as the perception of distance and size of objects, is also taking place during prolonged exposure to microgravity. Our results show that astronauts on board the International Space Station exhibit biases in the perception of their environment. Objects' heights and depths were perceived as taller and shallower, respectively, and distances were generally underestimated in orbit compared to Earth. These changes may occur because the perspective cues for depth are less salient in microgravity or the eye-height scaling of size is different when an observer is not standing on the ground. This finding has operational implications for human space exploration missions.
Collapse
Affiliation(s)
- Gilles Clément
- International Space University, Parc d'Innovation, 1 rue Jean-Dominique Cassini, Illkirch-Graffenstaden F-67400, France.
| | - Anna Skinner
- AnthroTronix, Inc., 8737 Colesville Road, Suite L203, Silver Spring, MD 20910, USA.
| | - Corinna Lathan
- AnthroTronix, Inc., 8737 Colesville Road, Suite L203, Silver Spring, MD 20910, USA.
| |
Collapse
|
41
|
Abstract
Sensorimotor mechanisms can unify explanations at cognitive, social, and cultural levels. As an example, we review how anticipated motor effort is used by individuals and groups to judge distance: the greater the anticipated effort the greater the perceived distance. Anticipated motor effort can also be used to understand cultural differences. People with interdependent self- construals interact almost exclusively with in-group members, and hence there is little opportunity to tune their sensorimotor systems for interaction with out-group members. The result is that interactions with out-group members are expected to be difficult and out-group members are perceived as literally more distant. In two experiments we show (a) interdependent Americans, compared to independent Americans, see American confederates (in-group) as closer; (b) interdependent Arabs, compared to independent Arabs, perceive Arab confederates (in- group) as closer, whereas interdependent Americans perceive Arab confederates (out-group) as farther. These results demonstrate how the same embodied mechanism can seamlessly contribute to explanations at the cognitive, social, and cultural levels.
Collapse
Affiliation(s)
- Tamer Soliman
- Department of Psychology, Arizona State University Tempe, AZ, USA
| | - Alison Gibson
- Department of Psychology, Arizona State University Tempe, AZ, USA
| | | |
Collapse
|
42
|
Takahashi K, Meilinger T, Watanabe K, Bülthoff HH. Psychological influences on distance estimation in a virtual reality environment. Front Hum Neurosci 2013; 7:580. [PMID: 24065905 PMCID: PMC3776303 DOI: 10.3389/fnhum.2013.00580] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2013] [Accepted: 08/28/2013] [Indexed: 11/21/2022] Open
Abstract
Studies of embodied perception have revealed that social, psychological, and physiological factors influence space perception. While many of these influences were observed with real or highly realistic stimuli, the present work showed that even the orientation of abstract geometric objects in a non-realistic virtual environment could influence distance perception. Observers wore a head mounted display and watched virtual cones moving within an invisible cube for 5 s with their head movement recorded. Subsequently, the observers estimated the distance to the cones or evaluated their friendliness. The cones either faced the observer, a target behind the cones, or were oriented randomly. The average viewing distance to the cones varied between 1.2 and 2.0 m. At a viewing distance of 1.6 m, the observers perceived the cones facing them as closer than the cones facing a target in the opposite direction, or those oriented randomly. Furthermore, irrespective of the viewing distance, observers moved their head away from the cones more strongly and evaluated the cones as less friendly when the cones faced the observers. Similar distance estimation results were obtained with a 3-dimensional projection onto a large screen, although the effective viewing distances were farther away. These results suggest that factors other than physical distance influenced distance perception even with non-realistic geometric objects in a virtual environment. Furthermore, the distance perception modulation was accompanied by changes in subjective impression and avoidance movement. We propose that cones facing an observer are perceived as socially discomforting or threatening, and potentially violate an observer's personal space, which might influence the perceived distance of cones.
Collapse
Affiliation(s)
- Kohske Takahashi
- Research Center for Advanced Science and Technology, The University of Tokyo Tokyo, Japan
| | | | | | | |
Collapse
|
43
|
Abstract
Humans can perceive depth when viewing with one eye, and even when viewing a two-dimensional picture of a three-dimensional scene. However, viewing a real scene with both eyes produces a more compelling three-dimensional experience of immersive space and tangible solid objects. A widely held belief is that this qualitative visual phenomenon (stereopsis) is a by-product of binocular vision. In the research reported here, we empirically established, for the first time, the qualitative characteristics associated with stereopsis to show that they can occur for static two-dimensional pictures without binocular vision. Critically, we show that stereopsis is a measurable qualitative attribute and that its induction while viewing pictures is not consistent with standard explanations based on depth-cue conflict or the perception of greater depth magnitude. These results challenge the conventional understanding of the underlying cause, variation, and functional role of stereopsis.
Collapse
|
44
|
Abstract
The question of whether defocus blur is a quantitative cue for depth perception is a topic of renewed interest. A recent study suggests that relative defocus blur can be used in computing depth throughout the visual field, particularly in regions where disparity loses precision. However, elements of the study's experimental design and theoretical analysis appear to undermine this claim. First, the study did not provide evidence that blur can be used as a quantitative depth cue. It only measured blur discrimination thresholds, not perceived depth from for blur. Second, the study's conceptualization of the complementary use of blur and disparity, and related conjectures, are based on the specific viewing geometry and fixation distance tested. They do not appear to generalize to natural viewing situations and tasks. I suggest a different way in which defocus blur might affect depth perception. Because depth-of-focus blur is a cue to egocentric distance, it could contribute to quantitative depth perception by scaling depth relations specified by other relative depth cues.
Collapse
Affiliation(s)
- Dhanraj Vishwanath
- School of Psychology, University of St Andrews, Fife KY16 9JP, UK; e-mail:
| |
Collapse
|
45
|
Abstract
There is controversy over the existence, nature, and cause of error in egocentric distance judgments. One proposal is that the systematic biases often found in explicit judgments of egocentric distance along the ground may be related to recently observed biases in the perceived declination of gaze (Durgin & Li, Attention, Perception, & Psychophysics, in press), To measure perceived egocentric distance nonverbally, observers in a field were asked to position themselves so that their distance from one of two experimenters was equal to the frontal distance between the experimenters. Observers placed themselves too far away, consistent with egocentric distance underestimation. A similar experiment was conducted with vertical frontal extents. Both experiments were replicated in panoramic virtual reality. Perceived egocentric distance was quantitatively consistent with angular bias in perceived gaze declination (1.5 gain). Finally, an exocentric distance-matching task was contrasted with a variant of the egocentric matching task. The egocentric matching data approximate a constant compression of perceived egocentric distance with a power function exponent of nearly 1; exocentric matches had an exponent of about 0.67. The divergent pattern between egocentric and exocentric matches suggests that they depend on different visual cues.
Collapse
Affiliation(s)
- Zhi Li
- Swarthmore College, Department of Psychology, 500 College Ave, Swarthmore, PA 19081, USA
| | - John Phillips
- Swarthmore College, Department of Psychology, 500 College Ave, Swarthmore, PA 19081, USA
| | - Frank H. Durgin
- Swarthmore College, Department of Psychology, 500 College Ave, Swarthmore, PA 19081, USA
| |
Collapse
|
46
|
Affiliation(s)
| | | | - Bevil R. Conway
- Department of Neurobiology, Harvard Medical School
- Neuroscience Program, Wellesley College
| |
Collapse
|
47
|
Abstract
We propose a novel method to probe the depth structure of the pictorial space evoked by paintings. The method involves an exocentric pointing paradigm that allows one to find the slope of the geodesic connection between any pair of points in pictorial space. Since the locations of the points in the picture plane are known, this immediately yields the depth difference between the points. A set of depth differences between all pairs of points from an N-point (N > 2) configuration then yields the configuration in depth up to an arbitrary depth offset. Since an N-point configuration implies N(N−1) (ordered) pairs, the number of observations typically far exceeds the number of inferred depths. This yields a powerful check on the geometrical consistency of the results. We report that the remaining inconsistencies are fully accounted for by the spread encountered in repeated observations. This implies that the concept of ‘pictorial space’ indeed has an empirical significance. The method is analyzed and empirically verified in considerable detail. We report large quantitative interobserver differences, though the results of all observers agree modulo a certain affine transformation that describes the basic cue ambiguities. This is expected on the basis of a formal analysis of monocular optical structure. The method will prove useful in a variety of potential applications.
Collapse
Affiliation(s)
- Johan Wagemans
- University of Leuven, Laboratory of Experimental Psychology, Tiensestraat 102-box 3711, 3000 Leuven, The Netherlands; e-mail:
| | | | | |
Collapse
|
48
|
Berryhill ME, Olson IR. The representation of object distance: evidence from neuroimaging and neuropsychology. Front Hum Neurosci 2009; 3:43. [PMID: 19949468 PMCID: PMC2784298 DOI: 10.3389/neuro.09.043.2009] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2009] [Accepted: 10/15/2009] [Indexed: 12/04/2022] Open
Abstract
Perceived distance in two-dimensional (2D) images relies on monocular distance cues. Here, we examined the representation of perceived object distance using a continuous carry-over adaptation design for fMRI. The task was to look at photographs of objects and make a judgment as to whether or not the item belonged in the kitchen. Importantly, this task was orthogonal to the variable of interest: the object's perceived distance from the viewer. In Experiment 1, whole brain group analyses identified bilateral clusters in the superior occipital gyrus (approximately area V3/V3A) that showed parametric adaptation to relative changes in perceived distance. In Experiment 2, retinotopic analyses confirmed that area V3A/B reflected the greatest magnitude of response to monocular changes in perceived distance. In Experiment 3, we report that the functional activations overlap with the occipito-parietal lesions in a patient with impaired distance perception, showing that the same regions monitor implied (2D) and actual (three-dimensional) distance. These data suggest that distance information is automatically processed even when it is task-irrelevant and that this process relies on superior occipital areas in and around area V3A.
Collapse
Affiliation(s)
- Marian E Berryhill
- Department of Psychology, Temple University Philadelphia, PA 19104-6196, USA.
| | | |
Collapse
|
49
|
Abstract
Previous research on perceiving spatial layout has found that people often exhibit normative biases in their perception of the environment. For instance, slant is typically overestimated and distance is usually underestimated. Surprisingly, however, the perception of height has rarely been studied. The present experiments examined the perception of height when viewed from the top (e.g., looking down) or from the bottom (e.g., looking up). Multiple measures were adapted from previous studies of horizontal extents to assess the perception of height. Across all of the measures, a large, consistent bias was found: Vertical distances were greatly overestimated, especially from the top. Secondary findings suggest that the overestimation of distance and size that occurs when looking down from a high place correlates with reports of trait- and state-level fear of heights, suggesting that height overestimation may be due, in part, to fear.
Collapse
|