1
|
Castet E, Termoz-Masson J, Vizcay S, Delachambre J, Myrodia V, Aguilar C, Matonti F, Kornprobst P. PTVR - A software in Python to make virtual reality experiments easier to build and more reproducible. J Vis 2024; 24:19. [PMID: 38652657 PMCID: PMC11044846 DOI: 10.1167/jov.24.4.19] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Accepted: 02/25/2024] [Indexed: 04/25/2024] Open
Abstract
Researchers increasingly use virtual reality (VR) to perform behavioral experiments, especially in vision science. These experiments are usually programmed directly in so-called game engines that are extremely powerful. However, this process is tricky and time-consuming as it requires solid knowledge of game engines. Consequently, the anticipated prohibitive effort discourages many researchers who want to engage in VR. This paper introduces the Perception Toolbox for Virtual Reality (PTVR) library, allowing visual perception studies in VR to be created using high-level Python script programming. A crucial consequence of using a script is that an experiment can be described by a single, easy-to-read piece of code, thus improving VR studies' transparency, reproducibility, and reusability. We built our library upon a seminal open-source library released in 2018 that we have considerably developed since then. This paper aims to provide a comprehensive overview of the PTVR software for the first time. We introduce the main objects and features of PTVR and some general concepts related to the three-dimensional (3D) world. This new library should dramatically reduce the difficulty of programming experiments in VR and elicit a whole new set of visual perception studies with high ecological validity.
Collapse
Affiliation(s)
- Eric Castet
- Aix Marseille Univ, CNRS, CRPN, Marseille, France
| | | | | | | | | | | | | | | |
Collapse
|
2
|
Beyond motion extrapolation: vestibular contribution to head-rotation-induced flash-lag effects. PSYCHOLOGICAL RESEARCH 2022; 86:2083-2098. [DOI: 10.1007/s00426-021-01638-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 12/19/2021] [Indexed: 11/30/2022]
|
3
|
Dong X, Bao M. The growing sensory suppression on visual perception during head-rotation preparation. Psych J 2021; 10:499-507. [PMID: 33665982 DOI: 10.1002/pchj.438] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2020] [Revised: 01/15/2021] [Accepted: 01/20/2021] [Indexed: 11/11/2022]
Abstract
Sensory perception is often impaired by self-generated movements. This effect of sensory suppression has been commonly observed in voluntary hand-movement-induced tactile sensation during the period of motor preparation and execution. However, it remains unclear whether such suppression also occurs in the visual domain and if it can be induced by the preparation of other body movements. To extend our knowledge about sensory suppression, the present study investigated visual sensitivity during the preparation of head rotation. Participants wore virtual reality goggles and rotated their heads horizontally according to a visual cue presented on the goggles screens. Before the start of head rotation, a target of Landolt C was displayed at a peripheral location that was directed by the head-rotation cue or a symmetric location in the opposite visual field. After each head rotation, participants reported the target's orientation, allowing the measurement of the discrimination threshold. Besides, the discrimination sensitivity was also measured in two head-still conditions with or without the presentation of a visual cue. The results showed that the discrimination performance was largely impaired by the preparation of head rotation. This effect of sensory attenuation increased with the approach of head-motion onset. However, the attenuation was not found on the discrimination of auditory stimuli during the preparation of head rotation, thus excluding the account of general dual-task requirement. In contrast to the previous findings of improved perception by preparation of saccade or reach, our findings indicate that sensory suppression rather than attention shift plays a major role during the preparation of head movement.
Collapse
Affiliation(s)
- Xue Dong
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Min Bao
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.,State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
4
|
Bai J, He X, Jiang Y, Zhang T, Bao M. Rotating One's Head Modulates the Perceived Velocity of Motion Aftereffect. Multisens Res 2020; 33:189-212. [PMID: 31648199 DOI: 10.1163/22134808-20191477] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2019] [Accepted: 09/11/2019] [Indexed: 11/19/2022]
Abstract
As a prominent illusion, the motion aftereffect (MAE) has traditionally been considered a visual phenomenon. Recent neuroimaging work has revealed increased activities in MT+ and decreased activities in vestibular regions during the MAE, supporting the notion of visual-vestibular interaction on the MAE. Since the head had to remain stationary in fMRI experiments, vestibular self-motion signals were absent in those studies. Accordingly, more direct evidence is still lacking in terms of whether and how vestibular signals modulate the MAE. By developing a virtual reality approach, the present study for the first time demonstrates that horizontal head rotation affects the perceived velocity of the MAE. We found that the MAE was predominantly perceived as moving faster when its direction was opposite to the direction of head rotation than when its direction was the same as head rotation. The magnitude of this effect was positively correlated with the velocity of head rotation. Similar result patterns were not observed for the real motion stimuli. Our findings support a 'cross-modal bias' hypothesis that after living in a multisensory environment long-term the brain develops a strong association between signals from the visual and vestibular pathways. Consequently, weak biasing visual signals in the associated direction can spontaneously emerge with the input of vestibular signals in the multisensory brain areas, substantially modulating the illusory visual motion represented in those areas as well. The hypothesis can also be used to explain other multisensory integration phenomena.
Collapse
Affiliation(s)
- Jianying Bai
- 1CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China.,2Xinjiang Astronomical Observatory, Chinese Academy of Sciences, Urumqi 830011, China.,3University of Chinese Academy of Sciences, Beijing 100049, China
| | - Xin He
- 1CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China.,5Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Yi Jiang
- 4State Key Laboratory of Brain and Cognitive Science, Beijing 100101, China.,5Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China.,6CAS Center for Excellence in Brain Science and Intelligence Technology, Shanghai, China
| | - Tao Zhang
- 4State Key Laboratory of Brain and Cognitive Science, Beijing 100101, China.,5Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| | - Min Bao
- 1CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing 100101, China.,4State Key Laboratory of Brain and Cognitive Science, Beijing 100101, China.,5Department of Psychology, University of Chinese Academy of Sciences, Beijing 100049, China
| |
Collapse
|
5
|
Bao M, Engel SA. Augmented Reality as a Tool for Studying Visual Plasticity: 2009 to 2018. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE 2019. [DOI: 10.1177/0963721419862290] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Augmented reality (AR) has developed rapidly since its conception less than 30 years ago and is now a hot topic for both consumers and scientists. Although much attention has been paid to its application in industry, medicine, education, and entertainment, the use of AR in psychological research has been less noted. In this article, we survey recent progress in basic research that uses AR to explore the plasticity of the adult visual system. We focus on a particular application of AR called altered reality, which has been used to shed new light on mechanisms of long-term contrast adaptation and ocular-dominance plasticity. The results suggest that AR could also be a useful tool for the treatment of visual disorders.
Collapse
Affiliation(s)
- Min Bao
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- State Key Laboratory of Brain and Cognitive Science, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences
| | | |
Collapse
|