1
|
Attentional modulations of audiovisual interactions in apparent motion: Temporal ventriloquism effects on perceived visual speed. Atten Percept Psychophys 2022; 84:2167-2185. [PMID: 35996056 DOI: 10.3758/s13414-022-02555-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/08/2022] [Indexed: 11/08/2022]
Abstract
The timing of brief stationary sounds has been shown to alter different aspects of visual motion, such as speed estimation. These effects of auditory timing have been explained by temporal ventriloquism and auditory dominance over visual information in the temporal domain. Although previous studies provide unprecedented evidence for the multisensory nature of speed estimation, how attention is involved in these audiovisual interactions remains unclear. Here, we aimed to understand the effects of spatial attention on these audiovisual interactions in time. We utilized a set of audiovisual stimuli that elicit temporal ventriloquism in visual apparent motion and asked participants to perform a speed comparison task. We manipulated attention either in the visual or auditory domain and systematically changed the number of moving objects in the visual field. When attention was diverted to a stationary object in the visual field via a secondary task, the temporal ventriloquism effects on perceived speed decreased. On the other hand, focusing attention on the auditory stimuli facilitated these effects consistently across different difficulty levels of secondary auditory task. Moreover, the effects of auditory timing on perceived speed did not change with the number of moving objects and existed in all the experimental conditions. Taken together, our findings revealed differential effects of allocating attentional resources in the visual and auditory domains. These behavioral results also demonstrate that reliable temporal ventriloquism effects on visual motion can be induced even in the presence of multiple moving objects in the visual field and under different perceptual load conditions.
Collapse
|
2
|
Makowski D, Lau ZJ, Pham T, Paul Boyce W, Annabel Chen SH. A Parametric Framework to Generate Visual Illusions Using Python. Perception 2021; 50:950-965. [PMID: 34841973 DOI: 10.1177/03010066211057347] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Visual illusions are fascinating phenomena that have been used and studied by artists and scientists for centuries, leading to important discoveries about the neurocognitive underpinnings of perception, consciousness, and neuropsychiatric disorders such as schizophrenia or autism. Surprisingly, despite their historical and theoretical importance as psychological stimuli, there is no dedicated software, nor consistent approach, to generate illusions in a systematic fashion. Instead, scientists have to craft them by hand in an idiosyncratic fashion, or use pre-made images not tailored for the specific needs of their studies. This, in turn, hinders the reproducibility of illusion-based research, narrowing possibilities for scientific breakthroughs and their applications. With the aim of addressing this gap, Pyllusion is a Python-based open-source software (freely available at https://github.com/RealityBending/Pyllusion), that offers a framework to manipulate and generate illusions in a systematic way, compatible with different output formats such as image files (.png, .jpg, .tiff, etc.) or experimental software (such as PsychoPy).
Collapse
Affiliation(s)
- Dominique Makowski
- School of Social Sciences, 54761Nanyang Technological University, Singapore.,School of Social Sciences, 54761Nanyang Technological University, Singapore.,Centre for Research and Development in Learning, 54761Nanyang Technological University, Singapore.,Lee Kong Chian School of Medicine, 54761Nanyang Technological University, Singapore.,National Institute of Education, 54761Nanyang Technological University, Singapore
| | - Zen J Lau
- School of Social Sciences, 54761Nanyang Technological University, Singapore.,School of Social Sciences, 54761Nanyang Technological University, Singapore.,Centre for Research and Development in Learning, 54761Nanyang Technological University, Singapore.,Lee Kong Chian School of Medicine, 54761Nanyang Technological University, Singapore.,National Institute of Education, 54761Nanyang Technological University, Singapore
| | - Tam Pham
- School of Social Sciences, 54761Nanyang Technological University, Singapore.,School of Social Sciences, 54761Nanyang Technological University, Singapore.,Centre for Research and Development in Learning, 54761Nanyang Technological University, Singapore.,Lee Kong Chian School of Medicine, 54761Nanyang Technological University, Singapore.,National Institute of Education, 54761Nanyang Technological University, Singapore
| | - W Paul Boyce
- School of Psychology, 7800University of New South Wales, Australia.,School of Social Sciences, 54761Nanyang Technological University, Singapore.,Centre for Research and Development in Learning, 54761Nanyang Technological University, Singapore.,Lee Kong Chian School of Medicine, 54761Nanyang Technological University, Singapore.,National Institute of Education, 54761Nanyang Technological University, Singapore
| | - S H Annabel Chen
- School of Social Sciences, 54761Nanyang Technological University, Singapore.,Centre for Research and Development in Learning, 54761Nanyang Technological University, Singapore.,Lee Kong Chian School of Medicine, 54761Nanyang Technological University, Singapore.,National Institute of Education, 54761Nanyang Technological University, Singapore
| |
Collapse
|