1
|
Gao Y, Xue K, Odegaard B, Rahnev D. Common computations in automatic cue combination and metacognitive confidence reports. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.06.07.544029. [PMID: 37333352 PMCID: PMC10274803 DOI: 10.1101/2023.06.07.544029] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/20/2023]
Abstract
Appropriate perceptual decision making necessitates the accurate estimation and use of sensory uncertainty. Such estimation has been studied in the context of both low-level multisensory cue combination and metacognitive estimation of confidence, but it remains unclear whether the same computations underlie both sets of uncertainty estimation. We created visual stimuli with low vs. high overall motion energy, such that the high-energy stimuli led to higher confidence but lower accuracy in a visual-only task. Importantly, we tested the impact of the low- and high-energy visual stimuli on auditory motion perception in a separate task. Despite being irrelevant to the auditory task, both visual stimuli impacted auditory judgments presumably via automatic low-level mechanisms. Critically, we found that the high-energy visual stimuli influenced the auditory judgments more strongly than the low-energy visual stimuli. This effect was in line with the confidence but contrary to the accuracy differences between the high- and low-energy stimuli in the visual-only task. These effects were captured by a simple computational model that assumes common computational principles underlying both confidence reports and multisensory cue combination. Our results reveal a deep link between automatic sensory processing and metacognitive confidence reports, and suggest that vastly different stages of perceptual decision making rely on common computational principles.
Collapse
|
2
|
Judging Relative Onsets and Offsets of Audiovisual Events. Vision (Basel) 2020; 4:vision4010017. [PMID: 32138261 PMCID: PMC7157228 DOI: 10.3390/vision4010017] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2019] [Revised: 02/15/2020] [Accepted: 02/23/2020] [Indexed: 01/29/2023] Open
Abstract
This study assesses the fidelity with which people can make temporal order judgments (TOJ) between auditory and visual onsets and offsets. Using an adaptive staircase task administered to a large sample of young adults, we find that the ability to judge temporal order varies widely among people, with notable difficulty created when auditory events closely follow visual events. Those findings are interpretable within the context of an independent channels model. Visual onsets and offsets can be difficult to localize in time when they occur within the temporal neighborhood of sound onsets or offsets.
Collapse
|
3
|
Chaplin TA, Rosa MGP, Lui LL. Auditory and Visual Motion Processing and Integration in the Primate Cerebral Cortex. Front Neural Circuits 2018; 12:93. [PMID: 30416431 PMCID: PMC6212655 DOI: 10.3389/fncir.2018.00093] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2018] [Accepted: 10/08/2018] [Indexed: 11/13/2022] Open
Abstract
The ability of animals to detect motion is critical for survival, and errors or even delays in motion perception may prove costly. In the natural world, moving objects in the visual field often produce concurrent sounds. Thus, it can highly advantageous to detect motion elicited from sensory signals of either modality, and to integrate them to produce more reliable motion perception. A great deal of progress has been made in understanding how visual motion perception is governed by the activity of single neurons in the primate cerebral cortex, but far less progress has been made in understanding both auditory motion and audiovisual motion integration. Here we, review the key cortical regions for motion processing, focussing on translational motion. We compare the representations of space and motion in the visual and auditory systems, and examine how single neurons in these two sensory systems encode the direction of motion. We also discuss the way in which humans integrate of audio and visual motion cues, and the regions of the cortex that may mediate this process.
Collapse
Affiliation(s)
- Tristan A Chaplin
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| | - Marcello G P Rosa
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| | - Leo L Lui
- Neuroscience Program, Biomedicine Discovery Institute and Department of Physiology, Monash University, Clayton, VIC, Australia.,Australian Research Council (ARC) Centre of Excellence for Integrative Brain Function, Monash University Node, Clayton, VIC, Australia
| |
Collapse
|
4
|
Yamasaki D, Miyoshi K, Altmann CF, Ashida H. Front-Presented Looming Sound Selectively Alters the Perceived Size of a Visual Looming Object. Perception 2018; 47:751-771. [PMID: 29783921 DOI: 10.1177/0301006618777708] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In spite of accumulating evidence for the spatial rule governing cross-modal interaction according to the spatial consistency of stimuli, it is still unclear whether 3D spatial consistency (i.e., front/rear of the body) of stimuli also regulates audiovisual interaction. We investigated how sounds with increasing/decreasing intensity (looming/receding sound) presented from the front and rear space of the body impact the size perception of a dynamic visual object. Participants performed a size-matching task (Experiments 1 and 2) and a size adjustment task (Experiment 3) of visual stimuli with increasing/decreasing diameter, while being exposed to a front- or rear-presented sound with increasing/decreasing intensity. Throughout these experiments, we demonstrated that only the front-presented looming sound caused overestimation of the spatially consistent looming visual stimulus in size, but not of the spatially inconsistent and the receding visual stimulus. The receding sound had no significant effect on vision. Our results revealed that looming sound alters dynamic visual size perception depending on the consistency in the approaching quality and the front-rear spatial location of audiovisual stimuli, suggesting that the human brain differently processes audiovisual inputs based on their 3D spatial consistency. This selective interaction between looming signals should contribute to faster detection of approaching threats. Our findings extend the spatial rule governing audiovisual interaction into 3D space.
Collapse
Affiliation(s)
| | | | - Christian F Altmann
- Human Brain Research Center, Graduate School of Medicine, Kyoto University, Japan
| | | |
Collapse
|
5
|
Dittrich S, Noesselt T. Temporal Audiovisual Motion Prediction in 2D- vs. 3D-Environments. Front Psychol 2018; 9:368. [PMID: 29618999 PMCID: PMC5871701 DOI: 10.3389/fpsyg.2018.00368] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 03/06/2018] [Indexed: 11/24/2022] Open
Abstract
Predicting motion is essential for many everyday life activities, e.g., in road traffic. Previous studies on motion prediction failed to find consistent results, which might be due to the use of very different stimulus material and behavioural tasks. Here, we directly tested the influence of task (detection, extrapolation) and stimulus features (visual vs. audiovisual and three-dimensional vs. non-three-dimensional) on temporal motion prediction in two psychophysical experiments. In both experiments a ball followed a trajectory toward the observer and temporarily disappeared behind an occluder. In audiovisual conditions a moving white noise (congruent or non-congruent to visual motion direction) was presented concurrently. In experiment 1 the ball reappeared on a predictable or a non-predictable trajectory and participants detected when the ball reappeared. In experiment 2 the ball did not reappear after occlusion and participants judged when the ball would reach a specified position at two possible distances from the occluder (extrapolation task). Both experiments were conducted in three-dimensional space (using stereoscopic screen and polarised glasses) and also without stereoscopic presentation. Participants benefitted from visually predictable trajectories and concurrent sounds during detection. Additionally, visual facilitation was more pronounced for non-3D stimulation during detection task. In contrast, for a more complex extrapolation task group mean results indicated that auditory information impaired motion prediction. However, a post hoc cross-validation procedure (split-half) revealed that participants varied in their ability to use sounds during motion extrapolation. Most participants selectively profited from either near or far extrapolation distances but were impaired for the other one. We propose that interindividual differences in extrapolation efficiency might be the mechanism governing this effect. Together, our results indicate that both a realistic experimental environment and subject-specific differences modulate the ability of audiovisual motion prediction and need to be considered in future research.
Collapse
Affiliation(s)
- Sandra Dittrich
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Tömme Noesselt
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany
| |
Collapse
|
6
|
Berger CC, Ehrsson HH. Auditory Motion Elicits a Visual Motion Aftereffect. Front Neurosci 2016; 10:559. [PMID: 27994538 PMCID: PMC5136551 DOI: 10.3389/fnins.2016.00559] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2016] [Accepted: 11/21/2016] [Indexed: 11/18/2022] Open
Abstract
The visual motion aftereffect is a visual illusion in which exposure to continuous motion in one direction leads to a subsequent illusion of visual motion in the opposite direction. Previous findings have been mixed with regard to whether this visual illusion can be induced cross-modally by auditory stimuli. Based on research on multisensory perception demonstrating the profound influence auditory perception can have on the interpretation and perceived motion of visual stimuli, we hypothesized that exposure to auditory stimuli with strong directional motion cues should induce a visual motion aftereffect. Here, we demonstrate that horizontally moving auditory stimuli induced a significant visual motion aftereffect—an effect that was driven primarily by a change in visual motion perception following exposure to leftward moving auditory stimuli. This finding is consistent with the notion that visual and auditory motion perception rely on at least partially overlapping neural substrates.
Collapse
Affiliation(s)
| | - H Henrik Ehrsson
- Department of Neuroscience, Karolinska Institutet Stockholm, Sweden
| |
Collapse
|
7
|
Abstract
We asked whether the perceived direction of visual motion and contrast thresholds for motion discrimination are influenced by the concurrent motion of an auditory sound source. Visual motion stimuli were counterphasing Gabor patches, whose net motion energy was manipulated by adjusting the contrast of the leftward-moving and rightward-moving components. The presentation of these visual stimuli was paired with the simultaneous presentation of auditory stimuli, whose apparent motion in 3D auditory space (rightward, leftward, static, no sound) was manipulated using interaural time and intensity differences, and Doppler cues. In experiment 1, observers judged whether the Gabor visual stimulus appeared to move rightward or leftward. In experiment 2, contrast discrimination thresholds for detecting the interval containing unequal (rightward or leftward) visual motion energy were obtained under the same auditory conditions. Experiment 1 showed that the perceived direction of ambiguous visual motion is powerfully influenced by concurrent auditory motion, such that auditory motion 'captured' ambiguous visual motion. Experiment 2 showed that this interaction occurs at a sensory stage of processing as visual contrast discrimination thresholds (a criterion-free measure of sensitivity) were significantly elevated when paired with congruent auditory motion. These results suggest that auditory and visual motion signals are integrated and combined into a supramodal (audiovisual) representation of motion.
Collapse
|
8
|
Sankaran N, Leung J, Carlile S. Effects of virtual speaker density and room reverberation on spatiotemporal thresholds of audio-visual motion coherence. PLoS One 2014; 9:e108437. [PMID: 25269061 DOI: 10.1371/journal.pone.0108437] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2014] [Accepted: 08/28/2014] [Indexed: 11/18/2022] Open
Abstract
The present study examined the effects of spatial sound-source density and reverberation on the spatiotemporal window for audio-visual motion coherence. Three different acoustic stimuli were generated in Virtual Auditory Space: two acoustically "dry" stimuli via the measurement of anechoic head-related impulse responses recorded at either 1° or 5° spatial intervals (Experiment 1), and a reverberant stimulus rendered from binaural room impulse responses recorded at 5° intervals in situ in order to capture reverberant acoustics in addition to head-related cues (Experiment 2). A moving visual stimulus with invariant localization cues was generated by sequentially activating LED's along the same radial path as the virtual auditory motion. Stimuli were presented at 25°/s, 50°/s and 100°/s with a random spatial offset between audition and vision. In a 2AFC task, subjects made a judgment of the leading modality (auditory or visual). No significant differences were observed in the spatial threshold based on the point of subjective equivalence (PSE) or the slope of psychometric functions (β) across all three acoustic conditions. Additionally, both the PSE and β did not significantly differ across velocity, suggesting a fixed spatial window of audio-visual separation. Findings suggest that there was no loss in spatial information accompanying the reduction in spatial cues and reverberation levels tested, and establish a perceptual measure for assessing the veracity of motion generated from discrete locations and in echoic environments.
Collapse
Affiliation(s)
- Narayan Sankaran
- Auditory Neuroscience Laboratory, School of Medical Sciences, The University of Sydney, Sydney, NSW, Australia
| | - Johahn Leung
- Auditory Neuroscience Laboratory, School of Medical Sciences, The University of Sydney, Sydney, NSW, Australia
| | - Simon Carlile
- Auditory Neuroscience Laboratory, School of Medical Sciences, The University of Sydney, Sydney, NSW, Australia
| |
Collapse
|
9
|
Kafaligonul H, Stoner GR. Static sound timing alters sensitivity to low-level visual motion. J Vis 2012; 12:12.11.2. [PMID: 23035130 DOI: 10.1167/12.11.2] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Visual motion processing is essential to survival in a dynamic world and is probably the best-studied facet of visual perception. It has been recently discovered that the timing of brief static sounds can bias visual motion perception, an effect attributed to "temporal ventriloquism" whereby the timing of the sounds "captures" the timing of the visual events. To determine whether this cross-modal interaction is dependent on the involvement of higher-order attentive tracking mechanisms, we used near-threshold motion stimuli that isolated low-level pre-attentive visual motion processing. We found that the timing of brief sounds altered sensitivity to these visual motion stimuli in a manner that paralleled changes in the timing of the visual stimuli. Our findings indicate that auditory timing impacts visual motion processing very early in the processing hierarchy and without the involvement of higher-order attentional and/or position tracking mechanisms.
Collapse
Affiliation(s)
- Hulusi Kafaligonul
- Vision Center Laboratory, The Salk Institute for Biological Studies, La Jolla, CA, USA.
| | | |
Collapse
|
10
|
Crossmodal interactions and multisensory integration in the perception of audio-visual motion — A free-field study. Brain Res 2012; 1466:99-111. [DOI: 10.1016/j.brainres.2012.05.015] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2012] [Revised: 05/03/2012] [Accepted: 05/06/2012] [Indexed: 11/17/2022]
|