1
|
Kelber P, Ulrich R. Independent-channels models of temporal-order judgment revisited: A model comparison. Atten Percept Psychophys 2024; 86:2187-2209. [PMID: 39107652 PMCID: PMC11410913 DOI: 10.3758/s13414-024-02915-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2024] [Indexed: 09/19/2024]
Abstract
The perception of temporal order or simultaneity of stimuli is almost always explained in terms of independent-channels models, such as perceptual-moment, triggered-moment, and attention-switching models. Independent-channels models generally posit that stimuli are processed in separate peripheral channels and that their arrival-time difference at a central location is translated into an internal state of order (simultaneity) if it reaches (misses) a certain threshold. Non-monotonic and non-parallel psychometric functions in a ternary-response task provided critical evidence against a wide range of independent-channels models. However, two independent-channels models have been introduced in the last decades that can account for such shapes by considering misreports of internal states (response-error model) or by assuming that simultaneity and order judgments rely on distinct sensory and decisional processes (two-stage model). Based on previous ideas, we also consider a two-threshold model, according to which the same arrival-time difference may need to reach a higher threshold for order detection than for successiveness detection. All three models were fitted to various data sets collected over a period of more than a century. The two-threshold model provided the best balance between goodness of fit and parsimony. This preference for the two-threshold model over the two-stage model and the response-error model aligns well with several lines of evidence from cognitive modeling, psychophysics, mental chronometry, and psychophysiology. We conclude that the seemingly deviant shapes of psychometric functions can be explained within the framework of independent-channels models in a simpler way than previously assumed.
Collapse
Affiliation(s)
- Paul Kelber
- Department of Psychology, University of Tübingen, Schleichstraße 4, Tübingen, 72076, Germany.
| | - Rolf Ulrich
- Department of Psychology, University of Tübingen, Schleichstraße 4, Tübingen, 72076, Germany
| |
Collapse
|
2
|
Uno K, Yokosawa K. Does cross-modal correspondence modulate modality-specific perceptual processing? Study using timing judgment tasks. Atten Percept Psychophys 2024; 86:273-284. [PMID: 37932495 DOI: 10.3758/s13414-023-02812-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/20/2023] [Indexed: 11/08/2023]
Abstract
Cross-modal correspondences refer to associations between stimulus features across sensory modalities. Previous studies have shown that cross-modal correspondences modulate reaction times for detecting and identifying stimuli in one modality when uninformative stimuli from another modality are present. However, it is unclear whether such modulation reflects changes in modality-specific perceptual processing. We used two psychophysical timing judgment tasks to examine the effects of audiovisual correspondences on visual perceptual processing. In Experiment 1, we conducted a temporal order judgment (TOJ) task that asked participants to judge which of two visual stimuli presented with various stimulus onset asynchronies (SOAs) appeared first. In Experiment 2, we conducted a simultaneous judgment (SJ) task that asked participants to report whether the two visual stimuli were simultaneous or successive. We also presented an unrelated auditory stimulus, simultaneously or preceding the first visual stimulus, and manipulated the congruency between audiovisual stimuli. Experiment 1 indicated that the points of subjective simultaneity (PSSs) between the two visual stimuli estimated in the TOJ task shifted according to the audiovisual correspondence between the auditory pitch and visual features of vertical location and size. However, these audiovisual correspondences did not affect PSS estimated using the SJ task in Experiment 2. The different results of the two tasks can be explained through the response bias triggered by audiovisual correspondence that only the TOJ task included. We concluded that audiovisual correspondence would not modulate visual perceptual timing and that changes in modality-specific perceptual processing might not trigger the congruency effects reported in previous studies.
Collapse
Affiliation(s)
- Kyuto Uno
- Department of Psychology, Graduate School of Humanities and Sociology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan.
- Japan Society for the Promotion of Science, 5-3-1 Kojimachi, Chiyoda-ku, Tokyo, 102-0083, Japan.
- Department of Psychology, Faculty of Human Sciences, Sophia University, 7-1 Kioi-cho, Chiyoda-ku, Tokyo, 102-8554, Japan.
| | - Kazuhiko Yokosawa
- Department of Psychology, Graduate School of Humanities and Sociology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
- Tsukuba Gakuin University, 3-1 Azuma, Tsukuba-shi, Ibaraki, 305-0031, Japan
| |
Collapse
|
3
|
Lowe BG, Robinson JE, Yamamoto N, Hogendoorn H, Johnston P. Same but different: The latency of a shared expectation signal interacts with stimulus attributes. Cortex 2023; 168:143-156. [PMID: 37716110 DOI: 10.1016/j.cortex.2023.08.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2023] [Revised: 07/13/2023] [Accepted: 08/07/2023] [Indexed: 09/18/2023]
Abstract
Predictive coding theories assert that perceptual inference is a hierarchical process of belief updating, wherein the onset of unexpected sensory data causes so-called prediction error responses that calibrate erroneous inferences. Given the functionally specialised organisation of visual cortex, it is assumed that prediction error propagation interacts with the specific visual attribute violating an expectation. We sought to test this within the temporal domain by applying time-resolved decoding methods to electroencephalography (EEG) data evoked by contextual trajectory violations of either brightness, size, or orientation within a bound stimulus. We found that following ∼170 ms post stimulus onset, responses to both size violations and orientation violations were decodable from physically identical control trials in which no attributes were violated. These two violation types were then directly compared, with attribute-specific signalling being decoded from 265 ms. Temporal generalisation suggested that this dissociation was driven by latency shifts in shared expectation signalling between the two conditions. Using a novel temporal bias method, we then found that this shared signalling occurred earlier for size violations than orientation violations. To our knowledge, we are among the first to decode expectation violations in humans using EEG and have demonstrated a temporal dissociation in attribute-specific expectancy violations.
Collapse
Affiliation(s)
- Benjamin G Lowe
- School of Psychology and Counselling, Queensland University of Technology (QUT), Kelvin Grove, QLD, Australia; Perception in Action Research Centre & School of Psychological Sciences, Macquarie University, Macquarie Park, NSW, Australia.
| | - Jonathan E Robinson
- Monash Centre for Consciousness & Contemplative Studies, Monash University, Clayton, VIC, Australia
| | - Naohide Yamamoto
- School of Psychology and Counselling, Queensland University of Technology (QUT), Kelvin Grove, QLD, Australia; Centre for Vision and Eye Research, Queensland University of Technology (QUT), Kelvin Grove, QLD, Australia
| | - Hinze Hogendoorn
- School of Psychology and Counselling, Queensland University of Technology (QUT), Kelvin Grove, QLD, Australia; Melbourne School of Psychological Science, University of Melbourne, Parkville, VIC, Australia
| | - Patrick Johnston
- School of Exercise Science and Nutrition Sciences, Queensland University of Technology (QUT), Kelvin Grove, QLD, Australia
| |
Collapse
|
4
|
Al-youzbaki MU, Schormans AL, Allman BL. Past and present experience shifts audiovisual temporal perception in rats. Front Behav Neurosci 2023; 17:1287587. [PMID: 37908200 PMCID: PMC10613659 DOI: 10.3389/fnbeh.2023.1287587] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2023] [Accepted: 09/25/2023] [Indexed: 11/02/2023] Open
Abstract
Our brains have a propensity to integrate closely-timed auditory and visual stimuli into a unified percept; a phenomenon that is highly malleable based on prior sensory experiences, and is known to be altered in clinical populations. While the neural correlates of audiovisual temporal perception have been investigated using neuroimaging and electroencephalography techniques in humans, animal research will be required to uncover the underlying cellular and molecular mechanisms. Prior to conducting such mechanistic studies, it is important to first confirm the translational potential of any prospective animal model. Thus, in the present study, we conducted a series of experiments to determine if rats show the hallmarks of audiovisual temporal perception observed in neurotypical humans, and whether the rat behavioral paradigms could reveal when they experienced perceptual disruptions akin to those observed in neurodevelopmental disorders. After training rats to perform a temporal order judgment (TOJ) or synchrony judgment (SJ) task, we found that the rats' perception was malleable based on their past and present sensory experiences. More specifically, passive exposure to asynchronous audiovisual stimulation in the minutes prior to behavioral testing caused the rats' perception to predictably shift in the direction of the leading stimulus; findings which represent the first time that this form of audiovisual perceptual malleability has been reported in non-human subjects. Furthermore, rats performing the TOJ task also showed evidence of rapid recalibration, in which their audiovisual temporal perception on the current trial was predictably influenced by the timing lag between the auditory and visual stimuli in the preceding trial. Finally, by manipulating either experimental testing parameters or altering the rats' neurochemistry with a systemic injection of MK-801, we showed that the TOJ and SJ tasks could identify when the rats had difficulty judging the timing of audiovisual stimuli. These findings confirm that the behavioral paradigms are indeed suitable for future testing of rats with perceptual disruptions in audiovisual processing. Overall, our collective results highlight that rats represent an excellent animal model to study the cellular and molecular mechanisms underlying the acuity and malleability of audiovisual temporal perception, as they showcase the perceptual hallmarks commonly observed in humans.
Collapse
|
5
|
Cai M, Bao Y. Spatial attention modulates auditory dominance in audiovisual order judgment. Psych J 2023; 12:537-539. [PMID: 37394228 DOI: 10.1002/pchj.661] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Accepted: 04/16/2023] [Indexed: 07/04/2023]
Abstract
Auditory dominance in audiovisual temporal order judgment is shown here to be modulated by exogenous orienting of attention to a spatial cue independent of the cue modality. The visual stimulus has to lead the auditory one further in advance for cued relative to uncued locations in order for the two to be perceived simultaneously, possibly suggesting an inhibitory function of spatial attention on temporal processing.
Collapse
Affiliation(s)
- Mengtong Cai
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China
| | - Yan Bao
- School of Psychological and Cognitive Sciences, Peking University, Beijing, China
- Institute of Medical Psychology, Ludwig Maximilian University, Munich, Germany
- Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
| |
Collapse
|
6
|
Wang X, Wu Y, Xing Z, Cui X, Gao M, Tang X. Modal-based attention modulates the redundant-signals effect: Role of unimodal target probability. Perception 2023; 52:97-115. [PMID: 36415087 DOI: 10.1177/03010066221136675] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Multisensory integration includes two behavioral manifestations: the modality dominance effect and the redundant-signals effect (RSE). RSE is a multisensory improvement effect in which individuals respond more quickly and accurately to bimodal audiovisual (AV) targets than to unimodal auditory (A) or visual (V) targets. Previous studies have confirmed that RSE is the product of modality interactions between different modalities. The goal of this study was to systematically investigate the effects of the modality dominance manipulated by modal-based attention and unimodal target probability on RSE. The results showed that when paying attention to both the A and V modalities (Exp. 1), RSE was not significantly different between unimodal target probabilities. When selectively paying attention to the A modality (Exp. 2A), RSE was also not significantly different between unimodal target probabilities. However, when selectively paying attention to the V modality (Exp. 2B), the magnitude of RSE showed a significant decreasing trend with the increasing probability of V targets. Our study is the first to reveal that the unimodal target probability significantly modulates RSE in visual selective attention, and this modulatory effect of the unimodal target probability on RSE is opposite to the modulatory effect on the modality dominance effect.
Collapse
Affiliation(s)
| | | | | | | | - Min Gao
- 66523Liaoning Normal University, China
| | | |
Collapse
|
7
|
Mafi F, Tang MF, Afarinesh MR, Ghasemian S, Sheibani V, Arabzadeh E. Temporal order judgment of multisensory stimuli in rat and human. Front Behav Neurosci 2023; 16:1070452. [PMID: 36710957 PMCID: PMC9879721 DOI: 10.3389/fnbeh.2022.1070452] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Accepted: 12/16/2022] [Indexed: 01/13/2023] Open
Abstract
We do not fully understand the resolution at which temporal information is processed by different species. Here we employed a temporal order judgment (TOJ) task in rats and humans to test the temporal precision with which these species can detect the order of presentation of simple stimuli across two modalities of vision and audition. Both species reported the order of audiovisual stimuli when they were presented from a central location at a range of stimulus onset asynchronies (SOA)s. While both species could reliably distinguish the temporal order of stimuli based on their sensory content (i.e., the modality label), rats outperformed humans at short SOAs (less than 100 ms) whereas humans outperformed rats at long SOAs (greater than 100 ms). Moreover, rats produced faster responses compared to humans. The reaction time data further revealed key differences in decision process across the two species: at longer SOAs, reaction times increased in rats but decreased in humans. Finally, drift-diffusion modeling allowed us to isolate the contribution of various parameters including evidence accumulation rates, lapse and bias to the sensory decision. Consistent with the psychophysical findings, the model revealed higher temporal sensitivity and a higher lapse rate in rats compared to humans. These findings suggest that these species applied different strategies for making perceptual decisions in the context of a multimodal TOJ task.
Collapse
Affiliation(s)
- Fatemeh Mafi
- Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
- Cognitive Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
| | - Matthew F. Tang
- Eccles Institute of Neuroscience, John Curtin School of Medical Research, The Australian National University, Canberra, ACT, Australia
| | - Mohammad Reza Afarinesh
- Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
- Cognitive Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
| | - Sadegh Ghasemian
- Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
- Cognitive Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
| | - Vahid Sheibani
- Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
- Cognitive Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
| | - Ehsan Arabzadeh
- Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
- Cognitive Neuroscience Research Center, Institute of Neuropharmacology, Kerman University of Medical Sciences, Kerman, Iran
- Eccles Institute of Neuroscience, John Curtin School of Medical Research, The Australian National University, Canberra, ACT, Australia
| |
Collapse
|
8
|
He Y, Yang T, He C, Sun K, Guo Y, Wang X, Bai L, Xue T, Xu T, Guo Q, Liao Y, Liu X, Wu S. Effects of audiovisual interactions on working memory: Use of the combined N-back + Go/NoGo paradigm. Front Psychol 2023; 14:1080788. [PMID: 36874804 PMCID: PMC9982107 DOI: 10.3389/fpsyg.2023.1080788] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Accepted: 01/27/2023] [Indexed: 02/19/2023] Open
Abstract
Background Approximately 94% of sensory information acquired by humans originates from the visual and auditory channels. Such information can be temporarily stored and processed in working memory, but this system has limited capacity. Working memory plays an important role in higher cognitive functions and is controlled by central executive function. Therefore, elucidating the influence of the central executive function on information processing in working memory, such as in audiovisual integration, is of great scientific and practical importance. Purpose This study used a paradigm that combined N-back and Go/NoGo tasks, using simple Arabic numerals as stimuli, to investigate the effects of cognitive load (modulated by varying the magnitude of N) and audiovisual integration on the central executive function of working memory as well as their interaction. Methods Sixty college students aged 17-21 years were enrolled and performed both unimodal and bimodal tasks to evaluate the central executive function of working memory. The order of the three cognitive tasks was pseudorandomized, and a Latin square design was used to account for order effects. Finally, working memory performance, i.e., reaction time and accuracy, was compared between unimodal and bimodal tasks with repeated-measures analysis of variance (ANOVA). Results As cognitive load increased, the presence of auditory stimuli interfered with visual working memory by a moderate to large extent; similarly, as cognitive load increased, the presence of visual stimuli interfered with auditory working memory by a moderate to large effect size. Conclusion Our study supports the theory of competing resources, i.e., that visual and auditory information interfere with each other and that the magnitude of this interference is primarily related to cognitive load.
Collapse
Affiliation(s)
- Yang He
- Department of Military Medical Psychology, Fourth Military Medical University, Xi'an, China
| | - Tianqi Yang
- Department of Military Medical Psychology, Fourth Military Medical University, Xi'an, China
| | - Chunyan He
- Department of Nursing, Fourth Military Medical University, Xi'an, China
| | - Kewei Sun
- Department of Military Medical Psychology, Fourth Military Medical University, Xi'an, China
| | - Yaning Guo
- Department of Military Medical Psychology, Fourth Military Medical University, Xi'an, China
| | - Xiuchao Wang
- Department of Military Medical Psychology, Fourth Military Medical University, Xi'an, China
| | - Lifeng Bai
- Faculty of Humanities and Social Sciences, Aviation University of Air Force, Changchun, China
| | - Ting Xue
- Faculty of Humanities and Social Sciences, Aviation University of Air Force, Changchun, China
| | - Tao Xu
- Psychology Section, Secondary Sanatorium of Air Force Healthcare Center for Special Services, Hangzhou, China
| | - Qingjun Guo
- Psychology Section, Secondary Sanatorium of Air Force Healthcare Center for Special Services, Hangzhou, China
| | - Yang Liao
- Air Force Medical Center, Air Force Medical University, Beijing, China
| | - Xufeng Liu
- Department of Military Medical Psychology, Fourth Military Medical University, Xi'an, China
| | - Shengjun Wu
- Department of Military Medical Psychology, Fourth Military Medical University, Xi'an, China
| |
Collapse
|
9
|
Fisher P, Schenk T. Temporal order judgments and presaccadic shifts of attention: What can prior entry teach us about the premotor theory? J Vis 2022; 22:6. [PMID: 36326744 PMCID: PMC9645358 DOI: 10.1167/jov.22.12.6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
A temporal order judgment (TOJ) 2-alternative forced choice design was used to examine presaccadic shifts of attention. Prior work on the premotor theory of attention (PTA) has predominantly focused on single-target discrimination tasks as a tool to measure accuracy and shifts of attention. It is important to demonstrate that the PTA is effective across attentional tasks that have been shown to be reliable in other contexts. Therefore, it was decided to use a perceptual task that probes multiple locations simultaneously and can equally be used to examine spatial spread of attention in more detail. In typical TOJ studies, prior entry is the metric used to measure an attentional effect. Prior entry is the biasing of temporal perception toward an attentionally cued location. This generally manifests as observers processing events at the cued location more rapidly, altering their perspective of temporal order. Participants were required to prepare saccades toward one of four targets, two of which would light up either synchronously or sequentially after a GO signal but before saccadic execution. Results demonstrated that in conditions with critical stimulus onset asynchronies, saccade preparation had a significant effect on performance. Prior entry effects were observed at saccade congruent locations with probes at these locations being typically perceived earlier than probes presented at a neutral location. These effects were not observed in control trials without a saccade. A further spatial effect was demonstrated for the attentional modulation, suggesting that this effect is restricted predominantly to horizontal configurations. Overall, results demonstrated that presaccadic attention is effective at eliciting a prior entry effect in TOJ designs and that such effects are more pronounced when the probes are distributed across the two lateral hemifields.
Collapse
Affiliation(s)
- Paul Fisher
- Lehrstuhl für Klinische Neuropsychologie, Ludwig Maximilians Universität, Munich, Germany
| | - Thomas Schenk
- Lehrstuhl für Klinische Neuropsychologie, Ludwig Maximilians Universität, Munich, Germany
| |
Collapse
|
10
|
Musical training refines audiovisual integration but does not influence temporal recalibration. Sci Rep 2022; 12:15292. [PMID: 36097277 PMCID: PMC9468170 DOI: 10.1038/s41598-022-19665-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2022] [Accepted: 09/01/2022] [Indexed: 11/11/2022] Open
Abstract
When the brain is exposed to a temporal asynchrony between the senses, it will shift its perception of simultaneity towards the previously experienced asynchrony (temporal recalibration). It is unknown whether recalibration depends on how accurately an individual integrates multisensory cues or on experiences they have had over their lifespan. Hence, we assessed whether musical training modulated audiovisual temporal recalibration. Musicians (n = 20) and non-musicians (n = 18) made simultaneity judgements to flash-tone stimuli before and after adaptation to asynchronous (± 200 ms) flash-tone stimuli. We analysed these judgements via an observer model that described the left and right boundaries of the temporal integration window (decisional criteria) and the amount of sensory noise that affected these judgements. Musicians’ boundaries were narrower (closer to true simultaneity) than non-musicians’, indicating stricter criteria for temporal integration, and they also exhibited enhanced sensory precision. However, while both musicians and non-musicians experienced cumulative and rapid recalibration, these recalibration effects did not differ between the groups. Unexpectedly, cumulative recalibration was caused by auditory-leading but not visual-leading adaptation. Overall, these findings suggest that the precision with which observers perceptually integrate audiovisual temporal cues does not predict their susceptibility to recalibration.
Collapse
|
11
|
He Y, Guo Z, Wang X, Sun K, Lin X, Wang X, Li F, Guo Y, Feng T, Zhang J, Li C, Tian W, Liu X, Wu S. Effects of Audiovisual Interactions on Working Memory Task Performance—Interference or Facilitation. Brain Sci 2022; 12:brainsci12070886. [PMID: 35884692 PMCID: PMC9313432 DOI: 10.3390/brainsci12070886] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Revised: 06/26/2022] [Accepted: 07/01/2022] [Indexed: 11/16/2022] Open
Abstract
(1) Background: The combined n-back + Go/NoGo paradigm was used to investigate whether audiovisual interactions interfere with or facilitate WM. (2) Methods: College students were randomly assigned to perform the working memory task based on either a single (visual or auditory) or dual (audiovisual) stimulus. Reaction times, accuracy, and WM performance were compared across the two groups to investigate effects of audiovisual interactions. (3) Results: With low cognitive load (2-back), auditory stimuli had no effect on visual working memory, whereas visual stimuli had a small effect on auditory working memory. With high cognitive load (3-back), auditory stimuli interfered (large effect size) with visual WM, and visual stimuli interfered (medium effect size) with auditory WM. (4) Conclusions: Audiovisual effects on WM follow the resource competition theory, and the cognitive load of a visual stimulus is dominated by competition; vision always interferes with audition, and audition conditionally interferes with vision. With increased visual cognitive load, competitive effects of audiovisual interactions were more obvious than those with auditory stimuli. Compared with visual stimuli, audiovisual stimuli showed significant interference only when visual cognitive load was high. With low visual cognitive load, the two stimulus components neither facilitated nor interfered with the other in accordance with a speed–accuracy trade-off.
Collapse
Affiliation(s)
- Yang He
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Zhihua Guo
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Xinlu Wang
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Kewei Sun
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Xinxin Lin
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Xiuchao Wang
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Fengzhan Li
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Yaning Guo
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Tingwei Feng
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Junpeng Zhang
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Congchong Li
- School of Public Health, Shaanxi University of Chinese Medicine, Xianyang 712046, China
| | - Wenqing Tian
- School of Public Health, Shaanxi University of Chinese Medicine, Xianyang 712046, China
| | - Xufeng Liu
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| | - Shengjun Wu
- Department of Military Medical Psychology, Air Force Medical University, Xi'an 710032, China
| |
Collapse
|
12
|
Kwon J, Park S, Sakamoto M, Mito K. The Effects of Vibratory Frequency and Temporal Interval on Tactile Apparent Motion. IEEE TRANSACTIONS ON HAPTICS 2021; 14:675-679. [PMID: 33439848 DOI: 10.1109/toh.2021.3051388] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Vibrotactile stimuli can be used to generate the haptic sensation of a static object or the motion of a dynamic object. Here, in this article, we investigated the effects of vibratory frequency and temporal interval on tactile apparent motion. In the experiment, we examined the effect of vibratory frequency with different temporal intervals on tactile apparent motion that results from two successive tactile stimuli on the index fingerpad. Results indicated that tactile apparent motion was perceived not only when both stimuli were either "flutter" or "vibration" stimuli, but also when one of each type was used. Specifically, when the first stimulus was introduced at 40Hz, "continuous motion" was viewed at all combinations of stimulus frequency, and "continuous motion" was clearly noted at the high-frequency combination instead of the low-frequency combination. Also, tactile apparent motion was predominantly viewed in the SOA range of 105 ms to 125 ms. We anticipate that our findings and further research will be essential resources for the design of tactile devices to represent the motion of dynamic objects.
Collapse
|
13
|
Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review. J Assoc Res Otolaryngol 2021; 22:365-386. [PMID: 34014416 PMCID: PMC8329114 DOI: 10.1007/s10162-021-00789-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Accepted: 02/07/2021] [Indexed: 01/03/2023] Open
Abstract
In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision's influence in audition, making the distinction between vision's ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision's ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception-scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.
Collapse
|
14
|
Opoku-Baah C, Wallace MT. Binocular Enhancement of Multisensory Temporal Perception. Invest Ophthalmol Vis Sci 2021; 62:7. [PMID: 33661284 PMCID: PMC7938005 DOI: 10.1167/iovs.62.3.7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023] Open
Abstract
Purpose The goal of this study was to examine the behavioral effects and to suggest possible underlying mechanisms of binocularity on audiovisual temporal perception in normally-sighted individuals. Methods Participants performed two audiovisual simultaneity judgment tasks-one using simple flashes and beeps and the other using audiovisual speech stimuli-with the left eye, right eye, and both eyes. Two measures, the point of subjective simultaneity (PSS) and the temporal binding window (TBW), an index for audiovisual temporal acuity, were derived for each viewing condition, stimulus type, and participant. The data were then modeled using causal inference, allowing us to determine whether binocularity affected low-level unisensory mechanisms (i.e., sensory noise level) or high-level multisensory mechanisms (i.e., prior probability of interring a common cause, pC=1). Results Whereas for the PSS there was no significant effect of viewing condition, for the TBW, a significant interaction between stimulus type and viewing condition was found. Post hoc analyses revealed a significantly narrower TBW during binocular than monocular viewing (average of left and right eyes) for the flash-beep condition but no difference between the viewing conditions for the speech stimuli. Modeling results showed no significant difference in pC=1 but a significant reduction in sensory noise during binocular performance on flash-beep trials. Conclusions Binocular viewing was found to enhance audiovisual temporal acuity as indexed by the TBW for simple low-level audiovisual stimuli. Furthermore, modeling results suggest that this effect may stem from enhanced sensory representations evidenced as a reduction in sensory noise affecting the measurement of physical asynchrony during audiovisual temporal perception.
Collapse
Affiliation(s)
- Collins Opoku-Baah
- Neuroscience Graduate Program, Vanderbilt University, Nashville, Tennessee, United States.,Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee, United States
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, Tennessee, United States.,Department of Psychology, Vanderbilt University, Nashville, Tennessee, United States.,Department of Hearing and Speech, Vanderbilt University Medical Center, Nashville, Tennessee, United States.,Vanderbilt Vision Research Center, Nashville, Tennessee, United States.,Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, Tennessee, United States.,Department of Pharmacology, Vanderbilt University, Nashville, Tennessee, United States
| |
Collapse
|
15
|
Tanaka T, Ogata T, Miyake Y. The Effect of Rhythmic Tactile Stimuli Under the Voluntary Movement on Audio-Tactile Temporal Order Judgement. Front Psychol 2021; 11:600263. [PMID: 33633626 PMCID: PMC7900129 DOI: 10.3389/fpsyg.2020.600263] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2020] [Accepted: 12/28/2020] [Indexed: 11/13/2022] Open
Abstract
The simultaneous perception of multimodal sensory information is important for effective reactions to the external environment. In relation to the effect on time perception, voluntary movement and rhythmic stimuli have already been identified in previous studies to be associated with improved accuracy of temporal order judgments (TOJs). Here, we examined whether the combination of voluntary movement and rhythmic stimuli improves the just noticeable difference (JND) in audio-tactile TOJ Tasks. Four different experimental conditions were studied, involving two types of movements (voluntary movement, involuntary movement) and two types of stimulus presentation (rhythmic, one-time only). In the voluntary movement condition (VM), after the auditory stimulus (cue sound) participants moved their right index finger voluntarily and naturally, while in the involuntary movement condition (IM), their right index finger was moved by the tactile device. The stimuli were provided in a rhythmic or one-time only manner by hitting inside the first joint of the participants' right index finger using a tactile device. Furthermore, in the rhythmical tactile (RT) conditions, tactile stimuli were presented rhythmically to the right index finger 5 times consecutively. On the other hand, in the one-time tactile (1T) conditions, tactile stimuli was presented one-time only to the right index finger. Participants made an order judgment for the fifth tactile stimuli and the first and only auditory stimuli. In our TOJ tasks, auditory-tactile stimulus pairs were presented to participants with varying stimulus-onset asynchronies (SOAs; intervals between the within-pair onsets of the auditory and tactile stimuli). For the two stimuli presented at a time that were shifted by the SOA, the participants were asked to judge which one was presented first, and they were given a two-choice answer. Using a non-parametric test, our results showed that voluntary movement and rhythmic tactile stimuli were both effective in improving the JNDs in TOJ Tasks. However, in the combination of voluntary movement and rhythmic tactile stimuli, we found that there was no significant difference in JNDs in our experiments.
Collapse
Affiliation(s)
- Taeko Tanaka
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Taiki Ogata
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Yoshihiro Miyake
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| |
Collapse
|
16
|
Opoku-Baah C, Wallace MT. Brief period of monocular deprivation drives changes in audiovisual temporal perception. J Vis 2020; 20:8. [PMID: 32761108 PMCID: PMC7438662 DOI: 10.1167/jov.20.8.8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023] Open
Abstract
The human brain retains a striking degree of plasticity into adulthood. Recent studies have demonstrated that a short period of altered visual experience (via monocular deprivation) can change the dynamics of binocular rivalry in favor of the deprived eye, a compensatory action thought to be mediated by an upregulation of cortical gain control mechanisms. Here, we sought to better understand the impact of monocular deprivation on multisensory abilities, specifically examining audiovisual temporal perception. Using an audiovisual simultaneity judgment task, we discovered that 90 minutes of monocular deprivation produced opposing effects on the temporal binding window depending on the eye used in the task. Thus, in those who performed the task with their deprived eye there was a narrowing of the temporal binding window, whereas in those performing the task with their nondeprived eye there was a widening of the temporal binding window. The effect was short lived, being observed only in the first 10 minutes of postdeprivation testing. These findings indicate that changes in visual experience in the adult can rapidly impact multisensory perceptual processes, a finding that has important clinical implications for those patients with adult-onset visual deprivation and for therapies founded on monocular deprivation.
Collapse
Affiliation(s)
| | - Mark T Wallace
- ,.,,.,,.,,.,,.,,
| |
Collapse
|
17
|
The simultaneous oddball: Oddball presentation does not affect simultaneity judgments. Atten Percept Psychophys 2020; 82:1654-1668. [PMID: 31942702 DOI: 10.3758/s13414-019-01866-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The oddball duration illusion describes how a rare or nonrepeated stimulus is perceived as lasting longer than a common or repeated stimulus. It has been argued that the oddball duration illusion could emerge because of an earlier perceived onset of an oddball stimulus. However, most methods used to assess the perceived duration of an oddball stimulus are ill suited to detect onset effects. Therefore, in the current article, I tested the perceived onset of oddball and standard stimuli using a simultaneity judgment task. In Experiments 1 and 2, repetition and rarity of the target stimulus were varied, and participants were required to judge whether the target stimulus and another stimulus were concurrent. In Experiment 3, I tested whether a brief initial stimulus could act as a conditioning stimulus in the oddball duration illusion. This was to ensure an oddball duration illusion could have occurred given the short duration of stimuli in the first two experiments. In both the first two experiments, I found moderate support for no onset-based difference between oddball and nonoddball stimuli. In Experiment 3, I found that a short conditioning stimulus could still lead to the oddball duration illusion occurring, removing this possible explanation for the null result. Experiment 4 showed that an oddball duration illusion could emerge given the rarity of the stimulus and a concurrent sound. In sum, the current article found evidence against an onset-based explanation of the oddball duration illusion.
Collapse
|
18
|
Boyce WP, Lindsay A, Zgonnikov A, Rañó I, Wong-Lin K. Optimality and Limitations of Audio-Visual Integration for Cognitive Systems. Front Robot AI 2020; 7:94. [PMID: 33501261 PMCID: PMC7805627 DOI: 10.3389/frobt.2020.00094] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2019] [Accepted: 06/09/2020] [Indexed: 11/13/2022] Open
Abstract
Multimodal integration is an important process in perceptual decision-making. In humans, this process has often been shown to be statistically optimal, or near optimal: sensory information is combined in a fashion that minimizes the average error in perceptual representation of stimuli. However, sometimes there are costs that come with the optimization, manifesting as illusory percepts. We review audio-visual facilitations and illusions that are products of multisensory integration, and the computational models that account for these phenomena. In particular, the same optimal computational model can lead to illusory percepts, and we suggest that more studies should be needed to detect and mitigate these illusions, as artifacts in artificial cognitive systems. We provide cautionary considerations when designing artificial cognitive systems with the view of avoiding such artifacts. Finally, we suggest avenues of research toward solutions to potential pitfalls in system design. We conclude that detailed understanding of multisensory integration and the mechanisms behind audio-visual illusions can benefit the design of artificial cognitive systems.
Collapse
Affiliation(s)
- William Paul Boyce
- Intelligent Systems Research Centre, Ulster University, Magee Campus, Derry Londonderry, Northern Ireland, United Kingdom
| | - Anthony Lindsay
- Intelligent Systems Research Centre, Ulster University, Magee Campus, Derry Londonderry, Northern Ireland, United Kingdom
| | - Arkady Zgonnikov
- AiTech, Delft University of Technology, Delft, Netherlands
- Department of Cognitive Robotics, Faculty of Mechanical, Maritime, and Materials Engineering, Delft University of Technology, Delft, Netherlands
| | - Iñaki Rañó
- Intelligent Systems Research Centre, Ulster University, Magee Campus, Derry Londonderry, Northern Ireland, United Kingdom
| | - KongFatt Wong-Lin
- Intelligent Systems Research Centre, Ulster University, Magee Campus, Derry Londonderry, Northern Ireland, United Kingdom
| |
Collapse
|
19
|
Zhou HY, Cheung EFC, Chan RCK. Audiovisual temporal integration: Cognitive processing, neural mechanisms, developmental trajectory and potential interventions. Neuropsychologia 2020; 140:107396. [PMID: 32087206 DOI: 10.1016/j.neuropsychologia.2020.107396] [Citation(s) in RCA: 38] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2019] [Revised: 02/14/2020] [Accepted: 02/15/2020] [Indexed: 12/21/2022]
Abstract
To integrate auditory and visual signals into a unified percept, the paired stimuli must co-occur within a limited time window known as the Temporal Binding Window (TBW). The width of the TBW, a proxy of audiovisual temporal integration ability, has been found to be correlated with higher-order cognitive and social functions. A comprehensive review of studies investigating audiovisual TBW reveals several findings: (1) a wide range of top-down processes and bottom-up features can modulate the width of the TBW, facilitating adaptation to the changing and multisensory external environment; (2) a large-scale brain network works in coordination to ensure successful detection of audiovisual (a)synchrony; (3) developmentally, audiovisual TBW follows a U-shaped pattern across the lifespan, with a protracted developmental course into late adolescence and rebounding in size again in late life; (4) an enlarged TBW is characteristic of a number of neurodevelopmental disorders; and (5) the TBW is highly flexible via perceptual and musical training. Interventions targeting the TBW may be able to improve multisensory function and ameliorate social communicative symptoms in clinical populations.
Collapse
Affiliation(s)
- Han-Yu Zhou
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | | | - Raymond C K Chan
- Neuropsychology and Applied Cognitive Neuroscience Laboratory, CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China; Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
20
|
Fang Y, Li Y, Xu X, Tao H, Chen Q. Top-down attention modulates the direction and magnitude of sensory dominance. Exp Brain Res 2020; 238:587-600. [PMID: 31996936 DOI: 10.1007/s00221-020-05737-7] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2019] [Accepted: 01/18/2020] [Indexed: 11/29/2022]
Abstract
Bottom-up inputs from multiple sensory modalities compete to reach perceptual consciousness. The sensory dominance effect refers to the phenomenon that stimuli from one sensory modality are preferentially selected over the other modalities. Top-down attention helps us to select task-relevant information while filtering out task-irrelevant distracting information. To investigate how top-down attention towards one specific modality modulates the sensory dominance effect, we incorporated the endogenous cue-target paradigm and an adapted version of the Colavita paradigm in the present study. The visual responses could either precede or fall behind the auditory responses, i.e., the visual vs. auditory precedence trials. The direction of the sensory dominance was defined as the proportion of the visual vs. auditory precedence bimodal trials, and the magnitude of the sensory dominance was calculated as the difference in reaction times between the first and the second responses in the bimodal trials. Results from the present three experiments consistently showed that when attention was voluntarily directed to the visual modality, the visual dominance occurred more frequently than the auditory dominance, and the magnitude of the visual dominance was significantly larger than the auditory dominance. This pattern of results was independent of the delivery modality of the cue. The present results thus provide direct empirical evidence showing that endogenous attention towards one specific sensory modality modulates both the direction and the size of sensory dominance.
Collapse
Affiliation(s)
- Ying Fang
- Center for Studies of Psychological Application, School of Psychology, South China Normal University, Guangzhou, 510631, People's Republic of China
| | - You Li
- Center for Studies of Psychological Application, School of Psychology, South China Normal University, Guangzhou, 510631, People's Republic of China
| | - Xiaoting Xu
- Center for Studies of Psychological Application, School of Psychology, South China Normal University, Guangzhou, 510631, People's Republic of China
| | - Hong Tao
- Center for Studies of Psychological Application, School of Psychology, South China Normal University, Guangzhou, 510631, People's Republic of China
| | - Qi Chen
- Center for Studies of Psychological Application, School of Psychology, South China Normal University, Guangzhou, 510631, People's Republic of China. .,Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University, Guangzhou, 510631, People's Republic of China.
| |
Collapse
|
21
|
Christie BP, Graczyk EL, Charkhkar H, Tyler DJ, Triolo RJ. Visuotactile synchrony of stimulation-induced sensation and natural somatosensation. J Neural Eng 2019; 16:036025. [PMID: 30939464 DOI: 10.1088/1741-2552/ab154c] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
OBJECTIVE Previous studies suggest that somatosensory feedback has the potential to improve the functional performance of prostheses, reduce phantom pain, and enhance embodiment of sensory-enabled prosthetic devices. To maximize such benefits for amputees, the temporal properties of the sensory feedback must resemble those of natural somatosensation in an intact limb. APPROACH To better understand temporal perception of artificial sensation, we characterized the perception of visuotactile synchrony for tactile perception restored via peripheral nerve stimulation. We electrically activated nerves in the residual limbs of two trans-tibial amputees and two trans-radial amputees via non-penetrating nerve cuff electrodes, which elicited sensations referred to the missing limbs. MAIN RESULTS Our findings suggest that with respect to vision, stimulation-induced sensation has a point of subjective simultaneity (PSS; processing time) and just noticeable difference (JND; temporal sensitivity) that are similar to natural touch. The JND was not significantly different between the participants with upper- and lower-limb amputations. However, the PSS indicated that sensations evoked in the missing leg must occur significantly earlier than those in the hand to be perceived as maximally synchronous with vision. Furthermore, we examined visuotactile synchrony in the context of a functional task during which stimulation was triggered by pressure applied to the prosthesis. Stimulation-induced sensation could be delayed up to 111 ± 62 ms without the delay being reliably detected. SIGNIFICANCE The quantitative temporal properties of stimulation-induced perception were previously unknown and will contribute to design specifications for future sensory neuroprostheses.
Collapse
Affiliation(s)
- Breanne P Christie
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH, United States of America. Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, OH, United States of America
| | | | | | | | | |
Collapse
|
22
|
Richards MD, Goltz HC, Wong AM. Audiovisual perception in amblyopia: A review and synthesis. Exp Eye Res 2019; 183:68-75. [DOI: 10.1016/j.exer.2018.04.017] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2018] [Revised: 04/27/2018] [Accepted: 04/28/2018] [Indexed: 11/15/2022]
|
23
|
Störmer VS. Orienting spatial attention to sounds enhances visual processing. Curr Opin Psychol 2019; 29:193-198. [PMID: 31022562 DOI: 10.1016/j.copsyc.2019.03.010] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2018] [Revised: 03/12/2019] [Accepted: 03/14/2019] [Indexed: 11/20/2022]
Abstract
Attention, the mechanism by which information is selected for further processing, has mostly been studied within the visual system. While this research has been exceptionally successful, it is important to understand how attention operates across the sensory modalities. This review focuses on recent studies showing that orienting to a peripheral, salient sound affects visual processing: it enhances visual perception, boosts visual-cortical responses, and modulates visual cortex activity before the appearance of a visual object. Critically, all of these effects are spatially selective, indicating that spatial attention facilitates perceptual processing at an attended location across sensory modalities. The neural changes in visual cortex triggered by the sounds not only resemble some of the neural modulations reported in uni-modal visual attention studies, but also reveal some important differences.
Collapse
Affiliation(s)
- Viola S Störmer
- Department of Psychology, University of California, San Diego, United States.
| |
Collapse
|
24
|
Sanders P, Thompson B, Corballis P, Searchfield G. On the Timing of Signals in Multisensory Integration and Crossmodal Interactions: a Scoping Review. Multisens Res 2019; 32:533-573. [PMID: 31137004 DOI: 10.1163/22134808-20191331] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2018] [Accepted: 04/24/2019] [Indexed: 11/19/2022]
Abstract
A scoping review was undertaken to explore research investigating early interactions and integration of auditory and visual stimuli in the human brain. The focus was on methods used to study low-level multisensory temporal processing using simple stimuli in humans, and how this research has informed our understanding of multisensory perception. The study of multisensory temporal processing probes how the relative timing between signals affects perception. Several tasks, illusions, computational models, and neuroimaging techniques were identified in the literature search. Research into early audiovisual temporal processing in special populations was also reviewed. Recent research has continued to provide support for early integration of crossmodal information. These early interactions can influence higher-level factors, and vice versa. Temporal relationships between auditory and visual stimuli influence multisensory perception, and likely play a substantial role in solving the 'correspondence problem' (how the brain determines which sensory signals belong together, and which should be segregated).
Collapse
Affiliation(s)
- Philip Sanders
- 1Section of Audiology, University of Auckland, Auckland, New Zealand.,2Centre for Brain Research, University of Auckland, New Zealand.,3Brain Research New Zealand - Rangahau Roro Aotearoa, New Zealand
| | - Benjamin Thompson
- 2Centre for Brain Research, University of Auckland, New Zealand.,4School of Optometry and Vision Science, University of Auckland, Auckland, New Zealand.,5School of Optometry and Vision Science, University of Waterloo, Waterloo, Canada
| | - Paul Corballis
- 2Centre for Brain Research, University of Auckland, New Zealand.,6Department of Psychology, University of Auckland, Auckland, New Zealand
| | - Grant Searchfield
- 1Section of Audiology, University of Auckland, Auckland, New Zealand.,2Centre for Brain Research, University of Auckland, New Zealand.,3Brain Research New Zealand - Rangahau Roro Aotearoa, New Zealand
| |
Collapse
|
25
|
Brady TF, Störmer VS, Shafer-Skelton A, Williams JR, Chapman AF, Schill HM. Scaling up visual attention and visual working memory to the real world. PSYCHOLOGY OF LEARNING AND MOTIVATION 2019. [DOI: 10.1016/bs.plm.2019.03.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/16/2023]
|
26
|
Li Q, Liu P, Huang S, Huang X. The influence of phasic alerting on multisensory temporal precision. Exp Brain Res 2018; 236:3279-3296. [DOI: 10.1007/s00221-018-5372-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2018] [Accepted: 08/29/2018] [Indexed: 11/29/2022]
|
27
|
Chen YC, Lewis TL, Shore DI, Spence C, Maurer D. Developmental changes in the perception of visuotactile simultaneity. J Exp Child Psychol 2018; 173:304-317. [DOI: 10.1016/j.jecp.2018.04.014] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2017] [Revised: 04/25/2018] [Accepted: 04/25/2018] [Indexed: 10/16/2022]
|
28
|
Butera IM, Stevenson RA, Mangus BD, Woynaroski TG, Gifford RH, Wallace MT. Audiovisual Temporal Processing in Postlingually Deafened Adults with Cochlear Implants. Sci Rep 2018; 8:11345. [PMID: 30054512 PMCID: PMC6063927 DOI: 10.1038/s41598-018-29598-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2018] [Accepted: 07/09/2018] [Indexed: 11/17/2022] Open
Abstract
For many cochlear implant (CI) users, visual cues are vitally important for interpreting the impoverished auditory speech information that an implant conveys. Although the temporal relationship between auditory and visual stimuli is crucial for how this information is integrated, audiovisual temporal processing in CI users is poorly understood. In this study, we tested unisensory (auditory alone, visual alone) and multisensory (audiovisual) temporal processing in postlingually deafened CI users (n = 48) and normal-hearing controls (n = 54) using simultaneity judgment (SJ) and temporal order judgment (TOJ) tasks. We varied the timing onsets between the auditory and visual components of either a syllable/viseme or a simple flash/beep pairing, and participants indicated either which stimulus appeared first (TOJ) or if the pair occurred simultaneously (SJ). Results indicate that temporal binding windows-the interval within which stimuli are likely to be perceptually 'bound'-are not significantly different between groups for either speech or non-speech stimuli. However, the point of subjective simultaneity for speech was less visually leading in CI users, who interestingly, also had improved visual-only TOJ thresholds. Further signal detection analysis suggests that this SJ shift may be due to greater visual bias within the CI group, perhaps reflecting heightened attentional allocation to visual cues.
Collapse
Affiliation(s)
- Iliza M Butera
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA.
| | - Ryan A Stevenson
- Department of Psychology, University of Western Ontario, London, ON, Canada
- Brain and Mind Institute, University of Western Ontario, London, ON, Canada
| | - Brannon D Mangus
- Murfreesboro Medical Clinic and Surgicenter, Murfreesboro, TN, USA
| | - Tiffany G Woynaroski
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA
| | - René H Gifford
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
- Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA
- Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, USA
| |
Collapse
|
29
|
Tünnermann J, Scharlau I. Poking Left To Be Right? A Model-Based Analysis of Temporal Order Judged by Mice. Adv Cogn Psychol 2018; 14:39-50. [PMID: 32676131 PMCID: PMC7354420 DOI: 10.5709/acp-0237-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023] Open
Abstract
The theory of visual attention (TVTVA) provides a formal framework for the assessment of visual attention and related processes. Its center is a mathematical model of visual encoding processes and discretely defined components of attention. Building on this model, TVTVA offers quantitative and process-related explanations for a variety of phenomena in the domain of visual attention. Because the theory relies on very general assumptions which might hold true for other domains of sensory processing, we tested its possible explanatory value for tactile processing in mice. Reanalyzing published data of temporal-order judgments by mice, we show how a TVTVA-based analysis identifies the processes which drive observable behavior and that it comes to conclusions quite different from those of conventional analyses of temporal-order judgments. According to this analysis, despite the same overall capacity dedicated to the task, some mice assume attentional biases toward one side, possibly to optimize their overall performance. We suggest that TVTVA's concepts provide a powerful point of vantage to find explanations for observable behavior where conventional analysis easily leads to dead ends.
Collapse
|
30
|
No perceptual prioritization of non-nociceptive vibrotactile and visual stimuli presented on a sensitized body part. Sci Rep 2018; 8:5359. [PMID: 29599492 PMCID: PMC5876401 DOI: 10.1038/s41598-018-23135-6] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2017] [Accepted: 03/05/2018] [Indexed: 12/13/2022] Open
Abstract
High frequency electrical conditioning stimulation (HFS) is an experimental method to induce increased mechanical pinprick sensitivity in the unconditioned surrounding skin (secondary hyperalgesia). Secondary hyperalgesia is thought to be the result of central sensitization, i.e. increased responsiveness of nociceptive neurons in the central nervous system. Vibrotactile and visual stimuli presented in the area of secondary hyperalgesia also elicit enhanced brain responses, a finding that cannot be explained by central sensitization as it is currently defined. HFS may recruit attentional processes, which in turn affect the processing of all stimuli. In this study we have investigated whether HFS induces perceptual biases towards stimuli presented onto the sensitized arm by using Temporal Order Judgment (TOJ) tasks. In TOJ tasks, stimuli are presented in rapid succession on either arm, and participants have to indicate their perceived order. In case of a perceptual bias, the stimuli presented on the attended side are systematically reported as occurring first. Participants performed a tactile and a visual TOJ task before and after HFS. Analyses of participants' performance did not reveal any prioritization of the visual and tactile stimuli presented onto the sensitized arm. Our results provide therefore no evidence for a perceptual bias towards tactile and visual stimuli presented onto the sensitized arm.
Collapse
|
31
|
Hemispheric asymmetry: Looking for a novel signature of the modulation of spatial attention in multisensory processing. Psychon Bull Rev 2018; 24:690-707. [PMID: 27586002 PMCID: PMC5486865 DOI: 10.3758/s13423-016-1154-y] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
The extent to which attention modulates multisensory processing in a top-down fashion is still a subject of debate among researchers. Typically, cognitive psychologists interested in this question have manipulated the participants’ attention in terms of single/dual tasking or focal/divided attention between sensory modalities. We suggest an alternative approach, one that builds on the extensive older literature highlighting hemispheric asymmetries in the distribution of spatial attention. Specifically, spatial attention in vision, audition, and touch is typically biased preferentially toward the right hemispace, especially under conditions of high perceptual load. We review the evidence demonstrating such an attentional bias toward the right in extinction patients and healthy adults, along with the evidence of such rightward-biased attention in multisensory experimental settings. We then evaluate those studies that have demonstrated either a more pronounced multisensory effect in right than in left hemispace, or else similar effects in the two hemispaces. The results suggest that the influence of rightward-biased attention is more likely to be observed when the crossmodal signals interact at later stages of information processing and under conditions of higher perceptual load—that is, conditions under which attention is perhaps a compulsory enhancer of information processing. We therefore suggest that the spatial asymmetry in attention may provide a useful signature of top-down attentional modulation in multisensory processing.
Collapse
|
32
|
|
33
|
Grabot L, Kösem A, Azizi L, van Wassenhove V. Prestimulus Alpha Oscillations and the Temporal Sequencing of Audiovisual Events. J Cogn Neurosci 2017; 29:1566-1582. [DOI: 10.1162/jocn_a_01145] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Perceiving the temporal order of sensory events typically depends on participants' attentional state, thus likely on the endogenous fluctuations of brain activity. Using magnetoencephalography, we sought to determine whether spontaneous brain oscillations could disambiguate the perceived order of auditory and visual events presented in close temporal proximity, that is, at the individual's perceptual order threshold (Point of Subjective Simultaneity [PSS]). Two neural responses were found to index an individual's temporal order perception when contrasting brain activity as a function of perceived order (i.e., perceiving the sound first vs. perceiving the visual event first) given the same physical audiovisual sequence. First, average differences in prestimulus auditory alpha power indicated perceiving the correct ordering of audiovisual events irrespective of which sensory modality came first: a relatively low alpha power indicated perceiving auditory or visual first as a function of the actual sequence order. Additionally, the relative changes in the amplitude of the auditory (but not visual) evoked responses were correlated with participant's correct performance. Crucially, the sign of the magnitude difference in prestimulus alpha power and evoked responses between perceived audiovisual orders correlated with an individual's PSS. Taken together, our results suggest that spontaneous oscillatory activity cannot disambiguate subjective temporal order without prior knowledge of the individual's bias toward perceiving one or the other sensory modality first. Altogether, our results suggest that, under high perceptual uncertainty, the magnitude of prestimulus alpha (de)synchronization indicates the amount of compensation needed to overcome an individual's prior in the serial ordering and temporal sequencing of information.
Collapse
Affiliation(s)
- Laetitia Grabot
- Cognitive Neuroimaging Unit, CEA DRF/Joliot, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, 91191 Gif/Yvette, France
| | - Anne Kösem
- Radboud University, Nijmegen, The Netherlands
- Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| | - Leila Azizi
- Cognitive Neuroimaging Unit, CEA DRF/Joliot, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, 91191 Gif/Yvette, France
| | - Virginie van Wassenhove
- Cognitive Neuroimaging Unit, CEA DRF/Joliot, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin center, 91191 Gif/Yvette, France
| |
Collapse
|
34
|
Dean CL, Eggleston BA, Gibney KD, Aligbe E, Blackwell M, Kwakye LD. Auditory and visual distractors disrupt multisensory temporal acuity in the crossmodal temporal order judgment task. PLoS One 2017; 12:e0179564. [PMID: 28723907 PMCID: PMC5516972 DOI: 10.1371/journal.pone.0179564] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2017] [Accepted: 05/30/2017] [Indexed: 12/15/2022] Open
Abstract
The ability to synthesize information across multiple senses is known as multisensory integration and is essential to our understanding of the world around us. Sensory stimuli that occur close in time are likely to be integrated, and the accuracy of this integration is dependent on our ability to precisely discriminate the relative timing of unisensory stimuli (crossmodal temporal acuity). Previous research has shown that multisensory integration is modulated by both bottom-up stimulus features, such as the temporal structure of unisensory stimuli, and top-down processes such as attention. However, it is currently uncertain how attention alters crossmodal temporal acuity. The present study investigated whether increasing attentional load would decrease crossmodal temporal acuity by utilizing a dual-task paradigm. In this study, participants were asked to judge the temporal order of a flash and beep presented at various temporal offsets (crossmodal temporal order judgment (CTOJ) task) while also directing their attention to a secondary distractor task in which they detected a target stimulus within a stream visual or auditory distractors. We found decreased performance on the CTOJ task as well as increases in both the positive and negative just noticeable difference with increasing load for both the auditory and visual distractor tasks. This strongly suggests that attention promotes greater crossmodal temporal acuity and that reducing the attentional capacity to process multisensory stimuli results in detriments to multisensory temporal processing. Our study is the first to demonstrate changes in multisensory temporal processing with decreased attentional capacity using a dual task paradigm and has strong implications for developmental disorders such as autism spectrum disorders and developmental dyslexia which are associated with alterations in both multisensory temporal processing and attention.
Collapse
Affiliation(s)
- Cassandra L. Dean
- Department of Neuroscience, Oberlin College, Oberlin, Ohio, United States of America
| | - Brady A. Eggleston
- Department of Neuroscience, Oberlin College, Oberlin, Ohio, United States of America
| | - Kyla David Gibney
- Department of Neuroscience, Oberlin College, Oberlin, Ohio, United States of America
| | - Enimielen Aligbe
- Department of Neuroscience, Oberlin College, Oberlin, Ohio, United States of America
| | - Marissa Blackwell
- Department of Neuroscience, Oberlin College, Oberlin, Ohio, United States of America
| | - Leslie Dowell Kwakye
- Department of Neuroscience, Oberlin College, Oberlin, Ohio, United States of America
- * E-mail:
| |
Collapse
|
35
|
Vibell J, Klinge C, Zampini M, Nobre AC, Spence C. Differences between endogenous attention to spatial locations and sensory modalities. Exp Brain Res 2017; 235:2983-2996. [PMID: 28717820 PMCID: PMC5603640 DOI: 10.1007/s00221-017-5030-4] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2016] [Accepted: 07/09/2017] [Indexed: 11/02/2022]
Abstract
Vibell et al. (J Cogn Neurosci 19:109-120, 2007) reported that endogenously attending to a sensory modality (vision or touch) modulated perceptual processing, in part, by the relative speeding-up of neural activation (i.e., as a result of prior entry). However, it was unclear whether it was the fine temporal discrimination required by the temporal-order judgment task that was used, or rather, the type of attentional modulation (spatial locations or sensory modalities) that was responsible for the shift in latencies that they observed. The present study used a similar experimental design to evaluate whether spatial attention would also yield similar latency effects suggestive of prior entry in the early visual P1 potentials. Intriguingly, while the results demonstrate similar neural latency shifts attributable to spatial attention, they started at a somewhat later stage than seen in Vibell et al.'s study. These differences are consistent with different neural mechanisms underlying attention to a specific sensory modality versus to a spatial location.
Collapse
Affiliation(s)
- J Vibell
- Department of Experimental Psychology, University of Oxford, Oxford, UK. .,Department of Psychology, University of Hawaii, 2530 Dole St, Honolulu, HI, 96822, USA.
| | - C Klinge
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - M Zampini
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - A C Nobre
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - C Spence
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| |
Collapse
|
36
|
Alterations in audiovisual simultaneity perception in amblyopia. PLoS One 2017; 12:e0179516. [PMID: 28598996 PMCID: PMC5466335 DOI: 10.1371/journal.pone.0179516] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2017] [Accepted: 05/30/2017] [Indexed: 11/19/2022] Open
Abstract
Amblyopia is a developmental visual impairment that is increasingly recognized to affect higher-level perceptual and multisensory processes. To further investigate the audiovisual (AV) perceptual impairments associated with this condition, we characterized the temporal interval in which asynchronous auditory and visual stimuli are perceived as simultaneous 50% of the time (i.e., the AV simultaneity window). Adults with unilateral amblyopia (n = 17) and visually normal controls (n = 17) judged the simultaneity of a flash and a click presented with both eyes viewing. The signal onset asynchrony (SOA) varied from 0 ms to 450 ms for auditory-lead and visual-lead conditions. A subset of participants with amblyopia (n = 6) was tested monocularly. Compared to the control group, the auditory-lead side of the AV simultaneity window was widened by 48 ms (36%; p = 0.002), whereas that of the visual-lead side was widened by 86 ms (37%; p = 0.02). The overall mean window width was 500 ms, compared to 366 ms among controls (37% wider; p = 0.002). Among participants with amblyopia, the simultaneity window parameters were unchanged by viewing condition, but subgroup analysis revealed differential effects on the parameters by amblyopia severity, etiology, and foveal suppression status. Possible mechanisms to explain these findings include visual temporal uncertainty, interocular perceptual latency asymmetry, and disruption of normal developmental tuning of sensitivity to audiovisual asynchrony.
Collapse
|
37
|
Simultaneity judgment using olfactory-visual, visual-gustatory, and olfactory-gustatory combinations. PLoS One 2017; 12:e0174958. [PMID: 28376116 PMCID: PMC5380340 DOI: 10.1371/journal.pone.0174958] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2016] [Accepted: 03/17/2017] [Indexed: 11/19/2022] Open
Abstract
Vision is a physical sense, whereas olfaction and gustation are chemical senses. Active sensing might function in vision, olfaction, and gustation, whereas passive sensing might function in vision and olfaction but not gustation. To investigate whether each sensory property affected synchrony perception, participants in this study performed simultaneity judgment (SJ) for three cross-modal combinations using visual (red LED light), olfactory (coumarin), and gustatory (NaCl solution) stimuli. We calculated the half-width at half-height (HWHH) and point of subjective simultaneity (PSS) on the basis of temporal distributions of simultaneous response rates in each combination. Although HWHH did not differ significantly among three cross-modal combinations, HWHH exhibited a higher value in cross-modal combinations involving one or two chemical stimuli than in combinations of two physical stimuli, reported in a previous study. The PSS of the olfactory–visual combination was approximately equal to the point of objective simultaneity (POS), whereas the PSS of visual–gustatory, and olfactory–gustatory combinations receded significantly from the POS. In order to generalize these results as specific to chemical senses in regard to synchrony perception, we need to determine whether the same phenomena will be reproduced when performing SJ for various cross-modal combinations using visual, olfactory, and gustatory stimuli other than red LED light, coumarin, and NaCl solution.
Collapse
|
38
|
Early Binocular Input Is Critical for Development of Audiovisual but Not Visuotactile Simultaneity Perception. Curr Biol 2017; 27:583-589. [PMID: 28190731 DOI: 10.1016/j.cub.2017.01.009] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2016] [Revised: 12/23/2016] [Accepted: 01/05/2017] [Indexed: 11/20/2022]
Abstract
Temporal simultaneity provides an essential cue for integrating multisensory signals into a unified perception. Early visual deprivation, in both animals and humans, leads to abnormal neural responses to audiovisual signals in subcortical and cortical areas [1-5]. Behavioral deficits in integrating complex audiovisual stimuli in humans are also observed [6, 7]. It remains unclear whether early visual deprivation affects visuotactile perception similarly to audiovisual perception and whether the consequences for either pairing differ after monocular versus binocular deprivation [8-11]. Here, we evaluated the impact of early visual deprivation on the perception of simultaneity for audiovisual and visuotactile stimuli in humans. We tested patients born with dense cataracts in one or both eyes that blocked all patterned visual input until the cataractous lenses were removed and the affected eyes fitted with compensatory contact lenses (mean duration of deprivation = 4.4 months; range = 0.3-28.8 months). Both monocularly and binocularly deprived patients demonstrated lower precision in judging audiovisual simultaneity. However, qualitatively different outcomes were observed for the two patient groups: the performance of monocularly deprived patients matched that of young children at immature stages, whereas that of binocularly deprived patients did not match any stage in typical development. Surprisingly, patients performed normally in judging visuotactile simultaneity after either monocular or binocular deprivation. Therefore, early binocular input is necessary to develop normal neural substrates for simultaneity perception of visual and auditory events but not visual and tactile events.
Collapse
|
39
|
Schormans AL, Scott KE, Vo AMQ, Tyker A, Typlt M, Stolzberg D, Allman BL. Audiovisual Temporal Processing and Synchrony Perception in the Rat. Front Behav Neurosci 2017; 10:246. [PMID: 28119580 PMCID: PMC5222817 DOI: 10.3389/fnbeh.2016.00246] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2016] [Accepted: 12/16/2016] [Indexed: 11/13/2022] Open
Abstract
Extensive research on humans has improved our understanding of how the brain integrates information from our different senses, and has begun to uncover the brain regions and large-scale neural activity that contributes to an observer’s ability to perceive the relative timing of auditory and visual stimuli. In the present study, we developed the first behavioral tasks to assess the perception of audiovisual temporal synchrony in rats. Modeled after the parameters used in human studies, separate groups of rats were trained to perform: (1) a simultaneity judgment task in which they reported whether audiovisual stimuli at various stimulus onset asynchronies (SOAs) were presented simultaneously or not; and (2) a temporal order judgment task in which they reported whether they perceived the auditory or visual stimulus to have been presented first. Furthermore, using in vivo electrophysiological recordings in the lateral extrastriate visual (V2L) cortex of anesthetized rats, we performed the first investigation of how neurons in the rat multisensory cortex integrate audiovisual stimuli presented at different SOAs. As predicted, rats (n = 7) trained to perform the simultaneity judgment task could accurately (~80%) identify synchronous vs. asynchronous (200 ms SOA) trials. Moreover, the rats judged trials at 10 ms SOA to be synchronous, whereas the majority (~70%) of trials at 100 ms SOA were perceived to be asynchronous. During the temporal order judgment task, rats (n = 7) perceived the synchronous audiovisual stimuli to be “visual first” for ~52% of the trials, and calculation of the smallest timing interval between the auditory and visual stimuli that could be detected in each rat (i.e., the just noticeable difference (JND)) ranged from 77 ms to 122 ms. Neurons in the rat V2L cortex were sensitive to the timing of audiovisual stimuli, such that spiking activity was greatest during trials when the visual stimulus preceded the auditory by 20–40 ms. Ultimately, given that our behavioral and electrophysiological results were consistent with studies conducted on human participants and previous recordings made in multisensory brain regions of different species, we suggest that the rat represents an effective model for studying audiovisual temporal synchrony at both the neuronal and perceptual level.
Collapse
Affiliation(s)
- Ashley L Schormans
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Kaela E Scott
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Albert M Q Vo
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Anna Tyker
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Marei Typlt
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Daniel Stolzberg
- Department of Physiology and Pharmacology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| | - Brian L Allman
- Department of Anatomy and Cell Biology, Schulich School of Medicine and Dentistry, University of Western Ontario London, ON, Canada
| |
Collapse
|
40
|
Hao Q, Ora H, Ogawa KI, Ogata T, Miyake Y. Voluntary movement affects simultaneous perception of auditory and tactile stimuli presented to a non-moving body part. Sci Rep 2016; 6:33336. [PMID: 27622584 PMCID: PMC5020736 DOI: 10.1038/srep33336] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2016] [Accepted: 08/24/2016] [Indexed: 11/10/2022] Open
Abstract
The simultaneous perception of multimodal sensory information has a crucial role for effective reactions to the external environment. Voluntary movements are known to occasionally affect simultaneous perception of auditory and tactile stimuli presented to the moving body part. However, little is known about spatial limits on the effect of voluntary movements on simultaneous perception, especially when tactile stimuli are presented to a non-moving body part. We examined the effect of voluntary movement on the simultaneous perception of auditory and tactile stimuli presented to the non-moving body part. We considered the possible mechanism using a temporal order judgement task under three experimental conditions: voluntary movement, where participants voluntarily moved their right index finger and judged the temporal order of auditory and tactile stimuli presented to their non-moving left index finger; passive movement; and no movement. During voluntary movement, the auditory stimulus needed to be presented before the tactile stimulus so that they were perceived as occurring simultaneously. This subjective simultaneity differed significantly from the passive movement and no movement conditions. This finding indicates that the effect of voluntary movement on simultaneous perception of auditory and tactile stimuli extends to the non-moving body part.
Collapse
Affiliation(s)
- Qiao Hao
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Hiroki Ora
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Ken-Ichiro Ogawa
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Taiki Ogata
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan.,Research into Artifacts, Center for Engineering (RACE), the University of Tokyo, Kashiwa, Japan
| | - Yoshihiro Miyake
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| |
Collapse
|
41
|
Li MS, Rhodes D, Di Luca M. For the Last Time: Temporal Sensitivity and Perceived Timing of the Final Stimulus in an Isochronous Sequence. TIMING & TIME PERCEPTION 2016. [DOI: 10.1163/22134468-00002057] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
An isochronous sequence is a series of repeating events with the same inter-onset-interval. A common finding is that as the length of a sequence increases, so does temporal sensitivity to irregularities — that is, the detection of deviations from isochrony is better with a longer sequence. Several theoretical accounts exist in the literature as to how the brain processes sequences for the detection of irregularities, yet there remains to be a systematic comparison of the predictions that such accounts make. To compare the predictions of these accounts, we asked participants to report whether the last stimulus of a regularly-timed sequence appeared ‘earlier’ or ‘later’ than expected. Such task allowed us to separately analyse bias and performance. Sequences lengths (3, 4, 5 or 6 beeps) were either randomly interleaved or presented in separate blocks. We replicate previous findings showing that temporal sensitivity increases with longer sequence in the interleaved condition but not in the blocked condition (where performance is higher overall). Results also indicate that there is a consistent bias in reporting whether the last stimulus is isochronous (irrespectively of how many stimuli the sequence is composed of). Such result is consistent with a perceptual acceleration of stimuli embedded in isochronous sequences. From the comparison of the models’ predictions we determine that the improvement in sensitivity is best captured by an averaging of successive estimates, but with an element that limits performance improvement below statistical optimality. None of the models considered, however, provides an exhaustive explanation for the pattern of results found.
Collapse
|
42
|
Durnez W, Van Damme S. No Evidence for Threat-Induced Spatial Prioritization of Somatosensory Stimulation during Pain Control Using a Synchrony Judgment Paradigm. PLoS One 2016; 11:e0156648. [PMID: 27270456 PMCID: PMC4896434 DOI: 10.1371/journal.pone.0156648] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2015] [Accepted: 05/17/2016] [Indexed: 11/19/2022] Open
Abstract
Topical research efforts on attention to pain often take a critical look at the modulatory role of top-down factors. For instance, it has been shown that the fearful expectation of pain at a location of the body directs attention towards that body part. In addition, motivated attempts to control this pain were found to modulate this prioritization effect. Such studies have often used a temporal order judgment task, requiring participants to judge the order in which two stimuli are presented by indicating which one they perceived first. As this constitutes a forced-choice response format, such studies may be subject to response bias. The aim of the current study was to address this concern. We used a ternary synchrony judgment paradigm, in which participants judged the order in which two somatosensory stimuli occurred. Critically, participants now also had the option to give a 'simultaneous' response when they did not perceive a difference. This way we eliminated the need for guessing, and thus reduced the risk of response bias. One location was threatened with the possibility of pain in half of the trials, as predicted by an auditory cue. Additionally, half of the participants (pain control group) were encouraged to avoid pain stimuli by executing a quick button press. The other half (comparison group) performed a similar action, albeit unrelated to the occurrence of pain. Our data did not support threat-induced spatial prioritization, nor did we find evidence that pain control attempts influenced attention in any way.
Collapse
Affiliation(s)
- Wouter Durnez
- Department of Experimental-Clinical and Health Psychology, Ghent University, Ghent, Belgium
| | - Stefaan Van Damme
- Department of Experimental-Clinical and Health Psychology, Ghent University, Ghent, Belgium
- * E-mail:
| |
Collapse
|
43
|
Cecere R, Gross J, Thut G. Behavioural evidence for separate mechanisms of audiovisual temporal binding as a function of leading sensory modality. Eur J Neurosci 2016; 43:1561-8. [PMID: 27003546 PMCID: PMC4915493 DOI: 10.1111/ejn.13242] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2015] [Revised: 02/09/2016] [Accepted: 03/17/2016] [Indexed: 11/30/2022]
Abstract
The ability to integrate auditory and visual information is critical for effective perception and interaction with the environment, and is thought to be abnormal in some clinical populations. Several studies have investigated the time window over which audiovisual events are integrated, also called the temporal binding window, and revealed asymmetries depending on the order of audiovisual input (i.e. the leading sense). When judging audiovisual simultaneity, the binding window appears narrower and non-malleable for auditory-leading stimulus pairs and wider and trainable for visual-leading pairs. Here we specifically examined the level of independence of binding mechanisms when auditory-before-visual vs. visual-before-auditory input is bound. Three groups of healthy participants practiced audiovisual simultaneity detection with feedback, selectively training on auditory-leading stimulus pairs (group 1), visual-leading stimulus pairs (group 2) or both (group 3). Subsequently, we tested for learning transfer (crossover) from trained stimulus pairs to non-trained pairs with opposite audiovisual input. Our data confirmed the known asymmetry in size and trainability for auditory-visual vs. visual-auditory binding windows. More importantly, practicing one type of audiovisual integration (e.g. auditory-visual) did not affect the other type (e.g. visual-auditory), even if trainable by within-condition practice. Together, these results provide crucial evidence that audiovisual temporal binding for auditory-leading vs. visual-leading stimulus pairs are independent, possibly tapping into different circuits for audiovisual integration due to engagement of different multisensory sampling mechanisms depending on leading sense. Our results have implications for informing the study of multisensory interactions in healthy participants and clinical populations with dysfunctional multisensory integration.
Collapse
Affiliation(s)
- Roberto Cecere
- Centre for Cognitive Neuroimaging (CCNi), Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, G12 8QB, Glasgow, UK
| | - Joachim Gross
- Centre for Cognitive Neuroimaging (CCNi), Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, G12 8QB, Glasgow, UK
| | - Gregor Thut
- Centre for Cognitive Neuroimaging (CCNi), Institute of Neuroscience and Psychology, University of Glasgow, 58 Hillhead Street, G12 8QB, Glasgow, UK
| |
Collapse
|
44
|
Filbrich L, Torta DM, Vanderclausen C, Azañón E, Legrain V. Using temporal order judgments to investigate attention bias toward pain and threat-related information. Methodological and theoretical issues. Conscious Cogn 2016; 41:135-8. [DOI: 10.1016/j.concog.2016.02.008] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2016] [Revised: 02/10/2016] [Accepted: 02/11/2016] [Indexed: 10/22/2022]
|
45
|
Rayner LH, Lee KH, Woodruff PWR. Reduced attention-driven auditory sensitivity in hallucination-prone individuals. Br J Psychiatry 2015; 207:414-9. [PMID: 26382950 DOI: 10.1192/bjp.bp.114.149799] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/11/2014] [Accepted: 01/15/2015] [Indexed: 11/23/2022]
Abstract
BACKGROUND Evidence suggests that auditory hallucinations may result from abnormally enhanced auditory sensitivity. AIMS To investigate whether there is an auditory processing bias in healthy individuals who are prone to experiencing auditory hallucinations. METHOD Two hundred healthy volunteers performed a temporal order judgement task in which they determined whether an auditory or a visual stimulus came first under conditions of directed attention ('attend-auditory' and 'attend-visual' conditions). The Launay-Slade Hallucination Scale was used to divide the sample into high and low hallucination-proneness groups. RESULTS The high hallucination-proneness group exhibited a reduced sensitivity to auditory stimuli under the attend-auditory condition. By contrast, attention-directed visual sensitivity did not differ significantly between groups. CONCLUSIONS Healthy individuals prone to hallucinatory experiences may possess a bias in attention towards internal auditory stimuli at the expense of external sounds. Interventions involving the redistribution of attentional resources would have therapeutic benefit in patients experiencing auditory hallucinations.
Collapse
Affiliation(s)
- Louise H Rayner
- Louise H. Rayner, MBChB, BMedSci, Kwang-Hyuk Lee, PhD, Peter W. R. Woodruff, MRCPsych, PhD, Department of Neuroscience, University of Sheffield, Sheffield, UK
| | - Kwang-Hyuk Lee
- Louise H. Rayner, MBChB, BMedSci, Kwang-Hyuk Lee, PhD, Peter W. R. Woodruff, MRCPsych, PhD, Department of Neuroscience, University of Sheffield, Sheffield, UK
| | - Peter W R Woodruff
- Louise H. Rayner, MBChB, BMedSci, Kwang-Hyuk Lee, PhD, Peter W. R. Woodruff, MRCPsych, PhD, Department of Neuroscience, University of Sheffield, Sheffield, UK
| |
Collapse
|
46
|
Hao Q, Ogata T, Ogawa KI, Kwon J, Miyake Y. The simultaneous perception of auditory-tactile stimuli in voluntary movement. Front Psychol 2015; 6:1429. [PMID: 26441799 PMCID: PMC4585164 DOI: 10.3389/fpsyg.2015.01429] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2015] [Accepted: 09/07/2015] [Indexed: 11/25/2022] Open
Abstract
The simultaneous perception of multimodal information in the environment during voluntary movement is very important for effective reactions to the environment. Previous studies have found that voluntary movement affects the simultaneous perception of auditory and tactile stimuli. However, the results of these experiments are not completely consistent, and the differences may be attributable to methodological differences in the previous studies. In this study, we investigated the effect of voluntary movement on the simultaneous perception of auditory and tactile stimuli using a temporal order judgment task with voluntary movement, involuntary movement, and no movement. To eliminate the potential effect of stimulus predictability and the effect of spatial information associated with large-scale movement in the previous studies, we randomized the interval between the start of movement and the first stimulus, and used small-scale movement. As a result, the point of subjective simultaneity (PSS) during voluntary movement shifted from the tactile stimulus being first during involuntary movement or no movement to the auditory stimulus being first. The just noticeable difference (JND), an indicator of temporal resolution, did not differ across the three conditions. These results indicate that voluntary movement itself affects the PSS in auditory–tactile simultaneous perception, but it does not influence the JND. In the discussion of these results, we suggest that simultaneous perception may be affected by the efference copy.
Collapse
Affiliation(s)
- Qiao Hao
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| | - Taiki Ogata
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan ; Research into Artifacts, Center for Engineering (RACE), The University of Tokyo Kashiwa, Japan
| | - Ken-Ichiro Ogawa
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| | - Jinhwan Kwon
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| | - Yoshihiro Miyake
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| |
Collapse
|
47
|
Stevenson RA, Segers M, Ferber S, Barense MD, Camarata S, Wallace MT. Keeping time in the brain: Autism spectrum disorder and audiovisual temporal processing. Autism Res 2015; 9:720-38. [PMID: 26402725 DOI: 10.1002/aur.1566] [Citation(s) in RCA: 58] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2015] [Revised: 08/22/2015] [Accepted: 08/29/2015] [Indexed: 12/21/2022]
Abstract
A growing area of interest and relevance in the study of autism spectrum disorder (ASD) focuses on the relationship between multisensory temporal function and the behavioral, perceptual, and cognitive impairments observed in ASD. Atypical sensory processing is becoming increasingly recognized as a core component of autism, with evidence of atypical processing across a number of sensory modalities. These deviations from typical processing underscore the value of interpreting ASD within a multisensory framework. Furthermore, converging evidence illustrates that these differences in audiovisual processing may be specifically related to temporal processing. This review seeks to bridge the connection between temporal processing and audiovisual perception, and to elaborate on emerging data showing differences in audiovisual temporal function in autism. We also discuss the consequence of such changes, the specific impact on the processing of different classes of audiovisual stimuli (e.g. speech vs. nonspeech, etc.), and the presumptive brain processes and networks underlying audiovisual temporal integration. Finally, possible downstream behavioral implications, and possible remediation strategies are outlined. Autism Res 2016, 9: 720-738. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Magali Segers
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - Susanne Ferber
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Morgan D Barense
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Stephen Camarata
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Mark T Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Brain Institute, Vanderbilt University Medical Center, Nashville, Tennessee.,Department of Psychology, Vanderbilt University, Nashville, Tennessee.,Department of Psychiatry, Vanderbilt University Medical Center, Nashville, Tennessee
| |
Collapse
|
48
|
Are the spatial features of bodily threat limited to the exact location where pain is expected? Acta Psychol (Amst) 2014; 153:113-9. [PMID: 25463551 DOI: 10.1016/j.actpsy.2014.09.014] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2014] [Revised: 09/26/2014] [Accepted: 09/29/2014] [Indexed: 11/22/2022] Open
Abstract
Previous research has revealed that anticipating pain at a particular location of the body prioritizes somatosensory input presented there. The present study tested whether the spatial features of bodily threat are limited to the exact location of nociception. Participants judged which one of two tactile stimuli, presented to either hand, had been presented first, while occasionally experiencing a painful stimulus. The distance between the pain and tactile locations was manipulated. In Experiment 1, participants expected pain either proximal to one of the tactile stimuli (on the hand; near condition) or more distant on the same body part (arm; far condition). In Experiment 2, the painful stimulus was expected either proximal to one of the tactile stimuli (hand; near) or on a different body-part at the same body side (leg; far). The results revealed that in the near condition of both experiments, participants became aware of tactile stimuli presented to the "threatened" hand more quickly as compared to the "neutral" hand. Of particular interest, the data in the far conditions showed a similar prioritization effect when pain was expected at a different location of the same body part as well as when pain was expected at a different body part at the same body side. In this study, the encoding of spatial features of bodily threat was not limited to the exact location where pain was anticipated but rather generalized to the entire body part and even to different body parts at the same side of the body.
Collapse
|
49
|
Selective attention modulates the direction of audio-visual temporal recalibration. PLoS One 2014; 9:e99311. [PMID: 25004132 PMCID: PMC4086723 DOI: 10.1371/journal.pone.0099311] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2013] [Accepted: 05/13/2014] [Indexed: 11/23/2022] Open
Abstract
Temporal recalibration of cross-modal synchrony has been proposed as a mechanism to compensate for timing differences between sensory modalities. However, far from the rich complexity of everyday life sensory environments, most studies to date have examined recalibration on isolated cross-modal pairings. Here, we hypothesize that selective attention might provide an effective filter to help resolve which stimuli are selected when multiple events compete for recalibration. We addressed this question by testing audio-visual recalibration following an adaptation phase where two opposing audio-visual asynchronies were present. The direction of voluntary visual attention, and therefore to one of the two possible asynchronies (flash leading or flash lagging), was manipulated using colour as a selection criterion. We found a shift in the point of subjective audio-visual simultaneity as a function of whether the observer had focused attention to audio-then-flash or to flash-then-audio groupings during the adaptation phase. A baseline adaptation condition revealed that this effect of endogenous attention was only effective toward the lagging flash. This hints at the role of exogenous capture and/or additional endogenous effects producing an asymmetry toward the leading flash. We conclude that selective attention helps promote selected audio-visual pairings to be combined and subsequently adjusted in time but, stimulus organization exerts a strong impact on recalibration. We tentatively hypothesize that the resolution of recalibration in complex scenarios involves the orchestration of top-down selection mechanisms and stimulus-driven processes.
Collapse
|
50
|
Gotow N, Kobayakawa T. Construction of a measurement system for simultaneity judgment using odor and taste stimuli. J Neurosci Methods 2014; 221:132-8. [DOI: 10.1016/j.jneumeth.2013.09.020] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2013] [Revised: 09/26/2013] [Accepted: 09/30/2013] [Indexed: 10/26/2022]
|