1
|
Fletcher MD, Akis E, Verschuur CA, Perry SW. Improved tactile speech perception and noise robustness using audio-to-tactile sensory substitution with amplitude envelope expansion. Sci Rep 2024; 14:15029. [PMID: 38951556 PMCID: PMC11217272 DOI: 10.1038/s41598-024-65510-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Accepted: 06/20/2024] [Indexed: 07/03/2024] Open
Abstract
Recent advances in haptic technology could allow haptic hearing aids, which convert audio to tactile stimulation, to become viable for supporting people with hearing loss. A tactile vocoder strategy for audio-to-tactile conversion, which exploits these advances, has recently shown significant promise. In this strategy, the amplitude envelope is extracted from several audio frequency bands and used to modulate the amplitude of a set of vibro-tactile tones. The vocoder strategy allows good consonant discrimination, but vowel discrimination is poor and the strategy is susceptible to background noise. In the current study, we assessed whether multi-band amplitude envelope expansion can effectively enhance critical vowel features, such as formants, and improve speech extraction from noise. In 32 participants with normal touch perception, tactile-only phoneme discrimination with and without envelope expansion was assessed both in quiet and in background noise. Envelope expansion improved performance in quiet by 10.3% for vowels and by 5.9% for consonants. In noise, envelope expansion improved overall phoneme discrimination by 9.6%, with no difference in benefit between consonants and vowels. The tactile vocoder with envelope expansion can be deployed in real-time on a compact device and could substantially improve clinical outcomes for a new generation of haptic hearing aids.
Collapse
Affiliation(s)
- Mark D Fletcher
- University of Southampton Auditory Implant Service, University of Southampton, University Road, Southampton, SO17 1BJ, UK.
- Institute of Sound and Vibration Research, University of Southampton, University Road, Southampton, SO17 1BJ, UK.
| | - Esma Akis
- University of Southampton Auditory Implant Service, University of Southampton, University Road, Southampton, SO17 1BJ, UK
- Institute of Sound and Vibration Research, University of Southampton, University Road, Southampton, SO17 1BJ, UK
| | - Carl A Verschuur
- University of Southampton Auditory Implant Service, University of Southampton, University Road, Southampton, SO17 1BJ, UK
| | - Samuel W Perry
- University of Southampton Auditory Implant Service, University of Southampton, University Road, Southampton, SO17 1BJ, UK
- Institute of Sound and Vibration Research, University of Southampton, University Road, Southampton, SO17 1BJ, UK
| |
Collapse
|
2
|
Kufer K, Schmitter CV, Kircher T, Straube B. Temporal recalibration in response to delayed visual feedback of active versus passive actions: an fMRI study. Sci Rep 2024; 14:4632. [PMID: 38409306 PMCID: PMC10897428 DOI: 10.1038/s41598-024-54660-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2023] [Accepted: 02/15/2024] [Indexed: 02/28/2024] Open
Abstract
The brain can adapt its expectations about the relative timing of actions and their sensory outcomes in a process known as temporal recalibration. This might occur as the recalibration of timing between the sensory (e.g. visual) outcome and (1) the motor act (sensorimotor) or (2) tactile/proprioceptive information (inter-sensory). This fMRI recalibration study investigated sensorimotor contributions to temporal recalibration by comparing active and passive conditions. Subjects were repeatedly exposed to delayed (150 ms) or undelayed visual stimuli, triggered by active or passive button presses. Recalibration effects were tested in delay detection tasks, including visual and auditory outcomes. We showed that both modalities were affected by visual recalibration. However, an active advantage was observed only in visual conditions. Recalibration was generally associated with the left cerebellum (lobules IV, V and vermis) while action related activation (active > passive) occurred in the right middle/superior frontal gyri during adaptation and test phases. Recalibration transfer from vision to audition was related to action specific activations in the cingulate cortex, the angular gyrus and left inferior frontal gyrus. Our data provide new insights in sensorimotor contributions to temporal recalibration via the middle/superior frontal gyri and inter-sensory contributions mediated by the cerebellum.
Collapse
Affiliation(s)
- Konstantin Kufer
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Strasse 8, 35039, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032, Marburg, Germany
| | - Christina V Schmitter
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Strasse 8, 35039, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032, Marburg, Germany
| | - Tilo Kircher
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Strasse 8, 35039, Marburg, Germany
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032, Marburg, Germany
| | - Benjamin Straube
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Strasse 8, 35039, Marburg, Germany.
- Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032, Marburg, Germany.
| |
Collapse
|
3
|
Ten Oever S, Martin AE. Interdependence of "What" and "When" in the Brain. J Cogn Neurosci 2024; 36:167-186. [PMID: 37847823 DOI: 10.1162/jocn_a_02067] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2023]
Abstract
From a brain's-eye-view, when a stimulus occurs and what it is are interrelated aspects of interpreting the perceptual world. Yet in practice, the putative perceptual inferences about sensory content and timing are often dichotomized and not investigated as an integrated process. We here argue that neural temporal dynamics can influence what is perceived, and in turn, stimulus content can influence the time at which perception is achieved. This computational principle results from the highly interdependent relationship of what and when in the environment. Both brain processes and perceptual events display strong temporal variability that is not always modeled; we argue that understanding-and, minimally, modeling-this temporal variability is key for theories of how the brain generates unified and consistent neural representations and that we ignore temporal variability in our analysis practice at the peril of both data interpretation and theory-building. Here, we review what and when interactions in the brain, demonstrate via simulations how temporal variability can result in misguided interpretations and conclusions, and outline how to integrate and synthesize what and when in theories and models of brain computation.
Collapse
Affiliation(s)
- Sanne Ten Oever
- Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
- Donders Centre for Cognitive Neuroimaging, Nijmegen, The Netherlands
- Maastricht University, The Netherlands
| | - Andrea E Martin
- Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
- Donders Centre for Cognitive Neuroimaging, Nijmegen, The Netherlands
| |
Collapse
|
4
|
Zhu H, Tang X, Chen T, Yang J, Wang A, Zhang M. Audiovisual illusion training improves multisensory temporal integration. Conscious Cogn 2023; 109:103478. [PMID: 36753896 DOI: 10.1016/j.concog.2023.103478] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2022] [Revised: 01/26/2023] [Accepted: 01/26/2023] [Indexed: 02/08/2023]
Abstract
When we perceive external physical stimuli from the environment, the brain must remain somewhat flexible to unaligned stimuli within a specific range, as multisensory signals are subject to different transmission and processing delays. Recent studies have shown that the width of the 'temporal binding window (TBW)' can be reduced by perceptual learning. However, to date, the vast majority of studies examining the mechanisms of perceptual learning have focused on experience-dependent effects, failing to reach a consensus on its relationship with the underlying perception influenced by audiovisual illusion. The sound-induced flash illusion (SiFI) training is a reliable function for improving perceptual sensitivity. The present study utilized the classic auditory-dominated SiFI paradigm with feedback training to investigate the effect of a 5-day SiFI training on multisensory temporal integration, as evaluated by a simultaneity judgment (SJ) task and temporal order judgment (TOJ) task. We demonstrate that audiovisual illusion training enhances multisensory temporal integration precision in the form of (i) the point of subjective simultaneity (PSS) shifts to reality (0 ms) and (ii) a narrowing TBW. The results are consistent with a Bayesian model of causal inference, suggesting that perception learning reduce the susceptibility to SiFI, whilst improving the precision of audiovisual temporal estimation.
Collapse
Affiliation(s)
- Haocheng Zhu
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
| | - Xiaoyu Tang
- School of Psychology, Liaoning Collaborative Innovation Center of Children and Adolescents Healthy Personality Assessment and Cultivation, Liaoning Normal University, Dalian, China
| | - Tingji Chen
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
| | - Jiajia Yang
- Applied Brain Science Lab Faculty of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Aijun Wang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China.
| | - Ming Zhang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China; Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan.
| |
Collapse
|
5
|
The development of audio-visual temporal precision precedes its rapid recalibration. Sci Rep 2022; 12:21591. [PMID: 36517503 PMCID: PMC9751280 DOI: 10.1038/s41598-022-25392-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2022] [Accepted: 11/29/2022] [Indexed: 12/15/2022] Open
Abstract
Through development, multisensory systems reach a balance between stability and flexibility: the systems integrate optimally cross-modal signals from the same events, while remaining adaptive to environmental changes. Is continuous intersensory recalibration required to shape optimal integration mechanisms, or does multisensory integration develop prior to recalibration? Here, we examined the development of multisensory integration and rapid recalibration in the temporal domain by re-analyzing published datasets for audio-visual, audio-tactile, and visual-tactile combinations. Results showed that children reach an adult level of precision in audio-visual simultaneity perception and show the first sign of rapid recalibration at 9 years of age. In contrast, there was very weak rapid recalibration for other cross-modal combinations at all ages, even when adult levels of temporal precision had developed. Thus, the development of audio-visual rapid recalibration appears to require the maturation of temporal precision. It may serve to accommodate distance-dependent travel time differences between light and sound.
Collapse
|
6
|
Schmitter CV, Straube B. The impact of cerebellar transcranial direct current stimulation (tDCS) on sensorimotor and inter-sensory temporal recalibration. Front Hum Neurosci 2022; 16:998843. [PMID: 36111210 PMCID: PMC9468227 DOI: 10.3389/fnhum.2022.998843] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Accepted: 08/12/2022] [Indexed: 12/02/2022] Open
Abstract
The characteristic temporal relationship between actions and their sensory outcomes allows us to distinguish self- from externally generated sensory events. However, the complex sensory environment can cause transient delays between action and outcome calling for flexible recalibration of predicted sensorimotor timing. Since the neural underpinnings of this process are largely unknown this study investigated the involvement of the cerebellum by means of cerebellar transcranial direct current stimulation (ctDCS). While receiving anodal, cathodal, dual-hemisphere or sham ctDCS, in an adaptation phase, participants were exposed to constant delays of 150 ms between actively or passively generated button presses and visual sensory outcomes. Recalibration in the same (visual outcome) and in another sensory modality (auditory outcome) was assessed in a subsequent test phase during which variable delays between button press and visual or auditory outcome had to be detected. Results indicated that temporal recalibration occurred in audition after anodal ctDCS while it was absent in vision. As the adaptation modality was visual, effects in audition suggest that recalibration occurred on a supra-modal level. In active conditions, anodal ctDCS improved sensorimotor recalibration at the delay level closest to the adaptation delay, suggesting a precise cerebellar-dependent temporal recalibration mechanism. In passive conditions, the facilitation of inter-sensory recalibration by anodal ctDCS was overall stronger and tuned to larger delays. These findings point to a role of the cerebellum in supra-modal temporal recalibration across sensorimotor and perceptual domains, but the differential manifestation of the effect across delay levels in active and passive conditions points to differences in the underlying mechanisms depending on the availability of action-based predictions. Furthermore, these results suggest that anodal ctDCS can be a promising tool for facilitating effects of temporal recalibration in sensorimotor and inter-sensory contexts.
Collapse
Affiliation(s)
- Christina V. Schmitter
- Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
- Center for Mind, Brain and Behavior, University of Marburg and Justus Liebig University Giessen, Marburg, Germany
- *Correspondence: Christina V. Schmitter,
| | - Benjamin Straube
- Department of Psychiatry and Psychotherapy, University of Marburg, Marburg, Germany
- Center for Mind, Brain and Behavior, University of Marburg and Justus Liebig University Giessen, Marburg, Germany
| |
Collapse
|
7
|
Botan V, Salisbury A, Critchley HD, Ward J. Vicarious pain is an outcome of atypical body ownership: Evidence from the rubber hand illusion and enfacement illusion. Q J Exp Psychol (Hove) 2021; 74:1888-1899. [PMID: 34049467 PMCID: PMC8450990 DOI: 10.1177/17470218211024822] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Revised: 03/22/2021] [Accepted: 03/30/2021] [Indexed: 11/18/2022]
Abstract
Some people report localised pain on their body when seeing other people in pain (sensory-localised vicarious pain responders). In this study, we assess whether this is related to atypical computations of body ownership which, in paradigms such as the rubber hand illusion (RHI), can be conceptualised as a Bayesian inference as to whether multiple sources of sensory information (visual, somatosensory) belong together on a single body (one's own) or are distributed across several bodies (vision = other, somatosensory = self). According to this model, computations of body ownership depend on the degree (and precision) of sensory evidence, rather than synchrony per se. Sensory-localised vicarious pain responders exhibit the RHI following synchronous stroking and-unusually-also after asynchronous stroking. Importantly, this occurs only in asynchronous conditions in which the stroking is predictable (alternating) rather than unpredictable (random). There was no evidence that their bottom-up proprioceptive signals are less precise, suggesting individual differences in the top-down weighting of sensory evidence. Finally, the enfacement illusion (EI) was also employed as a conceptually related bodily illusion paradigm that involves a completely different response judgement (based on vision rather than proprioception). Sensory-localised responders show a comparable pattern on this task after synchronous and asynchronous stroking. This is consistent with the idea that they have top-down (prior) differences in the way body ownership is inferred that transcends the exact judgement being made (visual or proprioceptive).
Collapse
Affiliation(s)
- Vanessa Botan
- School of Psychology, University of Sussex, Brighton, UK
- Sackler Centre for Consciousness Science, Brighton, UK
| | | | - Hugo D Critchley
- School of Psychology, University of Sussex, Brighton, UK
- Sackler Centre for Consciousness Science, Brighton, UK
- Brighton and Sussex Medical School, Brighton, UK
| | - Jamie Ward
- School of Psychology, University of Sussex, Brighton, UK
- Sackler Centre for Consciousness Science, Brighton, UK
| |
Collapse
|
8
|
Fletcher MD. Can Haptic Stimulation Enhance Music Perception in Hearing-Impaired Listeners? Front Neurosci 2021; 15:723877. [PMID: 34531717 PMCID: PMC8439542 DOI: 10.3389/fnins.2021.723877] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2021] [Accepted: 08/11/2021] [Indexed: 01/07/2023] Open
Abstract
Cochlear implants (CIs) have been remarkably successful at restoring hearing in severely-to-profoundly hearing-impaired individuals. However, users often struggle to deconstruct complex auditory scenes with multiple simultaneous sounds, which can result in reduced music enjoyment and impaired speech understanding in background noise. Hearing aid users often have similar issues, though these are typically less acute. Several recent studies have shown that haptic stimulation can enhance CI listening by giving access to sound features that are poorly transmitted through the electrical CI signal. This “electro-haptic stimulation” improves melody recognition and pitch discrimination, as well as speech-in-noise performance and sound localization. The success of this approach suggests it could also enhance auditory perception in hearing-aid users and other hearing-impaired listeners. This review focuses on the use of haptic stimulation to enhance music perception in hearing-impaired listeners. Music is prevalent throughout everyday life, being critical to media such as film and video games, and often being central to events such as weddings and funerals. It represents the biggest challenge for signal processing, as it is typically an extremely complex acoustic signal, containing multiple simultaneous harmonic and inharmonic sounds. Signal-processing approaches developed for enhancing music perception could therefore have significant utility for other key issues faced by hearing-impaired listeners, such as understanding speech in noisy environments. This review first discusses the limits of music perception in hearing-impaired listeners and the limits of the tactile system. It then discusses the evidence around integration of audio and haptic stimulation in the brain. Next, the features, suitability, and success of current haptic devices for enhancing music perception are reviewed, as well as the signal-processing approaches that could be deployed in future haptic devices. Finally, the cutting-edge technologies that could be exploited for enhancing music perception with haptics are discussed. These include the latest micro motor and driver technology, low-power wireless technology, machine learning, big data, and cloud computing. New approaches for enhancing music perception in hearing-impaired listeners could substantially improve quality of life. Furthermore, effective haptic techniques for providing complex sound information could offer a non-invasive, affordable means for enhancing listening more broadly in hearing-impaired individuals.
Collapse
Affiliation(s)
- Mark D Fletcher
- University of Southampton Auditory Implant Service, Faculty of Engineering and Physical Sciences, University of Southampton, Southampton, United Kingdom.,Institute of Sound and Vibration Research, Faculty of Engineering and Physical Sciences, University of Southampton, Southampton, United Kingdom
| |
Collapse
|
9
|
Fletcher MD, Verschuur CA. Electro-Haptic Stimulation: A New Approach for Improving Cochlear-Implant Listening. Front Neurosci 2021; 15:581414. [PMID: 34177440 PMCID: PMC8219940 DOI: 10.3389/fnins.2021.581414] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2020] [Accepted: 04/29/2021] [Indexed: 12/12/2022] Open
Abstract
Cochlear implants (CIs) have been remarkably successful at restoring speech perception for severely to profoundly deaf individuals. Despite their success, several limitations remain, particularly in CI users' ability to understand speech in noisy environments, locate sound sources, and enjoy music. A new multimodal approach has been proposed that uses haptic stimulation to provide sound information that is poorly transmitted by the implant. This augmenting of the electrical CI signal with haptic stimulation (electro-haptic stimulation; EHS) has been shown to improve speech-in-noise performance and sound localization in CI users. There is also evidence that it could enhance music perception. We review the evidence of EHS enhancement of CI listening and discuss key areas where further research is required. These include understanding the neural basis of EHS enhancement, understanding the effectiveness of EHS across different clinical populations, and the optimization of signal-processing strategies. We also discuss the significant potential for a new generation of haptic neuroprosthetic devices to aid those who cannot access hearing-assistive technology, either because of biomedical or healthcare-access issues. While significant further research and development is required, we conclude that EHS represents a promising new approach that could, in the near future, offer a non-invasive, inexpensive means of substantially improving clinical outcomes for hearing-impaired individuals.
Collapse
Affiliation(s)
- Mark D. Fletcher
- Faculty of Engineering and Physical Sciences, University of Southampton Auditory Implant Service, University of Southampton, Southampton, United Kingdom
- Faculty of Engineering and Physical Sciences, Institute of Sound and Vibration Research, University of Southampton, Southampton, United Kingdom
| | - Carl A. Verschuur
- Faculty of Engineering and Physical Sciences, University of Southampton Auditory Implant Service, University of Southampton, Southampton, United Kingdom
| |
Collapse
|
10
|
Fletcher MD. Using haptic stimulation to enhance auditory perception in hearing-impaired listeners. Expert Rev Med Devices 2020; 18:63-74. [PMID: 33372550 DOI: 10.1080/17434440.2021.1863782] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
INTRODUCTION Hearing-assistive devices, such as hearing aids and cochlear implants, transform the lives of hearing-impaired people. However, users often struggle to locate and segregate sounds. This leads to impaired threat detection and an inability to understand speech in noisy environments. Recent evidence suggests that segregation and localization can be improved by providing missing sound-information through haptic stimulation. AREAS COVERED This article reviews the evidence that haptic stimulation can effectively provide sound information. It then discusses the research and development required for this approach to be implemented in a clinically viable device. This includes discussion of what sound information should be provided and how that information can be extracted and delivered. EXPERT OPINION Although this research area has only recently emerged, it builds on a significant body of work showing that sound information can be effectively transferred through haptic stimulation. Current evidence suggests that haptic stimulation is highly effective at providing missing sound-information to cochlear implant users. However, a great deal of work remains to implement this approach in an effective wearable device. If successful, such a device could offer an inexpensive, noninvasive means of improving educational, work, and social experiences for hearing-impaired individuals, including those without access to hearing-assistive devices.
Collapse
Affiliation(s)
- Mark D Fletcher
- University of Southampton Auditory Implant Service, Southampton, UK.,Institute of Sound and Vibration Research, University of Southampton, Southampton, UK
| |
Collapse
|
11
|
Gu L, Mei X, Wu Q, Huang Y, Wu X. Temporal recalibration in vision requires location-based binding. Cognition 2020; 207:104510. [PMID: 33187640 DOI: 10.1016/j.cognition.2020.104510] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2020] [Revised: 10/19/2020] [Accepted: 11/02/2020] [Indexed: 10/23/2022]
Abstract
Occupying the same location and occurring at the same time are the essential spatial and temporal factors for different features of a natural event or object to be integrated. Audio-visual temporal recalibration, as a temporal integration mechanism, refers to the brain's capacity to perceive simultaneity by adjusting for differential delays in the transmission of auditory and visual signals. Co-localization of auditory and visual information, however, is found not to be necessary for audio-visual temporal recalibration to occur. Here, we show that after exposure to a time lag between a visual flash and a visual collision, simultaneity responses were shifted toward an adapt lag in a bound condition where the flash and collision belonged to the same object but not in a separate condition where the flash and collision belonged to spatially separated objects. The results demonstrate that location-based binding is a requisite for temporal recalibration within the visual modality. Our finding suggests that the brain takes the modality difference in object localization into consideration when integrating temporally asynchronous signals.
Collapse
Affiliation(s)
- Li Gu
- State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University, Guangzhou 510060, China; Department of Psychology, Sun Yat-Sen University, Guangzhou, China
| | - Xiaolin Mei
- Department of Psychology, Sun Yat-Sen University, Guangzhou, China
| | - Qian Wu
- Department of Psychology, Sun Yat-Sen University, Guangzhou, China
| | - Yingyu Huang
- Department of Psychology, Sun Yat-Sen University, Guangzhou, China
| | - Xiang Wu
- Department of Psychology, Sun Yat-Sen University, Guangzhou, China.
| |
Collapse
|
12
|
Badde S, Ley P, Rajendran SS, Shareef I, Kekunnaya R, Röder B. Sensory experience during early sensitive periods shapes cross-modal temporal biases. eLife 2020; 9:61238. [PMID: 32840213 PMCID: PMC7476755 DOI: 10.7554/elife.61238] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2020] [Accepted: 08/18/2020] [Indexed: 11/13/2022] Open
Abstract
Typical human perception features stable biases such as perceiving visual events as later than synchronous auditory events. The origin of such perceptual biases is unknown. To investigate the role of early sensory experience, we tested whether a congenital, transient loss of pattern vision, caused by bilateral dense cataracts, has sustained effects on audio-visual and tactile-visual temporal biases and resolution. Participants judged the temporal order of successively presented, spatially separated events within and across modalities. Individuals with reversed congenital cataracts showed a bias towards perceiving visual stimuli as occurring earlier than auditory (Expt. 1) and tactile (Expt. 2) stimuli. This finding stood in stark contrast to normally sighted controls and sight-recovery individuals who had developed cataracts later in childhood: both groups exhibited the typical bias of perceiving vision as delayed compared to audition. These findings provide strong evidence that cross-modal temporal biases depend on sensory experience during an early sensitive period.
Collapse
Affiliation(s)
- Stephanie Badde
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany.,Department of Psychology and Center of Neural Science, New York University, New York, United States
| | - Pia Ley
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| | - Siddhart S Rajendran
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany.,Child Sight Institute, Jasti V Ramanamma Children's Eye Care Center, LV Prasad Eye Institute, Hyderabad, India
| | - Idris Shareef
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany.,Child Sight Institute, Jasti V Ramanamma Children's Eye Care Center, LV Prasad Eye Institute, Hyderabad, India
| | - Ramesh Kekunnaya
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany.,Child Sight Institute, Jasti V Ramanamma Children's Eye Care Center, LV Prasad Eye Institute, Hyderabad, India
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Germany
| |
Collapse
|
13
|
Smit S, Rich AN, Zopf R. Visual body form and orientation cues do not modulate visuo-tactile temporal integration. PLoS One 2019; 14:e0224174. [PMID: 31841510 PMCID: PMC6913941 DOI: 10.1371/journal.pone.0224174] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2019] [Accepted: 10/07/2019] [Indexed: 11/19/2022] Open
Abstract
Body ownership relies on spatiotemporal correlations between multisensory signals and visual cues specifying oneself such as body form and orientation. The mechanism for the integration of bodily signals remains unclear. One approach to model multisensory integration that has been influential in the multisensory literature is Bayesian causal inference. This specifies that the brain integrates spatial and temporal signals coming from different modalities when it infers a common cause for inputs. As an example, the rubber hand illusion shows that visual form and orientation cues can promote the inference of a common cause (one's body) leading to spatial integration shown by a proprioceptive drift of the perceived location of the real hand towards the rubber hand. Recent studies investigating the effect of visual cues on temporal integration, however, have led to conflicting findings. These could be due to task differences, variation in ecological validity of stimuli and/or small samples. In this pre-registered study, we investigated the influence of visual information on temporal integration using a visuo-tactile temporal order judgement task with realistic stimuli and a sufficiently large sample determined by Bayesian analysis. Participants viewed videos of a touch being applied to plausible or implausible visual stimuli for one's hand (hand oriented plausibly, hand rotated 180 degrees, or a sponge) while also being touched at varying stimulus onset asynchronies. Participants judged which stimulus came first: viewed or felt touch. Results show that visual cues do not modulate visuo-tactile temporal order judgements. This is not in line with the idea that bodily signals indicating oneself influence the integration of multisensory signals in the temporal domain. The current study emphasises the importance of rigour in our methodologies and analyses to advance the understanding of how properties of multisensory events affect the encoding of temporal information in the brain.
Collapse
Affiliation(s)
- Sophie Smit
- Perception in Action Research Centre & Department of Cognitive Science, Faculty of Human Sciences, Macquarie University, Sydney, Australia
| | - Anina N. Rich
- Perception in Action Research Centre & Department of Cognitive Science, Faculty of Human Sciences, Macquarie University, Sydney, Australia
- Centre for Elite Performance, Expertise & Training, Macquarie University, Sydney, Australia
| | - Regine Zopf
- Perception in Action Research Centre & Department of Cognitive Science, Faculty of Human Sciences, Macquarie University, Sydney, Australia
- Body Image and Ingestion Group, Macquarie University, Sydney, Australia
| |
Collapse
|
14
|
Recio RS, Cravo AM, de Camargo RY, van Wassenhove V. Dissociating the sequential dependency of subjective temporal order from subjective simultaneity. PLoS One 2019; 14:e0223184. [PMID: 31596862 PMCID: PMC6785056 DOI: 10.1371/journal.pone.0223184] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2018] [Accepted: 09/16/2019] [Indexed: 11/20/2022] Open
Abstract
The physical simultaneity between two events can differ from our point of subjective simultaneity (PSS). Studies using simultaneity judgments (SJ) and temporal order judgments (TOJ) tasks have shown that whether two events are reported as simultaneous is highly context-dependent. It has been recently suggested that the interval between the two events in the previous trial can modulate judgments both in SJ and TOJ tasks, an effect named rapid recalibration. In this work, we investigated rapid recalibration in SJ and TOJ tasks and tested whether centering the range of presented intervals on perceived simultaneity modulated this effect. We found a rapid recalibration effect in TOJ, but not in SJ. Moreover, we found that centering the intervals on objective or subjective simultaneity did not change the pattern of results. Interestingly, we also found no correlations between an individual’s PSS in TOJ and in SJ tasks, which corroborates other studies in suggesting that these two psychophysical measures may capture different processes.
Collapse
Affiliation(s)
- Renan Schiavolin Recio
- Centro de Matemática, Computação e Cognição (CMCC), Universidade Federal do ABC (UFABC), São Bernardo do Campo, SP, Brazil
- * E-mail:
| | - André Mascioli Cravo
- Centro de Matemática, Computação e Cognição (CMCC), Universidade Federal do ABC (UFABC), São Bernardo do Campo, SP, Brazil
| | - Raphael Yokoingawa de Camargo
- Centro de Matemática, Computação e Cognição (CMCC), Universidade Federal do ABC (UFABC), São Bernardo do Campo, SP, Brazil
| | - Virginie van Wassenhove
- Cognitive Neuroimaging Unit CEA DRF/Joliot, INSERM, Université Paris-Sud, Université Paris-Saclay, NeuroSpin Center, Gif-sur-Yvette, Paris, France
| |
Collapse
|
15
|
Rapid recalibration to audiovisual asynchrony follows the physical-not the perceived-temporal order. Atten Percept Psychophys 2019; 80:2060-2068. [PMID: 29968078 DOI: 10.3758/s13414-018-1540-9] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
In natural scenes, audiovisual events deriving from the same source are synchronized at their origin. However, from the perspective of the observer, there are likely to be significant multisensory delays due to physical and neural latencies. Fortunately, our brain appears to compensate for the resulting latency differences by rapidly adapting to asynchronous audiovisual events by shifting the point of subjective synchrony (PSS) in the direction of the leading modality of the most recent event. Here we examined whether it is the perceived modality order of this prior lag or its physical order that determines the direction of the subsequent rapid recalibration. On each experimental trial, a brief tone pip and flash were presented across a range of stimulus onset asynchronies (SOAs). The participants' task alternated over trials: On adaptor trials, audition either led or lagged vision with fixed SOAs, and participants judged the order of the audiovisual event; on test trials, the SOA as well as the modality order varied randomly, and participants judged whether or not the event was synchronized. For test trials, we showed that the PSS shifted in the direction of the physical rather than the perceived (reported) modality order of the preceding adaptor trial. These results suggest that rapid temporal recalibration is determined by the physical timing of the preceding events, not by one's prior perceptual decisions.
Collapse
|
16
|
Cai C, Ogawa K, Kochiyama T, Tanaka H, Imamizu H. Temporal recalibration of motor and visual potentials in lag adaptation in voluntary movement. Neuroimage 2018; 172:654-662. [PMID: 29428581 DOI: 10.1016/j.neuroimage.2018.02.015] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2017] [Revised: 12/11/2017] [Accepted: 02/07/2018] [Indexed: 11/29/2022] Open
Abstract
Adaptively recalibrating motor-sensory asynchrony is critical for animals to perceive self-produced action consequences. It is controversial whether motor- or sensory-related neural circuits recalibrate this asynchrony. By combining magnetoencephalography (MEG) and functional MRI (fMRI), we investigate the temporal changes in brain activities caused by repeated exposure to a 150-ms delay inserted between a button-press action and a subsequent flash. We found that readiness potentials significantly shift later in the motor system, especially in parietal regions (average: 219.9 ms), while visually evoked potentials significantly shift earlier in occipital regions (average: 49.7 ms) in the delay condition compared to the no-delay condition. Moreover, the shift in readiness potentials, but not in visually evoked potentials, was significantly correlated with the psychophysical measure of motor-sensory adaptation. These results suggest that although both motor and sensory processes contribute to the recalibration, the motor process plays the major role, given the magnitudes of shift and the correlation with the psychophysical measure.
Collapse
Affiliation(s)
- Chang Cai
- Cognitive Mechanisms Laboratories, Advanced Telecommunications Research Institute International, Keihanna Science City, Kyoto 619-0288, Japan; Center for Information and Neural Networks, National Institute of Information and Communications Technology and Osaka University, Suita, Osaka 565-0871, Japan.
| | - Kenji Ogawa
- Cognitive Mechanisms Laboratories, Advanced Telecommunications Research Institute International, Keihanna Science City, Kyoto 619-0288, Japan; Department of Psychology, Graduate School of Letters, Hokkaido University, Sapporo, Hokkaido 060-0810, Japan
| | - Takanori Kochiyama
- Brain Activity Imaging Center, ATR-Promotions, Keihanna Science City, Kyoto 619-0288, Japan
| | - Hirokazu Tanaka
- School of Information Science, Japan Advanced Institute of Science and Technology, Nomi, Ishikawa 923-1211, Japan
| | - Hiroshi Imamizu
- Cognitive Mechanisms Laboratories, Advanced Telecommunications Research Institute International, Keihanna Science City, Kyoto 619-0288, Japan; Center for Information and Neural Networks, National Institute of Information and Communications Technology and Osaka University, Suita, Osaka 565-0871, Japan; Department of Psychology, Graduate School of Humanities and Sociology, The University of Tokyo, Tokyo 113-0033, Japan.
| |
Collapse
|
17
|
Sugano Y, Keetels M, Vroomen J. Audio-motor but not visuo-motor temporal recalibration speeds up sensory processing. PLoS One 2017; 12:e0189242. [PMID: 29216307 PMCID: PMC5720774 DOI: 10.1371/journal.pone.0189242] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2017] [Accepted: 11/24/2017] [Indexed: 11/18/2022] Open
Abstract
Perception of synchrony between one's own action (a finger tap) and the sensory feedback thereof (a visual flash or an auditory pip) can be recalibrated after exposure to an artificially inserted delay between them (temporal recalibration effect: TRE). TRE might be mediated by a compensatory shift of motor timing (when did I tap?) and/or the sensory timing of the feedback (when did I hear/see the feedback?). To examine this, we asked participants to voluntarily tap their index finger at a constant pace while receiving visual or auditory feedback (a flash or pip) that was either synced or somewhat delayed relative to the tap. Following this exposure phase, they then performed a simple reaction time (RT) task to measure the sensory timing of the exposure stimulus, and a sensorimotor synchronization (SMS) task (tapping in synchrony with a flash or pip as pacing stimulus) to measure the point of subjective synchrony between the tap and pacing stimulus. The results showed that after exposure to delayed auditory feedback, participants tapped earlier (~21.5 ms) relative to auditory pacing stimuli (= temporal recalibration) and reacted faster (~5.6 ms) to auditory stimuli. For visual exposure and test stimuli, there were no such compensatory effects. These results indicate that adjustments of audio-motor synchrony can to some extent be explained by a change in the speed of auditory sensory processing. We discuss this in terms of an attentional modulation of sensory processing.
Collapse
Affiliation(s)
- Yoshimori Sugano
- Department of Industrial Management, Kyushu Sangyo University, Fukuoka, Japan
- * E-mail: (YS); (JV)
| | - Mirjam Keetels
- Department of Cognitive Neuropsychology, Tilburg University, Tilburg, the Netherlands
| | - Jean Vroomen
- Department of Cognitive Neuropsychology, Tilburg University, Tilburg, the Netherlands
- * E-mail: (YS); (JV)
| |
Collapse
|
18
|
Alais D, Ho T, Han S, Van der Burg E. A Matched Comparison Across Three Different Sensory Pairs of Cross-Modal Temporal Recalibration From Sustained and Transient Adaptation. Iperception 2017; 8:2041669517718697. [PMID: 28748067 PMCID: PMC5507391 DOI: 10.1177/2041669517718697] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Sustained exposure to an asynchronous multisensory signal causes perceived simultaneity to shift in the direction of the leading component of the adapting stimulus. This is known as temporal recalibration, and recent evidence suggests that it can occur very rapidly, even after a single asynchronous audiovisual (AV) stimulus. However, this form of rapid recalibration appears to be unique to AV stimuli, in contrast to recalibration following sustained asynchronies which occurs with audiotactile (AT) and visuotactile (VT) stimuli. This study examines temporal recalibration to AV, VT and AT asynchrony with spatially collocated stimuli using a design that produces both sustained and inter-trial recalibration by combining the traditional sustained adaptation approach with an inter-trial analysis of sequential dependencies in an extended test period. Thus, we compare temporal recalibration to both sustained and transient asynchrony in three crossmodal combinations using the same design, stimuli and observers. The results reveal that prolonged exposure to asynchrony produced equivalent temporal recalibration for all combinations: AV, AT and VT. The pattern for rapid, inter-trial recalibration was very different. Rapid recalibration occurred strongly for AV stimuli, weakly for AT and did not occur at all for VT. For all sensory pairings, recalibration from sustained asynchrony decayed to baseline during the test phase while inter-trial recalibration was present and stable throughout testing, suggesting different mechanisms may underlie adaptation at long and short timescales.
Collapse
Affiliation(s)
- David Alais
- School of Psychology, The University of Sydney, Australia
| | - Tam Ho
- School of Psychology, The University of Sydney, Australia
| | - Shui'er Han
- School of Psychology, The University of Sydney, Australia
| | - Erik Van der Burg
- School of Psychology, The University of Sydney, Australia; Department of Experimental and Applied Psychology, Vrije Universiteit Amsterdam, The Netherlands
| |
Collapse
|
19
|
Sartorato F, Przybylowski L, Sarko DK. Improving therapeutic outcomes in autism spectrum disorders: Enhancing social communication and sensory processing through the use of interactive robots. J Psychiatr Res 2017; 90:1-11. [PMID: 28213292 DOI: 10.1016/j.jpsychires.2017.02.004] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/16/2016] [Accepted: 02/03/2017] [Indexed: 11/20/2022]
Abstract
For children with autism spectrum disorders (ASDs), social robots are increasingly utilized as therapeutic tools in order to enhance social skills and communication. Robots have been shown to generate a number of social and behavioral benefits in children with ASD including heightened engagement, increased attention, and decreased social anxiety. Although social robots appear to be effective social reinforcement tools in assistive therapies, the perceptual mechanism underlying these benefits remains unknown. To date, social robot studies have primarily relied on expertise in fields such as engineering and clinical psychology, with measures of social robot efficacy principally limited to qualitative observational assessments of children's interactions with robots. In this review, we examine a range of socially interactive robots that currently have the most widespread use as well as the utility of these robots and their therapeutic effects. In addition, given that social interactions rely on audiovisual communication, we discuss how enhanced sensory processing and integration of robotic social cues may underlie the perceptual and behavioral benefits that social robots confer. Although overall multisensory processing (including audiovisual integration) is impaired in individuals with ASD, social robot interactions may provide therapeutic benefits by allowing audiovisual social cues to be experienced through a simplified version of a human interaction. By applying systems neuroscience tools to identify, analyze, and extend the multisensory perceptual substrates that may underlie the therapeutic benefits of social robots, future studies have the potential to strengthen the clinical utility of social robots for individuals with ASD.
Collapse
Affiliation(s)
- Felippe Sartorato
- Osteopathic Medical Student (OMS-IV), Edward Via College of Osteopathic Medicine (VCOM), Spartanburg, SC, USA
| | - Leon Przybylowski
- Osteopathic Medical Student (OMS-IV), Edward Via College of Osteopathic Medicine (VCOM), Spartanburg, SC, USA
| | - Diana K Sarko
- Department of Anatomy, Southern Illinois University School of Medicine, Carbondale, IL, USA; Department of Psychology, Southern Illinois University School of Medicine, Carbondale, IL, USA.
| |
Collapse
|
20
|
Hachisu T, Kajimoto H. Vibration Feedback Latency Affects Material Perception During Rod Tapping Interactions. IEEE TRANSACTIONS ON HAPTICS 2017; 10:288-295. [PMID: 28113957 DOI: 10.1109/toh.2016.2628900] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
We investigated the effect of vibration feedback latency on material perception during a tapping interaction using a rod device. When a user taps a surface, the perception of the material can be modulated by providing a decaying sinusoidal vibration at the moment of contact. To achieve this haptic material augmentation on a touchscreen, a system that can measure the approach velocity and provide vibration with low latency is required. To this end, we developed a touchscreen system that is capable of measuring the approach velocity and providing vibration feedback via a rod device with latency of 0.1 ms. Using this system, we experimentally measured the human detection threshold of the vibration feedback latency adopting a psychophysical approach. We further investigated the effect of latency on the perception of the material using a subjective questionnaire. Results show that the threshold was around 5.5 ms and the latency made the user feel that the surface is soft. In addition, users reported bouncing and denting sensations induced by the latency.
Collapse
|
21
|
The Impact of Feedback on the Different Time Courses of Multisensory Temporal Recalibration. Neural Plast 2017; 2017:3478742. [PMID: 28316841 PMCID: PMC5339631 DOI: 10.1155/2017/3478742] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2016] [Revised: 01/14/2017] [Accepted: 01/26/2017] [Indexed: 11/18/2022] Open
Abstract
The capacity to rapidly adjust perceptual representations confers a fundamental advantage when confronted with a constantly changing world. Unexplored is how feedback regarding sensory judgments (top-down factors) interacts with sensory statistics (bottom-up factors) to drive long- and short-term recalibration of multisensory perceptual representations. Here, we examined the time course of both cumulative and rapid temporal perceptual recalibration for individuals completing an audiovisual simultaneity judgment task in which they were provided with varying degrees of feedback. We find that in the presence of feedback (as opposed to simple sensory exposure) temporal recalibration is more robust. Additionally, differential time courses are seen for cumulative and rapid recalibration dependent upon the nature of the feedback provided. Whereas cumulative recalibration effects relied more heavily on feedback that informs (i.e., negative feedback) rather than confirms (i.e., positive feedback) the judgment, rapid recalibration shows the opposite tendency. Furthermore, differential effects on rapid and cumulative recalibration were seen when the reliability of feedback was altered. Collectively, our findings illustrate that feedback signals promote and sustain audiovisual recalibration over the course of cumulative learning and enhance rapid trial-to-trial learning. Furthermore, given the differential effects seen for cumulative and rapid recalibration, these processes may function via distinct mechanisms.
Collapse
|
22
|
Hao Q, Ora H, Ogawa KI, Ogata T, Miyake Y. Voluntary movement affects simultaneous perception of auditory and tactile stimuli presented to a non-moving body part. Sci Rep 2016; 6:33336. [PMID: 27622584 PMCID: PMC5020736 DOI: 10.1038/srep33336] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2016] [Accepted: 08/24/2016] [Indexed: 11/10/2022] Open
Abstract
The simultaneous perception of multimodal sensory information has a crucial role for effective reactions to the external environment. Voluntary movements are known to occasionally affect simultaneous perception of auditory and tactile stimuli presented to the moving body part. However, little is known about spatial limits on the effect of voluntary movements on simultaneous perception, especially when tactile stimuli are presented to a non-moving body part. We examined the effect of voluntary movement on the simultaneous perception of auditory and tactile stimuli presented to the non-moving body part. We considered the possible mechanism using a temporal order judgement task under three experimental conditions: voluntary movement, where participants voluntarily moved their right index finger and judged the temporal order of auditory and tactile stimuli presented to their non-moving left index finger; passive movement; and no movement. During voluntary movement, the auditory stimulus needed to be presented before the tactile stimulus so that they were perceived as occurring simultaneously. This subjective simultaneity differed significantly from the passive movement and no movement conditions. This finding indicates that the effect of voluntary movement on simultaneous perception of auditory and tactile stimuli extends to the non-moving body part.
Collapse
Affiliation(s)
- Qiao Hao
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Hiroki Ora
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Ken-Ichiro Ogawa
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| | - Taiki Ogata
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan.,Research into Artifacts, Center for Engineering (RACE), the University of Tokyo, Kashiwa, Japan
| | - Yoshihiro Miyake
- Department of Computer Science, Tokyo Institute of Technology, Yokohama, Japan
| |
Collapse
|
23
|
Krueger Fister J, Stevenson RA, Nidiffer AR, Barnett ZP, Wallace MT. Stimulus intensity modulates multisensory temporal processing. Neuropsychologia 2016; 88:92-100. [PMID: 26920937 DOI: 10.1016/j.neuropsychologia.2016.02.016] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2015] [Revised: 01/20/2016] [Accepted: 02/22/2016] [Indexed: 12/18/2022]
Abstract
One of the more challenging feats that multisensory systems must perform is to determine which sensory signals originate from the same external event, and thus should be integrated or "bound" into a singular perceptual object or event, and which signals should be segregated. Two important stimulus properties impacting this process are the timing and effectiveness of the paired stimuli. It has been well established that the more temporally aligned two stimuli are, the greater the degree to which they influence one another's processing. In addition, the less effective the individual unisensory stimuli are in eliciting a response, the greater the benefit when they are combined. However, the interaction between stimulus timing and stimulus effectiveness in driving multisensory-mediated behaviors has never been explored - which was the purpose of the current study. Participants were presented with either high- or low-intensity audiovisual stimuli in which stimulus onset asynchronies (SOAs) were parametrically varied, and were asked to report on the perceived synchrony/asynchrony of the paired stimuli. Our results revealed an interaction between the temporal relationship (SOA) and intensity of the stimuli. Specifically, individuals were more tolerant of larger temporal offsets (i.e., more likely to call them synchronous) when the paired stimuli were less effective. This interaction was also seen in response time (RT) distributions. Behavioral gains in RTs were seen with synchronous relative to asynchronous presentations, but this effect was more pronounced with high-intensity stimuli. These data suggest that stimulus effectiveness plays an underappreciated role in the perception of the timing of multisensory events, and reinforces the interdependency of the principles of multisensory integration in determining behavior and shaping perception.
Collapse
Affiliation(s)
- Juliane Krueger Fister
- Neuroscience Graduate Program, Vanderbilt University Medical Center, United States; Vanderbilt Brain Institute, United States.
| | - Ryan A Stevenson
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States; Vanderbilt Brain Institute, United States; Vanderbilt University Kennedy Center, United States; Department of Psychology, University of Toronto, Canada
| | - Aaron R Nidiffer
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States
| | - Zachary P Barnett
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States
| | - Mark T Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, United States; Vanderbilt Brain Institute, United States; Vanderbilt University Kennedy Center, United States; Department of Psychology, Vanderbilt University, United States; Department of Psychiatry, Vanderbilt University, United States
| |
Collapse
|
24
|
Abstract
Temporal proximity is one of the key factors determining whether events in different modalities are integrated into a unified percept. Sensitivity to audiovisual temporal asynchrony has been studied in adults in great detail. However, how such sensitivity matures during childhood is poorly understood. We examined perception of audiovisual temporal asynchrony in 7- to 8-year-olds, 10- to 11-year-olds, and adults by using a simultaneity judgment task (SJT). Additionally, we evaluated whether nonverbal intelligence, verbal ability, attention skills, or age influenced children's performance. On each trial, participants saw an explosion-shaped figure and heard a 2-kHz pure tone. These occurred at the following stimulus onset asynchronies (SOAs): 0, 100, 200, 300, 400, and 500 ms. In half of all trials, the visual stimulus appeared first (VA condition), and in the other half, the auditory stimulus appeared first (AV condition). Both groups of children were significantly more likely than adults to perceive asynchronous events as synchronous at all SOAs exceeding 100 ms, in both VA and AV conditions. Furthermore, only adults exhibited a significant shortening of reaction time (RT) at long SOAs compared to medium SOAs. Sensitivities to the VA and AV temporal asynchronies showed different developmental trajectories, with 10- to 11-year-olds outperforming 7- to 8-year-olds at the 300- to 500-ms SOAs, but only in the AV condition. Lastly, age was the only predictor of children's performance on the SJT. These results provide an important baseline against which children with developmental disorders associated with impaired audiovisual temporal function-such as autism, specific language impairment, and dyslexia-may be compared.
Collapse
Affiliation(s)
- Natalya Kaganovich
- Department of Speech, Language, and Hearing Sciences, Purdue University, 715 Clinic Drive West Lafayette, IN 47907-2038
- Department of Psychological Sciences, Purdue University, 703 Third Street, West Lafayette, IN 47907-2038
| |
Collapse
|
25
|
Zmigrod L, Zmigrod S. On the Temporal Precision of Thought: Individual Differences in the Multisensory Temporal Binding Window Predict Performance on Verbal and Nonverbal Problem Solving Tasks. Multisens Res 2016. [DOI: 10.1163/22134808-00002532] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/06/2023]
Abstract
Although psychology is greatly preoccupied by the tight link between the way that individuals perceive the world and their intelligent, creative behavior, there is little experimental work on the relationship between individual differences in perception and cognitive ability in healthy populations. Here, individual differences in problem solving ability were examined in relation to multisensory perception as measured by tolerance for temporal asynchrony between auditory and visual inputs, i.e., the multisensory temporal binding window. The results demonstrated that enhanced performance in both verbal and nonverbal problem solving tasks (the Remote Associates Test and Raven’s Advanced Progressive Matrices Task) is predicted by a narrower audio-visual temporal binding window, which reflects greater sensitivity to subtle discrepancies in sensory inputs. This suggests that the precision of individuals’ temporal window of multisensory integration might mirror their capacities for complex reasoning and thus the precision of their thoughts.
Collapse
Affiliation(s)
- Leor Zmigrod
- Department of Psychology, University of Cambridge, Cambridge, UK
| | - Sharon Zmigrod
- Institute for Psychological Research & Leiden Institute for Brain and Cognition, Leiden University, Leiden, The Netherlands
| |
Collapse
|
26
|
Van der Burg E, Alais D, Cass J. Audiovisual temporal recalibration occurs independently at two different time scales. Sci Rep 2015; 5:14526. [PMID: 26455577 PMCID: PMC4600976 DOI: 10.1038/srep14526] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2015] [Accepted: 08/21/2015] [Indexed: 11/13/2022] Open
Abstract
Combining signals across the senses improves precision and speed of perception, although this multisensory benefit declines for asynchronous signals. Multisensory events may produce synchronized stimuli at source but asynchronies inevitably arise due to distance, intensity, attention and neural latencies. Temporal recalibration is an adaptive phenomenon that serves to perceptually realign physically asynchronous signals. Recently, it was discovered that temporal recalibration occurs far more rapidly than previously thought and does not require minutes of adaptation. Using a classical audiovisual simultaneity task and a series of brief flashes and tones varying in onset asynchrony, perceived simultaneity on a given trial was found to shift in the direction of the preceding trial’s asynchrony. Here we examine whether this inter-trial recalibration reflects the same process as prolonged adaptation by combining both paradigms: participants adapted to a fixed temporal lag for several minutes followed by a rapid series of test trials requiring a synchrony judgment. Interestingly, we find evidence of recalibration from prolonged adaptation and inter-trial recalibration within a single experiment. We show a dissociation in which sustained adaptation produces a large but decaying recalibration effect whilst inter-trial recalibration produces large transient effects whose sign matches that of the previous trial.
Collapse
Affiliation(s)
| | - David Alais
- School of Psychology, University of Sydney, Australia
| | - John Cass
- School of Psychology, Western Sydney University, Australia
| |
Collapse
|
27
|
Hao Q, Ogata T, Ogawa KI, Kwon J, Miyake Y. The simultaneous perception of auditory-tactile stimuli in voluntary movement. Front Psychol 2015; 6:1429. [PMID: 26441799 PMCID: PMC4585164 DOI: 10.3389/fpsyg.2015.01429] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2015] [Accepted: 09/07/2015] [Indexed: 11/25/2022] Open
Abstract
The simultaneous perception of multimodal information in the environment during voluntary movement is very important for effective reactions to the environment. Previous studies have found that voluntary movement affects the simultaneous perception of auditory and tactile stimuli. However, the results of these experiments are not completely consistent, and the differences may be attributable to methodological differences in the previous studies. In this study, we investigated the effect of voluntary movement on the simultaneous perception of auditory and tactile stimuli using a temporal order judgment task with voluntary movement, involuntary movement, and no movement. To eliminate the potential effect of stimulus predictability and the effect of spatial information associated with large-scale movement in the previous studies, we randomized the interval between the start of movement and the first stimulus, and used small-scale movement. As a result, the point of subjective simultaneity (PSS) during voluntary movement shifted from the tactile stimulus being first during involuntary movement or no movement to the auditory stimulus being first. The just noticeable difference (JND), an indicator of temporal resolution, did not differ across the three conditions. These results indicate that voluntary movement itself affects the PSS in auditory–tactile simultaneous perception, but it does not influence the JND. In the discussion of these results, we suggest that simultaneous perception may be affected by the efference copy.
Collapse
Affiliation(s)
- Qiao Hao
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| | - Taiki Ogata
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan ; Research into Artifacts, Center for Engineering (RACE), The University of Tokyo Kashiwa, Japan
| | - Ken-Ichiro Ogawa
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| | - Jinhwan Kwon
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| | - Yoshihiro Miyake
- Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology Yokohama, Japan
| |
Collapse
|
28
|
Stevenson RA, Segers M, Ferber S, Barense MD, Camarata S, Wallace MT. Keeping time in the brain: Autism spectrum disorder and audiovisual temporal processing. Autism Res 2015; 9:720-38. [PMID: 26402725 DOI: 10.1002/aur.1566] [Citation(s) in RCA: 58] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2015] [Revised: 08/22/2015] [Accepted: 08/29/2015] [Indexed: 12/21/2022]
Abstract
A growing area of interest and relevance in the study of autism spectrum disorder (ASD) focuses on the relationship between multisensory temporal function and the behavioral, perceptual, and cognitive impairments observed in ASD. Atypical sensory processing is becoming increasingly recognized as a core component of autism, with evidence of atypical processing across a number of sensory modalities. These deviations from typical processing underscore the value of interpreting ASD within a multisensory framework. Furthermore, converging evidence illustrates that these differences in audiovisual processing may be specifically related to temporal processing. This review seeks to bridge the connection between temporal processing and audiovisual perception, and to elaborate on emerging data showing differences in audiovisual temporal function in autism. We also discuss the consequence of such changes, the specific impact on the processing of different classes of audiovisual stimuli (e.g. speech vs. nonspeech, etc.), and the presumptive brain processes and networks underlying audiovisual temporal integration. Finally, possible downstream behavioral implications, and possible remediation strategies are outlined. Autism Res 2016, 9: 720-738. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Magali Segers
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - Susanne Ferber
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Morgan D Barense
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Stephen Camarata
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Mark T Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Brain Institute, Vanderbilt University Medical Center, Nashville, Tennessee.,Department of Psychology, Vanderbilt University, Nashville, Tennessee.,Department of Psychiatry, Vanderbilt University Medical Center, Nashville, Tennessee
| |
Collapse
|
29
|
Wallace MT, Stevenson RA. The construct of the multisensory temporal binding window and its dysregulation in developmental disabilities. Neuropsychologia 2014; 64:105-23. [PMID: 25128432 PMCID: PMC4326640 DOI: 10.1016/j.neuropsychologia.2014.08.005] [Citation(s) in RCA: 200] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2014] [Revised: 08/04/2014] [Accepted: 08/05/2014] [Indexed: 01/18/2023]
Abstract
Behavior, perception and cognition are strongly shaped by the synthesis of information across the different sensory modalities. Such multisensory integration often results in performance and perceptual benefits that reflect the additional information conferred by having cues from multiple senses providing redundant or complementary information. The spatial and temporal relationships of these cues provide powerful statistical information about how these cues should be integrated or "bound" in order to create a unified perceptual representation. Much recent work has examined the temporal factors that are integral in multisensory processing, with many focused on the construct of the multisensory temporal binding window - the epoch of time within which stimuli from different modalities is likely to be integrated and perceptually bound. Emerging evidence suggests that this temporal window is altered in a series of neurodevelopmental disorders, including autism, dyslexia and schizophrenia. In addition to their role in sensory processing, these deficits in multisensory temporal function may play an important role in the perceptual and cognitive weaknesses that characterize these clinical disorders. Within this context, focus on improving the acuity of multisensory temporal function may have important implications for the amelioration of the "higher-order" deficits that serve as the defining features of these disorders.
Collapse
Affiliation(s)
- Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, 465 21st Avenue South, Nashville, TN 37232, USA; Department of Hearing & Speech Sciences, Vanderbilt University, Nashville, TN, USA; Department of Psychology, Vanderbilt University, Nashville, TN, USA; Department of Psychiatry, Vanderbilt University, Nashville, TN, USA.
| | - Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
30
|
Harvey C, Van der Burg E, Alais D. Rapid temporal recalibration occurs crossmodally without stimulus specificity but is absent unimodally. Brain Res 2014; 1585:120-30. [DOI: 10.1016/j.brainres.2014.08.028] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2013] [Revised: 08/12/2014] [Accepted: 08/13/2014] [Indexed: 11/16/2022]
|
31
|
Van der Burg E, Orchard-Mills E, Alais D. Rapid temporal recalibration is unique to audiovisual stimuli. Exp Brain Res 2014; 233:53-9. [PMID: 25200176 DOI: 10.1007/s00221-014-4085-8] [Citation(s) in RCA: 41] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2014] [Accepted: 08/25/2014] [Indexed: 10/24/2022]
Abstract
Following prolonged exposure to asynchronous multisensory signals, the brain adapts to reduce the perceived asynchrony. Here, in three separate experiments, participants performed a synchrony judgment task on audiovisual, audiotactile or visuotactile stimuli and we used inter-trial analyses to examine whether temporal recalibration occurs rapidly on the basis of a single asynchronous trial. Even though all combinations used the same subjects, task and design, temporal recalibration occurred for audiovisual stimuli (i.e., the point of subjective simultaneity depended on the preceding trial's modality order), but none occurred when the same auditory or visual event was combined with a tactile event. Contrary to findings from prolonged adaptation studies showing recalibration for all three combinations, we show that rapid, inter-trial recalibration is unique to audiovisual stimuli. We conclude that recalibration occurs at two different timescales for audiovisual stimuli (fast and slow), but only on a slow timescale for audiotactile and visuotactile stimuli.
Collapse
Affiliation(s)
- Erik Van der Burg
- School of Psychology, University of Sydney, A19 Griffith Taylor, Sydney, NSW, 2006, Australia,
| | | | | |
Collapse
|
32
|
Yamamoto K, Kawabata H. Adaptation to delayed auditory feedback induces the temporal recalibration effect in both speech perception and production. Exp Brain Res 2014; 232:3707-18. [PMID: 25106757 DOI: 10.1007/s00221-014-4055-1] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2014] [Accepted: 07/24/2014] [Indexed: 11/24/2022]
Abstract
We ordinarily speak fluently, even though our perceptions of our own voices are disrupted by various environmental acoustic properties. The underlying mechanism of speech is supposed to monitor the temporal relationship between speech production and the perception of auditory feedback, as suggested by a reduction in speech fluency when the speaker is exposed to delayed auditory feedback (DAF). While many studies have reported that DAF influences speech motor processing, its relationship to the temporal tuning effect on multimodal integration, or temporal recalibration, remains unclear. We investigated whether the temporal aspects of both speech perception and production change due to adaptation to the delay between the motor sensation and the auditory feedback. This is a well-used method of inducing temporal recalibration. Participants continually read texts with specific DAF times in order to adapt to the delay. Then, they judged the simultaneity between the motor sensation and the vocal feedback. We measured the rates of speech with which participants read the texts in both the exposure and re-exposure phases. We found that exposure to DAF changed both the rate of speech and the simultaneity judgment, that is, participants' speech gained fluency. Although we also found that a delay of 200 ms appeared to be most effective in decreasing the rates of speech and shifting the distribution on the simultaneity judgment, there was no correlation between these measurements. These findings suggest that both speech motor production and multimodal perception are adaptive to temporal lag but are processed in distinct ways.
Collapse
Affiliation(s)
- Kosuke Yamamoto
- Department of Psychology, Keio University, 2-15-45 Mita, Minato-ku, Tokyo, 108-8345, Japan
| | | |
Collapse
|
33
|
Abstract
Spatial ventriloquism refers to the phenomenon that a visual stimulus such as a flash can attract the perceived location of a spatially discordant but temporally synchronous sound. An analogous example of mutual attraction between audition and vision has been found in the temporal domain, where temporal aspects of a visual event, such as its onset, frequency, or duration, can be biased by a slightly asynchronous sound. In this review, we examine various manifestations of spatial and temporal attraction between the senses (both direct effects and aftereffects), and we discuss important constraints on the occurrence of these effects. Factors that potentially modulate ventriloquism-such as attention, synesthetic correspondence, and other cognitive factors-are described. We trace theories and models of spatial and temporal ventriloquism, from the traditional unity assumption and modality appropriateness hypothesis to more recent Bayesian and neural network approaches. Finally, we summarize recent evidence probing the underlying neural mechanisms of spatial and temporal ventriloquism.
Collapse
|
34
|
Rohde M, Greiner L, Ernst MO. Asymmetries in visuomotor recalibration of time perception: does causal binding distort the window of integration? Acta Psychol (Amst) 2014; 147:127-35. [PMID: 23928564 DOI: 10.1016/j.actpsy.2013.07.011] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2013] [Revised: 07/04/2013] [Accepted: 07/11/2013] [Indexed: 11/20/2022] Open
Abstract
The recalibration of perceived visuomotor simultaneity to vision-lead and movement-lead temporal discrepancies is marked by an underlying causal asymmetry, if the movement (button press) is voluntary and self-initiated; a visual stimulus lagging the button press may be interpreted as causally linked sensory feedback (intentional or causal binding), a leading visual stimulus not. Here, we test whether this underlying causal asymmetry leads to directional asymmetries in the temporal recalibration of visuomotor time perception, using an interval estimation paradigm. Participants were trained to the presence of one of three temporal discrepancies between a motor action (button press) and a visual stimulus (flashed disk): 100 ms vision-lead, simultaneity, and 100 ms movement-lead. By adjusting a point on a visual scale, participants then estimated the interval between the visual stimulus and the button press over a range of discrepancies. Comparing the results across conditions, we found that temporal recalibration appears to be implemented nearly exclusively on the movement-lead side of the range of discrepancies by a uni-lateral lengthening or shortening of the window of temporal integration. Interestingly, this marked asymmetry does not lead to a significantly asymmetrical recalibration of the point of subjective simultaneity or to significant differences in discriminability. This seeming contradiction (symmetrical recalibration of subjective simultaneity and asymmetrical recalibration of interval estimation) poses a challenge to common models of temporal order perception that assume an underlying time measurement process with Gaussian noise. Using a two-criterion model of the window of temporal integration, we illustrate that a compressive bias around perceived simultaneity (temporal integration) even prior to perceptual decisions about temporal order would be very hard to detect given the sensitivity of the psychophysical procedures commonly used.
Collapse
Affiliation(s)
- Marieke Rohde
- Department of Cognitive Neuroscience, University of Bielefeld, Universitätsstr. 25, W3-240, 33615 Bielefeld Germany; Cognitive Interaction Technology (CITEC) Centre of Excellence, University of Bielefeld, Universitätsstr. 25, W3-240, 33615 Bielefeld Germany.
| | - Leonie Greiner
- Department of Cognitive Neuroscience, University of Bielefeld, Universitätsstr. 25, W3-240, 33615 Bielefeld Germany.
| | - Marc O Ernst
- Department of Cognitive Neuroscience, University of Bielefeld, Universitätsstr. 25, W3-240, 33615 Bielefeld Germany; Cognitive Interaction Technology (CITEC) Centre of Excellence, University of Bielefeld, Universitätsstr. 25, W3-240, 33615 Bielefeld Germany.
| |
Collapse
|
35
|
Cravo AM, Haddad H, Claessens PME, Baldo MVC. Bias and learning in temporal binding: Intervals between actions and outcomes are compressed by prior bias. Conscious Cogn 2013; 22:1174-80. [PMID: 24016785 DOI: 10.1016/j.concog.2013.08.001] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2013] [Revised: 07/27/2013] [Accepted: 08/01/2013] [Indexed: 11/19/2022]
Affiliation(s)
- Andre M Cravo
- Center for Mathematics, Computation and Cognition, Federal University of ABC (UFABC), Santo André, Brazil.
| | | | | | | |
Collapse
|
36
|
Omrani M, Lak A, Diamond ME. Learning not to feel: reshaping the resolution of tactile perception. Front Syst Neurosci 2013; 7:29. [PMID: 23847478 PMCID: PMC3701118 DOI: 10.3389/fnsys.2013.00029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2013] [Accepted: 06/14/2013] [Indexed: 11/13/2022] Open
Abstract
We asked whether biased feedback during training could cause human subjects to lose perceptual acuity in a vibrotactile frequency discrimination task. Prior to training, we determined each subject's vibration frequency discrimination capacity on one fingertip, the Just Noticeable Difference (JND). Subjects then received 850 trials in which they performed a same/different judgment on two vibrations presented to that fingertip. They gained points whenever their judgment matched the computer-generated feedback on that trial. Feedback, however, was biased: the probability per trial of “same” feedback was drawn from a normal distribution with standard deviation twice as wide as the subject's JND. After training, the JND was significantly widened: stimulus pairs previously perceived as different were now perceived as the same. The widening of the JND extended to the untrained hand, indicating that the decrease in resolution originated in non-topographic brain regions. In sum, the acuity of subjects' sensory-perceptual systems shifted in order to match the feedback received during training.
Collapse
Affiliation(s)
- Mohsen Omrani
- Tactile Perception and Learning Lab, International School for Advanced Studies-SISSA Trieste, Italy ; School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM) Tehran, Iran ; Centre for Neuroscience Studies, Queen's University Kingston, ON, Canada
| | | | | |
Collapse
|
37
|
Fitting model-based psychometric functions to simultaneity and temporal-order judgment data: MATLAB and R routines. Behav Res Methods 2013; 45:972-98. [DOI: 10.3758/s13428-013-0325-2] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
38
|
Parsons BD, Novich SD, Eagleman DM. Motor-sensory recalibration modulates perceived simultaneity of cross-modal events at different distances. Front Psychol 2013; 4:46. [PMID: 23549660 PMCID: PMC3582016 DOI: 10.3389/fpsyg.2013.00046] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2012] [Accepted: 01/22/2013] [Indexed: 11/13/2022] Open
Abstract
A popular model for the representation of time in the brain posits the existence of a single, central-clock. In that framework, temporal distortions in perception are explained by contracting or expanding time over a given interval. We here present evidence for an alternative account, one which proposes multiple independent timelines coexisting within the brain and stresses the importance of motor predictions and causal inferences in constructing our temporal representation of the world. Participants judged the simultaneity of a beep and flash coming from a single source at different distances. The beep was always presented at a constant delay after a motor action, while the flash occurred at a variable delay. Independent shifts in the implied timing of the auditory stimulus toward the motor action (but not the visual stimulus) provided evidence against a central-clock model. Additionally, the hypothesis that the time between action and delayed effect is compressed (known as intentional binding) seems unable to explain our results: firstly, because actions and effects can perceptually reverse, and secondly because the recalibration of simultaneity remains even after the participant's intentional actions are no longer present. Contrary to previous reports, we also find that participants are unable to use distance cues to compensate for the relatively slower speed of sound when audio-visual events are presented in depth. When a motor act is used to control the distal event, however, adaptation to the delayed auditory signal occurs and subjective cross-sensory synchrony is maintained. These results support the hypothesis that perceptual timing derives from and is calibrated by our motor interactions with the world.
Collapse
Affiliation(s)
- Brent D Parsons
- Department of Neuroscience, Baylor College of Medicine Houston, TX, USA
| | | | | |
Collapse
|
39
|
Rohde M, Ernst MO. To lead and to lag - forward and backward recalibration of perceived visuo-motor simultaneity. Front Psychol 2013; 3:599. [PMID: 23346063 PMCID: PMC3551234 DOI: 10.3389/fpsyg.2012.00599] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2012] [Accepted: 12/18/2012] [Indexed: 12/02/2022] Open
Abstract
Studies on human recalibration of perceived visuo-motor simultaneity so far have been limited to the study of recalibration to movement-lead temporal discrepancies (visual lags). We studied adaptation to both vision-lead and movement-lead discrepancies, to test for differences between these conditions, as a leading visual stimulus violates the underlying cause-effect structure. To this end, we manipulated the temporal relationship between a motor action (button press) and a visual event (flashed disk) in a training phase. Participants were tested in a temporal order judgment task and perceived simultaneity (PSS) was compared before and after recalibration. A PHANToM©force-feedback device that tracks the finger position in real time was used to display a virtual button. We predicted the timing of full compression of the button from early movement onset in order to time visual stimuli even before the movement event of the full button press. The results show that recalibration of perceived visuo-motor simultaneity is evident in both directions and does not differ in magnitude between the conditions. The strength of recalibration decreases with perceptual accuracy, suggesting the possibility that some participants recalibrate less because they detect the discrepancy. We conclude that the mechanisms of temporal recalibration work in both directions and that there is no evidence that they are asymmetrical around the point of actual simultaneity, despite the underlying asymmetry in the cause-effect relation.
Collapse
Affiliation(s)
- Marieke Rohde
- Department of Cognitive Neurosciences, University of Bielefeld Bielefeld, Germany ; Cognitive Interaction Technology Centre of Excellence, University of Bielefeld Bielefeld, Germany
| | | |
Collapse
|
40
|
Asai T, Kanayama N. "Cutaneous rabbit" hops toward a light: unimodal and cross-modal causality on the skin. Front Psychol 2012; 3:427. [PMID: 23133432 PMCID: PMC3490328 DOI: 10.3389/fpsyg.2012.00427] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2012] [Accepted: 10/01/2012] [Indexed: 11/23/2022] Open
Abstract
Our somatosensory system deals with not only spatial but also temporal imprecision, resulting in characteristic spatiotemporal illusions. Repeated rapid stimulation at the wrist, then near the elbow, can create the illusion of touch at intervening locations along the arm (as if a rabbit is hopping along the arm). This is known as the “cutaneous rabbit effect” (CRE). Previous studies have suggested that the CRE involves not only an intrinsic somatotopic representation but also the representation of an extended body schema that includes causality or animacy perception upon the skin. On the other hand, unlike other multi-modal causality couplings, it is possible that the CRE is not affected by concurrent auditory temporal information. The present study examined the effect of a simple visual flash on the CRE, which has both temporal and spatial information. Here, stronger cross-modal causality or correspondence could be provided. We presented three successive tactile stimuli on the inside of a participant’s left arm. Stimuli were presented on the wrist, elbow, and midway between the two. Results from our five experimental manipulations suggest that a one-shot flash enhances or attenuates the CRE depending on its congruency with cutaneous rabbit saltation. Our results reflect that (1) our brain interprets successive stimuli on the skin as motion in terms of time and space (unimodal causality) and that (2) the concurrent signals from other modalities provide clues for creating unified representations of this external motion (multi-modal causality) as to the extent that “spatiotemporal” synchronicity among modalities is provided.
Collapse
Affiliation(s)
- Tomohisa Asai
- Department of Psychology, Chiba University Chiba, Japan
| | | |
Collapse
|
41
|
Yamamoto S, Miyazaki M, Iwano T, Kitazawa S. Bayesian calibration of simultaneity in audiovisual temporal order judgments. PLoS One 2012; 7:e40379. [PMID: 22792297 PMCID: PMC3392227 DOI: 10.1371/journal.pone.0040379] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2012] [Accepted: 06/05/2012] [Indexed: 11/18/2022] Open
Abstract
After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation). In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to Bayesian integration theory (Bayesian calibration). We further showed, in theory, that the effect of Bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that Bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone) in a different block, the point of simultaneity shifted to “sound-first” for the pitch associated with sound-first stimuli, and to “light-first” for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to “light-first” for the pitch associated with sound-first stimuli, and to “sound-first” for the pitch associated with light-first stimuli. The results clearly show that Bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.
Collapse
Affiliation(s)
- Shinya Yamamoto
- National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Japan.
| | | | | | | |
Collapse
|
42
|
Keetels M, Vroomen J. Exposure to delayed visual feedback of the hand changes motor-sensory synchrony perception. Exp Brain Res 2012; 219:431-40. [PMID: 22623088 PMCID: PMC3366181 DOI: 10.1007/s00221-012-3081-0] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2011] [Accepted: 03/18/2012] [Indexed: 11/30/2022]
Abstract
We examined whether the brain can adapt to temporal delays between a self-initiated action and the naturalistic visual feedback of that action. During an exposure phase, participants tapped with their index finger while seeing their own hand in real time (~0 ms delay) or delayed at 40, 80, or 120 ms. Following exposure, participants were tested with a simultaneity judgment (SJ) task in which they judged whether the video of their hand was synchronous or asynchronous with respect to their finger taps. The locations of the seen and the real hand were either different (Experiment 1) or aligned (Experiment 2). In both cases, the point of subjective simultaneity (PSS) was uniformly shifted in the direction of the exposure lags while sensitivity to visual-motor asynchrony decreased with longer exposure delays. These findings demonstrate that the brain is quite flexible in adjusting the timing relation between a motor action and the otherwise naturalistic visual feedback that this action engenders.
Collapse
Affiliation(s)
- Mirjam Keetels
- Department of Medical Psychology and Neuropsychology, Tilburg University, Tilburg, The Netherlands.
| | | |
Collapse
|
43
|
Fujisaki W. Effects of delayed visual feedback on grooved pegboard test performance. Front Psychol 2012; 3:61. [PMID: 22408631 PMCID: PMC3297075 DOI: 10.3389/fpsyg.2012.00061] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2011] [Accepted: 02/15/2012] [Indexed: 11/25/2022] Open
Abstract
Using four experiments, this study investigates what amount of delay brings about maximal impairment under delayed visual feedback and whether a critical interval, such as that in audition, also exists in vision. The first experiment measured the Grooved Pegboard test performance as a function of visual feedback delays from 120 to 2120 ms in 16 steps. Performance sharply decreased until about 490 ms, then more gradually until 2120 ms, suggesting that two mechanisms were operating under delayed visual feedback. Since delayed visual feedback differs from delayed auditory feedback in that the former induces not only temporal but also spatial displacements between motor and sensory feedback, this difference could also exist in the mechanism responsible for spatial displacement. The second experiment was hence conducted to provide simultaneous haptic feedback together with delayed visual feedback to inform correct spatial position. The disruption was significantly ameliorated when information about spatial position was provided from a haptic source. The sharp decrease in performance of up to approximately 300 ms was followed by an almost flat performance. This is similar to the critical interval found in audition. Accordingly, the mechanism that caused the sharp decrease in performance in experiments 1 and 2 was probably mainly responsible for temporal disparity and is common across different modality–motor combinations, while the other mechanism that caused a rather gradual decrease in performance in experiment 1 was mainly responsible for spatial displacement. In experiments 3 and 4, the reliability of spatial information from the haptic source was reduced by wearing a glove or using a tool. When the reliability of spatial information was reduced, the data lay between those of experiments 1 and 2, and that a gradual decrease in performance partially reappeared. These results further support the notion that two mechanisms operate under delayed visual feedback.
Collapse
Affiliation(s)
- Waka Fujisaki
- Human Technology Research Institute, National Institute of Advanced Industrial Science and Technology Tsukuba, Ibaraki, Japan
| |
Collapse
|
44
|
Heron J, Roach NW, Hanson JVM, McGraw PV, Whitaker D. Audiovisual time perception is spatially specific. Exp Brain Res 2012; 218:477-85. [PMID: 22367399 PMCID: PMC3324684 DOI: 10.1007/s00221-012-3038-3] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2011] [Accepted: 02/09/2012] [Indexed: 11/19/2022]
Abstract
Our sensory systems face a daily barrage of auditory and visual signals whose arrival times form a wide range of audiovisual asynchronies. These temporal relationships constitute an important metric for the nervous system when surmising which signals originate from common external events. Internal consistency is known to be aided by sensory adaptation: repeated exposure to consistent asynchrony brings perceived arrival times closer to simultaneity. However, given the diverse nature of our audiovisual environment, functionally useful adaptation would need to be constrained to signals that were generated together. In the current study, we investigate the role of two potential constraining factors: spatial and contextual correspondence. By employing an experimental design that allows independent control of both factors, we show that observers are able to simultaneously adapt to two opposing temporal relationships, provided they are segregated in space. No such recalibration was observed when spatial segregation was replaced by contextual stimulus features (in this case, pitch and spatial frequency). These effects provide support for dedicated asynchrony mechanisms that interact with spatially selective mechanisms early in visual and auditory sensory pathways.
Collapse
Affiliation(s)
- James Heron
- Bradford School of Optometry and Vision Science, University of Bradford, Bradford, UK.
| | | | | | | | | |
Collapse
|
45
|
Multisensory simultaneity recalibration: storage of the aftereffect in the absence of counterevidence. Exp Brain Res 2011; 217:89-97. [PMID: 22207361 DOI: 10.1007/s00221-011-2976-5] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2011] [Accepted: 11/30/2011] [Indexed: 10/14/2022]
Abstract
Recent studies show that repeated exposure to an asynchrony between auditory and visual stimuli shifts the point of subjective simultaneity. Usually, the measurement stimuli used to assess this aftereffect are interleaved with short re-exposures to the asynchrony. In a first experiment, we show that the aftereffect declines during measurement in spite of the use of re-exposures. In a second experiment, we investigate whether the observed decline is either due to a dissipation of the aftereffect with the passage of time, or the result of using measurement stimuli with a distribution of asynchronies different from the exposure stimulus. To this end, we introduced a delay before measuring the aftereffects and we compared the magnitude of the aftereffect with and without delay. We find that the aftereffect does not dissipate during the delay but instead is stored until new sensory information in the form of measurement stimuli is presented as counterevidence (i.e., stimuli with an asynchrony that differs from the one used during exposure).
Collapse
|
46
|
Stekelenburg JJ, Sugano Y, Vroomen J. Neural correlates of motor-sensory temporal recalibration. Brain Res 2011; 1397:46-54. [PMID: 21600564 DOI: 10.1016/j.brainres.2011.04.045] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2010] [Revised: 03/03/2011] [Accepted: 04/24/2011] [Indexed: 11/29/2022]
Abstract
The relative timing of a motor-sensory event can be recalibrated after exposure to delayed visual feedback. Here we examined the neural consequences of lag adaptation using event-related potentials (ERPs). Participants tapped their finger on a pad, which triggered a flash either after a short delay (0 ms/50 ms) or a long delay (100 ms/150 ms). Following the exposure phase, they judged the temporal order of a synchronous tap-flash test stimulus. The synchronous flash was more often perceived to occur before the tap after exposure to long than short delays, indicating that the temporal relation between the tap and the flash was realigned. ERPs evoked by the synchronous tap-flash test stimulus showed that adaptation to delayed flashes caused an early attenuation of the visual P1 (85 ms-150 ms), and a later negativity at central electrodes (N450). The P1-attenuation may reflect the unexpected earliness of the test flash, or a violation of "cause-before-consequence". The N450 may be due to realignment of the adapted and the actual timing of the tap-flash interval. We conclude that motor-visual temporal recalibration has consequences at early perceptual levels of visual processing and involves a high-level recalibration mechanism.
Collapse
|
47
|
Abstract
Our sense of relative timing is malleable. For instance, visual signals can be made to seem synchronous with earlier sounds following prolonged exposure to an environment wherein auditory signals precede visual ones. Similarly, actions can be made to seem to precede their own consequences if an artificial delay is imposed for a period, and then removed. Here, we show that our sense of relative timing for combinations of visual changes is similarly pliant. We find that direction reversals can be made to seem synchronous with unusually early colour changes after prolonged exposure to a stimulus wherein colour changes precede direction changes. The opposite effect is induced by prolonged exposure to colour changes that lag direction changes. Our data are consistent with the proposal that our sense of timing for changes encoded by distinct sensory mechanisms can adjust, at least to some degree, to the prevailing environment. Moreover, they reveal that visual analyses of colour and motion are sufficiently independent for this to occur.
Collapse
Affiliation(s)
- Derek H. Arnold
- School of Psychology, The University of Queensland, Brisbane, QLD 4055, Australia
| | - Kielan Yarrow
- Department of Psychology, City University London, London, UK
| |
Collapse
|
48
|
Cravo AM, Claessens PM, Baldo MV. The relation between action, predictability and temporal contiguity in temporal binding. Acta Psychol (Amst) 2011; 136:157-66. [PMID: 21185547 DOI: 10.1016/j.actpsy.2010.11.005] [Citation(s) in RCA: 48] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2010] [Revised: 11/20/2010] [Accepted: 11/22/2010] [Indexed: 11/26/2022] Open
Abstract
Previous studies have documented a subjective temporal attraction between actions and their effects. This finding, named intentional binding, is thought to be the result of a cognitive function that links actions to their consequences. Although several studies have tried to outline the necessary and sufficient conditions for intentional binding, a quantitative comparison between the roles of temporal contiguity, predictability and voluntary action and the evaluation of their interactions is difficult due to the high variability of the temporal binding measurements. In the present study, we used a novel methodology to investigate the properties of intentional binding. Subjects judged whether an auditory stimulus, which could either be triggered by a voluntary finger lift or be presented after a visual temporal marker unrelated to any action, was presented synchronously with a reference stimulus. In three experiments, the predictability, the interval between action and consequence and the presence of action itself were manipulated. The results indicate that (1) action is a necessary condition for temporal binding; (2) a fixed interval between the two events is not sufficient to cause the effect and (3) only in the presence of voluntary action do temporal predictability and contiguity play a significant role in modulating the effect.These findings are discussed in the context of the relationship between intentional binding and temporal expectation.
Collapse
|
49
|
Tanaka H, Homma K, Imamizu H. Physical delay but not subjective delay determines learning rate in prism adaptation. Exp Brain Res 2010; 208:257-68. [PMID: 21076819 DOI: 10.1007/s00221-010-2476-z] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2010] [Accepted: 10/25/2010] [Indexed: 11/26/2022]
Abstract
Timing is critical in determining the causal relationship between two events. Motor adaptation relies on the timing of actions and their results for determining which preceding control signals were responsible for subsequent error in the resulting movements. An artificially induced temporal delay in error feedback as short as 50 ms has been found to slow the learning rate of prism adaptation. Recent studies have demonstrated that our sense of simultaneity is flexibly adaptive when a persistent delay is inserted into visual feedback timing of one's own action. Therefore, judgments of "subjective simultaneity" (i.e. whether two events are simultaneous on a subjective basis) do not necessarily correspond to the actual simultaneity of physical events. We evaluated the effects of adaptation to a temporal shift of subjective simultaneity on prism adaptation by examining whether prism adaptation depends on physical timing or subjective timing. We found that after persistently experiencing an additional 100-ms delay in a pointing experiment, psychometric curves of the timing of judgments about the temporal order of touching and visual feedback were shifted by 40 ms, indicating that subjective simultaneity adapted. Next, while maintaining temporal adaptation, participants adapted to spatial displacement caused by a prism with and without an additional temporal delay in feedback. Learning speed was reliably predicted by physical timing but not by subjective timing. These results indicate that prism adaptation occurs independently of awareness of subjective timing and may be processed in primary motor areas that are thought to have fidelity with temporal relations.
Collapse
Affiliation(s)
- Hirokazu Tanaka
- National Institute of Information and Communications Technology (NiCT), Hikaridai 2-2-2, Keihanna Science City, Kyoto 619-0288, Japan
| | | | | |
Collapse
|
50
|
Roach NW, Heron J, Whitaker D, McGraw PV. Asynchrony adaptation reveals neural population code for audio-visual timing. Proc Biol Sci 2010; 278:1314-22. [PMID: 20961905 PMCID: PMC3061136 DOI: 10.1098/rspb.2010.1737] [Citation(s) in RCA: 49] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible--adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.
Collapse
Affiliation(s)
- Neil W Roach
- Visual Neuroscience Group, School of Psychology, The University of Nottingham, Nottingham, UK.
| | | | | | | |
Collapse
|