1
|
Jiang P, Kent C, Rossiter J. Towards sensory substitution and augmentation: Mapping visual distance to audio and tactile frequency. PLoS One 2024; 19:e0299213. [PMID: 38530828 DOI: 10.1371/journal.pone.0299213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2023] [Accepted: 02/07/2024] [Indexed: 03/28/2024] Open
Abstract
Multimodal perception is the predominant means by which individuals experience and interact with the world. However, sensory dysfunction or loss can significantly impede this process. In such cases, cross-modality research offers valuable insight into how we can compensate for these sensory deficits through sensory substitution. Although sight and hearing are both used to estimate the distance to an object (e.g., by visual size and sound volume) and the perception of distance is an important element in navigation and guidance, it is not widely studied in cross-modal research. We investigate the relationship between audio and vibrotactile frequencies (in the ranges 47-2,764 Hz and 10-99 Hz, respectively) and distances uniformly distributed in the range 1-12 m. In our experiments participants mapped the distance (represented by an image of a model at that distance) to a frequency via adjusting a virtual tuning knob. The results revealed that the majority (more than 76%) of participants demonstrated a strong negative monotonic relationship between frequency and distance, across both vibrotactile (represented by a natural log function) and auditory domains (represented by an exponential function). However, a subgroup of participants showed the opposite positive linear relationship between frequency and distance. The strong cross-modal sensory correlation could contribute to the development of assistive robotic technologies and devices to augment human perception. This work provides the fundamental foundation for future assisted HRI applications where a mapping between distance and frequency is needed, for example for people with vision or hearing loss, drivers with loss of focus or response delay, doctors undertaking teleoperation surgery, and users in augmented reality (AR) or virtual reality (VR) environments.
Collapse
Affiliation(s)
- Pingping Jiang
- Department of Engineering Mathematics, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| | - Christopher Kent
- School of Psychological Science, University of Bristol, Bristol, United Kingdom
| | - Jonathan Rossiter
- Department of Engineering Mathematics, University of Bristol, Bristol, United Kingdom
- SoftLab, Bristol Robotics Laboratory, Bristol, United Kingdom
| |
Collapse
|
2
|
Kohler I, Perrotta MV, Ferreira T, Eagleman DM. Cross-Modal Sensory Boosting to Improve High-Frequency Hearing Loss: Device Development and Validation. JMIRX MED 2024; 5:e49969. [PMID: 38345294 PMCID: PMC11008433 DOI: 10.2196/49969] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Revised: 11/23/2023] [Accepted: 12/13/2023] [Indexed: 04/13/2024]
Abstract
Background High-frequency hearing loss is one of the most common problems in the aging population and with those who have a history of exposure to loud noises. This type of hearing loss can be frustrating and disabling, making it difficult to understand speech communication and interact effectively with the world. Objective This study aimed to examine the impact of spatially unique haptic vibrations representing high-frequency phonemes on the self-perceived ability to understand conversations in everyday situations. Methods To address high-frequency hearing loss, a multi-motor wristband was developed that uses machine learning to listen for specific high-frequency phonemes. The wristband vibrates in spatially unique locations to represent which phoneme was present in real time. A total of 16 participants with high-frequency hearing loss were recruited and asked to wear the wristband for 6 weeks. The degree of disability associated with hearing loss was measured weekly using the Abbreviated Profile of Hearing Aid Benefit (APHAB). Results By the end of the 6-week study, the average APHAB benefit score across all participants reached 12.39 points, from a baseline of 40.32 to a final score of 27.93 (SD 13.11; N=16; P=.002, 2-tailed dependent t test). Those without hearing aids showed a 10.78-point larger improvement in average APHAB benefit score at 6 weeks than those with hearing aids (t14=2.14; P=.10, 2-tailed independent t test). The average benefit score across all participants for ease of communication was 15.44 (SD 13.88; N=16; P<.001, 2-tailed dependent t test). The average benefit score across all participants for background noise was 10.88 (SD 17.54; N=16; P=.03, 2-tailed dependent t test). The average benefit score across all participants for reverberation was 10.84 (SD 16.95; N=16; P=.02, 2-tailed dependent t test). Conclusions These findings show that vibrotactile sensory substitution delivered by a wristband that produces spatially distinguishable vibrations in correspondence with high-frequency phonemes helps individuals with high-frequency hearing loss improve their perceived understanding of verbal communication. Vibrotactile feedback provides benefits whether or not a person wears hearing aids, albeit in slightly different ways. Finally, individuals with the greatest perceived difficulty understanding speech experienced the greatest amount of perceived benefit from vibrotactile feedback.
Collapse
Affiliation(s)
| | | | | | - David M Eagleman
- Neosensory, Los Altos, CA, United States
- Department of Psychiatry, Stanford University, Stanford, CA, United States
| |
Collapse
|
3
|
Răutu IS, De Tiège X, Jousmäki V, Bourguignon M, Bertels J. Speech-derived haptic stimulation enhances speech recognition in a multi-talker background. Sci Rep 2023; 13:16621. [PMID: 37789043 PMCID: PMC10547762 DOI: 10.1038/s41598-023-43644-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2023] [Accepted: 09/26/2023] [Indexed: 10/05/2023] Open
Abstract
Speech understanding, while effortless in quiet conditions, is challenging in noisy environments. Previous studies have revealed that a feasible approach to supplement speech-in-noise (SiN) perception consists in presenting speech-derived signals as haptic input. In the current study, we investigated whether the presentation of a vibrotactile signal derived from the speech temporal envelope can improve SiN intelligibility in a multi-talker background for untrained, normal-hearing listeners. We also determined if vibrotactile sensitivity, evaluated using vibrotactile detection thresholds, modulates the extent of audio-tactile SiN improvement. In practice, we measured participants' speech recognition in a multi-talker noise without (audio-only) and with (audio-tactile) concurrent vibrotactile stimulation delivered in three schemes: to the left or right palm, or to both. Averaged across the three stimulation delivery schemes, the vibrotactile stimulation led to a significant improvement of 0.41 dB in SiN recognition when compared to the audio-only condition. Notably, there were no significant differences observed between the improvements in these delivery schemes. In addition, audio-tactile SiN benefit was significantly predicted by participants' vibrotactile threshold levels and unimodal (audio-only) SiN performance. The extent of the improvement afforded by speech-envelope-derived vibrotactile stimulation was in line with previously uncovered vibrotactile enhancements of SiN perception in untrained listeners with no known hearing impairment. Overall, these results highlight the potential of concurrent vibrotactile stimulation to improve SiN recognition, especially in individuals with poor SiN perception abilities, and tentatively more so with increasing tactile sensitivity. Moreover, they lend support to the multimodal accounts of speech perception and research on tactile speech aid devices.
Collapse
Affiliation(s)
- I Sabina Răutu
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium.
| | - Xavier De Tiège
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium
- Service de Neuroimagerie Translationnelle, Hôpital Universitaire de Bruxelles (H.U.B.), CUB Hôpital Erasme, Université Libre de Bruxelles (ULB), Brussels, Belgium
| | | | - Mathieu Bourguignon
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium
- BCBL, Basque Center on Cognition, Brain and Language, 20009, San Sebastián, Spain
- Laboratory of Neurophysiology and Movement Biomechanics, UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium
| | - Julie Bertels
- Laboratoire de Neuroanatomie et de Neuroimagerie Translationnelles (LN2T), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium.
- ULBabylab, Center for Research in Cognition and Neurosciences (CRCN), UNI - ULB Neuroscience Institute, Université Libre de Bruxelles (ULB), Brussels, Belgium.
| |
Collapse
|
4
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
5
|
Enhancement of speech-in-noise comprehension through vibrotactile stimulation at the syllabic rate. Proc Natl Acad Sci U S A 2022; 119:e2117000119. [PMID: 35312362 PMCID: PMC9060510 DOI: 10.1073/pnas.2117000119] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Syllables are important building blocks of speech. They occur at a rate between 4 and 8 Hz, corresponding to the theta frequency range of neural activity in the cerebral cortex. When listening to speech, the theta activity becomes aligned to the syllabic rhythm, presumably aiding in parsing a speech signal into distinct syllables. However, this neural activity cannot only be influenced by sound, but also by somatosensory information. Here, we show that the presentation of vibrotactile signals at the syllabic rate can enhance the comprehension of speech in background noise. We further provide evidence that this multisensory enhancement of speech comprehension reflects the multisensory integration of auditory and tactile information in the auditory cortex. Speech unfolds over distinct temporal scales, in particular, those related to the rhythm of phonemes, syllables, and words. When a person listens to continuous speech, the syllabic rhythm is tracked by neural activity in the theta frequency range. The tracking plays a functional role in speech processing: Influencing the theta activity through transcranial current stimulation, for instance, can impact speech perception. The theta-band activity in the auditory cortex can also be modulated through the somatosensory system, but the effect on speech processing has remained unclear. Here, we show that vibrotactile feedback presented at the rate of syllables can modulate and, in fact, enhance the comprehension of a speech signal in background noise. The enhancement occurs when vibrotactile pulses occur at the perceptual center of the syllables, whereas a temporal delay between the vibrotactile signals and the speech stream can lead to a lower level of speech comprehension. We further investigate the neural mechanisms underlying the audiotactile integration through electroencephalographic (EEG) recordings. We find that the audiotactile stimulation modulates the neural response to the speech rhythm, as well as the neural response to the vibrotactile pulses. The modulations of these neural activities reflect the behavioral effects on speech comprehension. Moreover, we demonstrate that speech comprehension can be predicted by particular aspects of the neural responses. Our results evidence a role of vibrotactile information for speech processing and may have applications in future auditory prosthesis.
Collapse
|
6
|
Cieśla K, Wolak T, Lorens A, Mentzel M, Skarżyński H, Amedi A. Effects of training and using an audio-tactile sensory substitution device on speech-in-noise understanding. Sci Rep 2022; 12:3206. [PMID: 35217676 PMCID: PMC8881456 DOI: 10.1038/s41598-022-06855-8] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Accepted: 01/28/2022] [Indexed: 11/09/2022] Open
Abstract
Understanding speech in background noise is challenging. Wearing face-masks, imposed by the COVID19-pandemics, makes it even harder. We developed a multi-sensory setup, including a sensory substitution device (SSD) that can deliver speech simultaneously through audition and as vibrations on the fingertips. The vibrations correspond to low frequencies extracted from the speech input. We trained two groups of non-native English speakers in understanding distorted speech in noise. After a short session (30-45 min) of repeating sentences, with or without concurrent matching vibrations, we showed comparable mean group improvement of 14-16 dB in Speech Reception Threshold (SRT) in two test conditions, i.e., when the participants were asked to repeat sentences only from hearing and also when matching vibrations on fingertips were present. This is a very strong effect, if one considers that a 10 dB difference corresponds to doubling of the perceived loudness. The number of sentence repetitions needed for both types of training to complete the task was comparable. Meanwhile, the mean group SNR for the audio-tactile training (14.7 ± 8.7) was significantly lower (harder) than for the auditory training (23.9 ± 11.8), which indicates a potential facilitating effect of the added vibrations. In addition, both before and after training most of the participants (70-80%) showed better performance (by mean 4-6 dB) in speech-in-noise understanding when the audio sentences were accompanied with matching vibrations. This is the same magnitude of multisensory benefit that we reported, with no training at all, in our previous study using the same experimental procedures. After training, performance in this test condition was also best in both groups (SRT ~ 2 dB). The least significant effect of both training types was found in the third test condition, i.e. when participants were repeating sentences accompanied with non-matching tactile vibrations and the performance in this condition was also poorest after training. The results indicate that both types of training may remove some level of difficulty in sound perception, which might enable a more proper use of speech inputs delivered via vibrotactile stimulation. We discuss the implications of these novel findings with respect to basic science. In particular, we show that even in adulthood, i.e. long after the classical "critical periods" of development have passed, a new pairing between a certain computation (here, speech processing) and an atypical sensory modality (here, touch) can be established and trained, and that this process can be rapid and intuitive. We further present possible applications of our training program and the SSD for auditory rehabilitation in patients with hearing (and sight) deficits, as well as healthy individuals in suboptimal acoustic situations.
Collapse
Affiliation(s)
- K Cieśla
- The Baruch Ivcher Institute for Brain, Cognition & Technology, The Baruch Ivcher School of Psychology and the Ruth and Meir Rosental Brain Imaging Center, Reichman University, Herzliya, Israel. .,World Hearing Centre, Institute of Physiology and Pathology of Hearing, Warsaw, Poland.
| | - T Wolak
- World Hearing Centre, Institute of Physiology and Pathology of Hearing, Warsaw, Poland
| | - A Lorens
- World Hearing Centre, Institute of Physiology and Pathology of Hearing, Warsaw, Poland
| | - M Mentzel
- The Baruch Ivcher Institute for Brain, Cognition & Technology, The Baruch Ivcher School of Psychology and the Ruth and Meir Rosental Brain Imaging Center, Reichman University, Herzliya, Israel
| | - H Skarżyński
- World Hearing Centre, Institute of Physiology and Pathology of Hearing, Warsaw, Poland
| | - A Amedi
- The Baruch Ivcher Institute for Brain, Cognition & Technology, The Baruch Ivcher School of Psychology and the Ruth and Meir Rosental Brain Imaging Center, Reichman University, Herzliya, Israel
| |
Collapse
|
7
|
Ansorge J, Wu C, Shore SE, Krieger P. Audiotactile interactions in the mouse cochlear nucleus. Sci Rep 2021; 11:6887. [PMID: 33767295 PMCID: PMC7994829 DOI: 10.1038/s41598-021-86236-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2020] [Accepted: 03/11/2021] [Indexed: 11/15/2022] Open
Abstract
Multisensory integration of auditory and tactile information occurs already at the level of the cochlear nucleus. Rodents use their whiskers for tactile perception to guide them in their exploration of the world. As nocturnal animals with relatively poor vision, audiotactile interactions are of great importance for this species. Here, the influence of whisker deflections on sound-evoked spiking in the cochlear nucleus was investigated in vivo in anesthetized mice. Multichannel, silicon-probe electrophysiological recordings were obtained from both the dorsal and ventral cochlear nucleus. Whisker deflections evoked an increased spiking activity in fusiform cells of the dorsal cochlear nucleus and t-stellate cells in ventral cochlear nucleus, whereas bushy cells in the ventral cochlear nucleus showed a more variable response. The response to broadband noise stimulation increased in fusiform cells and primary-like bushy cells when the sound stimulation was preceded (~ 20 ms) by whisker stimulation. Multi-sensory integration of auditory and whisker input can thus occur already in this early brainstem nucleus, emphasizing the importance of early integration of auditory and somatosensory information.
Collapse
Affiliation(s)
- Josephine Ansorge
- Department of Systems Neuroscience, Faculty of Medicine, Ruhr University Bochum, Universitätsstraße 150, 44780, Bochum, Germany
| | - Calvin Wu
- Department of Otolaryngology, Kresge Hearing Research Institute, University of Michigan, Ann Arbor, MI, USA
| | - Susan E Shore
- Department of Otolaryngology, Kresge Hearing Research Institute, University of Michigan, Ann Arbor, MI, USA.,Biomedical Engineering, University of Michigan, Ann Arbor, MI, USA.,Molecular and Integrative Physiology, University of Michigan, Ann Arbor, MI, USA
| | - Patrik Krieger
- Department of Systems Neuroscience, Faculty of Medicine, Ruhr University Bochum, Universitätsstraße 150, 44780, Bochum, Germany.
| |
Collapse
|
8
|
Coffman BA, Candelaria-Cook FT, Stephen JM. Unisensory and Multisensory Responses in Fetal Alcohol Spectrum Disorders (FASD): Effects of Spatial Congruence. Neuroscience 2020; 430:34-46. [PMID: 31982473 DOI: 10.1016/j.neuroscience.2020.01.013] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2019] [Revised: 11/22/2019] [Accepted: 01/08/2020] [Indexed: 12/16/2022]
Abstract
While it is generally accepted that structural and functional brain deficits underlie the behavioral deficits associated with Fetal Alcohol Spectrum Disorders (FASD), the degree to which these problems are expressed in sensory pathology is unknown. Electrophysiological measures indicate that neural processing is delayed in visual and auditory domains. Furthermore, multiple reports of white matter deficits due to prenatal alcohol exposure indicate altered cortical connectivity in individuals with FASD. Multisensory integration requires close coordination between disparate cortical areas leading us to hypothesize that individuals with FASD will have impaired multisensory integration relative to healthy control (HC) participants. Participants' neurophysiological responses were recorded using magnetoencephalography (MEG) during passive unisensory or simultaneous, spatially congruent or incongruent multisensory auditory and somatosensory stimuli. Source timecourses from evoked responses were estimated using multi-dipole spatiotemporal modeling. Auditory M100 response latency was faster for the multisensory relative to the unisensory condition but no group differences were observed. M200 auditory latency to congruent stimuli was earlier and congruent amplitude was larger in participants with FASD relative to controls. Somatosensory M100 response latency was faster in right hemisphere for multisensory relative to unisensory stimulation in both groups. FASD participants' somatosensory M200 responses were delayed by 13 ms, but only for the unisensory presentation of the somatosensory stimulus. M200 results indicate that unisensory and multisensory processing is altered in FASD; it remains to be seen if the multisensory response represents a normalization of the unisensory deficits.
Collapse
Affiliation(s)
- Brian A Coffman
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA; Department of Psychology, University of New Mexico, MSC03 2220, 1 University of New Mexico, Albuquerque, NM 87131, USA; Department of Psychiatry, University of Pittsburgh School of Medicine, 3501 Forbes Avenue, Pittsburgh, PA 15213, USA
| | - Felicha T Candelaria-Cook
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA
| | - Julia M Stephen
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA.
| |
Collapse
|
9
|
Riecke L, Snipes S, van Bree S, Kaas A, Hausfeld L. Audio-tactile enhancement of cortical speech-envelope tracking. Neuroimage 2019; 202:116134. [DOI: 10.1016/j.neuroimage.2019.116134] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2019] [Revised: 08/07/2019] [Accepted: 08/26/2019] [Indexed: 11/25/2022] Open
|
10
|
Developmental changes in the perception of audiotactile simultaneity. J Exp Child Psychol 2019; 183:208-221. [DOI: 10.1016/j.jecp.2019.02.006] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2018] [Revised: 01/29/2019] [Accepted: 02/13/2019] [Indexed: 11/23/2022]
|
11
|
Cardon G, Sharma A. Somatosensory Cross-Modal Reorganization in Children With Cochlear Implants. Front Neurosci 2019; 13:469. [PMID: 31312115 PMCID: PMC6613479 DOI: 10.3389/fnins.2019.00469] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2019] [Accepted: 04/25/2019] [Indexed: 11/13/2022] Open
Abstract
Deprived of sensory input, as in deafness, the brain tends to reorganize. Cross-modal reorganization occurs when cortices associated with deficient sensory modalities are recruited by other, intact senses for processing of the latter's sensory input. Studies have shown that this type of reorganization may affect outcomes when sensory stimulation is later introduced via intervention devices. One such device is the cochlear implant (CI). Hundreds of thousands of CIs have been fitted on people with hearing impairment worldwide, many of them children. Factors such as age of implantation have proven useful in predicting speech perception outcome with these devices in children. However, a portion of the variance in speech understanding ability remains unexplained. It is possible that the degree of cross-modal reorganization may explain additional variability in listening outcomes. Thus, the current study aimed to examine possible somatosensory cross-modal reorganization of the auditory cortices. To this end we used high density EEG to record cortical responses to vibrotactile stimuli in children with normal hearing (NH) and those with CIs. We first investigated cortical somatosensory evoked potentials (CSEP) in NH children, in order to establish normal patterns of CSEP waveform morphology and sources of cortical activity. We then compared CSEP waveforms and estimations of cortical sources between NH children and those with CIs to assess the degree of somatosensory cross-modal reorganization. Results showed that NH children showed expected patterns of CSEP and current density reconstructions, such that postcentral cortices were activated contralaterally to the side of stimulation. Participants with CIs also showed this pattern of activity. However, in addition, they showed activation of auditory cortical areas in response to somatosensory stimulation. Additionally, certain CSEP waveform components were significantly earlier in the CI group than the children with NH. These results are taken as evidence of cross-modal reorganization by the somatosensory modality in children with CIs. Speech perception in noise scores were negatively associated with CSEP waveform components latencies in the CI group, suggesting that the degree of cross-modal reorganization is related to speech perception outcomes. These findings may have implications for clinical rehabilitation in children with cochlear implants.
Collapse
Affiliation(s)
- Garrett Cardon
- Department of Psychology, Colorado State University, Fort Collins, CO, United States
| | - Anu Sharma
- Department of Speech, Language, and Hearing Sciences, University of Colorado Boulder, Boulder, CO, United States
| |
Collapse
|
12
|
Cieśla K, Wolak T, Lorens A, Heimler B, Skarżyński H, Amedi A. Immediate improvement of speech-in-noise perception through multisensory stimulation via an auditory to tactile sensory substitution. Restor Neurol Neurosci 2019; 37:155-166. [PMID: 31006700 PMCID: PMC6598101 DOI: 10.3233/rnn-190898] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
BACKGROUND Hearing loss is becoming a real social and health problem. Its prevalence in the elderly is an epidemic. The risk of developing hearing loss is also growing among younger people. If left untreated, hearing loss can perpetuate development of neurodegenerative diseases, including dementia. Despite recent advancements in hearing aid (HA) and cochlear implant (CI) technologies, hearing impaired users still encounter significant practical and social challenges, with or without aids. In particular, they all struggle with understanding speech in challenging acoustic environments, especially in presence of a competing speaker. OBJECTIVES In the current proof-of-concept study we tested whether multisensory stimulation, pairing audition and a minimal-size touch device would improve intelligibility of speech in noise. METHODS To this aim we developed an audio-to-tactile sensory substitution device (SSD) transforming low-frequency speech signals into tactile vibrations delivered on two finger tips. Based on the inverse effectiveness law, i.e., multisensory enhancement is strongest when signal-to-noise ratio is lowest between senses, we embedded non-native language stimuli in speech-like noise and paired it with a low-frequency input conveyed through touch. RESULTS We found immediate and robust improvement in speech recognition (i.e. in the Signal-To-Noise-ratio) in the multisensory condition without any training, at a group level as well as in every participant. The reported improvement at the group-level of 6 dB was indeed major considering that an increase of 10 dB represents a doubling of the perceived loudness. CONCLUSIONS These results are especially relevant when compared to previous SSD studies showing effects in behavior only after a demanding cognitive training. We discuss the implications of our results for development of SSDs and of specific rehabilitation programs for the hearing impaired either using or not using HAs or CIs. We also discuss the potential application of such a set-up for sense augmentation, such as when learning a new language.
Collapse
Affiliation(s)
- Katarzyna Cieśla
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Tomasz Wolak
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
| | - Artur Lorens
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
| | - Benedetta Heimler
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Henryk Skarżyński
- Institute of Physiology and Pathology of Hearing, World Hearing Center, Warsaw, Poland
| | - Amir Amedi
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
- The Cognitive Science Program, The Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
13
|
Rizza A, Terekhov AV, Montone G, Olivetti-Belardinelli M, O'Regan JK. Why Early Tactile Speech Aids May Have Failed: No Perceptual Integration of Tactile and Auditory Signals. Front Psychol 2018; 9:767. [PMID: 29875719 PMCID: PMC5974558 DOI: 10.3389/fpsyg.2018.00767] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2017] [Accepted: 04/30/2018] [Indexed: 11/23/2022] Open
Abstract
Tactile speech aids, though extensively studied in the 1980’s and 1990’s, never became a commercial success. A hypothesis to explain this failure might be that it is difficult to obtain true perceptual integration of a tactile signal with information from auditory speech: exploitation of tactile cues from a tactile aid might require cognitive effort and so prevent speech understanding at the high rates typical of everyday speech. To test this hypothesis, we attempted to create true perceptual integration of tactile with auditory information in what might be considered the simplest situation encountered by a hearing-impaired listener. We created an auditory continuum between the syllables /BA/ and /VA/, and trained participants to associate /BA/ to one tactile stimulus and /VA/ to another tactile stimulus. After training, we tested if auditory discrimination along the continuum between the two syllables could be biased by incongruent tactile stimulation. We found that such a bias occurred only when the tactile stimulus was above, but not when it was below its previously measured tactile discrimination threshold. Such a pattern is compatible with the idea that the effect is due to a cognitive or decisional strategy, rather than to truly perceptual integration. We therefore ran a further study (Experiment 2), where we created a tactile version of the McGurk effect. We extensively trained two Subjects over 6 days to associate four recorded auditory syllables with four corresponding apparent motion tactile patterns. In a subsequent test, we presented stimulation that was either congruent or incongruent with the learnt association, and asked Subjects to report the syllable they perceived. We found no analog to the McGurk effect, suggesting that the tactile stimulation was not being perceptually integrated with the auditory syllable. These findings strengthen our hypothesis according to which tactile aids failed because integration of tactile cues with auditory speech occurred at a cognitive or decisional level, rather than truly at a perceptual level.
Collapse
Affiliation(s)
- Aurora Rizza
- Department of Psychology, Faculty of Medicine and Psychology, Sapienza University of Rome, Rome, Italy
| | - Alexander V Terekhov
- Laboratoire Psychologie de la Perception, Université Paris Descartes, Paris, France
| | - Guglielmo Montone
- Laboratoire Psychologie de la Perception, Université Paris Descartes, Paris, France
| | - Marta Olivetti-Belardinelli
- Department of Psychology, Faculty of Medicine and Psychology, Sapienza University of Rome, Rome, Italy.,ECONA Interuniversity Centre for Research on Cognitive Processing in Natural and Artificial Systems, Rome, Italy
| | - J Kevin O'Regan
- Laboratoire Psychologie de la Perception, Université Paris Descartes, Paris, France
| |
Collapse
|
14
|
Cardon G, Sharma A. Somatosensory Cross-Modal Reorganization in Adults With Age-Related, Early-Stage Hearing Loss. Front Hum Neurosci 2018; 12:172. [PMID: 29773983 PMCID: PMC5943502 DOI: 10.3389/fnhum.2018.00172] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2017] [Accepted: 04/12/2018] [Indexed: 02/04/2023] Open
Abstract
Under conditions of profound sensory deprivation, the brain has the propensity to reorganize. For example, intact sensory modalities often recruit deficient modalities' cortices for neural processing. This process is known as cross-modal reorganization and has been shown in congenitally and profoundly deaf patients. However, much less is known about cross-modal cortical reorganization in persons with less severe cases of age-related hearing loss (ARHL), even though such cases are far more common. Thus, we investigated cross-modal reorganization between the auditory and somatosensory modalities in older adults with normal hearing (NH) and mild-moderate ARHL in response to vibrotactile stimulation using high density electroencephalography (EEG). Results showed activation of the somatosensory cortices in adults with NH as well as those with hearing loss (HL). However, adults with mild-moderate ARHL also showed robust activation of auditory cortical regions in response to somatosensory stimulation. Neurophysiologic data exhibited significant correlations with speech perception in noise outcomes suggesting that the degree of cross-modal reorganization may be associated with functional performance. Our study presents the first evidence of somatosensory cross-modal reorganization of the auditory cortex in adults with early-stage, mild-moderate ARHL. Our findings suggest that even mild levels of ARHL associated with communication difficulty result in fundamental cortical changes.
Collapse
Affiliation(s)
- Garrett Cardon
- Department of Psychiatry, University of Colorado Denver Anschutz Medical Campus, Aurora, CO, United States
| | - Anu Sharma
- Department of Speech, Language, and Hearing Sciences, University of Colorado Boulder, Boulder, CO, United States
| |
Collapse
|
15
|
Convento S, Rahman MS, Yau JM. Selective Attention Gates the Interactive Crossmodal Coupling between Perceptual Systems. Curr Biol 2018; 28:746-752.e5. [PMID: 29456139 DOI: 10.1016/j.cub.2018.01.021] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2017] [Revised: 12/11/2017] [Accepted: 01/09/2018] [Indexed: 10/18/2022]
Abstract
Sensory cortical systems often activate in parallel, even when stimulation is experienced through a single sensory modality [1-3]. Co-activations may reflect the interactive coupling between information-linked cortical systems or merely parallel but independent sensory processing. We report causal evidence consistent with the hypothesis that human somatosensory cortex (S1), which co-activates with auditory cortex during the processing of vibrations and textures [4-9], interactively couples to cortical systems that support auditory perception. In a series of behavioral experiments, we used transcranial magnetic stimulation (TMS) to probe interactions between the somatosensory and auditory perceptual systems as we manipulated attention state. Acute TMS over S1 impairs auditory frequency perception when subjects simultaneously attend to auditory and tactile frequency, but not when attention is directed to audition alone. Auditory frequency perception is unaffected by TMS over visual cortex, thus confirming the privileged interactions between the somatosensory and auditory systems in temporal frequency processing [10-13]. Our results provide a key demonstration that selective attention can modulate the functional properties of cortical systems thought to support specific sensory modalities. The gating of crossmodal coupling by selective attention may critically support multisensory interactions and feature-specific perception.
Collapse
Affiliation(s)
- Silvia Convento
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA
| | - Md Shoaibur Rahman
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA
| | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA.
| |
Collapse
|
16
|
Villanueva L, Zampini M. Reciprocal Interference Between Audition and Touch in the Perception of Duration. Multisens Res 2018; 31:351-371. [DOI: 10.1163/22134808-00002583] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2016] [Accepted: 05/11/2017] [Indexed: 11/19/2022]
Abstract
Audition and touch interact with one another and share a number of similarities; however, little is known about their interplay in the perception of temporal duration. The present study intended to investigate whether the temporal duration of an irrelevant auditory or tactile stimulus could modulate the perceived duration of a target stimulus presented in the other modality (i.e., tactile or auditory) adopting both a between-participants (Experiment 1) and a within-participants (Experiment 2) experimental design. In a two-alternative forced-choice task, participants decided which of two events in a target modality was longer. The simultaneously distractor stimuli were presented with a duration that was either congruent or incongruent to the target’s. Results showed that both the auditory and tactile modalities affected duration judgments in the incongruent condition, decreasing performance in both experiments. Moreover, in Experiment 1, the tactile modality enhanced the perception of auditory stimuli in the congruent condition, but audition did not facilitate performance for the congruent condition in the tactile modality; this tactile enhancement of audition was not found in Experiment 2. To the best of our knowledge, this is the first study documenting audiotactile interactions in the perception of duration, and suggests that audition and touch might modulate one another in a more balanced manner, in contrast to audiovisual pairings. The findings support previous evidence as to the shared links and reciprocal influences when audition and touch interact with one another.
Collapse
Affiliation(s)
- Lia Villanueva
- CIMeC Center for Mind/Brain Sciences, University of Trento, Corso Bettini 31, Rovereto (Trento), Italy
| | - Massimiliano Zampini
- CIMeC Center for Mind/Brain Sciences, University of Trento, Corso Bettini 31, Rovereto (Trento), Italy
- Department of Psychology and Cognitive Science, University of Trento, Corso Bettini 31, Rovereto (Trento), Italy
| |
Collapse
|
17
|
Korman M, Herling Z, Levy I, Egbarieh N, Engel-Yeger B, Karni A. Background matters: Minor vibratory stimulation during motor skill acquisition selectively reduces off-line memory consolidation. Neurobiol Learn Mem 2017; 140:27-32. [PMID: 28189551 DOI: 10.1016/j.nlm.2017.02.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2016] [Revised: 01/08/2017] [Accepted: 02/07/2017] [Indexed: 10/20/2022]
Abstract
Although a ubiquitous situation, it is not clear how effective is a learning experience when task-irrelevant, sensory noise occurs in the background. Here, young adults were trained on the finger opposition sequence task, in a well-established training and testing protocol affording measures for online as well as off-line learning. During the training session, one group experienced a minor background vibratory stimulation to the trunk by the means of vibrating cushion, while the second group experienced recorded sound vibrations. A control group was trained with no extra sensory stimulation. Sensory stimulation during training had no effect on the online within-session gains, but dampened the expression of the off-line, consolidation phase, gains in the two sensory stimulation groups. These results suggest that background sensory stimulation can selectively modify off-line, procedural memory consolidation processes, despite well-preserved on-line learning. Classical studies have shown that neural plasticity in sensory systems is modulated by motor input. The current results extend this notion and suggest that some types of task-irrelevant sensory stimulation, concurrent with motor training, may constitute a 'gating' factor - modulating the triggering of long-term procedural memory consolidation processes. Thus, vibratory stimulation may be considered as a behavioral counterpart of pharmacological interventions that do not interfere with short term neural plasticity but block long-term plasticity.
Collapse
Affiliation(s)
- Maria Korman
- Department of Occupational Therapy, Faculty of Social Welfare & Health Sciences, University of Haifa, Israel.
| | - Zohar Herling
- Department of Human Biology, Faculty of Natural Sciences, University of Haifa, Israel
| | - Ishay Levy
- Department of Occupational Therapy, Faculty of Social Welfare & Health Sciences, University of Haifa, Israel
| | - Nebal Egbarieh
- Department of Occupational Therapy, Faculty of Social Welfare & Health Sciences, University of Haifa, Israel
| | - Batya Engel-Yeger
- Department of Occupational Therapy, Faculty of Social Welfare & Health Sciences, University of Haifa, Israel
| | - Avi Karni
- Department of Human Biology, Faculty of Natural Sciences, University of Haifa, Israel
| |
Collapse
|
18
|
Young GW, Murphy D, Weeter J. Haptics in Music: The Effects of Vibrotactile Stimulus in Low Frequency Auditory Difference Detection Tasks. IEEE TRANSACTIONS ON HAPTICS 2017; 10:135-139. [PMID: 28055906 DOI: 10.1109/toh.2016.2646370] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
We present an experiment that investigated the effect of vibrotactile stimulation in auditory pitch discrimination tasks. Extra-auditory information was expected to have some influence upon the frequency discrimination of auditory Just Noticeable Difference (JND) detection levels at 160 Hz. To measure this, the potential to correctly identified positive and negative frequency changes for two randomly divided groups was measured and then compared. The first group was given an audio only JND test and the second group was given the same test, but with additional vibrotactile stimulus delivered via a vibrating glove device. The results of the experiment suggest that in musical interactions involving the selection of specific pitches, or the detection of pitch variation, vibrotactile feedback may have some advantageous effect upon a musician's ability to perceive changes when presented in synchrony with auditory stimulus.
Collapse
|
19
|
Roy C, Dalla Bella S, Lagarde J. To bridge or not to bridge the multisensory time gap: bimanual coordination to sound and touch with temporal lags. Exp Brain Res 2016; 235:135-151. [DOI: 10.1007/s00221-016-4776-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2016] [Accepted: 09/13/2016] [Indexed: 11/28/2022]
|
20
|
Kanaya S, Kariya K, Fujisaki W. Cross-Modal Correspondence Among Vision, Audition, and Touch in Natural Objects: An Investigation of the Perceptual Properties of Wood. Perception 2016; 45:1099-114. [DOI: 10.1177/0301006616652018] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Certain systematic relationships are often assumed between information conveyed from multiple sensory modalities; for instance, a small figure and a high pitch may be perceived as more harmonious. This phenomenon, termed cross-modal correspondence, may result from correlations between multi-sensory signals learned in daily experience of the natural environment. If so, we would observe cross-modal correspondences not only in the perception of artificial stimuli but also in perception of natural objects. To test this hypothesis, we reanalyzed data collected previously in our laboratory examining perceptions of the material properties of wood using vision, audition, and touch. We compared participant evaluations of three perceptual properties (surface brightness, sharpness of sound, and smoothness) of the wood blocks obtained separately via vision, audition, and touch. Significant positive correlations were identified for all properties in the audition–touch comparison, and for two of the three properties regarding in the vision–touch comparison. By contrast, no properties exhibited significant positive correlations in the vision–audition comparison. These results suggest that we learn correlations between multi-sensory signals through experience; however, the strength of this statistical learning is apparently dependent on the particular combination of sensory modalities involved.
Collapse
Affiliation(s)
- Shoko Kanaya
- Human Information Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Japan
| | - Kenji Kariya
- Tsukuba Research Institute, Sumitomo Forestry Company, Tsukuba, Japan
| | - Waka Fujisaki
- Human Information Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Japan
| |
Collapse
|
21
|
Heimler B, Striem-Amit E, Amedi A. Origins of task-specific sensory-independent organization in the visual and auditory brain: neuroscience evidence, open questions and clinical implications. Curr Opin Neurobiol 2015; 35:169-77. [DOI: 10.1016/j.conb.2015.09.001] [Citation(s) in RCA: 66] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2015] [Revised: 09/01/2015] [Accepted: 09/02/2015] [Indexed: 12/28/2022]
|
22
|
Blanke O, Slater M, Serino A. Behavioral, Neural, and Computational Principles of Bodily Self-Consciousness. Neuron 2015; 88:145-66. [PMID: 26447578 DOI: 10.1016/j.neuron.2015.09.029] [Citation(s) in RCA: 386] [Impact Index Per Article: 42.9] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Affiliation(s)
- Olaf Blanke
- Laboratory of Cognitive Neuroscience, Center for Neuroprosthetics and Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne (EPFL), 9 Chemin des Mines, 1202 Geneva, Switzerland; Department of Neurology, University of Geneva, 24 rue Micheli-du-Crest, 1211 Geneva, Switzerland.
| | - Mel Slater
- ICREA-University of Barcelona, Campus de Mundet, 08035 Barcelona, Spain; Department of Computer Science, University College London, Malet Place Engineering Building, Gower Street, London, WC1E 6BT, UK
| | - Andrea Serino
- Laboratory of Cognitive Neuroscience, Center for Neuroprosthetics and Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne (EPFL), 9 Chemin des Mines, 1202 Geneva, Switzerland.
| |
Collapse
|
23
|
Pannunzi M, Pérez-Bellido A, Pereda-Baños A, López-Moliner J, Deco G, Soto-Faraco S. Deconstructing multisensory enhancement in detection. J Neurophysiol 2014; 113:1800-18. [PMID: 25520431 DOI: 10.1152/jn.00341.2014] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The mechanisms responsible for the integration of sensory information from different modalities have become a topic of intense interest in psychophysics and neuroscience. Many authors now claim that early, sensory-based cross-modal convergence improves performance in detection tasks. An important strand of supporting evidence for this claim is based on statistical models such as the Pythagorean model or the probabilistic summation model. These models establish statistical benchmarks representing the best predicted performance under the assumption that there are no interactions between the two sensory paths. Following this logic, when observed detection performances surpass the predictions of these models, it is often inferred that such improvement indicates cross-modal convergence. We present a theoretical analyses scrutinizing some of these models and the statistical criteria most frequently used to infer early cross-modal interactions during detection tasks. Our current analysis shows how some common misinterpretations of these models lead to their inadequate use and, in turn, to contradictory results and misleading conclusions. To further illustrate the latter point, we introduce a model that accounts for detection performances in multimodal detection tasks but for which surpassing of the Pythagorean or probabilistic summation benchmark can be explained without resorting to early cross-modal interactions. Finally, we report three experiments that put our theoretical interpretation to the test and further propose how to adequately measure multimodal interactions in audiotactile detection tasks.
Collapse
Affiliation(s)
| | | | | | - Joan López-Moliner
- Universitat de Barcelona, Barcelona, Spain; Institute for Brain, Cognition and Behaviour (IR3C), Barcelona, Spain; and
| | - Gustavo Deco
- Universitat Pompeu Fabra, Barcelona, Spain; Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| | - Salvador Soto-Faraco
- Universitat Pompeu Fabra, Barcelona, Spain; Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain
| |
Collapse
|
24
|
Heimler B, Weisz N, Collignon O. Revisiting the adaptive and maladaptive effects of crossmodal plasticity. Neuroscience 2014; 283:44-63. [PMID: 25139761 DOI: 10.1016/j.neuroscience.2014.08.003] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2014] [Revised: 08/01/2014] [Accepted: 08/06/2014] [Indexed: 11/15/2022]
Abstract
One of the most striking demonstrations of experience-dependent plasticity comes from studies of sensory-deprived individuals (e.g., blind or deaf), showing that brain regions deprived of their natural inputs change their sensory tuning to support the processing of inputs coming from the spared senses. These mechanisms of crossmodal plasticity have been traditionally conceptualized as having a double-edged sword effect on behavior. On one side, crossmodal plasticity is conceived as adaptive for the development of enhanced behavioral skills in the remaining senses of early-deaf or blind individuals. On the other side, crossmodal plasticity raises crucial challenges for sensory restoration and is typically conceived as maladaptive since its presence may prevent optimal recovery in sensory-re-afferented individuals. In the present review we stress that this dichotomic vision is oversimplified and we emphasize that the notions of the unavoidable adaptive/maladaptive effects of crossmodal reorganization for sensory compensation/restoration may actually be misleading. For this purpose we critically review the findings from the blind and deaf literatures, highlighting the complementary nature of these two fields of research. The integrated framework we propose here has the potential to impact on the way rehabilitation programs for sensory recovery are carried out, with the promising prospect of eventually improving their final outcomes.
Collapse
Affiliation(s)
- B Heimler
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Italy.
| | - N Weisz
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Italy
| | - O Collignon
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Italy
| |
Collapse
|
25
|
Petrini K, Remark A, Smith L, Nardini M. When vision is not an option: children's integration of auditory and haptic information is suboptimal. Dev Sci 2014; 17:376-87. [PMID: 24612244 PMCID: PMC4240463 DOI: 10.1111/desc.12127] [Citation(s) in RCA: 49] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2013] [Accepted: 08/19/2013] [Indexed: 11/29/2022]
Abstract
When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark environment, visual disorders), the use of other information is vital. Here we ask how humans combine haptic and auditory information from childhood. In the first experiment, adults and children aged 5 to 11 years judged the relative sizes of two objects in auditory, haptic, and non-conflicting bimodal conditions. In Experiment 2, different groups of adults and children were tested in non-conflicting and conflicting bimodal conditions. In Experiment 1, adults reduced sensory uncertainty by integrating the cues optimally, while children did not. In Experiment 2, adults and children used similar weighting strategies to solve audio–haptic conflict. These results suggest that, in the absence of visual information, optimal integration of cues for discrimination of object size develops late in childhood.
Collapse
Affiliation(s)
- Karin Petrini
- Institute of Ophthalmology, University College London, UK
| | | | | | | |
Collapse
|
26
|
Budd TW, Timora JR. Steady state responses to temporally congruent and incongruent auditory and vibrotactile amplitude modulated stimulation. Int J Psychophysiol 2013; 89:419-32. [DOI: 10.1016/j.ijpsycho.2013.06.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2013] [Revised: 05/26/2013] [Accepted: 06/04/2013] [Indexed: 11/16/2022]
|
27
|
Landry SP, Guillemot JP, Champoux F. Temporary deafness can impair multisensory integration: a study of cochlear-implant users. Psychol Sci 2013; 24:1260-8. [PMID: 23722977 DOI: 10.1177/0956797612471142] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022] Open
Abstract
Previous investigations suggest that temporary deafness can have a dramatic impact on audiovisual speech processing. The aim of this study was to test whether temporary deafness disturbs other multisensory processes in adults. A nonspeech task involving an audiotactile illusion was administered to a group of normally hearing individuals and a group of individuals who had been temporarily auditorily deprived. Members of this latter group had their auditory detection thresholds restored to normal levels through the use of a cochlear implant. Control conditions revealed that auditory and tactile discrimination capabilities were identical in the two groups. However, whereas normally hearing individuals integrated auditory and tactile information, so that they experienced the audiotactile illusion, individuals who had been temporarily deprived did not. Given the basic nature of the task, failure to integrate multisensory information could not be explained by the use of the cochlear implant. Thus, the results suggest that normally anticipated audiotactile interactions are disturbed following temporary deafness.
Collapse
Affiliation(s)
- Simon P Landry
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Montréal, Québec, Canada
| | | | | |
Collapse
|
28
|
McMahan W, Gomez ED, Chen L, Bark K, Nappo JC, Koch EI, Lee DI, Dumon KR, Williams NN, Kuchenbecker KJ. A practical system for recording instrument interactions during live robotic surgery. J Robot Surg 2013; 7:351-8. [DOI: 10.1007/s11701-013-0399-y] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2013] [Accepted: 03/15/2013] [Indexed: 10/27/2022]
|
29
|
Yau JM, Hollins M, Bensmaia SJ. Textural timbre: The perception of surface microtexture depends in part on multimodal spectral cues. Commun Integr Biol 2013; 2:344-6. [PMID: 19721886 DOI: 10.4161/cib.2.4.8551] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2009] [Accepted: 03/25/2009] [Indexed: 11/19/2022] Open
Abstract
During haptic exploration of surfaces, complex mechanical oscillations-of surface displacement and air pressure-are generated, which are then transduced by receptors in the skin and in the inner ear. Tactile and auditory signals thus convey redundant information about texture, partially carried in the spectral content of these signals. It is no surprise, then, that the representation of temporal frequency is linked in the auditory and somatosensory systems. An emergent hypothesis is that there exists a supramodal representation of temporal frequency, and by extension texture.
Collapse
|
30
|
Huang J, Gamble D, Sarnlertsophon K, Wang X, Hsiao S. Integration of auditory and tactile inputs in musical meter perception. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2013; 787:453-61. [PMID: 23716252 PMCID: PMC4324720 DOI: 10.1007/978-1-4614-1590-9_50] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
Abstract
Musicians often say that they not only hear but also "feel" music. To explore the contribution of tactile information to "feeling" music, we investigated the degree that auditory and tactile inputs are integrated in humans performing a musical meter-recognition task. Subjects discriminated between two types of sequences, "duple" (march-like rhythms) and "triple" (waltz-like rhythms), presented in three conditions: (1) unimodal inputs (auditory or tactile alone); (2) various combinations of bimodal inputs, where sequences were distributed between the auditory and tactile channels such that a single channel did not produce coherent meter percepts; and (3) bimodal inputs where the two channels contained congruent or incongruent meter cues. We first show that meter is perceived similarly well (70-85 %) when tactile or auditory cues are presented alone. We next show in the bimodal experiments that auditory and tactile cues are integrated to produce coherent meter percepts. Performance is high (70-90 %) when all of the metrically important notes are assigned to one channel and is reduced to 60 % when half of these notes are assigned to one channel. When the important notes are presented simultaneously to both channels, congruent cues enhance meter recognition (90 %). Performance dropped dramatically when subjects were presented with incongruent auditory cues (10 %), as opposed to incongruent tactile cues (60 %), demonstrating that auditory input dominates meter perception. These observations support the notion that meter perception is a cross-modal percept with tactile inputs underlying the perception of "feeling" music.
Collapse
Affiliation(s)
- Juan Huang
- The Solomon H. Snyder Department of Neuroscience, The Johns Hopkins University, Baltimore, MD 21205, USA.
| | | | | | | | | |
Collapse
|
31
|
Canzoneri E, Magosso E, Serino A. Dynamic sounds capture the boundaries of peripersonal space representation in humans. PLoS One 2012; 7:e44306. [PMID: 23028516 PMCID: PMC3460958 DOI: 10.1371/journal.pone.0044306] [Citation(s) in RCA: 144] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2012] [Accepted: 08/01/2012] [Indexed: 11/23/2022] Open
Abstract
Background We physically interact with external stimuli when they occur within a limited space immediately surrounding the body, i.e., Peripersonal Space (PPS). In the primate brain, specific fronto-parietal areas are responsible for the multisensory representation of PPS, by integrating tactile, visual and auditory information occurring on and near the body. Dynamic stimuli are particularly relevant for PPS representation, as they might refer to potential harms approaching the body. However, behavioural tasks for studying PPS representation with moving stimuli are lacking. Here we propose a new dynamic audio-tactile interaction task in order to assess the extension of PPS in a more functionally and ecologically valid condition. Methodology/Principal Findings Participants vocally responded to a tactile stimulus administered at the hand at different delays from the onset of task-irrelevant dynamic sounds which gave the impression of a sound source either approaching or receding from the subject’s hand. Results showed that a moving auditory stimulus speeded up the processing of a tactile stimulus at the hand as long as it was perceived at a limited distance from the hand, that is within the boundaries of PPS representation. The audio-tactile interaction effect was stronger when sounds were approaching compared to when sounds were receding. Conclusion/Significance This study provides a new method to dynamically assess PPS representation: The function describing the relationship between tactile processing and the position of sounds in space can be used to estimate the location of PPS boundaries, along a spatial continuum between far and near space, in a valuable and ecologically significant way.
Collapse
Affiliation(s)
- Elisa Canzoneri
- Dipartimento di Psicologia, ALMA MATER STUDIORUM - Università di Bologna, Bologna, Italy
- Centro studi e ricerche in Neuroscienze Cognitive, Polo Scientifico-Didattico di Cesena, Cesena, Italy
| | - Elisa Magosso
- Dipartimento di Elettronica, Informatica e Sistemistica, ALMA MATER STUDIORUM - Università di Bologna, Bologna, Italy
| | - Andrea Serino
- Dipartimento di Psicologia, ALMA MATER STUDIORUM - Università di Bologna, Bologna, Italy
- Centro studi e ricerche in Neuroscienze Cognitive, Polo Scientifico-Didattico di Cesena, Cesena, Italy
- * E-mail:
| |
Collapse
|
32
|
Nordmark PF, Pruszynski JA, Johansson RS. BOLD responses to tactile stimuli in visual and auditory cortex depend on the frequency content of stimulation. J Cogn Neurosci 2012; 24:2120-34. [PMID: 22721377 DOI: 10.1162/jocn_a_00261] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Although some brain areas preferentially process information from a particular sensory modality, these areas can also respond to other modalities. Here we used fMRI to show that such responsiveness to tactile stimuli depends on the temporal frequency of stimulation. Participants performed a tactile threshold-tracking task where the tip of either their left or right middle finger was stimulated at 3, 20, or 100 Hz. Whole-brain analysis revealed an effect of stimulus frequency in two regions: the auditory cortex and the visual cortex. The BOLD response in the auditory cortex was stronger during stimulation at hearable frequencies (20 and 100 Hz) whereas the response in the visual cortex was suppressed at infrasonic frequencies (3 Hz). Regardless of which hand was stimulated, the frequency-dependent effects were lateralized to the left auditory cortex and the right visual cortex. Furthermore, the frequency-dependent effects in both areas were abolished when the participants performed a visual task while receiving identical tactile stimulation as in the tactile threshold-tracking task. We interpret these findings in the context of the metamodal theory of brain function, which posits that brain areas contribute to sensory processing by performing specific computations regardless of input modality.
Collapse
Affiliation(s)
- Per F Nordmark
- Department of Integrative Medical Biology, Physiology Section, Umeå University,SE 90187 Umeå, Sweden.
| | | | | |
Collapse
|
33
|
Segregated audio–tactile events destabilize the bimanual coordination of distinct rhythms. Exp Brain Res 2012; 219:409-19. [DOI: 10.1007/s00221-012-3103-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2012] [Accepted: 04/17/2012] [Indexed: 12/27/2022]
|
34
|
Bolognini N, Cecchetto C, Geraci C, Maravita A, Pascual-Leone A, Papagno C. Hearing Shapes Our Perception of Time: Temporal Discrimination of Tactile Stimuli in Deaf People. J Cogn Neurosci 2012; 24:276-86. [DOI: 10.1162/jocn_a_00135] [Citation(s) in RCA: 43] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
Abstract
Confronted with the loss of one type of sensory input, we compensate using information conveyed by other senses. However, losing one type of sensory information at specific developmental times may lead to deficits across all sensory modalities. We addressed the effect of auditory deprivation on the development of tactile abilities, taking into account changes occurring at the behavioral and cortical level. Congenitally deaf and hearing individuals performed two tactile tasks, the first requiring the discrimination of the temporal duration of touches and the second requiring the discrimination of their spatial length. Compared with hearing individuals, deaf individuals were impaired only in tactile temporal processing. To explore the neural substrate of this difference, we ran a TMS experiment. In deaf individuals, the auditory association cortex was involved in temporal and spatial tactile processing, with the same chronometry as the primary somatosensory cortex. In hearing participants, the involvement of auditory association cortex occurred at a later stage and selectively for temporal discrimination. The different chronometry in the recruitment of the auditory cortex in deaf individuals correlated with the tactile temporal impairment. Thus, early hearing experience seems to be crucial to develop an efficient temporal processing across modalities, suggesting that plasticity does not necessarily result in behavioral compensation.
Collapse
|
35
|
Evaluating User Response to In-Car Haptic Feedback Touchscreens Using the Lane Change Test. ADVANCES IN HUMAN-COMPUTER INTERACTION 2012. [DOI: 10.1155/2012/598739] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Touchscreen interfaces are widely used in modern technology, from mobile devices to in-car infotainment systems. However, touchscreens impose significant visual workload demands on the user which have safety implications for use in cars. Previous studies indicate that the application of haptic feedback can improve both performance of and affective response to user interfaces. This paper reports on and extends the findings of a 2009 study conducted to evaluate the effects of different combinations of touchscreen visual, audible, and haptic feedback on driving and task performance, affective response, and subjective workload; the initial findings of which were originally published in (M. J. Pitts et al., 2009). A total of 48 non-expert users completed the study. A dual-task approach was applied, using the Lane Change Test as the driving task and realistic automotive use case touchscreen tasks. Results indicated that, while feedback type had no effect on driving or task performance, preference was expressed for multimodal feedback over visual alone. Issues relating to workload and cross-modal interaction were also identified.
Collapse
|
36
|
Abstract
In the present review, we focus on how commonalities in the ontogenetic development of the auditory and tactile sensory systems may inform the interplay between these signals in the temporal domain. In particular, we describe the results of behavioral studies that have investigated temporal resolution (in temporal order, synchrony/asynchrony, and simultaneity judgment tasks), as well as temporal numerosity perception, and similarities in the perception of frequency across touch and hearing. The evidence reviewed here highlights features of audiotactile temporal perception that are distinctive from those seen for other pairings of sensory modalities. For instance, audiotactile interactions are characterized in certain tasks (e.g., temporal numerosity judgments) by a more balanced reciprocal influence than are other modality pairings. Moreover, relative spatial position plays a different role in the temporal order and temporal recalibration processes for audiotactile stimulus pairings than for other modality pairings. The effect exerted by both the spatial arrangement of stimuli and attention on temporal order judgments is described. Moreover, a number of audiotactile interactions occurring during sensory-motor synchronization are highlighted. We also look at the audiotactile perception of rhythm and how it may be affected by musical training. The differences emerging from this body of research highlight the need for more extensive investigation into audiotactile temporal interactions. We conclude with a brief overview of some of the key issues deserving of further research in this area.
Collapse
|
37
|
Champoux F, Shiller DM, Zatorre RJ. Feel what you say: an auditory effect on somatosensory perception. PLoS One 2011; 6:e22829. [PMID: 21857955 PMCID: PMC3152559 DOI: 10.1371/journal.pone.0022829] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2011] [Accepted: 07/02/2011] [Indexed: 11/18/2022] Open
Abstract
In the present study, we demonstrate an audiotactile effect in which amplitude modulation of auditory feedback during voiced speech induces a throbbing sensation over the lip and laryngeal regions. Control tasks coupled with the examination of speech acoustic parameters allow us to rule out the possibility that the effect may have been due to cognitive factors or motor compensatory effects. We interpret the effect as reflecting the tight interplay between auditory and tactile modalities during vocal production.
Collapse
Affiliation(s)
- François Champoux
- Centre de recherche interdisciplinaire en réadaptation du Montréal métropolitain/Institut Raymond-Dewar, Montreal, Quebec, Canada.
| | | | | |
Collapse
|
38
|
Overvliet KE, Soto-Faraco S. I can't believe this isn't wood! An investigation in the perception of naturalness. Acta Psychol (Amst) 2011; 136:95-111. [PMID: 21092921 DOI: 10.1016/j.actpsy.2010.10.007] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2009] [Revised: 10/13/2010] [Accepted: 10/14/2010] [Indexed: 11/24/2022] Open
Abstract
For most people "naturalness" is a highly appreciated material characteristic. For instance, a natural wooden floor is seen as more valuable than a fake replica, though they may be comparable in quality and durability. In the present study we investigated how sensory input (vision and touch) contributes to the perception of naturalness in wood. Participants rated samples of wood or imitations thereof, such as vinyl and veneers. We first attempted to provide a validation of the measurement of perceived naturalness by comparing four psychophysical measurement methods (labelled scaling, magnitude estimation, binary decision, and ranked ordering). Second, we investigated the contribution of vision and touch by measuring the perception of naturalness in three exploration modalities (vision only, touch only, and visuo-tactile). The results show a high degree of consistency across measurement methods, suggesting that we measured a common underlying construct that relates to naturalness. It also suggests that this construct is represented on a metathetic (categorical) continuum. Moreover, we found that both vision and touch are highly correlated predictors of visuo-tactile perception of naturalness.
Collapse
|
39
|
Early- and Late-Onset Blindness Both Curb Audiotactile Integration on the Parchment-Skin Illusion. Psychol Sci 2010; 22:19-25. [DOI: 10.1177/0956797610391099] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
It has been shown that congenital blindness can lead to anomalies in the integration of auditory and tactile information, at least under certain conditions. In the present study, we used the parchment-skin illusion, a robust illustration of sound-biased perception of touch based on changes in frequency, to investigate the specificities of audiotactile interactions in early- and late-onset blind individuals. Blind individuals in both groups did not experience any illusory change in tactile perception when the frequency of the auditory signal was modified, whereas sighted individuals consistently experienced the illusion. This demonstration that blind individuals had reduced susceptibility to an auditory-tactile illusion suggests either that vision is necessary for the establishment of audiotactile interactions or that auditory and tactile information can be processed more independently in blind individuals than in sighted individuals. In addition, the results obtained in late-onset blind participants suggest that visual input may play a role in the maintenance of audiotactile integration.
Collapse
|
40
|
Yau JM, Weber AI, Bensmaia SJ. Separate mechanisms for audio-tactile pitch and loudness interactions. Front Psychol 2010; 1:160. [PMID: 21887147 PMCID: PMC3157934 DOI: 10.3389/fpsyg.2010.00160] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2010] [Accepted: 09/09/2010] [Indexed: 11/13/2022] Open
Abstract
A major goal in perceptual neuroscience is to understand how signals from different sensory modalities are combined to produce stable and coherent representations. We previously investigated interactions between audition and touch, motivated by the fact that both modalities are sensitive to environmental oscillations. In our earlier study, we characterized the effect of auditory distractors on tactile frequency and intensity perception. Here, we describe the converse experiments examining the effect of tactile distractors on auditory processing. Because the two studies employ the same psychophysical paradigm, we combined their results for a comprehensive view of how auditory and tactile signals interact and how these interactions depend on the perceptual task. Together, our results show that temporal frequency representations are perceptually linked regardless of the attended modality. In contrast, audio-tactile loudness interactions depend on the attended modality: Tactile distractors influence judgments of auditory intensity, but judgments of tactile intensity are impervious to auditory distraction. Lastly, we show that audio-tactile loudness interactions depend critically on stimulus timing, while pitch interactions do not. These results reveal that auditory and tactile inputs are combined differently depending on the perceptual task. That distinct rules govern the integration of auditory and tactile signals in pitch and loudness perception implies that the two are mediated by separate neural mechanisms. These findings underscore the complexity and specificity of multisensory interactions.
Collapse
Affiliation(s)
- Jeffrey M Yau
- Department of Neurology, Division of Cognitive Neuroscience, Johns Hopkins University School of Medicine Baltimore, MD, USA
| | | | | |
Collapse
|
41
|
Abstract
In six experiments, subjects judged the sizes of squares that were presented visually and/or haptically, in unimodal or bimodal conditions. We were interested in which mode most affected size judgments in the bimodal condition when the squares presented to each mode actually differed in size. Three factors varied: whether haptic exploration was passive or active, whether the choice set from which the subjects selected their responses was visual or haptic, and whether cutaneous information was provided in addition to kinesthetic information. To match the task for each mode, visual presentations consisted of a cursor that moved along a square pathway to correspond to the haptic experience of successive segments revealed during exploration. We found that the visual influence on size judgments was greater than the influence of haptics when the haptic experience involved only kinesthesis, passive movement, and a visual choice set. However, when cutaneous input was added to kinesthetic information, size judgments were most influenced by the haptic mode. The results support hypotheses of sensory integration, rather than capture of one sense by the other.
Collapse
|
42
|
Aspell JE, Lavanchy T, Lenggenhager B, Blanke O. Seeing the body modulates audiotactile integration. Eur J Neurosci 2010; 31:1868-73. [DOI: 10.1111/j.1460-9568.2010.07210.x] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
43
|
Audio-tactile superiority over visuo-tactile and audio-visual combinations in the temporal resolution of synchrony perception. Exp Brain Res 2009; 198:245-59. [PMID: 19499212 DOI: 10.1007/s00221-009-1870-x] [Citation(s) in RCA: 75] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2008] [Accepted: 05/16/2009] [Indexed: 10/20/2022]
Abstract
To see whether there is a difference in temporal resolution of synchrony perception between audio-visual (AV), visuo-tactile (VT), and audio-tactile (AT) combinations, we compared synchrony-asynchrony discrimination thresholds of human participants. Visual and auditory stimuli were, respectively, a luminance-modulated Gaussian blob and an amplitude-modulated white noise. Tactile stimuli were mechanical vibrations presented to the index finger. All the stimuli were temporally modulated by either single pulses or repetitive-pulse trains. The results show that the temporal resolution of synchrony perception was similar for AV and VT (e.g., approximately 4 Hz for repetitive-pulse stimuli), but significantly higher for AT approximately 10 Hz). Apart from having a higher temporal resolution, however, AT synchrony perception was similar to AV synchrony perception in that participants could select matching features through attention, and a change in the matching-feature attribute had little effect on temporal resolution. The AT superiority in temporal resolution was indicated not only by synchrony-asynchrony discrimination but also by simultaneity judgments. Temporal order judgments were less affected by modality combination than the other two tasks.
Collapse
|