1
|
Bulusu V, Lazar L. Crossmodal associations between naturally occurring tactile and sound textures. Perception 2024; 53:219-239. [PMID: 38304994 DOI: 10.1177/03010066231224557] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2024]
Abstract
This study investigates the crossmodal associations between naturally occurring sound textures and tactile textures. Previous research has demonstrated the association between low-level sensory features of sound and touch, as well as higher-level, cognitively mediated associations involving language, emotions, and metaphors. However, stimuli like textures, which are found in both modalities have received less attention. In this study, we conducted two experiments: a free association task and a two alternate forced choice task using everyday tactile textures and sound textures selected from natural sound categories. The results revealed consistent crossmodal associations reported by participants between the textures of the two modalities. They tended to associate more sound textures (e.g., wood shavings and sandpaper) with tactile surfaces that were rated as harder, rougher, and intermediate on the sticky-slippery scale. While some participants based the auditory-tactile association on sensory features, others made the associations based on semantic relationships, co-occurrence in nature, and emotional mediation. Interestingly, the statistical features of the sound textures (mean, variance, kurtosis, power, autocorrelation, and correlation) did not show significant correlations with the crossmodal associations, indicating a higher-level association. This study provides insights into auditory-tactile associations by highlighting the role of sensory and emotional (or cognitive) factors in prompting these associations.
Collapse
Affiliation(s)
| | - Leslee Lazar
- Indian Institute of Technology Gandhinagar, India
| |
Collapse
|
2
|
Landelle C, Caron-Guyon J, Nazarian B, Anton J, Sein J, Pruvost L, Amberg M, Giraud F, Félician O, Danna J, Kavounoudias A. Beyond sense-specific processing: decoding texture in the brain from touch and sonified movement. iScience 2023; 26:107965. [PMID: 37810223 PMCID: PMC10551894 DOI: 10.1016/j.isci.2023.107965] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 07/08/2023] [Accepted: 09/15/2023] [Indexed: 10/10/2023] Open
Abstract
Texture, a fundamental object attribute, is perceived through multisensory information including touch and auditory cues. Coherent perceptions may rely on shared texture representations across different senses in the brain. To test this hypothesis, we delivered haptic textures coupled with a sound synthesizer to generate real-time textural sounds. Participants completed roughness estimation tasks with haptic, auditory, or bimodal cues in an MRI scanner. Somatosensory, auditory, and visual cortices were all activated during haptic and auditory exploration, challenging the traditional view that primary sensory cortices are sense-specific. Furthermore, audio-tactile integration was found in secondary somatosensory (S2) and primary auditory cortices. Multivariate analyses revealed shared spatial activity patterns in primary motor and somatosensory cortices, for discriminating texture across both modalities. This study indicates that primary areas and S2 have a versatile representation of multisensory textures, which has significant implications for how the brain processes multisensory cues to interact more efficiently with our environment.
Collapse
Affiliation(s)
- C. Landelle
- McGill University, McConnell Brain Imaging Centre, Department of Neurology and Neurosurgery, Montreal Neurological Institute, Montreal, QC, Canada
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| | - J. Caron-Guyon
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- University of Louvain, Institute for Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, Crossmodal Perception and Plasticity Laboratory, Louvain-la-Neuve, Belgium
| | - B. Nazarian
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J.L. Anton
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - J. Sein
- Aix-Marseille Université, CNRS, Centre IRM-INT@CERIMED, Institut de Neurosciences de la Timone, INT UMR 7289, Marseille, France
| | - L. Pruvost
- Aix-Marseille Université, CNRS, Perception, Représentations, Image, Son, Musique, PRISM UMR 7061, Marseille, France
| | - M. Amberg
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - F. Giraud
- Université Lille, Laboratoire d'Electrotechnique et d'Electronique de Puissance, EA 2697-L2EP, Lille, France
| | - O. Félician
- Aix Marseille Université, INSERM, Institut des Neurosciences des Systèmes, INS UMR 1106, Marseille, France
| | - J. Danna
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
- Université de Toulouse, CNRS, Laboratoire Cognition, Langues, Langage, Ergonomie, CLLE UMR5263, Toulouse, France
| | - A. Kavounoudias
- Aix-Marseille Université, CNRS, Laboratoire de Neurosciences Cognitives, LNC UMR 7291, Marseille, France
| |
Collapse
|
3
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
4
|
Sathian K, Lacey S. Cross-Modal Interactions of the Tactile System. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE 2022; 31:411-418. [PMID: 36408466 PMCID: PMC9674209 DOI: 10.1177/09637214221101877] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/29/2023]
Abstract
The sensory systems responsible for perceptions of touch, vision, hearing, etc. have traditionally been regarded as mostly separate, only converging at late stages of processing. Contrary to this dogma, recent work has shown that interactions between the senses are robust and abundant. Touch and vision are both commonly used to obtain information about a number of object properties, and share perceptual and neural representations in many domains. Additionally, visuotactile interactions are implicated in the sense of body ownership, as revealed by powerful illusions that can be evoked by manipulating these interactions. Touch and hearing both rely in part on temporal frequency information, leading to a number of audiotactile interactions reflecting a good deal of perceptual and neural overlap. The focus in sensory neuroscience and psychophysics is now on characterizing the multisensory interactions that lead to our panoply of perceptual experiences.
Collapse
Affiliation(s)
- K. Sathian
- Department of Neurology, Penn State Health Milton S. Hershey Medical Center
- Department of Neural & Behavioral Sciences, Penn State College of Medicine
- Department of Psychology, Penn State College of Liberal Arts
| | - Simon Lacey
- Department of Neurology, Penn State Health Milton S. Hershey Medical Center
- Department of Neural & Behavioral Sciences, Penn State College of Medicine
| |
Collapse
|
5
|
Sharma D, Ng KKW, Birznieks I, Vickery RM. Auditory clicks elicit equivalent temporal frequency perception to tactile pulses: A cross-modal psychophysical study. Front Neurosci 2022; 16:1006185. [PMID: 36161171 PMCID: PMC9500524 DOI: 10.3389/fnins.2022.1006185] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Accepted: 08/24/2022] [Indexed: 12/02/2022] Open
Abstract
Both hearing and touch are sensitive to the frequency of mechanical oscillations—sound waves and tactile vibrations, respectively. The mounting evidence of parallels in temporal frequency processing between the two sensory systems led us to directly address the question of perceptual frequency equivalence between touch and hearing using stimuli of simple and more complex temporal features. In a cross-modal psychophysical paradigm, subjects compared the perceived frequency of pulsatile mechanical vibrations to that elicited by pulsatile acoustic (click) trains, and vice versa. Non-invasive pulsatile stimulation designed to excite a fixed population of afferents was used to induce desired temporal spike trains at frequencies spanning flutter up to vibratory hum (>50 Hz). The cross-modal perceived frequency for regular test pulse trains of either modality was a close match to the presented stimulus physical frequency up to 100 Hz. We then tested whether the recently discovered “burst gap” temporal code for frequency, that is shared by the two senses, renders an equivalent cross-modal frequency perception. When subjects compared trains comprising pairs of pulses (bursts) in one modality against regular trains in the other, the cross-sensory equivalent perceptual frequency best corresponded to the silent interval between the successive bursts in both auditory and tactile test stimuli. These findings suggest that identical acoustic and vibrotactile pulse trains, regardless of pattern, elicit equivalent frequencies, and imply analogous temporal frequency computation strategies in both modalities. This perceptual correspondence raises the possibility of employing a cross-modal comparison as a robust standard to overcome the prevailing methodological limitations in psychophysical investigations and strongly encourages cross-modal approaches for transmitting sensory information such as translating pitch into a similar pattern of vibration on the skin.
Collapse
Affiliation(s)
- Deepak Sharma
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- *Correspondence: Deepak Sharma,
| | - Kevin K. W. Ng
- Center for Social and Affective Neuroscience, Department of Biomedical and Clinical Sciences, Linköping University, Linköping, Sweden
| | - Ingvars Birznieks
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- Bionics and Bio-Robotics, Tyree Foundation Institute of Health Engineering, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
| | - Richard M. Vickery
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- Bionics and Bio-Robotics, Tyree Foundation Institute of Health Engineering, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
| |
Collapse
|
6
|
He J, Ren H, Li J, Dong M, Dai L, Li Z, Miao Y, Li Y, Tan P, Gu L, Chen X, Tang J. Deficits in Sense of Body Ownership, Sensory Processing, and Temporal Perception in Schizophrenia Patients With/Without Auditory Verbal Hallucinations. Front Neurosci 2022; 16:831714. [PMID: 35495040 PMCID: PMC9046910 DOI: 10.3389/fnins.2022.831714] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2021] [Accepted: 03/15/2022] [Indexed: 11/13/2022] Open
Abstract
It has been claimed that individuals with schizophrenia have difficulty in self-recognition and, consequently, are unable to identify the sources of their sensory perceptions or thoughts, resulting in delusions, hallucinations, and unusual experiences of body ownership. The deficits also contribute to the enhanced rubber hand illusion (RHI; a body perception illusion, induced by synchronous visual and tactile stimulation). Evidence based on RHI paradigms is emerging that auditory information can make an impact on the sense of body ownership, which relies on the process of multisensory inputs and integration. Hence, we assumed that auditory verbal hallucinations (AVHs), as an abnormal auditory perception, could be linked with body ownership, and the RHI paradigm could be conducted in patients with AVHs to explore the underlying mechanisms. In this study, we investigated the performance of patients with/without AVHs in the RHI. We administered the RHI paradigm to 80 patients with schizophrenia (47 with AVHs and 33 without AVHs) and 36 healthy controls. We conducted the experiment under two conditions (synchronous and asynchronous) and evaluated the RHI effects by both objective and subjective measures. Both patient groups experienced the RHI more quickly and strongly than HCs. The RHI effects of patients with AVHs were significantly smaller than those of patients without AVHs. Another important finding was that patients with AVHs did not show a reduction in RHI under asynchronous conditions. These results emphasize the disturbances of the sense of body ownership in schizophrenia patients with/without AVHs and the associations with AVHs. Furthermore, it is suggested that patients with AVHs may have multisensory processing dysfunctions and internal timing deficits.
Collapse
Affiliation(s)
- Jingqi He
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, China
| | - Honghong Ren
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, China
| | - Jinguang Li
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, China
- Affiliated Wuhan Mental Health Center, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Min Dong
- Guangdong Mental Health Center, Guangdong Provincial People’s Hospital, Guangdong Academy of Medical Sciences, Guangzhou, China
| | - Lulin Dai
- Department of Neurosurgery, Center for Functional Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Zhijun Li
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, China
| | - Yating Miao
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, China
| | - Yunjin Li
- Department of Pathology, School of Basic Medical Sciences, Central South University, Changsha, China
| | - Peixuan Tan
- Department of Medical Psychology and Behavioral Medicine, School of Public Health, Guangxi Medical University, Nanning, China
| | - Lin Gu
- RIKEN Center for Advanced Intelligence Project, Tokyo, Japan
- Research Center for Advanced Science and Technology (RCAST), University of Tokyo, Tokyo, Japan
| | - Xiaogang Chen
- Department of Psychiatry, National Clinical Research Center for Mental Disorders, The Second Xiangya Hospital of Central South University, Changsha, China
- *Correspondence: Xiaogang Chen,
| | - Jinsong Tang
- Department of Psychiatry, Sir Run-Run Shaw Hospital, School of Medicine, Zhejiang University, Hangzhou, China
- Zigong Mental Health Center, Zigong, China
- Jinsong Tang,
| |
Collapse
|
7
|
Bernard C, Monnoyer J, Wiertlewski M, Ystad S. Rhythm perception is shared between audio and haptics. Sci Rep 2022; 12:4188. [PMID: 35264703 PMCID: PMC8907191 DOI: 10.1038/s41598-022-08152-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2021] [Accepted: 02/24/2022] [Indexed: 11/25/2022] Open
Abstract
A surface texture is perceived through both the sound and vibrations produced while being explored by our fingers. Because of their common origin, both modalities have a strong influence on each other, particularly at above 60 Hz for which vibrotactile perception and pitch perception share common neural processes. However, whether the sensation of rhythm is shared between audio and haptic perception is still an open question. In this study, we show striking similarities between the audio and haptic perception of rhythmic changes, and demonstrate the interaction of both modalities below 60 Hz. Using a new surface-haptic device to synthesize arbitrary audio-haptic textures, psychophysical experiments demonstrate that the perception threshold curves of audio and haptic rhythmic gradients are the same. Moreover, multimodal integration occurs when audio and haptic rhythmic gradients are congruent. We propose a multimodal model of rhythm perception to explain these observations. These findings suggest that audio and haptic signals are likely to be processed by common neural mechanisms also for the perception of rhythm. They provide a framework for audio-haptic stimulus generation that is beneficial for nonverbal communication or modern human-machine interfaces.
Collapse
Affiliation(s)
- Corentin Bernard
- CNRS, PRISM, Aix-Marseille Univ, Marseille, France. .,Centre Technique de Vélizy, Stellantis, Paris, France. .,CNRS, ISM, Aix-Marseille Univ, Marseille, France.
| | - Jocelyn Monnoyer
- Centre Technique de Vélizy, Stellantis, Paris, France.,CNRS, ISM, Aix-Marseille Univ, Marseille, France
| | | | - Sølvi Ystad
- CNRS, PRISM, Aix-Marseille Univ, Marseille, France
| |
Collapse
|
8
|
Whitton S, Kim JM, Scurry AN, Otto S, Zhuang X, Cordes D, Jiang F. Multisensory temporal processing in early deaf. Neuropsychologia 2021; 163:108069. [PMID: 34715119 PMCID: PMC8653765 DOI: 10.1016/j.neuropsychologia.2021.108069] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Revised: 08/01/2021] [Accepted: 10/21/2021] [Indexed: 10/20/2022]
Abstract
Navigating the world relies on understanding progressive sequences of multisensory events across time. Early deaf (ED) individuals are more precise in visual detection of space and motion than their normal hearing (NH) counterparts. However, whether ED individuals show altered multisensory temporal processing abilities is less clear. According to the connectome model, brain development depends on experience, and therefore the lack of audition may affect how the brain responds to remaining senses and how they are functionally connected. We used a temporal order judgment (TOJ) task to examine multisensory (visuotactile) temporal processing in ED and NH groups. We quantified BOLD responses and functional connectivity (FC) in both groups. ED and NH groups performed similarly for the visuotactile TOJ task. Bilateral posterior superior temporal sulcus (pSTS) BOLD responses during the TOJ task were significantly larger in the ED group than in NH. Using anatomically defined pSTS seeds, our FC analysis revealed stronger somatomotor and weaker visual regional connections in the ED group than in NH during the TOJ task. These results suggest that a lack of auditory input might alter the balance of tactile and visual area FC with pSTS when a multisensory temporal task is involved.
Collapse
Affiliation(s)
- Simon Whitton
- Department of Psychology, University of Nevada, Reno, USA.
| | - Jung Min Kim
- Department of Psychology, University of Nevada, Reno, USA
| | | | - Stephanie Otto
- Department of Psychology, University of Nevada, Reno, USA
| | - Xiaowei Zhuang
- Cleveland Clinic Lou Ruvo Center for Brain Health, Las Vegas, USA
| | - Dietmar Cordes
- Cleveland Clinic Lou Ruvo Center for Brain Health, Las Vegas, USA
| | - Fang Jiang
- Department of Psychology, University of Nevada, Reno, USA
| |
Collapse
|
9
|
Electro-Tactile Stimulation Enhances Cochlear-Implant Melody Recognition: Effects of Rhythm and Musical Training. Ear Hear 2021; 41:106-113. [PMID: 31884501 DOI: 10.1097/aud.0000000000000749] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVES Electro-acoustic stimulation (EAS) enhances speech and music perception in cochlear-implant (CI) users who have residual low-frequency acoustic hearing. For CI users who do not have low-frequency acoustic hearing, tactile stimulation may be used in a similar fashion as residual low-frequency acoustic hearing to enhance CI performance. Previous studies showed that electro-tactile stimulation (ETS) enhanced speech recognition in noise and tonal language perception for CI listeners. Here, we examined the effect of ETS on melody recognition in both musician and nonmusician CI users. DESIGN Nine musician and eight nonmusician CI users were tested in a melody recognition task with or without rhythmic cues in three testing conditions: CI only (E), tactile only (T), and combined CI and tactile stimulation (ETS). RESULTS Overall, the combined electrical and tactile stimulation enhanced the melody recognition performance in CI users by 9% points. Two additional findings were observed. First, musician CI users outperformed nonmusicians CI users in melody recognition, but the size of the enhancement effect was similar between the two groups. Second, the ETS enhancement was significantly higher with nonrhythmic melodies than rhythmic melodies in both groups. CONCLUSIONS These findings suggest that, independent of musical experience, the size of the ETS enhancement depends on integration efficiency between tactile and auditory stimulation, and that the mechanism of the ETS enhancement is improved electric pitch perception. The present study supports the hypothesis that tactile stimulation can be used to improve pitch perception in CI users.
Collapse
|
10
|
Taffou M, Suied C, Viaud-Delmon I. Auditory roughness elicits defense reactions. Sci Rep 2021; 11:956. [PMID: 33441758 PMCID: PMC7806762 DOI: 10.1038/s41598-020-79767-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Accepted: 12/09/2020] [Indexed: 11/26/2022] Open
Abstract
Auditory roughness elicits aversion, and higher activation in cerebral areas involved in threat processing, but its link with defensive behavior is unknown. Defensive behaviors are triggered by intrusions into the space immediately surrounding the body, called peripersonal space (PPS). Integrating multisensory information in PPS is crucial to assure the protection of the body. Here, we assessed the behavioral effects of roughness on auditory-tactile integration, which reflects the monitoring of this multisensory region of space. Healthy human participants had to detect as fast as possible a tactile stimulation delivered on their hand while an irrelevant sound was approaching them from the rear hemifield. The sound was either a simple harmonic sound or a rough sound, processed through binaural rendering so that the virtual sound source was looming towards participants. The rough sound speeded tactile reaction times at a farther distance from the body than the non-rough sound. This indicates that PPS, as estimated here via auditory-tactile integration, is sensitive to auditory roughness. Auditory roughness modifies the behavioral relevance of simple auditory events in relation to the body. Even without emotional or social contextual information, auditory roughness constitutes an innate threat cue that elicits defensive responses.
Collapse
Affiliation(s)
- Marine Taffou
- Institut de Recherche Biomédicale des Armées, 91220, Brétigny-sur-Orge, France.
| | - Clara Suied
- Institut de Recherche Biomédicale des Armées, 91220, Brétigny-sur-Orge, France
| | - Isabelle Viaud-Delmon
- CNRS, Ircam, Sorbonne Université, Ministère de la Culture, Sciences et Technologies de la Musique et du son, STMS, 75004, Paris, France
| |
Collapse
|
11
|
Scurry AN, Huber E, Matera C, Jiang F. Increased Right Posterior STS Recruitment Without Enhanced Directional-Tuning During Tactile Motion Processing in Early Deaf Individuals. Front Neurosci 2020; 14:864. [PMID: 32982667 PMCID: PMC7477335 DOI: 10.3389/fnins.2020.00864] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Accepted: 07/24/2020] [Indexed: 01/19/2023] Open
Abstract
Upon early sensory deprivation, the remaining modalities often exhibit cross-modal reorganization, such as primary auditory cortex (PAC) recruitment for visual motion processing in early deafness (ED). Previous studies of compensatory plasticity in ED individuals have given less attention to tactile motion processing. In the current study, we aimed to examine the effects of early auditory deprivation on tactile motion processing. We simulated four directions of tactile motion on each participant's right index finger and characterized their tactile motion responses and directional-tuning profiles using population receptive field analysis. Similar tactile motion responses were found within primary (SI) and secondary (SII) somatosensory cortices between ED and hearing control groups, whereas ED individuals showed a reduced proportion of voxels with directionally tuned responses in SI contralateral to stimulation. There were also significant but minimal responses to tactile motion within PAC for both groups. While early deaf individuals show significantly larger recruitment of right posterior superior temporal sulcus (pSTS) region upon tactile motion stimulation, there was no evidence of enhanced directional tuning. Greater recruitment of right pSTS region is consistent with prior studies reporting reorganization of multimodal areas due to sensory deprivation. The absence of increased directional tuning within the right pSTS region may suggest a more distributed population of neurons dedicated to processing tactile spatial information as a consequence of early auditory deprivation.
Collapse
Affiliation(s)
- Alexandra N Scurry
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Elizabeth Huber
- Department of Speech and Hearing Sciences, Institute for Learning & Brain Sciences, University of Washington, Seattle, WA, United States
| | - Courtney Matera
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| | - Fang Jiang
- Department of Psychology, University of Nevada, Reno, Reno, NV, United States
| |
Collapse
|
12
|
Rahman MS, Barnes KA, Crommett LE, Tommerdahl M, Yau JM. Auditory and tactile frequency representations are co-embedded in modality-defined cortical sensory systems. Neuroimage 2020; 215:116837. [PMID: 32289461 PMCID: PMC7292761 DOI: 10.1016/j.neuroimage.2020.116837] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2019] [Revised: 03/17/2020] [Accepted: 04/06/2020] [Indexed: 11/18/2022] Open
Abstract
Sensory information is represented and elaborated in hierarchical cortical systems that are thought to be dedicated to individual sensory modalities. This traditional view of sensory cortex organization has been challenged by recent evidence of multimodal responses in primary and association sensory areas. Although it is indisputable that sensory areas respond to multiple modalities, it remains unclear whether these multimodal responses reflect selective information processing for particular stimulus features. Here, we used fMRI adaptation to identify brain regions that are sensitive to the temporal frequency information contained in auditory, tactile, and audiotactile stimulus sequences. A number of brain regions distributed over the parietal and temporal lobes exhibited frequency-selective temporal response modulation for both auditory and tactile stimulus events, as indexed by repetition suppression effects. A smaller set of regions responded to crossmodal adaptation sequences in a frequency-dependent manner. Despite an extensive overlap of multimodal frequency-selective responses across the parietal and temporal lobes, representational similarity analysis revealed a cortical "regional landscape" that clearly reflected distinct somatosensory and auditory processing systems that converged on modality-invariant areas. These structured relationships between brain regions were also evident in spontaneous signal fluctuation patterns measured at rest. Our results reveal that multimodal processing in human cortex can be feature-specific and that multimodal frequency representations are embedded in the intrinsically hierarchical organization of cortical sensory systems.
Collapse
Affiliation(s)
- Md Shoaibur Rahman
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA
| | - Kelly Anne Barnes
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA; Department of Behavioral and Social Sciences, San Jacinto College - South, Houston, 13735 Beamer Rd, S13.269, Houston, TX, 77089, USA
| | - Lexi E Crommett
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA
| | - Mark Tommerdahl
- Department of Biomedical Engineering, University of North Carolina at Chapel Hill, CB No. 7575, Chapel Hill, NC, 27599, USA
| | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, 77030, USA.
| |
Collapse
|
13
|
Villalonga MB, Sussman RF, Sekuler R. Feeling the Beat (and Seeing It, Too): Vibrotactile, Visual, and Bimodal Rate Discrimination. Multisens Res 2020; 33:31-59. [PMID: 31648198 DOI: 10.1163/22134808-20191413] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 07/09/2019] [Indexed: 11/19/2022]
Abstract
Beats are among the basic units of perceptual experience. Produced by regular, intermittent stimulation, beats are most commonly associated with audition, but the experience of a beat can result from stimulation in other modalities as well. We studied the robustness of visual, vibrotactile, and bimodal signals as sources of beat perception. Subjects attempted to discriminate between pulse trains delivered at 3 Hz or at 6 Hz. To investigate signal robustness, we intentionally degraded signals on two-thirds of the trials using temporal-domain noise. On these trials, inter-pulse intervals (IPIs) were stochastic, perturbed independently from the nominal IPI by random samples from zero-mean Gaussian distributions with different variances. These perturbations produced directional changes in the IPIs, which either increased or decreased the likelihood of confusing the two pulse rates. In addition to affording an assay of signal robustness, this paradigm made it possible to gauge how subjects' judgments were influenced by successive IPIs. Logistic regression revealed a strong primacy effect: subjects' decisions were disproportionately influenced by a trial's initial IPIs. Response times and parameter estimates from drift-diffusion modeling showed that information accumulates more rapidly with bimodal stimulation than with either unimodal stimulus alone. Analysis of error rates within each condition suggested consistently optimal decision making, even with increased IPI variability. Finally, beat information delivered by vibrotactile signals proved just as robust as information conveyed by visual signals, confirming vibrotactile stimulation's potential as a communication channel.
Collapse
Affiliation(s)
| | - Rachel F Sussman
- 2Volen National Center for Complex Systems, Brandeis University, Waltham, MA, USA
| | - Robert Sekuler
- 2Volen National Center for Complex Systems, Brandeis University, Waltham, MA, USA
| |
Collapse
|
14
|
Cacciamani L, Sheparovich L, Gibbons M, Crowley B, Carpenter KE, Wack C. Task-Irrelevant Sound Corrects Leftward Spatial Bias in Blindfolded Haptic Placement Task. Multisens Res 2020; 33:521-548. [PMID: 32083560 DOI: 10.1163/22134808-20191387] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2018] [Accepted: 09/03/2019] [Indexed: 11/19/2022]
Abstract
We often rely on our sense of vision for understanding the spatial location of objects around us. If vision cannot be used, one must rely on other senses, such as hearing and touch, in order to build spatial representations. Previous work has found evidence of a leftward spatial bias in visual and tactile tasks. In this study, we sought evidence of this leftward bias in a non-visual haptic object location memory task and assessed the influence of a task-irrelevant sound. In Experiment 1, blindfolded right-handed sighted participants used their non-dominant hand to haptically locate an object on the table, then used their dominant hand to place the object back in its original location. During placement, participants either heard nothing (no-sound condition) or a task-irrelevant repeating tone to the left, right, or front of the room. The results showed that participants exhibited a leftward placement bias on no-sound trials. On sound trials, this leftward bias was corrected; placements were faster and more accurate (regardless of the direction of the sound). One explanation for the leftward bias could be that participants were overcompensating their reach with the right hand during placement. Experiment 2 tested this explanation by switching the hands used for exploration and placement, but found similar results as Experiment 1. A third Experiment found evidence supporting the explanation that sound corrects the leftward bias by heightening attention. Together, these findings show that sound, even if task-irrelevant and semantically unrelated, can correct one's tendency to place objects too far to the left.
Collapse
Affiliation(s)
- Laura Cacciamani
- California Polytechnic State University, San Luis Obispo, CA,USA
| | | | - Molly Gibbons
- California Polytechnic State University, San Luis Obispo, CA,USA
| | - Brooke Crowley
- California Polytechnic State University, San Luis Obispo, CA,USA
| | | | - Carson Wack
- California Polytechnic State University, San Luis Obispo, CA,USA
| |
Collapse
|
15
|
Zhang M, Kwon SE, Ben-Johny M, O'Connor DH, Issa JB. Spectral hallmark of auditory-tactile interactions in the mouse somatosensory cortex. Commun Biol 2020; 3:64. [PMID: 32047263 PMCID: PMC7012892 DOI: 10.1038/s42003-020-0788-5] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Accepted: 01/22/2020] [Indexed: 11/08/2022] Open
Abstract
To synthesize a coherent representation of the external world, the brain must integrate inputs across different types of stimuli. Yet the mechanistic basis of this computation at the level of neuronal populations remains obscure. Here, we investigate tactile-auditory integration using two-photon Ca2+ imaging in the mouse primary (S1) and secondary (S2) somatosensory cortices. Pairing sound with whisker stimulation modulates tactile responses in both S1 and S2, with the most prominent modulation being robust inhibition in S2. The degree of inhibition depends on tactile stimulation frequency, with lower frequency responses the most severely attenuated. Alongside these neurons, we identify sound-selective neurons in S2 whose responses are inhibited by high tactile frequencies. These results are consistent with a hypothesized local mutually-inhibitory S2 circuit that spectrally selects tactile versus auditory inputs. Our findings enrich mechanistic understanding of multisensory integration and suggest a key role for S2 in combining auditory and tactile information.
Collapse
Affiliation(s)
- Manning Zhang
- Department of Biomedical Engineering, The Johns Hopkins University School of Medicine, Baltimore, MD, 21205, USA
- Department of Biomedical Engineering, Washington University in St. Louis, St. Louis, MO, 63130, USA
| | - Sung Eun Kwon
- Solomon H. Snyder Department of Neuroscience, The Johns Hopkins University School of Medicine, Kavli Neuroscience Discovery Institute, and Brain Science Institute, Baltimore, MD, 21205, USA
- Department of Molecular, Cellular and Developmental Biology, University of Michigan, Ann Arbor, MI, 48109, USA
| | - Manu Ben-Johny
- Department of Biomedical Engineering, The Johns Hopkins University School of Medicine, Baltimore, MD, 21205, USA
- Department of Physiology and Cellular Biophysics, Columbia University, New York, NY, 10032, USA
| | - Daniel H O'Connor
- Solomon H. Snyder Department of Neuroscience, The Johns Hopkins University School of Medicine, Kavli Neuroscience Discovery Institute, and Brain Science Institute, Baltimore, MD, 21205, USA
| | - John B Issa
- Department of Biomedical Engineering, The Johns Hopkins University School of Medicine, Baltimore, MD, 21205, USA.
- Department of Neurobiology, Northwestern University, Evanston, IL, 60201, USA.
| |
Collapse
|
16
|
Electro-Haptic Enhancement of Spatial Hearing in Cochlear Implant Users. Sci Rep 2020; 10:1621. [PMID: 32005889 PMCID: PMC6994470 DOI: 10.1038/s41598-020-58503-8] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2019] [Accepted: 01/15/2020] [Indexed: 11/08/2022] Open
Abstract
Cochlear implants (CIs) have enabled hundreds of thousands of profoundly hearing-impaired people to perceive sounds by electrically stimulating the auditory nerve. However, CI users are often very poor at locating sounds, which leads to impaired sound segregation and threat detection. We provided missing spatial hearing cues through haptic stimulation to augment the electrical CI signal. We found that this "electro-haptic" stimulation dramatically improved sound localisation. Furthermore, participants were able to effectively integrate spatial information transmitted through these two senses, performing better with combined audio and haptic stimulation than with either alone. Our haptic signal was presented to the wrists and could readily be delivered by a low-cost wearable device. This approach could provide a non-invasive means of improving outcomes for the vast majority of CI users who have only one implant, without the expense and risk of a second implantation.
Collapse
|
17
|
Pérez-Bellido A, Anne Barnes K, Crommett LE, Yau JM. Auditory Frequency Representations in Human Somatosensory Cortex. Cereb Cortex 2019; 28:3908-3921. [PMID: 29045579 DOI: 10.1093/cercor/bhx255] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2017] [Indexed: 01/01/2023] Open
Abstract
Recent studies have challenged the traditional notion of modality-dedicated cortical systems by showing that audition and touch evoke responses in the same sensory brain regions. While much of this work has focused on somatosensory responses in auditory regions, fewer studies have investigated sound responses and representations in somatosensory regions. In this functional magnetic resonance imaging (fMRI) study, we measured BOLD signal changes in participants performing an auditory frequency discrimination task and characterized activation patterns related to stimulus frequency using both univariate and multivariate analysis approaches. Outside of bilateral temporal lobe regions, we observed robust and frequency-specific responses to auditory stimulation in classically defined somatosensory areas. Moreover, using representational similarity analysis to define the relationships between multi-voxel activation patterns for all sound pairs, we found clear similarity patterns for auditory responses in the parietal lobe that correlated significantly with perceptual similarity judgments. Our results demonstrate that auditory frequency representations can be distributed over brain regions traditionally considered to be dedicated to somatosensation. The broad distribution of auditory and tactile responses over parietal and temporal regions reveals a number of candidate brain areas that could support general temporal frequency processing and mediate the extensive and robust perceptual interactions between audition and touch.
Collapse
Affiliation(s)
- Alexis Pérez-Bellido
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| | - Kelly Anne Barnes
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| | - Lexi E Crommett
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX, USA
| |
Collapse
|
18
|
Ohashi H, Ito T. Recalibration of auditory perception of speech due to orofacial somatosensory inputs during speech motor adaptation. J Neurophysiol 2019; 122:2076-2084. [PMID: 31509469 DOI: 10.1152/jn.00028.2019] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Speech motor control and learning rely on both somatosensory and auditory inputs. Somatosensory inputs associated with speech production can also affect the process of auditory perception of speech, and the somatosensory-auditory interaction may play a fundamental role in auditory perception of speech. In this report, we show that the somatosensory system contributes to perceptual recalibration, separate from its role in motor function. Subjects participated in speech motor adaptation to altered auditory feedback. Auditory perception of speech was assessed in phonemic identification tests before and after speech adaptation. To investigate a role of the somatosensory system in motor adaptation and subsequent perceptual change, we applied orofacial skin stretch in either a backward or forward direction during the auditory feedback alteration as a somatosensory modulation. We found that the somatosensory modulation did not affect the amount of adaptation at the end of training, although it changed the rate of adaptation. However, the perception following speech adaptation was altered depending on the direction of the somatosensory modulation. Somatosensory inflow rather than motor outflow thus drives changes to auditory perception of speech following speech adaptation, suggesting that somatosensory inputs play an important role in tuning of perceptual system.NEW & NOTEWORTHY This article reports that the somatosensory system works not equally with the motor system, but predominantly in the calibration of auditory perception of speech by speech production.
Collapse
Affiliation(s)
- Hiroki Ohashi
- Department of Psychology, McGill University, Montreal, Quebec, Canada.,Haskins Laboratories, New Haven, Connecticut
| | - Takayuki Ito
- Haskins Laboratories, New Haven, Connecticut.,Centre National de la Recherche Scientifique, GIPSA-Lab, Grenoble Institute of Technology, University of Grenoble-Alpes, Saint Martin d'Heres, France
| |
Collapse
|
19
|
Li B, Chen L, Fang F. Somatotopic representation of tactile duration: evidence from tactile duration aftereffect. Behav Brain Res 2019; 371:111954. [DOI: 10.1016/j.bbr.2019.111954] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2019] [Revised: 05/16/2019] [Accepted: 05/16/2019] [Indexed: 11/27/2022]
|
20
|
Rahman MS, Yau JM. Somatosensory interactions reveal feature-dependent computations. J Neurophysiol 2019; 122:5-21. [DOI: 10.1152/jn.00168.2019] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it scans across a surface. Although much is known about the processing of vibration amplitude and frequency information when cutaneous stimulation is experienced at a single location on the body, how these stimulus features are processed when touch occurs at multiple locations is poorly understood. We evaluated participants’ ability to discriminate tactile cues (100–300 Hz) on one hand while they ignored distractor cues experienced on their other hand. We manipulated the relative positions of the hands to characterize how limb position influenced cutaneous touch interactions. In separate experiments, participants judged either the frequency or intensity of mechanical vibrations. We found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Notably, bimanual interaction patterns and their sensitivity to hand locations differed according to stimulus feature. Somatosensory interactions in intensity perception were only marked by attenuation that was invariant to hand position manipulations. In contrast, interactions in frequency perception consisted of both bias and sensitivity changes that were more pronounced when the hands were held in close proximity. We implemented models to infer the neural computations that mediate somatosensory interactions in the intensity and frequency dimensions. Our findings reveal obligatory and feature-dependent somatosensory interactions that may be supported by both feature-specific and feature-general operations. NEW & NOTEWORTHY Little is known about the neural computations mediating feature-specific sensory interactions between the hands. We show that vibrations experienced on one hand systematically modulate the perception of vibrations felt on the other hand. Critically, interaction patterns and their dependence on the relative positions of the hands differed depending on whether participants judged vibration intensity or frequency. These results, which we recapitulate with models, imply that somatosensory interactions are mediated by feature-dependent neural computations.
Collapse
Affiliation(s)
| | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas
| |
Collapse
|
21
|
Crommett LE, Madala D, Yau JM. Multisensory perceptual interactions between higher-order temporal frequency signals. J Exp Psychol Gen 2019; 148:1124-1137. [PMID: 30335446 PMCID: PMC6472995 DOI: 10.1037/xge0000513] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Naturally occurring signals in audition and touch can be complex and marked by temporal variations in frequency and amplitude. Auditory frequency sweep processing has been studied extensively; however, much less is known about sweep processing in touch because studies have primarily focused on the perception of simple sinusoidal vibrations. Given the extensive interactions between audition and touch in the frequency processing of pure tone signals, we reasoned that these senses might also interact in the processing of higher-order frequency representations like sweeps. In a series of psychophysical experiments, we characterized the influence of auditory distractors on the ability of participants to discriminate tactile frequency sweeps. Auditory frequency sweeps systematically biased the tactile perception of sweep direction. Importantly, auditory cues exerted little influence on tactile sweep direction perception when the sounds and vibrations occupied different absolute frequency ranges or when the sounds consisted of intensity sweeps. Thus, audition and touch interact in frequency sweep perception in a frequency- and feature-specific manner. Our results demonstrate that audio-tactile interactions are not constrained to the processing of simple sinusoids. Because higher-order frequency representations may be synthesized from simpler representations, our findings imply that multisensory interactions in the temporal frequency domain span multiple hierarchical levels in sensory processing. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Collapse
Affiliation(s)
- Lexi E. Crommett
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA
| | | | - Jeffrey M. Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA
| |
Collapse
|
22
|
Convento S, Wegner-Clemens KA, Yau JM. Reciprocal Interactions Between Audition and Touch in Flutter Frequency Perception. Multisens Res 2019; 32:67-85. [PMID: 31059492 DOI: 10.1163/22134808-20181334] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2018] [Accepted: 11/09/2018] [Indexed: 11/19/2022]
Abstract
In both audition and touch, sensory cues comprising repeating events are perceived either as a continuous signal or as a stream of temporally discrete events (flutter), depending on the events' repetition rate. At high repetition rates (>100 Hz), auditory and tactile cues interact reciprocally in pitch processing. The frequency of a cue experienced in one modality systematically biases the perceived frequency of a cue experienced in the other modality. Here, we tested whether audition and touch also interact in the processing of low-frequency stimulation. We also tested whether multisensory interactions occurred if the stimulation in one modality comprised click trains and the stimulation in the other modality comprised amplitude-modulated signals. We found that auditory cues bias touch and tactile cues bias audition on a flutter discrimination task. Even though participants were instructed to attend to a single sensory modality and ignore the other cue, the flutter rate in the attended modality is perceived to be similar to that of the distractor modality. Moreover, we observed similar interaction patterns regardless of stimulus type and whether the same stimulus types were experienced by both senses. Combined with earlier studies, our results suggest that the nervous system extracts and combines temporal rate information from multisensory environmental signals, regardless of stimulus type, in both the low- and high temporal frequency domains. This function likely reflects the importance of temporal frequency as a fundamental feature of our multisensory experience.
Collapse
Affiliation(s)
- Silvia Convento
- 1Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA
| | - Kira A Wegner-Clemens
- 2Department of Neurosurgery, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA
| | - Jeffrey M Yau
- 1Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA
| |
Collapse
|
23
|
Convento S, Rahman MS, Yau JM. Selective Attention Gates the Interactive Crossmodal Coupling between Perceptual Systems. Curr Biol 2018; 28:746-752.e5. [PMID: 29456139 DOI: 10.1016/j.cub.2018.01.021] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2017] [Revised: 12/11/2017] [Accepted: 01/09/2018] [Indexed: 10/18/2022]
Abstract
Sensory cortical systems often activate in parallel, even when stimulation is experienced through a single sensory modality [1-3]. Co-activations may reflect the interactive coupling between information-linked cortical systems or merely parallel but independent sensory processing. We report causal evidence consistent with the hypothesis that human somatosensory cortex (S1), which co-activates with auditory cortex during the processing of vibrations and textures [4-9], interactively couples to cortical systems that support auditory perception. In a series of behavioral experiments, we used transcranial magnetic stimulation (TMS) to probe interactions between the somatosensory and auditory perceptual systems as we manipulated attention state. Acute TMS over S1 impairs auditory frequency perception when subjects simultaneously attend to auditory and tactile frequency, but not when attention is directed to audition alone. Auditory frequency perception is unaffected by TMS over visual cortex, thus confirming the privileged interactions between the somatosensory and auditory systems in temporal frequency processing [10-13]. Our results provide a key demonstration that selective attention can modulate the functional properties of cortical systems thought to support specific sensory modalities. The gating of crossmodal coupling by selective attention may critically support multisensory interactions and feature-specific perception.
Collapse
Affiliation(s)
- Silvia Convento
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA
| | - Md Shoaibur Rahman
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA
| | - Jeffrey M Yau
- Department of Neuroscience, Baylor College of Medicine, Houston, One Baylor Plaza, Houston, TX 77030, USA.
| |
Collapse
|
24
|
Li L, Chan A, Iqbal SM, Goldreich D. An Adaptation-Induced Repulsion Illusion in Tactile Spatial Perception. Front Hum Neurosci 2017; 11:331. [PMID: 28701936 PMCID: PMC5487416 DOI: 10.3389/fnhum.2017.00331] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2017] [Accepted: 06/08/2017] [Indexed: 11/23/2022] Open
Abstract
Following focal sensory adaptation, the perceived separation between visual stimuli that straddle the adapted region is often exaggerated. For instance, in the tilt aftereffect illusion, adaptation to tilted lines causes subsequently viewed lines with nearby orientations to be perceptually repelled from the adapted orientation. Repulsion illusions in the nonvisual senses have been less studied. Here, we investigated whether adaptation induces a repulsion illusion in tactile spatial perception. In a two-interval forced-choice task, participants compared the perceived separation between two point-stimuli applied on the forearms successively. Separation distance was constant on one arm (the reference) and varied on the other arm (the comparison). In Experiment 1, we took three consecutive baseline measurements, verifying that in the absence of manipulation, participants’ distance perception was unbiased across arms and stable across experimental blocks. In Experiment 2, we vibrated a region of skin on the reference arm, verifying that this focally reduced tactile sensitivity, as indicated by elevated monofilament detection thresholds. In Experiment 3, we applied vibration between the two reference points in our distance perception protocol and discovered that this caused an illusory increase in the separation between the points. We conclude that focal adaptation induces a repulsion aftereffect illusion in tactile spatial perception. The illusion provides clues as to how the tactile system represents spatial information. The analogous repulsion aftereffects caused by adaptation in different stimulus domains and sensory systems may point to fundamentally similar strategies for dynamic sensory coding.
Collapse
Affiliation(s)
- Lux Li
- Department of Psychology, Neuroscience and Behaviour, McMaster UniversityHamilton, ON, Canada
| | - Arielle Chan
- Department of Psychology, Neuroscience and Behaviour, McMaster UniversityHamilton, ON, Canada
| | - Shah M Iqbal
- Department of Psychology, Neuroscience and Behaviour, McMaster UniversityHamilton, ON, Canada
| | - Daniel Goldreich
- Department of Psychology, Neuroscience and Behaviour, McMaster UniversityHamilton, ON, Canada.,McMaster Integrative Neuroscience Discovery and Study, McMaster UniversityHamilton, ON, Canada
| |
Collapse
|