1
|
Franken MK, Liu BC, Ostry DJ. Towards a somatosensory theory of speech perception. J Neurophysiol 2022; 128:1683-1695. [PMID: 36416451 PMCID: PMC9762980 DOI: 10.1152/jn.00381.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2022] [Revised: 11/19/2022] [Accepted: 11/19/2022] [Indexed: 11/24/2022] Open
Abstract
Speech perception is known to be a multimodal process, relying not only on auditory input but also on the visual system and possibly on the motor system as well. To date there has been little work on the potential involvement of the somatosensory system in speech perception. In the present review, we identify the somatosensory system as another contributor to speech perception. First, we argue that evidence in favor of a motor contribution to speech perception can just as easily be interpreted as showing somatosensory involvement. Second, physiological and neuroanatomical evidence for auditory-somatosensory interactions across the auditory hierarchy indicates the availability of a neural infrastructure that supports somatosensory involvement in auditory processing in general. Third, there is accumulating evidence for somatosensory involvement in the context of speech specifically. In particular, tactile stimulation modifies speech perception, and speech auditory input elicits activity in somatosensory cortical areas. Moreover, speech sounds can be decoded from activity in somatosensory cortex; lesions to this region affect perception, and vowels can be identified based on somatic input alone. We suggest that the somatosensory involvement in speech perception derives from the somatosensory-auditory pairing that occurs during speech production and learning. By bringing together findings from a set of studies that have not been previously linked, the present article identifies the somatosensory system as a presently unrecognized contributor to speech perception.
Collapse
Affiliation(s)
| | | | - David J Ostry
- McGill University, Montreal, Quebec, Canada
- Haskins Laboratories, New Haven, Connecticut
| |
Collapse
|
2
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
3
|
Shiroshita Y, Kirimoto H, Watanabe T, Yunoki K, Sobue I. Event-related potentials evoked by skin puncture reflect activation of Aβ fibers: comparison with intraepidermal and transcutaneous electrical stimulations. PeerJ 2021; 9:e12250. [PMID: 34707936 PMCID: PMC8504465 DOI: 10.7717/peerj.12250] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Accepted: 09/13/2021] [Indexed: 11/20/2022] Open
Abstract
Background Recently, event-related potentials (ERPs) evoked by skin puncture, commonly used for blood sampling, have received attention as a pain assessment tool in neonates. However, their latency appears to be far shorter than the latency of ERPs evoked by intraepidermal electrical stimulation (IES), which selectively activates nociceptive Aδ and C fibers. To clarify this important issue, we examined whether ERPs evoked by skin puncture appropriately reflect central nociceptive processing, as is the case with IES. Methods In Experiment 1, we recorded evoked potentials to the click sound produced by a lance device (click-only), lance stimulation with the click sound (click+lance), or lance stimulation with white noise (WN+lance) in eight healthy adults to investigate the effect of the click sound on the ERP evoked by skin puncture. In Experiment 2, we tested 18 heathy adults and recorded evoked potentials to shallow lance stimulation (SL) with a blade that did not reach the dermis (0.1 mm insertion depth); normal lance stimulation (CL) (1 mm depth); transcutaneous electrical stimulation (ES), which mainly activates Aβ fibers; and IES, which selectively activates Aδ fibers when low stimulation current intensities are applied. White noise was continuously presented during the experiments. The stimulations were applied to the hand dorsum. In the SL, the lance device did not touch the skin and the blade was inserted to a depth of 0.1 mm into the epidermis, where the free nerve endings of Aδ fibers are located, which minimized the tactile sensation caused by the device touching the skin and the activation of Aβ fibers by the blade reaching the dermis. In the CL, as in clinical use, the lance device touched the skin and the blade reached a depth of 1 mm from the skin surface, i.e., the depth of the dermis at which the Aβ fibers are located. Results The ERP N2 latencies for click-only (122 ± 2.9 ms) and click+lance (121 ± 6.5 ms) were significantly shorter than that for WN+lance (154 ± 7.1 ms). The ERP P2 latency for click-only (191 ± 11.3 ms) was significantly shorter than those for click+lance (249 ± 18.6 ms) and WN+lance (253 ± 11.2 ms). This suggests that the click sound shortens the N2 latency of the ERP evoked by skin puncture. The ERP N2 latencies for SL, CL, ES, and IES were 146 ± 8.3, 149 ± 9.9, 148 ± 13.1, and 197 ± 21.2 ms, respectively. The ERP P2 latencies were 250 ± 18.2, 251 ± 14.1, 237 ± 26.3, and 294 ± 30.0 ms, respectively. The ERP latency for SL was significantly shorter than that for IES and was similar to that for ES. This suggests that the penetration force generated by the blade of the lance device activates the Aβ fibers, consequently shortening the ERP latency. Conclusions Lance ERP may reflect the activation of Aβ fibers rather than Aδ fibers. A pain index that correctly and reliably reflects nociceptive processing must be developed to improve pain assessment and management in neonates.
Collapse
Affiliation(s)
- Yui Shiroshita
- Department of Nursing Science, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Hikari Kirimoto
- Department of Sensorimotor Neuroscience, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Tatsunori Watanabe
- Department of Sensorimotor Neuroscience, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Keisuke Yunoki
- Department of Sensorimotor Neuroscience, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Ikuko Sobue
- Department of Nursing Science, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| |
Collapse
|
4
|
Zumer JM, White TP, Noppeney U. The neural mechanisms of audiotactile binding depend on asynchrony. Eur J Neurosci 2020; 52:4709-4731. [PMID: 32725895 DOI: 10.1111/ejn.14928] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2020] [Revised: 07/06/2020] [Accepted: 07/24/2020] [Indexed: 11/30/2022]
Abstract
Asynchrony is a critical cue informing the brain whether sensory signals are caused by a common source and should be integrated or segregated. This psychophysics-electroencephalography (EEG) study investigated the influence of asynchrony on how the brain binds audiotactile (AT) signals to enable faster responses in a redundant target paradigm. Human participants actively responded (psychophysics) or passively attended (EEG) to noise bursts, "taps-to-the-face" and their AT combinations at seven AT asynchronies: 0, ±20, ±70 and ±500 ms. Behaviourally, observers were faster at detecting AT than unisensory stimuli within a temporal integration window: the redundant target effect was maximal for synchronous stimuli and declined within a ≤70 ms AT asynchrony. EEG revealed a cascade of AT interactions that relied on different neural mechanisms depending on AT asynchrony. At small (≤20 ms) asynchronies, AT interactions arose for evoked response potentials (ERPs) at 110 ms and ~400 ms post-stimulus. Selectively at ±70 ms asynchronies, AT interactions were observed for the P200 ERP, theta-band inter-trial coherence (ITC) and power at ~200 ms post-stimulus. In conclusion, AT binding was mediated by distinct neural mechanisms depending on the asynchrony of the AT signals. Early AT interactions in ERPs and theta-band ITC and power were critical for the behavioural response facilitation within a ≤±70 ms temporal integration window.
Collapse
Affiliation(s)
- Johanna M Zumer
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,School of Life and Health Sciences, Aston University, Birmingham, UK
| | - Thomas P White
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK
| | - Uta Noppeney
- School of Psychology, University of Birmingham, Birmingham, UK.,Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Centre for Human Brain Health, University of Birmingham, Birmingham, UK.,Donders Institute for Brain, Cognition, and Behaviour, Nijmegen, The Netherlands
| |
Collapse
|
5
|
Coffman BA, Candelaria-Cook FT, Stephen JM. Unisensory and Multisensory Responses in Fetal Alcohol Spectrum Disorders (FASD): Effects of Spatial Congruence. Neuroscience 2020; 430:34-46. [PMID: 31982473 DOI: 10.1016/j.neuroscience.2020.01.013] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2019] [Revised: 11/22/2019] [Accepted: 01/08/2020] [Indexed: 12/16/2022]
Abstract
While it is generally accepted that structural and functional brain deficits underlie the behavioral deficits associated with Fetal Alcohol Spectrum Disorders (FASD), the degree to which these problems are expressed in sensory pathology is unknown. Electrophysiological measures indicate that neural processing is delayed in visual and auditory domains. Furthermore, multiple reports of white matter deficits due to prenatal alcohol exposure indicate altered cortical connectivity in individuals with FASD. Multisensory integration requires close coordination between disparate cortical areas leading us to hypothesize that individuals with FASD will have impaired multisensory integration relative to healthy control (HC) participants. Participants' neurophysiological responses were recorded using magnetoencephalography (MEG) during passive unisensory or simultaneous, spatially congruent or incongruent multisensory auditory and somatosensory stimuli. Source timecourses from evoked responses were estimated using multi-dipole spatiotemporal modeling. Auditory M100 response latency was faster for the multisensory relative to the unisensory condition but no group differences were observed. M200 auditory latency to congruent stimuli was earlier and congruent amplitude was larger in participants with FASD relative to controls. Somatosensory M100 response latency was faster in right hemisphere for multisensory relative to unisensory stimulation in both groups. FASD participants' somatosensory M200 responses were delayed by 13 ms, but only for the unisensory presentation of the somatosensory stimulus. M200 results indicate that unisensory and multisensory processing is altered in FASD; it remains to be seen if the multisensory response represents a normalization of the unisensory deficits.
Collapse
Affiliation(s)
- Brian A Coffman
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA; Department of Psychology, University of New Mexico, MSC03 2220, 1 University of New Mexico, Albuquerque, NM 87131, USA; Department of Psychiatry, University of Pittsburgh School of Medicine, 3501 Forbes Avenue, Pittsburgh, PA 15213, USA
| | - Felicha T Candelaria-Cook
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA
| | - Julia M Stephen
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, 1101 Yale NE, Albuquerque, NM 87106, USA.
| |
Collapse
|
6
|
Sugiyama S, Kinukawa T, Takeuchi N, Nishihara M, Shioiri T, Inui K. Tactile Cross-Modal Acceleration Effects on Auditory Steady-State Response. Front Integr Neurosci 2019; 13:72. [PMID: 31920574 PMCID: PMC6927992 DOI: 10.3389/fnint.2019.00072] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2019] [Accepted: 12/02/2019] [Indexed: 01/09/2023] Open
Abstract
In the sensory cortex, cross-modal interaction occurs during the early cortical stages of processing; however, its effect on the speed of neuronal activity remains unclear. In this study, we used magnetoencephalography (MEG) to investigate whether tactile stimulation influences auditory steady-state responses (ASSRs). To this end, a 0.5-ms electrical pulse was randomly presented to the dorsum of the left or right hand of 12 healthy volunteers at 700 ms while a train of 25-ms pure tones were applied to the left or right side at 75 dB for 1,200 ms. Peak latencies of 40-Hz ASSR were measured. Our results indicated that tactile stimulation significantly shortened subsequent ASSR latency. This cross-modal effect was observed from approximately 50 ms to 125 ms after the onset of tactile stimulation. The somatosensory information that appeared to converge on the auditory system may have arisen during the early processing stages, with the reduced ASSR latency indicating that a new sensory event from the cross-modal inputs served to increase the speed of ongoing sensory processing. Collectively, our findings indicate that ASSR latency changes are a sensitive index of accelerated processing.
Collapse
Affiliation(s)
- Shunsuke Sugiyama
- Department of Psychiatry and Psychotherapy, Gifu University Graduate School of Medicine, Gifu, Japan
| | - Tomoaki Kinukawa
- Department of Anesthesiology, Nagoya University Graduate School of Medicine, Nagoya, Japan
| | | | - Makoto Nishihara
- Multidisciplinary Pain Center, Aichi Medical University, Nagakute, Japan
| | - Toshiki Shioiri
- Department of Psychiatry and Psychotherapy, Gifu University Graduate School of Medicine, Gifu, Japan
| | - Koji Inui
- Departmernt of Functioning and Disability, Institute for Developmental Research, Kasugai, Japan
| |
Collapse
|
7
|
Bernasconi F, Noel JP, Park HD, Faivre N, Seeck M, Spinelli L, Schaller K, Blanke O, Serino A. Audio-Tactile and Peripersonal Space Processing Around the Trunk in Human Parietal and Temporal Cortex: An Intracranial EEG Study. Cereb Cortex 2019; 28:3385-3397. [PMID: 30010843 PMCID: PMC6095214 DOI: 10.1093/cercor/bhy156] [Citation(s) in RCA: 38] [Impact Index Per Article: 7.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2018] [Accepted: 06/14/2018] [Indexed: 12/04/2022] Open
Abstract
Interactions with the environment happen within one’s peripersonal space (PPS)—the space surrounding the body. Studies in monkeys and humans have highlighted a multisensory distributed cortical network representing the PPS. However, knowledge about the temporal dynamics of PPS processing around the trunk is lacking. Here, we recorded intracranial electroencephalography (iEEG) in humans while administering tactile stimulation (T), approaching auditory stimuli (A), and the 2 combined (AT). To map PPS, tactile stimulation was delivered when the sound was far, intermediate, or close to the body. The 19% of the electrodes showed AT multisensory integration. Among those, 30% showed a PPS effect, a modulation of the response as a function of the distance between the sound and body. AT multisensory integration and PPS effects had similar spatiotemporal characteristics, with an early response (~50 ms) in the insular cortex, and later responses (~200 ms) in precentral and postcentral gyri. Superior temporal cortex showed a different response pattern with AT multisensory integration at ~100 ms without a PPS effect. These results, represent the first iEEG delineation of PPS processing in humans and show that PPS and multisensory integration happen at similar neural sites and time periods, suggesting that PPS representation is based on a spatial modulation of multisensory integration.
Collapse
Affiliation(s)
- Fosco Bernasconi
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Swiss Federal Institute of Technology (Ecole Polytechnique Fédérale de Lausanne), Geneva, Switzerland.,Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Jean-Paul Noel
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Swiss Federal Institute of Technology (Ecole Polytechnique Fédérale de Lausanne), Geneva, Switzerland.,Neuroscience Graduate Program, Vanderbilt University, Nashville, USA.,Vanderbilt Brain Institute, Vanderbilt University, Nashville, USA
| | - Hyeong Dong Park
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Swiss Federal Institute of Technology (Ecole Polytechnique Fédérale de Lausanne), Geneva, Switzerland.,Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Geneva, Switzerland
| | - Nathan Faivre
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Swiss Federal Institute of Technology (Ecole Polytechnique Fédérale de Lausanne), Geneva, Switzerland.,Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Geneva, Switzerland.,Centre d'Economie de la Sorbonne, CNRS UMR 8174, Paris, France
| | - Margitta Seeck
- Presurgical Epilepsy Evaluation Unit, Neurology Department, University Hospital of Geneva, Geneva, Switzerland
| | - Laurent Spinelli
- Presurgical Epilepsy Evaluation Unit, Neurology Department, University Hospital of Geneva, Geneva, Switzerland
| | - Karl Schaller
- Department of Neurosurgery, Geneva University Hospital (HUG), 4 Rue Gabrielle-Perret-Gentil, Geneva, Switzerland
| | - Olaf Blanke
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Swiss Federal Institute of Technology (Ecole Polytechnique Fédérale de Lausanne), Geneva, Switzerland.,Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Geneva, Switzerland.,Centre d'Economie de la Sorbonne, CNRS UMR 8174, Paris, France
| | - Andrea Serino
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Swiss Federal Institute of Technology (Ecole Polytechnique Fédérale de Lausanne), Geneva, Switzerland.,Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Geneva, Switzerland.,MySpace Lab, Department of Clinical Neuroscience, Centre Hospitalier Universitaire Vaudois (CHUV), University of Lausanne, Lausanne, Switzerland
| |
Collapse
|
8
|
Riecke L, Snipes S, van Bree S, Kaas A, Hausfeld L. Audio-tactile enhancement of cortical speech-envelope tracking. Neuroimage 2019; 202:116134. [DOI: 10.1016/j.neuroimage.2019.116134] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2019] [Revised: 08/07/2019] [Accepted: 08/26/2019] [Indexed: 11/25/2022] Open
|
9
|
Xu W, Kolozsvári OB, Oostenveld R, Leppänen PHT, Hämäläinen JA. Audiovisual Processing of Chinese Characters Elicits Suppression and Congruency Effects in MEG. Front Hum Neurosci 2019; 13:18. [PMID: 30787872 PMCID: PMC6372538 DOI: 10.3389/fnhum.2019.00018] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2018] [Accepted: 01/16/2019] [Indexed: 11/13/2022] Open
Abstract
Learning to associate written letters/characters with speech sounds is crucial for reading acquisition. Most previous studies have focused on audiovisual integration in alphabetic languages. Less is known about logographic languages such as Chinese characters, which map onto mostly syllable-based morphemes in the spoken language. Here we investigated how long-term exposure to native language affects the underlying neural mechanisms of audiovisual integration in a logographic language using magnetoencephalography (MEG). MEG sensor and source data from 12 adult native Chinese speakers and a control group of 13 adult Finnish speakers were analyzed for audiovisual suppression (bimodal responses vs. sum of unimodal responses) and congruency (bimodal incongruent responses vs. bimodal congruent responses) effects. The suppressive integration effect was found in the left angular and supramarginal gyri (205-365 ms), left inferior frontal and left temporal cortices (575-800 ms) in the Chinese group. The Finnish group showed a distinct suppression effect only in the right parietal and occipital cortices at a relatively early time window (285-460 ms). The congruency effect was only observed in the Chinese group in left inferior frontal and superior temporal cortex in a late time window (about 500-800 ms) probably related to modulatory feedback from multi-sensory regions and semantic processing. The audiovisual integration in a logographic language showed a clear resemblance to that in alphabetic languages in the left superior temporal cortex, but with activation specific to the logographic stimuli observed in the left inferior frontal cortex. The current MEG study indicated that learning of logographic languages has a large impact on the audiovisual integration of written characters with some distinct features compared to previous results on alphabetic languages.
Collapse
Affiliation(s)
- Weiyong Xu
- Department of Psychology, University of Jyväskylä, Jyväskylä, Finland.,Jyväskylä Centre for Interdisciplinary Brain Research, Department of Psychology, University of Jyväskylä, Jyväskylä, Finland
| | - Orsolya Beatrix Kolozsvári
- Department of Psychology, University of Jyväskylä, Jyväskylä, Finland.,Jyväskylä Centre for Interdisciplinary Brain Research, Department of Psychology, University of Jyväskylä, Jyväskylä, Finland
| | - Robert Oostenveld
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands.,NatMEG, Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - Paavo Herman Tapio Leppänen
- Department of Psychology, University of Jyväskylä, Jyväskylä, Finland.,Jyväskylä Centre for Interdisciplinary Brain Research, Department of Psychology, University of Jyväskylä, Jyväskylä, Finland
| | - Jarmo Arvid Hämäläinen
- Department of Psychology, University of Jyväskylä, Jyväskylä, Finland.,Jyväskylä Centre for Interdisciplinary Brain Research, Department of Psychology, University of Jyväskylä, Jyväskylä, Finland
| |
Collapse
|
10
|
Effect of acceleration of auditory inputs on the primary somatosensory cortex in humans. Sci Rep 2018; 8:12883. [PMID: 30150686 PMCID: PMC6110726 DOI: 10.1038/s41598-018-31319-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2018] [Accepted: 08/17/2018] [Indexed: 11/09/2022] Open
Abstract
Cross-modal interaction occurs during the early stages of processing in the sensory cortex; however, its effect on neuronal activity speed remains unclear. We used magnetoencephalography to investigate whether auditory stimulation influences the initial cortical activity in the primary somatosensory cortex. A 25-ms pure tone was randomly presented to the left or right side of healthy volunteers at 1000 ms when electrical pulses were applied to the left or right median nerve at 20 Hz for 1500 ms because we did not observe any cross-modal effect elicited by a single pulse. The latency of N20 m originating from Brodmann's area 3b was measured for each pulse. The auditory stimulation significantly shortened the N20 m latency at 1050 and 1100 ms. This reduction in N20 m latency was identical for the ipsilateral and contralateral sounds for both latency points. Therefore, somatosensory-auditory interaction, such as input to the area 3b from the thalamus, occurred during the early stages of synaptic transmission. Auditory information that converged on the somatosensory system was considered to have arisen from the early stages of the feedforward pathway. Acceleration of information processing through the cross-modal interaction seemed to be partly due to faster processing in the sensory cortex.
Collapse
|
11
|
Bolaños AD, Coffman BA, Candelaria-Cook FT, Kodituwakku P, Stephen JM. Altered Neural Oscillations During Multisensory Integration in Adolescents with Fetal Alcohol Spectrum Disorder. Alcohol Clin Exp Res 2017; 41:2173-2184. [PMID: 28944474 DOI: 10.1111/acer.13510] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2017] [Accepted: 09/19/2017] [Indexed: 01/22/2023]
Abstract
BACKGROUND Children with fetal alcohol spectrum disorder (FASD), who were exposed to alcohol in utero, display a broad range of sensory, cognitive, and behavioral deficits, which are broadly theorized to be rooted in altered brain function and structure. Based on the role of neural oscillations in multisensory integration from past studies, we hypothesized that adolescents with FASD would show a decrease in oscillatory power during event-related gamma oscillatory activity (30 to 100 Hz), when compared to typically developing healthy controls (HC), and that such decrease in oscillatory power would predict behavioral performance. METHODS We measured sensory neurophysiology using magnetoencephalography (MEG) during passive auditory, somatosensory, and multisensory (synchronous) stimulation in 19 adolescents (12 to 21 years) with FASD and 23 age- and gender-matched HC. We employed a cross-hemisphere multisensory paradigm to assess interhemispheric connectivity deficits in children with FASD. RESULTS Time-frequency analysis of MEG data revealed a significant decrease in gamma oscillatory power for both unisensory and multisensory conditions in the FASD group relative to HC, based on permutation testing of significant group differences. Greater beta oscillatory power (15 to 30 Hz) was also noted in the FASD group compared to HC in both unisensory and multisensory conditions. Regression analysis revealed greater predictive power of multisensory oscillations from unisensory oscillations in the FASD group compared to the HC group. Furthermore, multisensory oscillatory power, for both groups, predicted performance on the Intra-Extradimensional Set Shift Task and the Cambridge Gambling Task. CONCLUSIONS Altered oscillatory power in the FASD group may reflect a restricted ability to process somatosensory and multisensory stimuli during day-to-day interactions. These alterations in neural oscillations may be associated with the neurobehavioral deficits experienced by adolescents with FASD and may carry over to adulthood.
Collapse
Affiliation(s)
- Alfredo D Bolaños
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, Albuquerque, New Mexico
| | - Brian A Coffman
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, Albuquerque, New Mexico.,Department of Psychiatry, University of Pittsburgh School of Medicine, Pittsburgh, Pennsylvania
| | - Felicha T Candelaria-Cook
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, Albuquerque, New Mexico.,Biomedical Informatics Unit, Health Sciences Library and Informatics Center, University of New Mexico Health Sciences Center, Albuquerque, New Mexico
| | - Piyadasa Kodituwakku
- Department of Pediatrics, University of New Mexico Health Sciences Center, Albuquerque, New Mexico
| | - Julia M Stephen
- The Mind Research Network and Lovelace Biomedical and Environmental Research Institute, Albuquerque, New Mexico
| |
Collapse
|
12
|
Landry SP, Champoux F. Musicians react faster and are better multisensory integrators. Brain Cogn 2016; 111:156-162. [PMID: 27978450 DOI: 10.1016/j.bandc.2016.12.001] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2016] [Revised: 11/03/2016] [Accepted: 12/01/2016] [Indexed: 10/20/2022]
Abstract
The results from numerous investigations suggest that musical training might enhance how senses interact. Despite repeated confirmation of anatomical and structural changes in visual, tactile, and auditory regions, significant changes have only been reported in the audiovisual domain and for the detection of audio-tactile incongruencies. In the present study, we aim at testing whether long-term musical training might also enhance other multisensory processes at a behavioural level. An audio-tactile reaction time task was administrated to a group of musicians and non-musicians. We found significantly faster reaction times with musicians for auditory, tactile, and audio-tactile stimulations. Statistical analyses between the combined uni- and multisensory reaction times revealed that musicians possess a statistical advantage when responding to multisensory stimuli compared to non-musicians. These results suggest for the first time that long-term musical training reduces simple non-musical auditory, tactile, and multisensory reaction times. Taken together with the previous results from other sensory modalities, these results strongly point towards musicians being better at integrating the inputs from various senses.
Collapse
Affiliation(s)
- Simon P Landry
- Université de Montréal, Faculté de Medicine, École d'orthophonie et d'audiologie, C.P. 6128, Succursale Centre-Ville, Montréal, Québec H3C 3J7, Canada
| | - François Champoux
- Université de Montréal, Faculté de Medicine, École d'orthophonie et d'audiologie, C.P. 6128, Succursale Centre-Ville, Montréal, Québec H3C 3J7, Canada.
| |
Collapse
|
13
|
Wu C, Stefanescu RA, Martel DT, Shore SE. Listening to another sense: somatosensory integration in the auditory system. Cell Tissue Res 2015; 361:233-50. [PMID: 25526698 PMCID: PMC4475675 DOI: 10.1007/s00441-014-2074-7] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2014] [Accepted: 11/18/2014] [Indexed: 12/19/2022]
Abstract
Conventionally, sensory systems are viewed as separate entities, each with its own physiological process serving a different purpose. However, many functions require integrative inputs from multiple sensory systems and sensory intersection and convergence occur throughout the central nervous system. The neural processes for hearing perception undergo significant modulation by the two other major sensory systems, vision and somatosensation. This synthesis occurs at every level of the ascending auditory pathway: the cochlear nucleus, inferior colliculus, medial geniculate body and the auditory cortex. In this review, we explore the process of multisensory integration from (1) anatomical (inputs and connections), (2) physiological (cellular responses), (3) functional and (4) pathological aspects. We focus on the convergence between auditory and somatosensory inputs in each ascending auditory station. This review highlights the intricacy of sensory processing and offers a multisensory perspective regarding the understanding of sensory disorders.
Collapse
Affiliation(s)
- Calvin Wu
- Department of Otolaryngology, Kresge Hearing Research Institute, University of Michigan, Ann Arbor, MI, 48109, USA
| | | | | | | |
Collapse
|
14
|
Leonardelli E, Braun C, Weisz N, Lithari C, Occelli V, Zampini M. Prestimulus oscillatory alpha power and connectivity patterns predispose perceptual integration of an audio and a tactile stimulus. Hum Brain Mapp 2015; 36:3486-98. [PMID: 26109518 DOI: 10.1002/hbm.22857] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2014] [Revised: 05/13/2015] [Accepted: 05/14/2015] [Indexed: 11/06/2022] Open
Abstract
To efficiently perceive and respond to the external environment, our brain has to perceptually integrate or segregate stimuli of different modalities. The temporal relationship between the different sensory modalities is therefore essential for the formation of different multisensory percepts. In this magnetoencephalography study, we created a paradigm where an audio and a tactile stimulus were presented by an ambiguous temporal relationship so that perception of physically identical audiotactile stimuli could vary between integrated (emanating from the same source) and segregated. This bistable paradigm allowed us to compare identical bimodal stimuli that elicited different percepts, providing a possibility to directly infer multisensory interaction effects. Local differences in alpha power over bilateral inferior parietal lobules (IPLs) and superior parietal lobules (SPLs) preceded integrated versus segregated percepts of the two stimuli (audio and tactile). Furthermore, differences in long-range cortical functional connectivity seeded in rIPL (region of maximum difference) revealed differential patterns that predisposed integrated or segregated percepts encompassing secondary areas of all different modalities and prefrontal cortex. We showed that the prestimulus brain states predispose the perception of the audiotactile stimulus both in a global and a local manner. Our findings are in line with a recent consistent body of findings on the importance of prestimulus brain states for perception of an upcoming stimulus. This new perspective on how stimuli originating from different modalities are integrated suggests a non-modality specific network predisposing multisensory perception.
Collapse
Affiliation(s)
| | - Christoph Braun
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy.,MEG Center, University of Tübingen, Tübingen, Germany.,Werner Reichardt Centre for Integrative Neuroscience(CIN), University of Tübingen, Tübingen, Germany
| | - Nathan Weisz
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | - Chrysa Lithari
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | | | | |
Collapse
|
15
|
Ito T, Gracco VL, Ostry DJ. Temporal factors affecting somatosensory-auditory interactions in speech processing. Front Psychol 2014; 5:1198. [PMID: 25452733 PMCID: PMC4233986 DOI: 10.3389/fpsyg.2014.01198] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2014] [Accepted: 10/04/2014] [Indexed: 12/03/2022] Open
Abstract
Speech perception is known to rely on both auditory and visual information. However, sound-specific somatosensory input has been shown also to influence speech perceptual processing (Ito et al., 2009). In the present study, we addressed further the relationship between somatosensory information and speech perceptual processing by addressing the hypothesis that the temporal relationship between orofacial movement and sound processing contributes to somatosensory–auditory interaction in speech perception. We examined the changes in event-related potentials (ERPs) in response to multisensory synchronous (simultaneous) and asynchronous (90 ms lag and lead) somatosensory and auditory stimulation compared to individual unisensory auditory and somatosensory stimulation alone. We used a robotic device to apply facial skin somatosensory deformations that were similar in timing and duration to those experienced in speech production. Following synchronous multisensory stimulation the amplitude of the ERP was reliably different from the two unisensory potentials. More importantly, the magnitude of the ERP difference varied as a function of the relative timing of the somatosensory–auditory stimulation. Event-related activity change due to stimulus timing was seen between 160 and 220 ms following somatosensory onset, mostly around the parietal area. The results demonstrate a dynamic modulation of somatosensory–auditory convergence and suggest the contribution of somatosensory information for speech processing process is dependent on the specific temporal order of sensory inputs in speech production.
Collapse
Affiliation(s)
| | - Vincent L Gracco
- Haskins Laboratories, New Haven , CT, USA ; McGill University, Montréal , QC, Canada
| | - David J Ostry
- Haskins Laboratories, New Haven , CT, USA ; McGill University, Montréal , QC, Canada
| |
Collapse
|
16
|
Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions. Neuropsychologia 2014; 57:71-7. [DOI: 10.1016/j.neuropsychologia.2014.02.004] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2013] [Revised: 02/02/2014] [Accepted: 02/04/2014] [Indexed: 11/21/2022]
|
17
|
Reynolds GD, Bahrick LE, Lickliter R, Guy MW. Neural correlates of intersensory processing in 5-month-old infants. Dev Psychobiol 2014; 56:355-72. [PMID: 23423948 PMCID: PMC3954462 DOI: 10.1002/dev.21104] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2012] [Accepted: 01/16/2013] [Indexed: 11/12/2022]
Abstract
Two experiments assessing event-related potentials in 5-month-old infants were conducted to examine neural correlates of attentional salience and efficiency of processing of a visual event (woman speaking) paired with redundant (synchronous) speech, nonredundant (asynchronous) speech, or no speech. In Experiment 1, the Nc component associated with attentional salience was greater in amplitude following synchronous audiovisual as compared with asynchronous audiovisual and unimodal visual presentations. A block design was utilized in Experiment 2 to examine efficiency of processing of a visual event. Only infants exposed to synchronous audiovisual speech demonstrated a significant reduction in amplitude of the late slow wave associated with successful stimulus processing and recognition memory from early to late blocks of trials. These findings indicate that events that provide intersensory redundancy are associated with enhanced neural responsiveness indicative of greater attentional salience and more efficient stimulus processing as compared with the same events when they provide no intersensory redundancy in 5-month-old infants.
Collapse
Affiliation(s)
- Greg D Reynolds
- Department of Psychology, University of Tennessee, Knoxville, TN 37996.
| | | | | | | |
Collapse
|
18
|
Ito T, Johns AR, Ostry DJ. Left lateralized enhancement of orofacial somatosensory processing due to speech sounds. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2013; 56:S1875-S1881. [PMID: 24687443 PMCID: PMC4228692 DOI: 10.1044/1092-4388(2013/12-0226)] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
PURPOSE Somatosensory information associated with speech articulatory movements affects the perception of speech sounds and vice versa, suggesting an intimate linkage between speech production and perception systems. However, it is unclear which cortical processes are involved in the interaction between speech sounds and orofacial somatosensory inputs. The authors examined whether speech sounds modify orofacial somatosensory cortical potentials that were elicited using facial skin perturbations. METHOD Somatosensory event-related potentials in EEG were recorded in 3 background sound conditions (pink noise, speech sounds, and nonspeech sounds) and also in a silent condition. Facial skin deformations that are similar in timing and duration to those experienced in speech production were used for somatosensory stimulation. RESULTS The authors found that speech sounds reliably enhanced the first negative peak of the somatosensory event-related potential when compared with the other 3 sound conditions. The enhancement was evident at electrode locations above the left motor and premotor area of the orofacial system. The result indicates that speech sounds interact with somatosensory cortical processes that are produced by speech-production-like patterns of facial skin stretch. CONCLUSION Neural circuits in the left hemisphere, presumably in left motor and premotor cortex, may play a prominent role in the interaction between auditory inputs and speech-relevant somatosensory processing.
Collapse
Affiliation(s)
| | - Alexis R. Johns
- Haskins Laboratories, New Haven, CT
- University of Connecticut, Storrs
| | - David J. Ostry
- Haskins Laboratories, New Haven, CT
- McGill University, Montreal, Quebec, Canada
| |
Collapse
|
19
|
Nozaradan S, Zerouali Y, Peretz I, Mouraux A. Capturing with EEG the neural entrainment and coupling underlying sensorimotor synchronization to the beat. ACTA ACUST UNITED AC 2013; 25:736-47. [PMID: 24108804 DOI: 10.1093/cercor/bht261] [Citation(s) in RCA: 76] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Synchronizing movements with rhythmic inputs requires tight coupling of sensory and motor neural processes. Here, using a novel approach based on the recording of steady-state-evoked potentials (SS-EPs), we examine how distant brain areas supporting these processes coordinate their dynamics. The electroencephalogram was recorded while subjects listened to a 2.4-Hz auditory beat and tapped their hand on every second beat. When subjects tapped to the beat, the EEG was characterized by a 2.4-Hz SS-EP compatible with beat-related entrainment and a 1.2-Hz SS-EP compatible with movement-related entrainment, based on the results of source analysis. Most importantly, when compared with passive listening of the beat, we found evidence suggesting an interaction between sensory- and motor-related activities when subjects tapped to the beat, in the form of (1) additional SS-EP appearing at 3.6 Hz, compatible with a nonlinear product of sensorimotor integration; (2) phase coupling of beat- and movement-related activities; and (3) selective enhancement of beat-related activities over the hemisphere contralateral to the tapping, suggesting a top-down effect of movement-related activities on auditory beat processing. Taken together, our results are compatible with the view that rhythmic sensorimotor synchronization is supported by a dynamic coupling of sensory and motor related activities.
Collapse
Affiliation(s)
- Sylvie Nozaradan
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Belgium International Laboratory for Brain, Music and Sound Research (BRAMS), Université de Montréal, Canada
| | - Younes Zerouali
- Ecole de Technologie Supérieure, Université de Montréal, Canada
| | - Isabelle Peretz
- International Laboratory for Brain, Music and Sound Research (BRAMS), Université de Montréal, Canada
| | - André Mouraux
- Institute of Neuroscience (IONS), Université catholique de Louvain (UCL), Belgium
| |
Collapse
|
20
|
Mendes RM, Barbosa RI, Salmón CEG, Rondinoni C, Escorsi-Rosset S, Delsim JC, Barbieri CH, Mazzer N. Auditory stimuli from a sensor glove model modulate cortical audiotactile integration. Neurosci Lett 2013; 548:33-7. [DOI: 10.1016/j.neulet.2013.04.019] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2012] [Revised: 04/03/2013] [Accepted: 04/10/2013] [Indexed: 01/22/2023]
|
21
|
Abstract
AbstractThere is a strong interaction between multisensory processing and the neuroplasticity of the human brain. On one hand, recent research demonstrates that experience and training in various domains modifies how information from the different senses is integrated; and, on the other hand multisensory training paradigms seem to be particularly effective in driving functional and structural plasticity. Multisensory training affects early sensory processing within separate sensory domains, as well as the functional and structural connectivity between uni- and multisensory brain regions. In this review, we discuss the evidence for interactions of multisensory processes and brain plasticity and give an outlook on promising clinical applications and open questions.
Collapse
|
22
|
Nordmark PF, Pruszynski JA, Johansson RS. BOLD responses to tactile stimuli in visual and auditory cortex depend on the frequency content of stimulation. J Cogn Neurosci 2012; 24:2120-34. [PMID: 22721377 DOI: 10.1162/jocn_a_00261] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Although some brain areas preferentially process information from a particular sensory modality, these areas can also respond to other modalities. Here we used fMRI to show that such responsiveness to tactile stimuli depends on the temporal frequency of stimulation. Participants performed a tactile threshold-tracking task where the tip of either their left or right middle finger was stimulated at 3, 20, or 100 Hz. Whole-brain analysis revealed an effect of stimulus frequency in two regions: the auditory cortex and the visual cortex. The BOLD response in the auditory cortex was stronger during stimulation at hearable frequencies (20 and 100 Hz) whereas the response in the visual cortex was suppressed at infrasonic frequencies (3 Hz). Regardless of which hand was stimulated, the frequency-dependent effects were lateralized to the left auditory cortex and the right visual cortex. Furthermore, the frequency-dependent effects in both areas were abolished when the participants performed a visual task while receiving identical tactile stimulation as in the tactile threshold-tracking task. We interpret these findings in the context of the metamodal theory of brain function, which posits that brain areas contribute to sensory processing by performing specific computations regardless of input modality.
Collapse
Affiliation(s)
- Per F Nordmark
- Department of Integrative Medical Biology, Physiology Section, Umeå University,SE 90187 Umeå, Sweden.
| | | | | |
Collapse
|
23
|
Abstract
Playing a musical instrument requires a complex skill set that depends on the brain's ability to quickly integrate information from multiple senses. It has been well documented that intensive musical training alters brain structure and function within and across multisensory brain regions, supporting the experience-dependent plasticity model. Here, we argue that this experience-dependent plasticity occurs because of the multisensory nature of the brain and may be an important contributing factor to musical learning. This review highlights key multisensory regions within the brain and discusses their role in the context of music learning and rehabilitation.
Collapse
Affiliation(s)
- Emily Zimmerman
- Department of Newborn Medicine, Brigham and Women's Hospital, Boston, Massachusetts, USA
| | | |
Collapse
|
24
|
Cross-modal recruitment of primary visual cortex by auditory stimuli in the nonhuman primate brain: a molecular mapping study. Neural Plast 2012; 2012:197264. [PMID: 22792489 PMCID: PMC3388421 DOI: 10.1155/2012/197264] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2012] [Revised: 04/17/2012] [Accepted: 05/07/2012] [Indexed: 11/26/2022] Open
Abstract
Recent studies suggest that exposure to only one component of audiovisual events can lead to cross-modal cortical activation. However, it is not certain whether such crossmodal recruitment can occur in the absence of explicit conditioning, semantic factors, or long-term associations. A recent study demonstrated that crossmodal cortical recruitment can occur even after a brief exposure to bimodal stimuli without semantic association. In addition, the authors showed that the primary visual cortex is under such crossmodal influence. In the present study, we used molecular activity mapping of the immediate early gene zif268. We found that animals, which had previously been exposed to a combination of auditory and visual stimuli, showed increased number of active neurons in the primary visual cortex when presented with sounds alone. As previously implied, this crossmodal activation appears to be the result of implicit associations of the two stimuli, likely driven by their spatiotemporal characteristics; it was observed after a relatively short period of exposure (~45 min) and lasted for a relatively long period after the initial exposure (~1 day). These results suggest that the previously reported findings may be directly rooted in the increased activity of the neurons occupying the primary visual cortex.
Collapse
|
25
|
Hsieh PJ, Colas JT, Kanwisher N. Spatial pattern of BOLD fMRI activation reveals cross-modal information in auditory cortex. J Neurophysiol 2012; 107:3428-32. [PMID: 22514287 DOI: 10.1152/jn.01094.2010] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Recent findings suggest that neural representations in early auditory cortex reflect not only the physical properties of a stimulus, but also high-level, top-down, and even cross-modal information. However, the nature of cross-modal information in auditory cortex remains poorly understood. Here, we used pattern analyses of fMRI data to ask whether early auditory cortex contains information about the visual environment. Our data show that 1) early auditory cortex contained information about a visual stimulus when there was no bottom-up auditory signal, and that 2) no influence of visual stimulation was observed in auditory cortex when visual stimuli did not provide a context relevant to audition. Our findings attest to the capacity of auditory cortex to reflect high-level, top-down, and cross-modal information and indicate that the spatial patterns of activation in auditory cortex reflect contextual/implied auditory information but not visual information per se.
Collapse
Affiliation(s)
- P-J Hsieh
- Neuroscience and Behavioral Disorders Program, Duke-NUS Graduate Medical School, Singapore.
| | | | | |
Collapse
|
26
|
Abstract
Interactions between auditory and somatosensory information are relevant to the neural processing of speech since speech processes and certainly speech production involves both auditory information and inputs that arise from the muscles and tissues of the vocal tract. We previously demonstrated that somatosensory inputs associated with facial skin deformation alter the perceptual processing of speech sounds. We show here that the reverse is also true, that speech sounds alter the perception of facial somatosensory inputs. As a somatosensory task, we used a robotic device to create patterns of facial skin deformation that would normally accompany speech production. We found that the perception of the facial skin deformation was altered by speech sounds in a manner that reflects the way in which auditory and somatosensory effects are linked in speech production. The modulation of orofacial somatosensory processing by auditory inputs was specific to speech and likewise to facial skin deformation. Somatosensory judgments were not affected when the skin deformation was delivered to the forearm or palm or when the facial skin deformation accompanied nonspeech sounds. The perceptual modulation that we observed in conjunction with speech sounds shows that speech sounds specifically affect neural processing in the facial somatosensory system and suggest the involvement of the somatosensory system in both the production and perceptual processing of speech.
Collapse
Affiliation(s)
- Takayuki Ito
- Dept. of Psychology, McGill Univ., 1205 Dr. Penfield Ave., Montréal, QC, Canada H3A 1B1
| | | |
Collapse
|
27
|
Abstract
Certain features of objects or events can be represented by more than a single sensory system, such as roughness of a surface (sight, sound, and touch), the location of a speaker (audition and sight), and the rhythm or duration of an event (by all three major sensory systems). Thus, these properties can be said to be sensory-independent or amodal. A key question is whether common multisensory cortical regions process these amodal features, or does each sensory system contain its own specialized region(s) for processing common features? We tackled this issue by investigating simple duration-detection mechanisms across audition and touch; these systems were chosen because fine duration discriminations are possible in both. The mismatch negativity (MMN) component of the human event-related potential provides a sensitive metric of duration processing and has been elicited independently during both auditory and somatosensory investigations. Employing high-density electroencephalographic recordings in conjunction with intracranial subdural recordings, we asked whether fine duration discriminations, represented by the MMN, were generated in the same cortical regions regardless of the sensory modality being probed. Scalp recordings pointed to statistically distinct MMN topographies across senses, implying differential underlying cortical generator configurations. Intracranial recordings confirmed these noninvasive findings, showing generators of the auditory MMN along the superior temporal gyrus with no evidence of a somatosensory MMN in this region, whereas a robust somatosensory MMN was recorded from postcentral gyrus in the absence of an auditory MMN. The current data clearly argue against a common circuitry account for amodal duration processing.
Collapse
|
28
|
Russo N, Foxe JJ, Brandwein AB, Altschuler T, Gomes H, Molholm S. Multisensory processing in children with autism: high-density electrical mapping of auditory-somatosensory integration. Autism Res 2011; 3:253-67. [PMID: 20730775 DOI: 10.1002/aur.152] [Citation(s) in RCA: 97] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Successful integration of signals from the various sensory systems is crucial for normal sensory-perceptual functioning, allowing for the perception of coherent objects rather than a disconnected cluster of fragmented features. Several prominent theories of autism suggest that automatic integration is impaired in this population, but there have been few empirical tests of this thesis. A standard electrophysiological metric of multisensory integration (MSI) was used to test the integrity of auditory-somatosensory integration in children with autism (N=17, aged 6-16 years), compared to age- and IQ-matched typically developing (TD) children. High-density electrophysiology was recorded while participants were presented with either auditory or somatosensory stimuli alone (unisensory conditions), or as a combined auditory-somatosensory stimulus (multisensory condition), in randomized order. Participants watched a silent movie during testing, ignoring concurrent stimulation. Significant differences between neural responses to the multisensory auditory-somatosensory stimulus and the unisensory stimuli (the sum of the responses to the auditory and somatosensory stimuli when presented alone) served as the dependent measure. The data revealed group differences in the integration of auditory and somatosensory information that appeared at around 175 ms, and were characterized by the presence of MSI for the TD but not the autism spectrum disorder (ASD) children. Overall, MSI was less extensive in the ASD group. These findings are discussed within the framework of current knowledge of MSI in typical development as well as in relation to theories of ASD.
Collapse
Affiliation(s)
- Natalie Russo
- City College of New York, The Children's Research Unit, Program in Cognitive Neuroscience, Departments of Psychology & Biology, New York, USA
| | | | | | | | | | | |
Collapse
|
29
|
Pihko E, Lauronen L, Kivistö K, Nevalainen P. Increasing the efficiency of neonatal MEG measurements by alternating auditory and tactile stimulation. Clin Neurophysiol 2010; 122:808-14. [PMID: 20951084 DOI: 10.1016/j.clinph.2010.09.017] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2010] [Revised: 08/27/2010] [Accepted: 09/21/2010] [Indexed: 10/18/2022]
Abstract
OBJECTIVE To evaluate the possible effect of intervening auditory stimulation on somatosensory evoked magnetic fields in newborns. METHODS We recorded auditory and tactile evoked responses with magnetoencephalography (MEG) from two groups of healthy newborns. One group (n=11) received only tactile stimuli to the index finger, the other (n=11) received alternating tactile and auditory (vowel [a:] with 300-ms duration) stimuli. The interval between subsequent tactile stimuli was always 2 s. We analyzed the equivalent current dipoles (ECDs) of the main auditory and somatosensory responses. RESULTS The ECDs of the tactile responses agreed with activation of the primary somatosensory cortex at ∼60 ms and the secondary somatosensory region at ∼200 ms. The source of the auditory response (∼250 ms) was clearly distinct from those to tactile stimulation and in line with auditory cortex activation. The intervening auditory stimulation did not affect the strength, latency, or location of the ECDs of the tactile responses. CONCLUSIONS Auditory and tactile MEG responses from newborns can be obtained in one measurement session. SIGNIFICANCE The alternating stimulation can be used to shorten the total measurement time and/or to improve the signal to noise ratio by collecting more data.
Collapse
Affiliation(s)
- Elina Pihko
- Brain Research Unit, Low Temperature Laboratory, Aalto University School of Science and Technology, Espoo, Finland.
| | | | | | | |
Collapse
|
30
|
Yau JM, Weber AI, Bensmaia SJ. Separate mechanisms for audio-tactile pitch and loudness interactions. Front Psychol 2010; 1:160. [PMID: 21887147 PMCID: PMC3157934 DOI: 10.3389/fpsyg.2010.00160] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2010] [Accepted: 09/09/2010] [Indexed: 11/13/2022] Open
Abstract
A major goal in perceptual neuroscience is to understand how signals from different sensory modalities are combined to produce stable and coherent representations. We previously investigated interactions between audition and touch, motivated by the fact that both modalities are sensitive to environmental oscillations. In our earlier study, we characterized the effect of auditory distractors on tactile frequency and intensity perception. Here, we describe the converse experiments examining the effect of tactile distractors on auditory processing. Because the two studies employ the same psychophysical paradigm, we combined their results for a comprehensive view of how auditory and tactile signals interact and how these interactions depend on the perceptual task. Together, our results show that temporal frequency representations are perceptually linked regardless of the attended modality. In contrast, audio-tactile loudness interactions depend on the attended modality: Tactile distractors influence judgments of auditory intensity, but judgments of tactile intensity are impervious to auditory distraction. Lastly, we show that audio-tactile loudness interactions depend critically on stimulus timing, while pitch interactions do not. These results reveal that auditory and tactile inputs are combined differently depending on the perceptual task. That distinct rules govern the integration of auditory and tactile signals in pitch and loudness perception implies that the two are mediated by separate neural mechanisms. These findings underscore the complexity and specificity of multisensory interactions.
Collapse
Affiliation(s)
- Jeffrey M Yau
- Department of Neurology, Division of Cognitive Neuroscience, Johns Hopkins University School of Medicine Baltimore, MD, USA
| | | | | |
Collapse
|
31
|
Bolognini N, Papagno C, Moroni D, Maravita A. Tactile temporal processing in the auditory cortex. J Cogn Neurosci 2010; 22:1201-11. [PMID: 19413471 DOI: 10.1162/jocn.2009.21267] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Perception of the outside world results from integration of information simultaneously derived via multiple senses. Increasing evidence suggests that the neural underpinnings of multisensory integration extend into the early stages of sensory processing. In the present study, we investigated whether the superior temporal gyrus (STG), an auditory modality-specific area, is critical for processing tactile events. Transcranial magnetic stimulation (TMS) was applied over the left STG and the left primary somatosensory cortex (SI) at different time intervals (60, 120, and 180 msec) during a tactile temporal discrimination task (Experiment 1) and a tactile spatial discrimination task (Experiment 2). Tactile temporal processing was disrupted when TMS was applied to SI at 60 msec after tactile presentation, confirming the modality specificity of this region. Crucially, TMS over STG also affected tactile temporal processing but at 180 msec delay. In both cases, the impairment was limited to the contralateral touches and was due to reduced perceptual sensitivity. In contrary, tactile spatial processing was impaired only by TMS over SI at 60-120 msec. These findings demonstrate the causal involvement of auditory areas in processing the duration of somatosensory events, suggesting that STG might play a supramodal role in temporal perception. Furthermore, the involvement of auditory cortex in somatosensory processing supports the view that multisensory integration occurs at an early stage of cortical processing.
Collapse
|
32
|
Aspell JE, Lavanchy T, Lenggenhager B, Blanke O. Seeing the body modulates audiotactile integration. Eur J Neurosci 2010; 31:1868-73. [DOI: 10.1111/j.1460-9568.2010.07210.x] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
33
|
An exploratory event-related potential study of multisensory integration in sensory over-responsive children. Brain Res 2010; 1321:67-77. [PMID: 20097181 DOI: 10.1016/j.brainres.2010.01.043] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2009] [Revised: 01/14/2010] [Accepted: 01/15/2010] [Indexed: 11/20/2022]
Abstract
Children who are over-responsive to sensation have defensive and "fight or flight" reactions to ordinary levels of sensory stimulation in the environment. Based on clinical observations, sensory over-responsivity is hypothesized to reflect atypical neural integration of sensory input. To examine a possible underlying neural mechanism of the disorder, integration of simultaneous multisensory auditory and somatosensory stimulation was studied in twenty children with sensory over-responsivity (SOR) using event-related potentials (ERPs). Three types of sensory stimuli were presented and ERPs were recorded from thirty-two scalp electrodes while participants watched a silent cartoon: bilateral auditory clicks, right somatosensory median nerve electrical pulses, or both simultaneously. The paradigm was passive; no behavioral responses were required. To examine integration, responses to simultaneous multisensory auditory-somatosensory stimulation were compared to the sum of unisensory auditory plus unisensory somatosensory responses in four time-windows: (60-80 ms, 80-110 ms, 110-150 ms, and 180-220 ms). Specific midline and lateral electrode sites were examined over scalp regions where auditory-somatosensory integration was expected based on previous studies. Midline electrode sites (Fz, Cz, and Pz) showed significant integration during two time-windows: 60-80 ms and 180-220 ms. Significant integration was also found at contralateral electrode site (C3) for the time-window between 180 and 220 ms. At ipsilateral electrode sites (C4 and CP6), no significant integration was found during any of the time-windows (i.e. the multisensory ERP was not significantly different from the summed unisensory ERP). These results demonstrate that MSI can be reliably measured in children with SOR and provide evidence that multisensory auditory-somatosensory input is integrated during both early and later stages of sensory information processing, mainly over fronto-central scalp regions.
Collapse
|
34
|
Murray MM, Spierer L. Auditory spatio-temporal brain dynamics and their consequences for multisensory interactions in humans. Hear Res 2009; 258:121-33. [DOI: 10.1016/j.heares.2009.04.022] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/02/2009] [Revised: 04/28/2009] [Accepted: 04/28/2009] [Indexed: 11/24/2022]
|
35
|
Zangenehpour S, Zatorre RJ. Crossmodal recruitment of primary visual cortex following brief exposure to bimodal audiovisual stimuli. Neuropsychologia 2009; 48:591-600. [PMID: 19883668 DOI: 10.1016/j.neuropsychologia.2009.10.022] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2009] [Revised: 10/14/2009] [Accepted: 10/22/2009] [Indexed: 10/20/2022]
Abstract
Several lines of evidence suggest that exposure to only one component of typically audiovisual events can lead to crossmodal cortical activation. These effects are likely explained by long-term associations formed between the auditory and visual components of such events. It is not certain whether such crossmodal recruitment can occur in the absence of explicit conditioning, semantic factors, or long-term association; nor is it clear whether primary sensory cortices can be recruited in such paradigms. In the present study we tested the hypothesis that crossmodal cortical recruitment would occur even after a brief exposure to bimodal stimuli without semantic association. We used positron emission tomography, and an apparatus allowing presentation of spatially and temporally congruous audiovisual stimuli (noise bursts and light flashes). When presented with only the auditory or visual components of the bimodal stimuli, naïve subjects showed only modality-specific cortical activation, as expected. However, subjects who had previously been exposed to the audiovisual stimuli showed increased cerebral blood flow in the primary visual cortex when presented with sounds alone. Functional connectivity analysis suggested that the auditory cortex was the source of visual cortex activity. This crossmodal activation appears to be the result of implicit associations of the two stimuli, likely driven by their spatiotemporal characteristics; it was observed after a relatively short period of exposure (approximately 45 min), and lasted for a relatively long period after the initial exposure (approximately 1 day). The findings indicate that auditory and visual cortices interact with one another to a larger degree than typically assumed.
Collapse
|
36
|
Polley DB, Hillock AR, Spankovich C, Popescu MV, Royal DW, Wallace MT. Development and plasticity of intra- and intersensory information processing. J Am Acad Audiol 2009; 19:780-98. [PMID: 19358458 DOI: 10.3766/jaaa.19.10.6] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The functional architecture of sensory brain regions reflects an ingenious biological solution to the competing demands of a continually changing sensory environment. While they are malleable, they have the constancy necessary to support a stable sensory percept. How does the functional organization of sensory brain regions contend with these antithetical demands? Here we describe the functional organization of auditory and multisensory (i.e., auditory-visual) information processing in three sensory brain structures: (1) a low-level unisensory cortical region, the primary auditory cortex (A1); (2) a higher-order multisensory cortical region, the anterior ectosylvian sulcus (AES); and (3) a multisensory subcortical structure, the superior colliculus (SC). We then present a body of work that characterizes the ontogenic expression of experience-dependent influences on the operations performed by the functional circuits contained within these regions. We will present data to support the hypothesis that the competing demands for plasticity and stability are addressed through a developmental transition in operational properties of functional circuits from an initially labile mode in the early stages of postnatal development to a more stable mode in the mature brain that retains the capacity for plasticity under specific experiential conditions. Finally, we discuss parallels between the central tenets of functional organization and plasticity of sensory brain structures drawn from animal studies and a growing literature on human brain plasticity and the potential applicability of these principles to the audiology clinic.
Collapse
Affiliation(s)
- Daniel B Polley
- Vanderbilt Bill Wilkerson Center for Otolaryngology and Communication Sciences, Department of Hearing and Speech Sciences, Vanderbilt Kennedy Center for Human Development, Vanderbilt University Medical School, USA.
| | | | | | | | | | | |
Collapse
|
37
|
Kayser C, Petkov CI, Logothetis NK. Multisensory interactions in primate auditory cortex: fMRI and electrophysiology. Hear Res 2009; 258:80-8. [PMID: 19269312 DOI: 10.1016/j.heares.2009.02.011] [Citation(s) in RCA: 72] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/11/2008] [Revised: 02/25/2009] [Accepted: 02/25/2009] [Indexed: 10/21/2022]
Abstract
Recent studies suggest that multisensory integration does not only occur in higher association cortices but also at early stages of auditory processing, possibly in primary or secondary auditory cortex. Support for such early multisensory influences comes from functional magnetic resonance imaging experiments in humans and monkeys. However we argue that the current understanding of neurovascular coupling and of the neuronal basis underlying the imaging signal does not permit the direct extrapolation from imaging data to properties of neurons in the same region. While imaging can guide subsequent electrophysiological studies, only these can determine whether and how neurons in auditory cortices combine information from multiple modalities. Indeed, electrophysiological studies only partly confirm the findings from imaging studies. While recordings of field potentials reveal strong influences of visual or somatosensory stimulation on synaptic activity even in primary auditory cortex, single unit studies find only a small minority of neurons as being influenced by non-acoustic stimuli. We propose the analysis of the information coding properties of individual neurons as one way to quantitatively determine whether the representation of our acoustic environment in (primary) auditory cortex indeed benefits from multisensory input.
Collapse
Affiliation(s)
- Christoph Kayser
- Max Planck Institute for Biological Cybernetics, Spemannstrasse 38, 72076 Tübingen, Germany.
| | | | | |
Collapse
|
38
|
Tanaka E, Inui K, Kida T, Miyazaki T, Takeshima Y, Kakigi R. A transition from unimodal to multimodal activations in four sensory modalities in humans: an electrophysiological study. BMC Neurosci 2008; 9:116. [PMID: 19061523 PMCID: PMC2607283 DOI: 10.1186/1471-2202-9-116] [Citation(s) in RCA: 48] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2008] [Accepted: 12/08/2008] [Indexed: 11/10/2022] Open
Abstract
Background To investigate the long-latency activities common to all sensory modalities, electroencephalographic responses to auditory (1000 Hz pure tone), tactile (electrical stimulation to the index finger), visual (simple figure of a star), and noxious (intra-epidermal electrical stimulation to the dorsum of the hand) stimuli were recorded from 27 scalp electrodes in 14 healthy volunteers. Results Results of source modeling showed multimodal activations in the anterior part of the cingulate cortex (ACC) and hippocampal region (Hip). The activity in the ACC was biphasic. In all sensory modalities, the first component of ACC activity peaked 30–56 ms later than the peak of the major modality-specific activity, the second component of ACC activity peaked 117–145 ms later than the peak of the first component, and the activity in Hip peaked 43–77 ms later than the second component of ACC activity. Conclusion The temporal sequence of activations through modality-specific and multimodal pathways was similar among all sensory modalities.
Collapse
Affiliation(s)
- Emi Tanaka
- Department of Integrative Physiology, National Institute for Physiological Sciences, Okazaki 444-8585, Japan.
| | | | | | | | | | | |
Collapse
|
39
|
Rodgers KM, Benison AM, Klein A, Barth DS. Auditory, somatosensory, and multisensory insular cortex in the rat. Cereb Cortex 2008; 18:2941-51. [PMID: 18424777 PMCID: PMC2583160 DOI: 10.1093/cercor/bhn054] [Citation(s) in RCA: 99] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Compared with other areas of the forebrain, the function of insular cortex is poorly understood. This study examined the unisensory and multisensory function of the rat insula using high-resolution, whole-hemisphere, epipial evoked potential mapping. We found the posterior insula to contain distinct auditory and somatotopically organized somatosensory fields with an interposed and overlapping region capable of integrating these sensory modalities. Unisensory and multisensory responses were uninfluenced by complete lesioning of primary and secondary auditory and somatosensory cortices, suggesting a high degree of parallel afferent input from the thalamus. In light of the established connections of the posterior insula with the amygdala, we propose that integration of auditory and somatosensory modalities reported here may play a role in auditory fear conditioning.
Collapse
Affiliation(s)
- Krista M Rodgers
- Department of Psychology, University of Colorado, Boulder, CO 80309-0345, USA
| | | | | | | |
Collapse
|
40
|
Soto-Faraco S, Deco G. Multisensory contributions to the perception of vibrotactile events. Behav Brain Res 2008; 196:145-54. [PMID: 18930769 DOI: 10.1016/j.bbr.2008.09.018] [Citation(s) in RCA: 40] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2008] [Revised: 09/15/2008] [Accepted: 09/17/2008] [Indexed: 10/21/2022]
Abstract
We argue that audio-tactile interactions during vibrotactile processing provide a promising, albeit largely neglected, benchmark for the systematic study multisensory integration. This article reviews and discusses current evidence for multisensory contributions to the perception of vibratory events, and proposes a framework to address a number of relevant questions. First, we highlight some of the features that characterize the senses of hearing and touch in terms of vibratory information processing, and which allow for potential cross-modal interactions at multiple levels along the functional architecture of the sensory systems. Second, we briefly review empirical evidence for interactions between hearing and touch in the domain of vibroactile perception and related stimulus properties, covering behavioural, electrophysiological and neuroimaging studies in humans and animals. Third, we discuss the vibrotactile discrimination task, which has been successfully applied in the study of perception and decision processes in psychophysical and physiological research. We argue that this approach, complemented with computational modeling using biophysically realistic neural networks, may be a convenient framework to address auditory contributions to vibrotactile processing in the somatosensory system. Finally, we comment on a series of particular issues which are relevant in multisensory research and potentially addressable within the proposed framework.
Collapse
|
41
|
Brett-Green BA, Miller LJ, Gavin WJ, Davies PL. Multisensory integration in children: a preliminary ERP study. Brain Res 2008; 1242:283-90. [PMID: 18495092 DOI: 10.1016/j.brainres.2008.03.090] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2008] [Revised: 03/21/2008] [Accepted: 03/22/2008] [Indexed: 11/29/2022]
Abstract
The spatio-temporal scalp distribution of multisensory auditory-somatosensory integration was investigated in typically developing children ages 6-13. Event-related potentials were recorded from 32 scalp electrodes while participants watched a silent cartoon. Three types of sensory stimulation were presented pseudo-randomly: auditory clicks, somatosensory median nerve electrical pulses, or simultaneous auditory and somatosensory stimuli. No behavioral responses were required of the participant. To examine integration, responses to simultaneous auditory and somatosensory stimulation were compared to the sum of unisensory auditory plus unisensory somatosensory responses for four time-windows: (60-80 ms, 80-110 ms, 110-150 ms and 180-220 ms). Results indicated significant multisensory integration occurred in central/post-central scalp regions between 60-80 ms in the hemisphere contralateral to the side of somatosensory stimulation and between 110-150 ms in the hemisphere ipsilateral to the side of somatosensory stimulation. Between 180-220 ms, significant multisensory integration was evident in central/post-central regions in both hemispheres as well as midline scalp regions. This study suggests that children exhibit differential processing of multisensory compared to unisensory stimuli, as has previously been reported in adults.
Collapse
|
42
|
The Interaction Between Somatosensory and Auditory Cognitive Processing Assessed With Event-Related Potentials. J Clin Neurophysiol 2008; 25:90-7. [DOI: 10.1097/wnp.0b013e31816a8ffa] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022] Open
|
43
|
Kayser C, Petkov CI, Logothetis NK. Visual modulation of neurons in auditory cortex. ACTA ACUST UNITED AC 2008; 18:1560-74. [PMID: 18180245 DOI: 10.1093/cercor/bhm187] [Citation(s) in RCA: 364] [Impact Index Per Article: 22.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
Our brain integrates the information provided by the different sensory modalities into a coherent percept, and recent studies suggest that this process is not restricted to higher association areas. Here we evaluate the hypothesis that auditory cortical fields are involved in cross-modal processing by probing individual neurons for audiovisual interactions. We find that visual stimuli modulate auditory processing both at the level of field potentials and single-unit activity and already in primary and secondary auditory fields. These interactions strongly depend on a stimulus' efficacy in driving the neurons but occur independently of stimulus category and for naturalistic as well as artificial stimuli. In addition, interactions are sensitive to the relative timing of audiovisual stimuli and are strongest when visual stimuli lead by 20-80 msec. Exploring the underlying mechanisms, we find that enhancement correlates with the resetting of slow (approximately 10 Hz) oscillations to a phase angle of optimal excitability. These results demonstrate that visual stimuli can modulate the firing of neurons in auditory cortex in a manner that depends on stimulus efficacy and timing. These neurons thus meet the criteria for sensory integration and provide the auditory modality with multisensory contextual information about co-occurring environmental events.
Collapse
Affiliation(s)
- Christoph Kayser
- Max Planck Institute for Biological Cybernetics, Spemannstrasse 38, 72076 Tübingen, Germany.
| | | | | |
Collapse
|
44
|
Mouraux A, Plaghki L. Cortical interactions and integration of nociceptive and non-nociceptive somatosensory inputs in humans. Neuroscience 2007; 150:72-81. [DOI: 10.1016/j.neuroscience.2007.08.035] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2007] [Revised: 08/16/2007] [Accepted: 08/27/2007] [Indexed: 11/25/2022]
|
45
|
Bresciani JP, Ernst MO. Signal reliability modulates auditory-tactile integration for event counting. Neuroreport 2007; 18:1157-61. [PMID: 17589318 DOI: 10.1097/wnr.0b013e3281ace0ca] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Sequences of auditory beeps and tactile taps were simultaneously presented and participants were instructed to focus on one of these modalities and to ignore the other. We tested whether (i) the two sensory channels bias one another and (ii) the interaction depends on the relative reliability of the channels. Audition biased tactile perception and touch biased auditory perception. Lowering the reliability of the auditory channel (i.e. the intensity of the beeps) decreased the effect of audition on touch and increased the effect of touch on audition. These results show that simultaneous auditory and tactile stimuli tend to be automatically integrated in a reliability-dependent manner.
Collapse
|
46
|
Ku Y, Ohara S, Wang L, Lenz FA, Hsiao SS, Bodner M, Hong B, Zhou YD. Prefrontal cortex and somatosensory cortex in tactile crossmodal association: an independent component analysis of ERP recordings. PLoS One 2007; 2:e771. [PMID: 17712419 PMCID: PMC1942117 DOI: 10.1371/journal.pone.0000771] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2007] [Accepted: 07/04/2007] [Indexed: 11/28/2022] Open
Abstract
Our previous studies on scalp-recorded event-related potentials (ERPs) showed that somatosensory N140 evoked by a tactile vibration in working memory tasks was enhanced when human subjects expected a coming visual stimulus that had been paired with the tactile stimulus. The results suggested that such enhancement represented the cortical activities involved in tactile-visual crossmodal association. In the present study, we further hypothesized that the enhancement represented the neural activities in somatosensory and frontal cortices in the crossmodal association. By applying independent component analysis (ICA) to the ERP data, we found independent components (ICs) located in the medial prefrontal cortex (around the anterior cingulate cortex, ACC) and the primary somatosensory cortex (SI). The activity represented by the IC in SI cortex showed enhancement in expectation of the visual stimulus. Such differential activity thus suggested the participation of SI cortex in the task-related crossmodal association. Further, the coherence analysis and the Granger causality spectral analysis of the ICs showed that SI cortex appeared to cooperate with ACC in attention and perception of the tactile stimulus in crossmodal association. The results of our study support with new evidence an important idea in cortical neurophysiology: higher cognitive operations develop from the modality-specific sensory cortices (in the present study, SI cortex) that are involved in sensation and perception of various stimuli.
Collapse
Affiliation(s)
- Yixuan Ku
- Department of Neurosurgery, Johns Hopkins University, Baltimore, Maryland, United States of America
- Tsinghua University, Beijing, People's Republic of China
| | - Shinji Ohara
- Department of Neurosurgery, Johns Hopkins University, Baltimore, Maryland, United States of America
| | - Liping Wang
- Department of Neurosurgery, Johns Hopkins University, Baltimore, Maryland, United States of America
- The Institute of Cognitive Neuroscience, East China Normal University, Shanghai, People's Republic of China
| | - Fred A. Lenz
- Department of Neurosurgery, Johns Hopkins University, Baltimore, Maryland, United States of America
| | - Steven S. Hsiao
- Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- Department of Neuroscience, Johns Hopkins University, Baltimore, Maryland, United States of America
| | - Mark Bodner
- Department of Neurosurgery, Johns Hopkins University, Baltimore, Maryland, United States of America
- The Music Intelligence Neural Development Institute, Costa Mesa, California, United States of America
| | - Bo Hong
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland, United States of America
- Tsinghua University, Beijing, People's Republic of China
| | - Yong-Di Zhou
- Department of Neurosurgery, Johns Hopkins University, Baltimore, Maryland, United States of America
- Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland, United States of America
- * To whom correspondence should be addressed. E-mail:
| |
Collapse
|
47
|
Baumann S, Koeneke S, Schmidt CF, Meyer M, Lutz K, Jancke L. A network for audio–motor coordination in skilled pianists and non-musicians. Brain Res 2007; 1161:65-78. [PMID: 17603027 DOI: 10.1016/j.brainres.2007.05.045] [Citation(s) in RCA: 154] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2006] [Revised: 05/27/2007] [Accepted: 05/30/2007] [Indexed: 10/23/2022]
Abstract
Playing a musical instrument requires efficient auditory and motor processing. Fast feed forward and feedback connections that link the acoustic target to the corresponding motor programs need to be established during years of practice. The aim of our study is to provide a detailed description of cortical structures that participate in this audio-motor coordination network in professional pianists and non-musicians. In order to map these interacting areas using functional magnetic resonance imaging (fMRI), we considered cortical areas that are concurrently activated during silent piano performance and motionless listening to piano sound. Furthermore we investigated to what extent interactions between the auditory and the motor modality happen involuntarily. We observed a network of predominantly secondary and higher order areas belonging to the auditory and motor modality. The extent of activity was clearly increased by imagination of the absent modality. However, this network did neither comprise primary auditory nor primary motor areas in any condition. Activity in the lateral dorsal premotor cortex (PMd) and the pre-supplementary motor cortex (preSMA) was significantly increased for pianists. Our data imply an intermodal transformation network of auditory and motor areas which is subject to a certain degree of plasticity by means of intensive training.
Collapse
Affiliation(s)
- Simon Baumann
- Department of Neuropsychology, Institute for Psychology, University of Zurich, Switzerland.
| | | | | | | | | | | |
Collapse
|
48
|
Iguchi Y, Hoshi Y, Nemoto M, Taira M, Hashimoto I. Co-activation of the secondary somatosensory and auditory cortices facilitates frequency discrimination of vibrotactile stimuli. Neuroscience 2007; 148:461-72. [PMID: 17640818 DOI: 10.1016/j.neuroscience.2007.06.004] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2006] [Revised: 05/25/2007] [Accepted: 06/06/2007] [Indexed: 11/16/2022]
Abstract
The contribution of the auditory cortex to tactile information processing was studied by measuring somatosensory evoked magnetic fields (SEFs). Three kinds of vibrotactile stimuli with frequencies of 180, 280 and 380 Hz were randomly delivered on the right index finger with a probability of 40, 20 and 40%, respectively. Twenty normal subjects participated in four kinds of tasks: a control condition to ignore these stimuli, a simple task to discriminate the 280-Hz stimulus from the other two stimuli (discrimination task for the vibrotactile stimuli, Ts task), a feedback task modified from the Ts task by adding acoustic feedback of the vibratory frequency at 1300 ms poststimulus (tactile discrimination with auditory clues, TA), and an easy version of the TA task (TA-easy) to discriminate the 280-Hz stimulus (20% target) from the 180- or 380-Hz stimuli (80% nontarget). The Ts and TA tasks required accurate perception of the vibrotactile frequencies to discriminate among the three kinds of stimuli. Under such a task demand, the post hoc auditory feedback in the TA task was expected to induce acoustic imagery for the tactile sensation. The SEFs for the nontarget stimuli were analyzed. A middle-latency component (M150/200) was specifically evoked by the three discrimination tasks. In the Ts and TA-easy tasks, the M150/200 source indicated inferior parietal cortical activities (SII area). In the TA task, 11 subjects showed activity in both the SII area and the superior temporal auditory region and increased accuracy of discrimination compared with the Ts task, in contrast with other subjects who showed activity only in the SII area and small changes in task accuracy between the Ts and TA tasks. Asynchronous auditory feedback for the vibrotactile sensation induced the auditory cortex activity in the SEFs in relation to the progress in tactile discrimination, which suggested an induction of acoustic imagery to complement the tactile information processing.
Collapse
Affiliation(s)
- Yoshinobu Iguchi
- Integrated Neuroscience Research Team, Tokyo Institute of Psychiatry, 2-1-8 Kamikitazawa, Tokyo 156-8585, Japan.
| | | | | | | | | |
Collapse
|
49
|
Gondan M, Vorberg D, Greenlee MW. Modality shift effects mimic multisensory interactions: an event-related potential study. Exp Brain Res 2007; 182:199-214. [PMID: 17562033 DOI: 10.1007/s00221-007-0982-4] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2006] [Accepted: 05/08/2007] [Indexed: 11/26/2022]
Abstract
A frequent approach to study interactions of the auditory and the visual system is to measure event-related potentials (ERPs) to auditory, visual, and auditory-visual stimuli (A, V, AV). A nonzero result of the AV - (A + V) comparison indicates that the sensory systems interact at a specific processing stage. Two possible biases weaken the conclusions drawn by this approach: first, subtracting two ERPs from one requires that A, V, and AV do not share any common activity. We have shown before (Gondan and Röder in Brain Res 1073-1074:389-397, 2006) that the problem of common activity can be avoided using an additional tactile stimulus (T) and evaluating the ERP difference (T + TAV) - (TA + TV). A second possible confound is the modality shift effect (MSE): for example, the auditory N1 is increased if an auditory stimulus follows a visual stimulus, whereas it is smaller if the modality is unchanged (ipsimodal stimulus). Bimodal stimuli might be affected less by MSEs because at least one component always matches the preceding trial. Consequently, an apparent amplitude modulation of the N1 would be observed in AV. We tested the influence of MSEs on auditory-visual interactions by comparing the results of AV - (A + V) using (a) all stimuli and using (b) only ipsimodal stimuli. (a) and (b) differed around 150 ms, this indicates that AV - (A + V) is indeed affected by the MSE. We then formally and empirically demonstrate that (T + TAV) - (TA + TV) is robust against possible biases due to the MSE.
Collapse
Affiliation(s)
- Matthias Gondan
- Department of Psychology, University of Regensburg, 93050 Regensburg, Germany.
| | | | | |
Collapse
|
50
|
Zampini M, Torresan D, Spence C, Murray MM. Auditory-somatosensory multisensory interactions in front and rear space. Neuropsychologia 2007; 45:1869-77. [PMID: 17291546 DOI: 10.1016/j.neuropsychologia.2006.12.004] [Citation(s) in RCA: 64] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2006] [Revised: 11/28/2006] [Accepted: 12/03/2006] [Indexed: 12/01/2022]
Abstract
The information conveyed by our senses can be combined to facilitate perception and behaviour. One focus of recent research has been on the factors governing such facilitatory multisensory interactions. The spatial register of neuronal receptive fields (RFs) appears to be a prerequisite for multisensory enhancement. In terms of auditory-somatosensory (AS) interactions, facilitatory effects on simple reaction times and on brain responses have been demonstrated in caudo-medial auditory cortices, both when auditory and somatosensory stimuli are presented to the same spatial location and also when they are separated by 100 degrees in frontal space. One implication is that these brain regions contain large spatial RFs. The present study further investigated this possibility and, in particular, the question of whether AS interactions are restricted to frontal space, since recent research has revealed some fundamental differences between the sensory processing of stimuli in front and rear space. Twelve participants performed a simple reaction time task to auditory, somatosensory, or simultaneous auditory-somatosensory stimuli. The participants placed one of their arms in front of them and the other behind their backs. Loudspeakers were placed close to each hand. Thus, there were a total of eight stimulus conditions - four unisensory and four multisensory - including all possible combinations of posture and loudspeaker location. A significant facilitation of reaction times (RTs), exceeding that predicted by probability summation, was obtained following multisensory stimulation, irrespective of whether the stimuli were in spatial register or not. These results are interpreted in terms of the likely RF organization of previously identified auditory-somatosensory brain regions.
Collapse
Affiliation(s)
- Massimiliano Zampini
- Department of Cognitive Sciences and Education, University of Trento, Rovereto (TN), Italy.
| | | | | | | |
Collapse
|