51
|
Avery JA, Ingeholm JE, Wohltjen S, Collins M, Riddell CD, Gotts SJ, Kenworthy L, Wallace GL, Simmons WK, Martin A. Neural correlates of taste reactivity in autism spectrum disorder. NEUROIMAGE-CLINICAL 2018; 19:38-46. [PMID: 30035000 PMCID: PMC6051474 DOI: 10.1016/j.nicl.2018.04.008] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/15/2017] [Revised: 02/22/2018] [Accepted: 04/01/2018] [Indexed: 11/02/2022]
Abstract
Selective or 'picky' eating habits are common among those with autism spectrum disorder (ASD). These behaviors are often related to aberrant sensory experience in individuals with ASD, including heightened reactivity to food taste and texture. However, very little is known about the neural mechanisms that underlie taste reactivity in ASD. In the present study, food-related neural responses were evaluated in 21 young adult and adolescent males diagnosed with ASD without intellectual disability, and 21 typically-developing (TD) controls. Taste reactivity was assessed using the Adolescent/Adult Sensory Profile, a clinical self-report measure. Functional magnetic resonance imaging was used to evaluate hemodynamic responses to sweet (vs. neutral) tastants and food pictures. Subjects also underwent resting-state functional connectivity scans.The ASD and TD individuals did not differ in their hemodynamic response to gustatory stimuli. However, the ASD subjects, but not the controls, exhibited a positive association between self-reported taste reactivity and the response to sweet tastants within the insular cortex and multiple brain regions associated with gustatory perception and reward. There was a strong interaction between diagnostic group and taste reactivity on tastant response in brain regions associated with ASD pathophysiology, including the bilateral anterior superior temporal sulcus (STS). This interaction of diagnosis and taste reactivity was also observed in the resting state functional connectivity between the anterior STS and dorsal mid-insula (i.e., gustatory cortex).These results suggest that self-reported heightened taste reactivity in ASD is associated with heightened brain responses to food-related stimuli and atypical functional connectivity of primary gustatory cortex, which may predispose these individuals to maladaptive and unhealthy patterns of selective eating behavior. Trial registration (clinicaltrials.gov identifier) NCT01031407. Registered: December 14, 2009.
Collapse
Affiliation(s)
- Jason A Avery
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, United States.
| | - John E Ingeholm
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, United States
| | - Sophie Wohltjen
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, United States
| | - Meghan Collins
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, United States
| | - Cameron D Riddell
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, United States
| | - Stephen J Gotts
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, United States
| | - Lauren Kenworthy
- Center for Autism Spectrum Disorders, Children's National Health System, Washington, DC, United States
| | - Gregory L Wallace
- Department of Speech, Language, and Hearing Sciences, The George Washington University, Washington, DC, United States
| | - W Kyle Simmons
- Laureate Institute for Brain Research, Tulsa, OK, United States; School of Community Medicine, The University of Tulsa, Tulsa, OK, United States
| | - Alex Martin
- Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, United States
| |
Collapse
|
52
|
Chu SH, Parhi KK, Lenglet C. Function-specific and Enhanced Brain Structural Connectivity Mapping via Joint Modeling of Diffusion and Functional MRI. Sci Rep 2018; 8:4741. [PMID: 29549287 PMCID: PMC5856752 DOI: 10.1038/s41598-018-23051-9] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2017] [Accepted: 02/22/2018] [Indexed: 12/20/2022] Open
Abstract
A joint structural-functional brain network model is presented, which enables the discovery of function-specific brain circuits, and recovers structural connections that are under-estimated by diffusion MRI (dMRI). Incorporating information from functional MRI (fMRI) into diffusion MRI to estimate brain circuits is a challenging task. Usually, seed regions for tractography are selected from fMRI activation maps to extract the white matter pathways of interest. The proposed method jointly analyzes whole brain dMRI and fMRI data, allowing the estimation of complete function-specific structural networks instead of interactively investigating the connectivity of individual cortical/sub-cortical areas. Additionally, tractography techniques are prone to limitations, which can result in erroneous pathways. The proposed framework explicitly models the interactions between structural and functional connectivity measures thereby improving anatomical circuit estimation. Results on Human Connectome Project (HCP) data demonstrate the benefits of the approach by successfully identifying function-specific anatomical circuits, such as the language and resting-state networks. In contrast to correlation-based or independent component analysis (ICA) functional connectivity mapping, detailed anatomical connectivity patterns are revealed for each functional module. Results on a phantom (Fibercup) also indicate improvements in structural connectivity mapping by rejecting false-positive connections with insufficient support from fMRI, and enhancing under-estimated connectivity with strong functional correlation.
Collapse
Affiliation(s)
- Shu-Hsien Chu
- Electrical and Computer Engineering Department, University of Minnesota, Minneapolis, 55455, USA
| | - Keshab K Parhi
- Electrical and Computer Engineering Department, University of Minnesota, Minneapolis, 55455, USA
| | - Christophe Lenglet
- Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, 55455, USA.
| |
Collapse
|
53
|
Corporaal SHA, Bruijn SM, Hoogkamer W, Chalavi S, Boisgontier MP, Duysens J, Swinnen SP, Gooijers J. Different neural substrates for precision stepping and fast online step adjustments in youth. Brain Struct Funct 2018; 223:2039-2053. [PMID: 29368052 PMCID: PMC5884917 DOI: 10.1007/s00429-017-1586-9] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2017] [Accepted: 11/30/2017] [Indexed: 12/27/2022]
Abstract
Humans can navigate through challenging environments (e.g., cluttered or uneven terrains) by modifying their preferred gait pattern (e.g., step length, step width, or speed). Growing behavioral and neuroimaging evidence suggests that the ability to modify preferred step patterns requires the recruitment of cognitive resources. In children, it is argued that prolonged development of complex gait is related to the ongoing development of involved brain regions, but this has not been directly investigated yet. Here, we aimed to elucidate the relationship between structural brain properties and complex gait in youth aged 9–18 years. We used volumetric analyses of cortical grey matter (GM) and whole-brain voxelwise statistical analyses of white matter (WM), and utilized a treadmill-based precision stepping task to investigate complex gait. Moreover, precision stepping was performed on step targets which were either unperturbed or perturbed (i.e., unexpectedly shifting to a new location). Our main findings revealed that larger unperturbed precision step error was associated with decreased WM microstructural organization of tracts that are particularly associated with attentional and visual processing functions. These results strengthen the hypothesis that precision stepping on unperturbed step targets is driven by cortical processes. In contrast, no significant correlations were found between perturbed precision stepping and cortical structures, indicating that other (neural) mechanisms may be more important for this type of stepping.
Collapse
Affiliation(s)
- Sharissa H A Corporaal
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium
| | - Sjoerd M Bruijn
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium
- Department of Human Movement Sciences, MOVE Research Institute Amsterdam, VU University Amsterdam, Amsterdam, The Netherlands
| | - Wouter Hoogkamer
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium
- Department of Integrative Physiology, University of Colorado, Boulder, USA
| | - Sima Chalavi
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium
| | - Matthieu P Boisgontier
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium
| | - Jacques Duysens
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium
| | - Stephan P Swinnen
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium
- Leuven Research Institute for Neuroscience and Disease (LIND), KU Leuven, Leuven, Belgium
| | - Jolien Gooijers
- Movement Control and Neuroplasticity Research Group, Department of Movement Sciences, KU Leuven, Tervuursevest 101, box 1501, 3001, Leuven, Belgium.
| |
Collapse
|
54
|
Chauvigné LAS, Belyk M, Brown S. Taking two to tango: fMRI analysis of improvised joint action with physical contact. PLoS One 2018; 13:e0191098. [PMID: 29324862 PMCID: PMC5764359 DOI: 10.1371/journal.pone.0191098] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2017] [Accepted: 12/28/2017] [Indexed: 11/18/2022] Open
Abstract
Many forms of joint action involve physical coupling between the participants, such as when moving a sofa together or dancing a tango. We report the results of a novel two-person functional MRI study in which trained couple dancers engaged in bimanual contact with an experimenter standing next to the bore of the magnet, and in which the two alternated between being the leader and the follower of joint improvised movements. Leading showed a general pattern of self-orientation, being associated with brain areas involved in motor planning, navigation, sequencing, action monitoring, and error correction. In contrast, following showed a far more sensory, externally-oriented pattern, revealing areas involved in somatosensation, proprioception, motion tracking, social cognition, and outcome monitoring. We also had participants perform a "mutual" condition in which the movement patterns were pre-learned and the roles were symmetric, thereby minimizing any tendency toward either leading or following. The mutual condition showed greater activity in brain areas involved in mentalizing and social reward than did leading or following. Finally, the analysis of improvisation revealed the dual importance of motor-planning and working-memory areas. We discuss these results in terms of theories of both joint action and improvisation.
Collapse
Affiliation(s)
- Léa A. S. Chauvigné
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada
| | - Michel Belyk
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada
| | - Steven Brown
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada
| |
Collapse
|
55
|
Cottereau BR, Smith AT, Rima S, Fize D, Héjja-Brichard Y, Renaud L, Lejards C, Vayssière N, Trotter Y, Durand JB. Processing of Egomotion-Consistent Optic Flow in the Rhesus Macaque Cortex. Cereb Cortex 2018; 27:330-343. [PMID: 28108489 PMCID: PMC5939222 DOI: 10.1093/cercor/bhw412] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2016] [Indexed: 11/12/2022] Open
Abstract
The cortical network that processes visual cues to self-motion was characterized with functional magnetic resonance imaging in 3 awake behaving macaques. The experimental protocol was similar to previous human studies in which the responses to a single large optic flow patch were contrasted with responses to an array of 9 similar flow patches. This distinguishes cortical regions where neurons respond to flow in their receptive fields regardless of surrounding motion from those that are sensitive to whether the overall image arises from self-motion. In all 3 animals, significant selectivity for egomotion-consistent flow was found in several areas previously associated with optic flow processing, and notably dorsal middle superior temporal area, ventral intra-parietal area, and VPS. It was also seen in areas 7a (Opt), STPm, FEFsem, FEFsac and in a region of the cingulate sulcus that may be homologous with human area CSv. Selectivity for egomotion-compatible flow was never total but was particularly strong in VPS and putative macaque CSv. Direct comparison of results with the equivalent human studies reveals several commonalities but also some differences.
Collapse
Affiliation(s)
- Benoit R Cottereau
- Université de Toulouse, Centre de Recherche Cerveau et Cognition, Toulouse, France.,Centre National de la Recherche Scientifique, Toulouse, France
| | - Andrew T Smith
- Department of Psychology, Royal Holloway, University of London, Egham, UK
| | - Samy Rima
- Université de Toulouse, Centre de Recherche Cerveau et Cognition, Toulouse, France.,Centre National de la Recherche Scientifique, Toulouse, France
| | - Denis Fize
- Laboratoire d'Anthropologie Moléculaire et Imagerie de Synthèse, CNRS-Université de Toulouse, Toulouse, France
| | - Yseult Héjja-Brichard
- Université de Toulouse, Centre de Recherche Cerveau et Cognition, Toulouse, France.,Centre National de la Recherche Scientifique, Toulouse, France
| | - Luc Renaud
- CNRS, CE2F PRIM UMS3537, Marseille, France.,Aix Marseille Université, Centre d'Exploration Fonctionnelle et de Formation, Marseille, France
| | - Camille Lejards
- Université de Toulouse, Centre de Recherche Cerveau et Cognition, Toulouse, France.,Centre National de la Recherche Scientifique, Toulouse, France
| | - Nathalie Vayssière
- Université de Toulouse, Centre de Recherche Cerveau et Cognition, Toulouse, France.,Centre National de la Recherche Scientifique, Toulouse, France
| | - Yves Trotter
- Université de Toulouse, Centre de Recherche Cerveau et Cognition, Toulouse, France.,Centre National de la Recherche Scientifique, Toulouse, France
| | - Jean-Baptiste Durand
- Université de Toulouse, Centre de Recherche Cerveau et Cognition, Toulouse, France.,Centre National de la Recherche Scientifique, Toulouse, France
| |
Collapse
|
56
|
Functional anatomy of the macaque temporo-parieto-frontal connectivity. Cortex 2017; 97:306-326. [DOI: 10.1016/j.cortex.2016.12.007] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2016] [Revised: 11/21/2016] [Accepted: 12/04/2016] [Indexed: 01/19/2023]
|
57
|
Miguel HO, Lisboa IC, Gonçalves ÓF, Sampaio A. Brain mechanisms for processing discriminative and affective touch in 7-month-old infants. Dev Cogn Neurosci 2017; 35:20-27. [PMID: 29108882 PMCID: PMC6968955 DOI: 10.1016/j.dcn.2017.10.008] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2017] [Revised: 10/13/2017] [Accepted: 10/20/2017] [Indexed: 12/20/2022] Open
Abstract
Affective touch has been associated with affiliative behavior during early stages of infant development; however, its underlying brain mechanisms are still poorly understood. This study used fNIRS (functional near-infrared spectroscopy) to examine both affective and discriminative touch in 7- month-old infants (n=35). Infants were provided affective stimuli on the forearm for 10 sec followed by a 20 sec rest period. The protocol was repeated for discriminative touch, and both affective and discriminative stimuli were given in a counterbalanced order. Brain activation (oxy-hemoglobin and deoxy-hemoglobin levels) in the somatosensory and temporal regions was registered during administration of the stimuli. There was an increase in oxy-hemoglobin and decrease in deoxy-hemoglobin only in the somatosensory region in response to both affective and discriminative touch. No other activations were found. Seven-month-old infants’ brain activation in the somatosensory cortex was similar for both discriminative and affective touch, but the stimuli did not elicit any activation in the temporal region/ pSTS. Our study is the first to suggest that 7-month-old infants do not yet recruit socio-emotional brain areas in response to affective touch.
Collapse
Affiliation(s)
- Helga O Miguel
- Neuropsychophysiology Lab, CiPsi, School of Psychology, University of Minho, Campus de Gualtar, 4710-057 Braga, Portugal.
| | - Isabel C Lisboa
- Human Cognition Lab, CiPsi, School of Psychology, University of Minho, Campus de Gualtar, 4710-057 Braga, Portugal.
| | - Óscar F Gonçalves
- Neuropsychophysiology Lab, CiPsi, School of Psychology, University of Minho, Campus de Gualtar, 4710-057 Braga, Portugal; Spaulding Neuromodulation Center, Spaulding Rehabilitation Hospital, Harvard Medical School, Charlestown campus: 79/96 13th Street, Charlestown, MA, 02129, USA.
| | - Adriana Sampaio
- Neuropsychophysiology Lab, CiPsi, School of Psychology, University of Minho, Campus de Gualtar, 4710-057 Braga, Portugal.
| |
Collapse
|
58
|
Sours C, Raghavan P, Foxworthy WA, Meredith MA, El Metwally D, Zhuo J, Gilmore JH, Medina AE, Gullapalli RP. Cortical multisensory connectivity is present near birth in humans. Brain Imaging Behav 2017; 11:1207-1213. [PMID: 27581715 PMCID: PMC5332431 DOI: 10.1007/s11682-016-9586-6] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
How the newborn brain adapts to its new multisensory environment has been a subject of debate. Although an early theory proposed that the brain acquires multisensory features as a result of postnatal experience, recent studies have demonstrated that the neonatal brain is already capable of processing multisensory information. For multisensory processing to be functional, it is a prerequisite that multisensory convergence among neural connections occur. However, multisensory connectivity has not been examined in human neonates nor are its location(s) or afferent sources understood. We used resting state functional MRI (fMRI) in two independent cohorts of infants to examine the functional connectivity of two cortical areas known to be multisensory in adults: the intraparietal sulcus (IPS) and the superior temporal sulcus (STS). In the neonate, the IPS was found to demonstrate significant functional connectivity with visual association and somatosensory association areas, while the STS showed significant functional connectivity with the visual association areas, primary auditory cortex, and somatosensory association areas. Our findings establish that each of these areas displays functional communication with cortical regions representing various sensory modalities. This demonstrates the presence of cortical areas with converging sensory inputs, representing that the functional architecture needed for multisensory processing is already present within the first weeks of life.
Collapse
Affiliation(s)
- Chandler Sours
- Magnetic Resonance Research Center, University of Maryland School of Medicine, Baltimore, MD, 21201, USA
- Department of Diagnostic Radiology & Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD, 21201, USA
| | - Prashant Raghavan
- Department of Diagnostic Radiology & Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD, 21201, USA
| | - W Alex Foxworthy
- Department of Pediatrics, University of Maryland School of Medicine, 655 W. Baltimore Street, Baltimore, MD, 21201, USA
| | - M Alex Meredith
- Department of Anatomy and Neurobiology, Virginia Commonwealth University, Richmond, VA, 23298, USA
| | - Dina El Metwally
- Department of Pediatrics, University of Maryland School of Medicine, 655 W. Baltimore Street, Baltimore, MD, 21201, USA
| | - Jiachen Zhuo
- Magnetic Resonance Research Center, University of Maryland School of Medicine, Baltimore, MD, 21201, USA
- Department of Diagnostic Radiology & Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD, 21201, USA
| | - John H Gilmore
- Department of Psychiatry, University of North Carolina at Chapel Hill, Chapel Hill, NC, 27516, USA
| | - Alexandre E Medina
- Department of Pediatrics, University of Maryland School of Medicine, 655 W. Baltimore Street, Baltimore, MD, 21201, USA.
| | - Rao P Gullapalli
- Magnetic Resonance Research Center, University of Maryland School of Medicine, Baltimore, MD, 21201, USA.
- Department of Diagnostic Radiology & Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD, 21201, USA.
| |
Collapse
|
59
|
|
60
|
Aparicio M, Peigneux P, Charlier B, Balériaux D, Kavec M, Leybaert J. The Neural Basis of Speech Perception through Lipreading and Manual Cues: Evidence from Deaf Native Users of Cued Speech. Front Psychol 2017; 8:426. [PMID: 28424636 PMCID: PMC5371603 DOI: 10.3389/fpsyg.2017.00426] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2016] [Accepted: 03/07/2017] [Indexed: 11/13/2022] Open
Abstract
We present here the first neuroimaging data for perception of Cued Speech (CS) by deaf adults who are native users of CS. CS is a visual mode of communicating a spoken language through a set of manual cues which accompany lipreading and disambiguate it. With CS, sublexical units of the oral language are conveyed clearly and completely through the visual modality without requiring hearing. The comparison of neural processing of CS in deaf individuals with processing of audiovisual (AV) speech in normally hearing individuals represents a unique opportunity to explore the similarities and differences in neural processing of an oral language delivered in a visuo-manual vs. an AV modality. The study included deaf adult participants who were early CS users and native hearing users of French who process speech audiovisually. Words were presented in an event-related fMRI design. Three conditions were presented to each group of participants. The deaf participants saw CS words (manual + lipread), words presented as manual cues alone, and words presented to be lipread without manual cues. The hearing group saw AV spoken words, audio-alone and lipread-alone. Three findings are highlighted. First, the middle and superior temporal gyrus (excluding Heschl's gyrus) and left inferior frontal gyrus pars triangularis constituted a common, amodal neural basis for AV and CS perception. Second, integration was inferred in posterior parts of superior temporal sulcus for audio and lipread information in AV speech, but in the occipito-temporal junction, including MT/V5, for the manual cues and lipreading in CS. Third, the perception of manual cues showed a much greater overlap with the regions activated by CS (manual + lipreading) than lipreading alone did. This supports the notion that manual cues play a larger role than lipreading for CS processing. The present study contributes to a better understanding of the role of manual cues as support of visual speech perception in the framework of the multimodal nature of human communication.
Collapse
Affiliation(s)
- Mario Aparicio
- Laboratory of Cognition, Language and Development, Centre de Recherches Neurosciences et Cognition, Université Libre de Bruxelles,Brussels, Belgium
| | - Philippe Peigneux
- Neuropsychology and Functional Neuroimaging Research Unit (UR2NF), Centre de Recherches Cognition et Neurosciences, Université Libre de Bruxelles,Brussels, Belgium
| | - Brigitte Charlier
- Laboratory of Cognition, Language and Development, Centre de Recherches Neurosciences et Cognition, Université Libre de Bruxelles,Brussels, Belgium
| | - Danielle Balériaux
- Department of Radiology, Clinics of Magnetic Resonance, Erasme HospitalBrussels, Belgium
| | - Martin Kavec
- Department of Radiology, Clinics of Magnetic Resonance, Erasme HospitalBrussels, Belgium
| | - Jacqueline Leybaert
- Laboratory of Cognition, Language and Development, Centre de Recherches Neurosciences et Cognition, Université Libre de Bruxelles,Brussels, Belgium
| |
Collapse
|
61
|
D’Imperio D, Scandola M, Gobbetto V, Bulgarelli C, Salgarello M, Avesani R, Moro V. Visual and cross-modal cues increase the identification of overlapping visual stimuli in Balint’s syndrome. J Clin Exp Neuropsychol 2017; 39:786-802. [DOI: 10.1080/13803395.2016.1266307] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
- Daniela D’Imperio
- Department of Psychology, AgliotiLab, University of Rome, Faculty of Medicine and Psychology, Rome, Italy
- Department of Human Sciences, Npsy.Lab-Vr, University of Verona, Verona, Italy
| | - Michele Scandola
- Department of Human Sciences, Npsy.Lab-Vr, University of Verona, Verona, Italy
- Department of Rehabilitation, Sacro Cuore Don Calabria Hospital, Negrar, Italy
| | - Valeria Gobbetto
- Department of Rehabilitation, Sacro Cuore Don Calabria Hospital, Negrar, Italy
| | - Cristina Bulgarelli
- Department of Rehabilitation, Sacro Cuore Don Calabria Hospital, Negrar, Italy
| | - Matteo Salgarello
- Nuclear Medicine Unit, Ospedale Sacro Cuore Don Calabria, Negrar, Italy
| | - Renato Avesani
- Department of Rehabilitation, Sacro Cuore Don Calabria Hospital, Negrar, Italy
| | - Valentina Moro
- Department of Human Sciences, Npsy.Lab-Vr, University of Verona, Verona, Italy
| |
Collapse
|
62
|
Smith AT, Greenlee MW, DeAngelis GC, Angelaki D. Distributed Visual–Vestibular Processing in the Cerebral Cortex of Man and Macaque. Multisens Res 2017. [DOI: 10.1163/22134808-00002568] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Recent advances in understanding the neurobiological underpinnings of visual–vestibular interactions underlying self-motion perception are reviewed with an emphasis on comparisons between the macaque and human brains. In both species, several distinct cortical regions have been identified that are active during both visual and vestibular stimulation and in some of these there is clear evidence for sensory integration. Several possible cross-species homologies between cortical regions are identified. A key feature of cortical organization is that the same information is apparently represented in multiple, anatomically diverse cortical regions, suggesting that information about self-motion is used for different purposes in different brain regions.
Collapse
Affiliation(s)
- Andrew T. Smith
- Department of Psychology, Royal Holloway, University of London, Egham TW20 0EX, UK
| | - Mark W. Greenlee
- Institute of Experimental Psychology, University of Regensburg, 93053 Regensburg, Germany
| | - Gregory C. DeAngelis
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, New York 14627, USA
| | - Dora E. Angelaki
- Department of Neuroscience, Baylor College of Medicine, Houston, Texas 77030, USA
| |
Collapse
|
63
|
Oscillatory activity in auditory cortex reflects the perceptual level of audio-tactile integration. Sci Rep 2016; 6:33693. [PMID: 27647158 PMCID: PMC5028762 DOI: 10.1038/srep33693] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2016] [Accepted: 08/31/2016] [Indexed: 12/02/2022] Open
Abstract
Cross-modal interactions between sensory channels have been shown to depend on both the spatial disparity and the perceptual similarity between the presented stimuli. Here we investigate the behavioral and neural integration of auditory and tactile stimulus pairs at different levels of spatial disparity. Additionally, we modulated the amplitudes of both stimuli in either a coherent or non-coherent manner. We found that both auditory and tactile localization performance was biased towards the stimulus in the respective other modality. This bias linearly increases with stimulus disparity and is more pronounced for coherently modulated stimulus pairs. Analyses of electroencephalographic (EEG) activity at temporal–cortical sources revealed enhanced event-related potentials (ERPs) as well as decreased alpha and beta power during bimodal as compared to unimodal stimulation. However, while the observed ERP differences are similar for all stimulus combinations, the extent of oscillatory desynchronization varies with stimulus disparity. Moreover, when both stimuli were subjectively perceived as originating from the same direction, the reduction in alpha and beta power was significantly stronger. These observations suggest that in the EEG the level of perceptual integration is mainly reflected by changes in ongoing oscillatory activity.
Collapse
|
64
|
Davidovic M, Jönsson EH, Olausson H, Björnsdotter M. Posterior Superior Temporal Sulcus Responses Predict Perceived Pleasantness of Skin Stroking. Front Hum Neurosci 2016; 10:432. [PMID: 27679564 PMCID: PMC5020046 DOI: 10.3389/fnhum.2016.00432] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2016] [Accepted: 08/11/2016] [Indexed: 11/17/2022] Open
Abstract
Love and affection is expressed through a range of physically intimate gestures, including caresses. Recent studies suggest that posterior temporal lobe areas typically associated with visual processing of social cues also respond to interpersonal touch. Here, we asked whether these areas are selective to caress-like skin stroking. We collected functional magnetic resonance imaging data from 23 healthy participants and compared brain responses to skin stroking and vibration. We did not find any significant differences between stroking and vibration in the posterior temporal lobe; however, right posterior superior temporal sulcus (pSTS) responses predicted healthy participant's perceived pleasantness of skin stroking, but not vibration. These findings link right pSTS responses to individual variability in perceived pleasantness of caress-like tactile stimuli. We speculate that the right pSTS may play a role in the translation of tactile stimuli into positively valenced, socially relevant interpersonal touch and that this system may be affected in disorders associated with impaired attachment.
Collapse
Affiliation(s)
- Monika Davidovic
- Institute of Neuroscience and Physiology, University of GothenburgGothenburg, Sweden
| | - Emma H. Jönsson
- Institute of Neuroscience and Physiology, University of GothenburgGothenburg, Sweden
| | - Håkan Olausson
- Center for Social and Affective Neuroscience, Linköping UniversityLinköping, Sweden
| | - Malin Björnsdotter
- Center for Social and Affective Neuroscience, Linköping UniversityLinköping, Sweden
- Center for Ethics, Law and Mental Health, University of GothenburgGothenburg, Sweden
| |
Collapse
|
65
|
Abstract
The hypothesis that highly overlapping networks underlie brain functions (neural reuse) is decisively supported by three decades of multisensory research. Multisensory areas process information from more than one sensory modality and therefore represent the best examples of neural reuse. Recent evidence of multisensory processing in primary visual cortices further indicates that neural reuse is a basic feature of the brain.
Collapse
|
66
|
Morrison I. ALE meta-analysis reveals dissociable networks for affective and discriminative aspects of touch. Hum Brain Mapp 2016; 37:1308-20. [PMID: 26873519 PMCID: PMC5066805 DOI: 10.1002/hbm.23103] [Citation(s) in RCA: 104] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2015] [Revised: 11/13/2015] [Accepted: 12/17/2015] [Indexed: 12/19/2022] Open
Abstract
Emotionally-laden tactile stimulation-such as a caress on the skin or the feel of velvet-may represent a functionally distinct domain of touch, underpinned by specific cortical pathways. In order to determine whether, and to what extent, cortical functional neuroanatomy supports a distinction between affective and discriminative touch, an activation likelihood estimate (ALE) meta-analysis was performed. This meta-analysis statistically mapped reported functional magnetic resonance imaging (fMRI) activations from 17 published affective touch studies in which tactile stimulation was associated with positive subjective evaluation (n = 291, 34 experimental contrasts). A separate ALE meta-analysis mapped regions most likely to be activated by tactile stimulation during detection and discrimination tasks (n = 1,075, 91 experimental contrasts). These meta-analyses revealed dissociable regions for affective and discriminative touch, with posterior insula (PI) more likely to be activated for affective touch, and primary somatosensory cortices (SI) more likely to be activated for discriminative touch. Secondary somatosensory cortex had a high likelihood of engagement by both affective and discriminative touch. Further, meta-analytic connectivity (MCAM) analyses investigated network-level co-activation likelihoods independent of task or stimulus, across a range of domains and paradigms. Affective-related PI and discriminative-related SI regions co-activated with different networks, implicated in dissociable functions, but sharing somatosensory co-activations. Taken together, these meta-analytic findings suggest that affective and discriminative touch are dissociable both on the regional and network levels. However, their degree of shared activation likelihood in somatosensory cortices indicates that this dissociation reflects functional biases within tactile processing networks, rather than functionally and anatomically distinct pathways.
Collapse
Affiliation(s)
- India Morrison
- Department of Clinical and Experimental Medicine, Center for Social and Affective Neuroscience (CSAN), Linköping University, Linköping, Sweden
| |
Collapse
|
67
|
Pishnamazi M, Nojaba Y, Ganjgahi H, Amousoltani A, Oghabian MA. Neural correlates of audiotactile phonetic processing in early-blind readers: an fMRI study. Exp Brain Res 2015; 234:1263-77. [DOI: 10.1007/s00221-015-4515-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2014] [Accepted: 11/30/2015] [Indexed: 10/22/2022]
|
68
|
Baum SH, Stevenson RA, Wallace MT. Behavioral, perceptual, and neural alterations in sensory and multisensory function in autism spectrum disorder. Prog Neurobiol 2015; 134:140-60. [PMID: 26455789 PMCID: PMC4730891 DOI: 10.1016/j.pneurobio.2015.09.007] [Citation(s) in RCA: 239] [Impact Index Per Article: 23.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2015] [Revised: 08/21/2015] [Accepted: 09/05/2015] [Indexed: 01/24/2023]
Abstract
Although sensory processing challenges have been noted since the first clinical descriptions of autism, it has taken until the release of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) in 2013 for sensory problems to be included as part of the core symptoms of autism spectrum disorder (ASD) in the diagnostic profile. Because sensory information forms the building blocks for higher-order social and cognitive functions, we argue that sensory processing is not only an additional piece of the puzzle, but rather a critical cornerstone for characterizing and understanding ASD. In this review we discuss what is currently known about sensory processing in ASD, how sensory function fits within contemporary models of ASD, and what is understood about the differences in the underlying neural processing of sensory and social communication observed between individuals with and without ASD. In addition to highlighting the sensory features associated with ASD, we also emphasize the importance of multisensory processing in building perceptual and cognitive representations, and how deficits in multisensory integration may also be a core characteristic of ASD.
Collapse
Affiliation(s)
- Sarah H Baum
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA
| | - Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, ON, Canada
| | - Mark T Wallace
- Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA; Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA; Department of Psychology, Vanderbilt University, Nashville, TN, USA; Department of Psychiatry, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
69
|
Multisensory Stimulation to Improve Low- and Higher-Level Sensory Deficits after Stroke: A Systematic Review. Neuropsychol Rev 2015; 26:73-91. [PMID: 26490254 PMCID: PMC4762927 DOI: 10.1007/s11065-015-9301-1] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2015] [Accepted: 10/01/2015] [Indexed: 10/24/2022]
Abstract
The aim of this systematic review was to integrate and assess evidence for the effectiveness of multisensory stimulation (i.e., stimulating at least two of the following sensory systems: visual, auditory, and somatosensory) as a possible rehabilitation method after stroke. Evidence was considered with a focus on low-level, perceptual (visual, auditory and somatosensory deficits), as well as higher-level, cognitive, sensory deficits. We referred to the electronic databases Scopus and PubMed to search for articles that were published before May 2015. Studies were included which evaluated the effects of multisensory stimulation on patients with low- or higher-level sensory deficits caused by stroke. Twenty-one studies were included in this review and the quality of these studies was assessed (based on eight elements: randomization, inclusion of control patient group, blinding of participants, blinding of researchers, follow-up, group size, reporting effect sizes, and reporting time post-stroke). Twenty of the twenty-one included studies demonstrate beneficial effects on low- and/or higher-level sensory deficits after stroke. Notwithstanding these beneficial effects, the quality of the studies is insufficient for valid conclusion that multisensory stimulation can be successfully applied as an effective intervention. A valuable and necessary next step would be to set up well-designed randomized controlled trials to examine the effectiveness of multisensory stimulation as an intervention for low- and/or higher-level sensory deficits after stroke. Finally, we consider the potential mechanisms of multisensory stimulation for rehabilitation to guide this future research.
Collapse
|
70
|
Wardak C, Guipponi O, Pinède S, Ben Hamed S. Tactile representation of the head and shoulders assessed by fMRI in the nonhuman primate. J Neurophysiol 2015; 115:80-91. [PMID: 26467517 DOI: 10.1152/jn.00633.2015] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2015] [Accepted: 10/13/2015] [Indexed: 11/22/2022] Open
Abstract
In nonhuman primates, tactile representation at the cortical level has mostly been studied using single-cell recordings targeted to specific cortical areas. In this study, we explored the representation of tactile information delivered to the face or the shoulders at the whole brain level, using functional magnetic resonance imaging (fMRI) in the nonhuman primate. We used air puffs delivered to the center of the face, the periphery of the face, or the shoulders. These stimulations elicited activations in numerous cortical areas, encompassing the primary and secondary somatosensory areas, prefrontal and premotor areas, and parietal, temporal, and cingulate areas as well as low-level visual cortex. Importantly, a specific parieto-temporo-prefrontal network responded to the three stimulations but presented a marked preference for air puffs directed to the center of the face. This network corresponds to areas that are also involved in near-space representation, as well as in the multisensory integration of information at the interface between this near space and the skin of the face, and is probably involved in the construction of a peripersonal space representation around the head.
Collapse
Affiliation(s)
- Claire Wardak
- Centre de Neuroscience Cognitive, UMR 5229, Centre National de la Recherche Scientifique, Université Claude Bernard Lyon 1, Bron, France
| | - Olivier Guipponi
- Centre de Neuroscience Cognitive, UMR 5229, Centre National de la Recherche Scientifique, Université Claude Bernard Lyon 1, Bron, France
| | - Serge Pinède
- Centre de Neuroscience Cognitive, UMR 5229, Centre National de la Recherche Scientifique, Université Claude Bernard Lyon 1, Bron, France
| | - Suliann Ben Hamed
- Centre de Neuroscience Cognitive, UMR 5229, Centre National de la Recherche Scientifique, Université Claude Bernard Lyon 1, Bron, France
| |
Collapse
|
71
|
Stevenson RA, Segers M, Ferber S, Barense MD, Camarata S, Wallace MT. Keeping time in the brain: Autism spectrum disorder and audiovisual temporal processing. Autism Res 2015; 9:720-38. [PMID: 26402725 DOI: 10.1002/aur.1566] [Citation(s) in RCA: 58] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2015] [Revised: 08/22/2015] [Accepted: 08/29/2015] [Indexed: 12/21/2022]
Abstract
A growing area of interest and relevance in the study of autism spectrum disorder (ASD) focuses on the relationship between multisensory temporal function and the behavioral, perceptual, and cognitive impairments observed in ASD. Atypical sensory processing is becoming increasingly recognized as a core component of autism, with evidence of atypical processing across a number of sensory modalities. These deviations from typical processing underscore the value of interpreting ASD within a multisensory framework. Furthermore, converging evidence illustrates that these differences in audiovisual processing may be specifically related to temporal processing. This review seeks to bridge the connection between temporal processing and audiovisual perception, and to elaborate on emerging data showing differences in audiovisual temporal function in autism. We also discuss the consequence of such changes, the specific impact on the processing of different classes of audiovisual stimuli (e.g. speech vs. nonspeech, etc.), and the presumptive brain processes and networks underlying audiovisual temporal integration. Finally, possible downstream behavioral implications, and possible remediation strategies are outlined. Autism Res 2016, 9: 720-738. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Ryan A Stevenson
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Magali Segers
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - Susanne Ferber
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Morgan D Barense
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada.,Rotman Research Institute, Toronto, Ontario, Canada
| | - Stephen Camarata
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee
| | - Mark T Wallace
- Department of Hearing and Speech Sciences, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, Tennessee.,Vanderbilt Brain Institute, Vanderbilt University Medical Center, Nashville, Tennessee.,Department of Psychology, Vanderbilt University, Nashville, Tennessee.,Department of Psychiatry, Vanderbilt University Medical Center, Nashville, Tennessee
| |
Collapse
|
72
|
Abstract
The superior temporal sulcus (STS) is implicated in a variety of social processes, ranging from language perception to simulating the mental processes of others (theory of mind). In a new study, Deen and colleagues use functional magnetic resonance imaging (fMRI) to show a regular anterior-posterior organization in the STS for different social tasks.
Collapse
Affiliation(s)
- Michael S Beauchamp
- Department of Neurosurgery and Core for Advanced MRI, Baylor College of Medicine, Houston, TX, USA.
| |
Collapse
|
73
|
Jiang F, Beauchamp MS, Fine I. Re-examining overlap between tactile and visual motion responses within hMT+ and STS. Neuroimage 2015; 119:187-96. [PMID: 26123373 DOI: 10.1016/j.neuroimage.2015.06.056] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2015] [Revised: 06/14/2015] [Accepted: 06/18/2015] [Indexed: 10/23/2022] Open
Abstract
Here, we examine overlap between tactile and visual motion BOLD responses within the human MT+ complex. Although several studies have reported tactile responses overlapping with hMT+, many used group average analyses, leaving it unclear whether these responses were restricted to subregions of hMT+. Moreover, previous studies either employed a tactile task or passive stimulation, leaving it unclear whether or not tactile responses in hMT+ are simply the consequence of visual imagery. Here, we carried out a replication of one of the classic papers finding tactile responses in hMT+. We mapped MT and MST in individual subjects using visual field localizers. We then examined responses to tactile motion on the arm, either presented passively or in the presence of a visual task performed at fixation designed to minimize visualization of the concurrent tactile stimulation. To our surprise, without a visual task, we found only weak tactile motion responses in MT (6% of voxels showing tactile responses) and MST (2% of voxels). With an unrelated visual task designed to withdraw attention from the tactile modality, responses in MST were reduced to almost nothing (<1% regions). Consistent with previous results, we did observe tactile responses in STS regions superior and anterior to hMT+. Despite the lack of individual overlap, group-averaged responses produced strong spurious overlap between tactile and visual motion responses within hMT+ that resembled those observed in previous studies. The weak nature of tactile responses in hMT+ (and their abolition by withdrawal of attention) suggests that hMT+ may not serve as a supramodal motion processing module.
Collapse
Affiliation(s)
- Fang Jiang
- Department of Psychology, University of Washington, Seattle, WA 98195, USA; Department of Psychology, University of Nevada Reno, Reno, NV 89557, USA.
| | - Michael S Beauchamp
- Department of Neurosurgery, Baylor College of Medicine, Houston, TX 77030, USA
| | - Ione Fine
- Department of Psychology, University of Washington, Seattle, WA 98195, USA
| |
Collapse
|
74
|
Leonardelli E, Braun C, Weisz N, Lithari C, Occelli V, Zampini M. Prestimulus oscillatory alpha power and connectivity patterns predispose perceptual integration of an audio and a tactile stimulus. Hum Brain Mapp 2015; 36:3486-98. [PMID: 26109518 DOI: 10.1002/hbm.22857] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2014] [Revised: 05/13/2015] [Accepted: 05/14/2015] [Indexed: 11/06/2022] Open
Abstract
To efficiently perceive and respond to the external environment, our brain has to perceptually integrate or segregate stimuli of different modalities. The temporal relationship between the different sensory modalities is therefore essential for the formation of different multisensory percepts. In this magnetoencephalography study, we created a paradigm where an audio and a tactile stimulus were presented by an ambiguous temporal relationship so that perception of physically identical audiotactile stimuli could vary between integrated (emanating from the same source) and segregated. This bistable paradigm allowed us to compare identical bimodal stimuli that elicited different percepts, providing a possibility to directly infer multisensory interaction effects. Local differences in alpha power over bilateral inferior parietal lobules (IPLs) and superior parietal lobules (SPLs) preceded integrated versus segregated percepts of the two stimuli (audio and tactile). Furthermore, differences in long-range cortical functional connectivity seeded in rIPL (region of maximum difference) revealed differential patterns that predisposed integrated or segregated percepts encompassing secondary areas of all different modalities and prefrontal cortex. We showed that the prestimulus brain states predispose the perception of the audiotactile stimulus both in a global and a local manner. Our findings are in line with a recent consistent body of findings on the importance of prestimulus brain states for perception of an upcoming stimulus. This new perspective on how stimuli originating from different modalities are integrated suggests a non-modality specific network predisposing multisensory perception.
Collapse
Affiliation(s)
| | - Christoph Braun
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy.,MEG Center, University of Tübingen, Tübingen, Germany.,Werner Reichardt Centre for Integrative Neuroscience(CIN), University of Tübingen, Tübingen, Germany
| | - Nathan Weisz
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | - Chrysa Lithari
- Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | | | | |
Collapse
|
75
|
Barger N, Sheley MF, Schumann CM. Stereological study of pyramidal neurons in the human superior temporal gyrus from childhood to adulthood. J Comp Neurol 2015; 523:1054-72. [PMID: 25556320 DOI: 10.1002/cne.23707] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2014] [Revised: 10/25/2014] [Accepted: 10/30/2014] [Indexed: 01/11/2023]
Abstract
The association cortex of the superior temporal gyrus (STG) is implicated in complex social and linguistic functions. Thus, reliable methods for quantifying cellular variation in this region could greatly benefit researchers interested in addressing the cellular correlates of typical and atypical function associated with these critical cognitive abilities. To facilitate this task, we first present a general set of cytoarchitectonic criteria targeted specifically toward stereological analyses of thick, Nissl-stained sections for the homotypical cortex of the STG, referred to here as BA22/TA. Second, we use the optical fractionator to estimate pyramidal neuron number and the nucleator for pyramidal somal and nuclear volume. We also investigated the influence of age and sex on these parameters, as well as set a typically developing baseline for future comparisons. In 11 typically developing cases aged 4-48 years, the most distinguishing features of BA22/TA were the presence of distinct granular layers, a prominent, jagged layer IIIc, and a distinctly staining VIa. The average number of neurons was 91 ± 15 million, the volume of pyramidal soma 1,512 µm(3) , and the nuclear volume 348 µm(3) . We found no correlation with age and neuron number. In contrast, pyramidal somal and nuclear volume were both negatively correlated and linearly associated with age in regression analyses. We found no significant sex differences. Overall, the data support the idea that postnatal neuron numbers are relatively stable through development but also suggest that neuronal volume may be subject to important developmental variation. Both measures are critical variables in the study of developmental neuropathology.
Collapse
Affiliation(s)
- Nicole Barger
- Department of Psychiatry and Behavioral Sciences, MIND Institute, University of California, Davis, Sacramento, California, 95817
| | | | | |
Collapse
|
76
|
Yang DYJ, Rosenblau G, Keifer C, Pelphrey KA. An integrative neural model of social perception, action observation, and theory of mind. Neurosci Biobehav Rev 2015; 51:263-75. [PMID: 25660957 DOI: 10.1016/j.neubiorev.2015.01.020] [Citation(s) in RCA: 173] [Impact Index Per Article: 17.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2014] [Revised: 01/13/2015] [Accepted: 01/23/2015] [Indexed: 10/24/2022]
Abstract
In the field of social neuroscience, major branches of research have been instrumental in describing independent components of typical and aberrant social information processing, but the field as a whole lacks a comprehensive model that integrates different branches. We review existing research related to the neural basis of three key neural systems underlying social information processing: social perception, action observation, and theory of mind. We propose an integrative model that unites these three processes and highlights the posterior superior temporal sulcus (pSTS), which plays a central role in all three systems. Furthermore, we integrate these neural systems with the dual system account of implicit and explicit social information processing. Large-scale meta-analyses based on Neurosynth confirmed that the pSTS is at the intersection of the three neural systems. Resting-state functional connectivity analysis with 1000 subjects confirmed that the pSTS is connected to all other regions in these systems. The findings presented in this review are specifically relevant for psychiatric research especially disorders characterized by social deficits such as autism spectrum disorder.
Collapse
Affiliation(s)
- Daniel Y-J Yang
- Center for Translational Developmental Neuroscience, Child Study Center, Yale University, New Haven, CT, USA.
| | - Gabriela Rosenblau
- Center for Translational Developmental Neuroscience, Child Study Center, Yale University, New Haven, CT, USA
| | - Cara Keifer
- Center for Translational Developmental Neuroscience, Child Study Center, Yale University, New Haven, CT, USA
| | - Kevin A Pelphrey
- Center for Translational Developmental Neuroscience, Child Study Center, Yale University, New Haven, CT, USA
| |
Collapse
|
77
|
Bonino D, Ricciardi E, Bernardi G, Sani L, Gentili C, Vecchi T, Pietrini P. Spatial imagery relies on a sensory independent, though sensory sensitive, functional organization within the parietal cortex: a fMRI study of angle discrimination in sighted and congenitally blind individuals. Neuropsychologia 2015; 68:59-70. [PMID: 25575449 DOI: 10.1016/j.neuropsychologia.2015.01.004] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2014] [Revised: 12/31/2014] [Accepted: 01/05/2015] [Indexed: 10/24/2022]
Abstract
Although vision offers distinctive information to space representation, individuals who lack vision since birth often show perceptual and representational skills comparable to those found in sighted individuals. However, congenitally blind individuals may result in impaired spatial analysis, when engaging in 'visual' spatial features (e.g., perspective or angle representation) or complex spatial mental abilities. In the present study, we measured behavioral and brain responses using functional magnetic resonance imaging in sighted and congenitally blind individuals during spatial imagery based on a modified version of the mental clock task (e.g., angle discrimination) and a simple recognition control condition, as conveyed across distinct sensory modalities: visual (sighted individuals only), tactile and auditory. Blind individuals were significantly less accurate during the auditory task, but comparable-to-sighted during the tactile task. As expected, both groups showed common neural activations in intraparietal and superior parietal regions across visual and non-visual spatial perception and imagery conditions, indicating the more abstract, sensory independent functional organization of these cortical areas, a property that we named supramodality. At the same time, however, comparisons in brain responses and functional connectivity patterns across experimental conditions demonstrated also a functional lateralization, in a way that correlated with the distinct behavioral performance in blind and sighted individuals. Specifically, blind individuals relied more on right parietal regions, mainly in the tactile and less in the auditory spatial processing. In sighted, spatial representation across modalities relied more on left parietal regions. In conclusions, intraparietal and superior parietal regions subserve supramodal spatial representations in sighted and congenitally blind individuals. Differences in their recruitment across non-visual spatial processing in sighted and blind individuals may be related to distinctive behavioral performance and/or mental strategies adopted when they deal with the same spatial representation as conveyed through different sensory modalities.
Collapse
Affiliation(s)
- Daniela Bonino
- Laboratory of Clinical Biochemistry and Molecular Biology, Department of Surgery, Medical, Molecular Pathology, and Critical Care, University of Pisa, Pisa, Italy
| | - Emiliano Ricciardi
- Laboratory of Clinical Biochemistry and Molecular Biology, Department of Surgery, Medical, Molecular Pathology, and Critical Care, University of Pisa, Pisa, Italy; MRI Lab, Fondazione "G. Monasterio" Regione Toscana/C.N.R., Pisa, Italy.
| | - Giulio Bernardi
- Laboratory of Clinical Biochemistry and Molecular Biology, Department of Surgery, Medical, Molecular Pathology, and Critical Care, University of Pisa, Pisa, Italy
| | - Lorenzo Sani
- Laboratory of Clinical Biochemistry and Molecular Biology, Department of Surgery, Medical, Molecular Pathology, and Critical Care, University of Pisa, Pisa, Italy
| | - Claudio Gentili
- Clinical Psychology Branch, Department of Surgery, Medical, Molecular Pathology, and Critical Care, University of Pisa, Pisa, Italy
| | - Tomaso Vecchi
- Department of Brain and Behavioral Sciences, University of Pavia, Pavia, Italy; Brain Connectivity Center, National Neurological Institute C. Mondino, Pavia, Italy
| | - Pietro Pietrini
- Laboratory of Clinical Biochemistry and Molecular Biology, Department of Surgery, Medical, Molecular Pathology, and Critical Care, University of Pisa, Pisa, Italy; Clinical Psychology Branch, Department of Surgery, Medical, Molecular Pathology, and Critical Care, University of Pisa, Pisa, Italy
| |
Collapse
|
78
|
Brauns I, Teixeira S, Velasques B, Bittencourt J, Machado S, Cagy M, Gongora M, Bastos VH, Machado D, Sandoval-Carrillo A, Salas-Pacheco J, Piedade R, Ribeiro P, Arias-Carrión O. Changes in the theta band coherence during motor task after hand immobilization. Int Arch Med 2014; 7:51. [PMID: 25838843 PMCID: PMC4363202 DOI: 10.1186/1755-7682-7-51] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2014] [Accepted: 12/02/2014] [Indexed: 11/10/2022] Open
Abstract
Many different factors can temporarily or permanently impair movement and impairs cortical organization, e.g. hand immobilization. Such changes have been widely studied using electroencephalography. Within this context, we have investigated the immobilization effects through the theta band coherence analysis, in order to find out whether the immobilization period causes any changes in the inter and intra-hemispheric coherence within the cerebral cortex, as well as to observe whether the theta band provides any information about the neural mechanisms involved during the motor act. We analyzed the cortical changes that occurred after 48 hours of hand immobilization. The theta band coherence was study through electroencephalography in 30 healthy subjects, divided into two groups (control and experimental). Within both groups, the subjects executed a task involving flexion and extension of the index finger, before and after 48 hours. The experimental group, however, was actually submitted to hand immobilization. We were able to observe an increase in the coupling within the experimental group in the frontal, parietal and temporal regions, and a decrease in the motor area. In order to execute manual tasks after some time of movement restriction, greater coherence is present in areas related to attention, movement preparation and sensorimotor integration processes. These results may contribute to a detailed assessment of involved neurophysiological mechanism in motor act execution.
Collapse
Affiliation(s)
- Igor Brauns
- Brain Mapping and Sensory Motor Integration, Institute of Psychiatry of Federal University of Rio de Janeiro (IPUB/UFRJ), Rio de Janeiro, Brazil
| | - Silmar Teixeira
- Brain Mapping and Sensory Motor Integration, Institute of Psychiatry of Federal University of Rio de Janeiro (IPUB/UFRJ), Rio de Janeiro, Brazil ; Unidad de Trastornos del Movimiento y Sueño (TMS), Hospital General Dr. Manuel Gea González/IFC-UNAM, Mexico City, Mexico ; Unidad de Trastornos del Movimiento y Sueño (TMS), Hospital General Ajusco Medio, Secretaria de Salud Mexico City, Mexico
| | - Bruna Velasques
- Brain Mapping and Sensory Motor Integration, Institute of Psychiatry of Federal University of Rio de Janeiro (IPUB/UFRJ), Rio de Janeiro, Brazil ; Institute of Applied Neuroscience (INA), Rio de Janeiro, Brazil ; National Institute of Traumatology and Orthopaedics (INTO), Neuromuscular Research Laboratory, Rio de Janeiro, Brazil
| | - Juliana Bittencourt
- Brain Mapping and Sensory Motor Integration, Institute of Psychiatry of Federal University of Rio de Janeiro (IPUB/UFRJ), Rio de Janeiro, Brazil
| | - Sergio Machado
- Institute of Psychiatry of Federal University of Rio de Janeiro, Panic and Respiration, Rio de Janeiro, Brazil ; National Institute for Translational Medicine (INCT-TM), Rio de Janeiro, Brazil ; Physical Activity Neuroscience, Physical Activity Sciences Postgraduate Program, Salgado de Oliveira University, Niterói, Brazil
| | - Mauricio Cagy
- Biomedical Engineering Program, COPPE, Federal University of Rio de Janeiro, Rio de Janeiro, Brazil
| | - Mariana Gongora
- Brain Mapping and Sensory Motor Integration, Institute of Psychiatry of Federal University of Rio de Janeiro (IPUB/UFRJ), Rio de Janeiro, Brazil
| | - Victor Hugo Bastos
- Brain Mapping and Functionality Laboratory, Federal University of Piauí, UFPI, Parnaiba, Brazil ; Brain Mapping and Plasticity Laboratory, Federal University of Piauí, UFPI, Parnaiba, Brazil
| | - Dionis Machado
- Brain Mapping and Functionality Laboratory, Federal University of Piauí, UFPI, Parnaiba, Brazil ; Brain Mapping and Plasticity Laboratory, Federal University of Piauí, UFPI, Parnaiba, Brazil
| | - Ada Sandoval-Carrillo
- Instituto de Investigación Científica, Universidad Juárez del Estado de Durango, Durango, Durango, México
| | - Jose Salas-Pacheco
- Instituto de Investigación Científica, Universidad Juárez del Estado de Durango, Durango, Durango, México
| | - Roberto Piedade
- Brain Mapping and Sensory Motor Integration, Institute of Psychiatry of Federal University of Rio de Janeiro (IPUB/UFRJ), Rio de Janeiro, Brazil
| | - Pedro Ribeiro
- Brain Mapping and Sensory Motor Integration, Institute of Psychiatry of Federal University of Rio de Janeiro (IPUB/UFRJ), Rio de Janeiro, Brazil ; School of Physical Education, Bioscience Department (EEFD/UFRJ), Rio de Janeiro, Brazil ; Institute of Applied Neuroscience (INA), Rio de Janeiro, Brazil
| | - Oscar Arias-Carrión
- Unidad de Trastornos del Movimiento y Sueño (TMS), Hospital General Dr. Manuel Gea González/IFC-UNAM, Mexico City, Mexico ; Unidad de Trastornos del Movimiento y Sueño (TMS), Hospital General Ajusco Medio, Secretaria de Salud Mexico City, Mexico
| |
Collapse
|
79
|
Geranmayeh F, Leech R, Wise RJS. Semantic retrieval during overt picture description: Left anterior temporal or the parietal lobe? Neuropsychologia 2014; 76:125-35. [PMID: 25497693 PMCID: PMC4582804 DOI: 10.1016/j.neuropsychologia.2014.12.012] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2014] [Revised: 12/07/2014] [Accepted: 12/08/2014] [Indexed: 11/15/2022]
Abstract
Retrieval of semantic representations is a central process during overt speech production. There is an increasing consensus that an amodal semantic 'hub' must exist that draws together modality-specific representations of concepts. Based on the distribution of atrophy and the behavioral deficit of patients with the semantic variant of fronto-temporal lobar degeneration, it has been proposed that this hub is localized within both anterior temporal lobes (ATL), and is functionally connected with verbal 'output' systems via the left ATL. An alternative view, dating from Geschwind's proposal in 1965, is that the angular gyrus (AG) is central to object-based semantic representations. In this fMRI study we examined the connectivity of the left ATL and parietal lobe (PL) with whole brain networks known to be activated during overt picture description. We decomposed each of these two brain volumes into 15 regions of interest (ROIs), using independent component analysis. A dual regression analysis was used to establish the connectivity of each ROI with whole brain-networks. An ROI within the left anterior superior temporal sulcus (antSTS) was functionally connected to other parts of the left ATL, including anterior ventromedial left temporal cortex (partially attenuated by signal loss due to susceptibility artifact), a large left dorsolateral prefrontal region (including 'classic' Broca's area), extensive bilateral sensory-motor cortices, and the length of both superior temporal gyri. The time-course of this functionally connected network was associated with picture description but not with non-semantic baseline tasks. This system has the distribution expected for the production of overt speech with appropriate semantic content, and the auditory monitoring of the overt speech output. In contrast, the only left PL ROI that showed connectivity with brain systems most strongly activated by the picture-description task, was in the superior parietal lobe (supPL). This region showed connectivity with predominantly posterior cortical regions required for the visual processing of the pictorial stimuli, with additional connectivity to the dorsal left AG and a small component of the left inferior frontal gyrus. None of the other PL ROIs that included part of the left AG were activated by Speech alone. The best interpretation of these results is that the left antSTS connects the proposed semantic hub (specifically localized to ventral anterior temporal cortex based on clinical neuropsychological studies) to posterior frontal regions and sensory-motor cortices responsible for the overt production of speech.
Collapse
Affiliation(s)
- Fatemeh Geranmayeh
- Computational Cognitive and Clinical Neuroimaging Laboratory, Imperial College, Hammersmith Hospital, London W12 0NN, UK.
| | - Robert Leech
- Computational Cognitive and Clinical Neuroimaging Laboratory, Imperial College, Hammersmith Hospital, London W12 0NN, UK
| | - Richard J S Wise
- Computational Cognitive and Clinical Neuroimaging Laboratory, Imperial College, Hammersmith Hospital, London W12 0NN, UK
| |
Collapse
|
80
|
Lee SM, McCarthy G. Functional Heterogeneity and Convergence in the Right Temporoparietal Junction. Cereb Cortex 2014; 26:1108-1116. [PMID: 25477367 DOI: 10.1093/cercor/bhu292] [Citation(s) in RCA: 38] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
The right temporoparietal junction (rTPJ) is engaged by tasks that manipulate biological motion processing, Theory of Mind attributions, and attention reorienting. The proximity of activations elicited by these tasks raises the question of whether these tasks share common cognitive component processes that are subserved by common neural substrates. Here, we used high-resolution whole-brain functional magnetic resonance imaging in a within-subjects design to determine whether these tasks activate common regions of the rTPJ. Each participant was presented with the 3 tasks in the same imaging session. In a whole-brain analysis, we found that only the right and left TPJs were activated by all 3 tasks. Multivoxel pattern analysis revealed that the regions of overlap could still discriminate the 3 tasks. Notably, we found significant cross-task classification in the right TPJ, which suggests a shared neural process between the 3 tasks. Taken together, these results support prior studies that have indicated functional heterogeneity within the rTPJ but also suggest a convergence of function within a region of overlap. These results also call for further investigation into the nature of the function subserved in this overlap region.
Collapse
Affiliation(s)
- Su Mei Lee
- Department of Psychology, Yale University, New Haven, CT, USA
| | | |
Collapse
|
81
|
Filippetti ML, Lloyd-Fox S, Longo MR, Farroni T, Johnson MH. Neural Mechanisms of Body Awareness in Infants. Cereb Cortex 2014; 25:3779-87. [PMID: 25404469 PMCID: PMC4585515 DOI: 10.1093/cercor/bhu261] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023] Open
Abstract
The ability to differentiate one's body from others is a fundamental aspect of social perception and has been shown to involve the integration of sense modalities attributable to the self. Though behavioral studies in infancy have investigated infants' discrimination of body-related multisensory stimuli, whether they attribute this information as belonging to the self is still unknown. In human adults, neuroimaging studies have demonstrated the recruitment of a specific set of brain regions in response to body-related multisensory integration. To test whether the infant brain integrates this information similarly to adults, in a first functional near-infrared spectroscopy study we investigated the role of visual–proprioceptive feedback when temporal cues are manipulated by showing 5-month-old infants an online video of their own face while the infant was performing movements. To explore the role of body-related contingency further, in a second study we investigated whether cortical activation in response to self-initiated movements and external tactile stimulation was similar to that found in the first study. Our results indicate that infants' specialized cortical activation in response to body-related contingencies is similar to brain activation seen in response to body awareness in adults.
Collapse
Affiliation(s)
| | | | - M R Longo
- Department of Psychological Sciences, University of London, Birkbeck, UK
| | - T Farroni
- Dipartimento di Psicologia dello Sviluppo e della Socializzazione, University of Padua, Padua, Italy
| | | |
Collapse
|
82
|
Ito T, Gracco VL, Ostry DJ. Temporal factors affecting somatosensory-auditory interactions in speech processing. Front Psychol 2014; 5:1198. [PMID: 25452733 PMCID: PMC4233986 DOI: 10.3389/fpsyg.2014.01198] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2014] [Accepted: 10/04/2014] [Indexed: 12/03/2022] Open
Abstract
Speech perception is known to rely on both auditory and visual information. However, sound-specific somatosensory input has been shown also to influence speech perceptual processing (Ito et al., 2009). In the present study, we addressed further the relationship between somatosensory information and speech perceptual processing by addressing the hypothesis that the temporal relationship between orofacial movement and sound processing contributes to somatosensory–auditory interaction in speech perception. We examined the changes in event-related potentials (ERPs) in response to multisensory synchronous (simultaneous) and asynchronous (90 ms lag and lead) somatosensory and auditory stimulation compared to individual unisensory auditory and somatosensory stimulation alone. We used a robotic device to apply facial skin somatosensory deformations that were similar in timing and duration to those experienced in speech production. Following synchronous multisensory stimulation the amplitude of the ERP was reliably different from the two unisensory potentials. More importantly, the magnitude of the ERP difference varied as a function of the relative timing of the somatosensory–auditory stimulation. Event-related activity change due to stimulus timing was seen between 160 and 220 ms following somatosensory onset, mostly around the parietal area. The results demonstrate a dynamic modulation of somatosensory–auditory convergence and suggest the contribution of somatosensory information for speech processing process is dependent on the specific temporal order of sensory inputs in speech production.
Collapse
Affiliation(s)
| | - Vincent L Gracco
- Haskins Laboratories, New Haven , CT, USA ; McGill University, Montréal , QC, Canada
| | - David J Ostry
- Haskins Laboratories, New Haven , CT, USA ; McGill University, Montréal , QC, Canada
| |
Collapse
|
83
|
Horiguchi H, Wandell BA, Winawer J. A Predominantly Visual Subdivision of The Right Temporo-Parietal Junction (vTPJ). Cereb Cortex 2014; 26:639-646. [PMID: 25267856 DOI: 10.1093/cercor/bhu226] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
A multiplicity of sensory and cognitive functions has been attributed to the large cortical region at the temporo-parietal junction (TPJ). Using functional MRI, we report that a small region lateralized within the right TPJ responds robustly to certain simple visual stimuli ("vTPJ"). The vTPJ was found in all right hemispheres (n = 7), posterior to the auditory cortex. To manipulate stimuli and attention, subjects were presented with a mixture of visual and auditory stimuli in a concurrent block design in 2 experiments: (1) A simple visual stimulus (a grating pattern modulating in mean luminance) elicited robust responses in the vTPJ, whether or not the subject attended to vision and(2) a drifting low-contrast dartboard pattern of constant mean luminance evoked robust responses in the vTPJ when it was task-relevant (visual task), and smaller responses when it was not (auditory task). The results suggest a focal, visually responsive region within the right TPJ that is powerfully driven by certain visual stimuli (luminance fluctuations), and that can be driven by other visual stimuli when the subject is attending. The precise localization of this visually responsive region is helpful in segmenting the TPJ and to better understand its role in visual awareness and related disorders such as extinction and neglect.
Collapse
Affiliation(s)
- Hiroshi Horiguchi
- Department of Psychology.,Department of Ophthalmology, Jikei University, School of Medicine, Minato, Tokyo, Japan
| | - Brian A Wandell
- Department of Psychology.,Center for Cognitive and Neurobiological Imaging, Stanford University, Stanford, CA, USA
| | - Jonathan Winawer
- Department of Psychology and Center for Neural Science, New York University, New York, NY, USA
| |
Collapse
|
84
|
Liebenthal E, Desai RH, Humphries C, Sabri M, Desai A. The functional organization of the left STS: a large scale meta-analysis of PET and fMRI studies of healthy adults. Front Neurosci 2014; 8:289. [PMID: 25309312 PMCID: PMC4160993 DOI: 10.3389/fnins.2014.00289] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2014] [Accepted: 08/26/2014] [Indexed: 11/13/2022] Open
Abstract
The superior temporal sulcus (STS) in the left hemisphere is functionally diverse, with sub-areas implicated in both linguistic and non-linguistic functions. However, the number and boundaries of distinct functional regions remain to be determined. Here, we present new evidence, from meta-analysis of a large number of positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies, of different functional specificity in the left STS supporting a division of its middle to terminal extent into at least three functional areas. The middle portion of the left STS stem (fmSTS) is highly specialized for speech perception and the processing of language material. The posterior portion of the left STS stem (fpSTS) is highly versatile and involved in multiple functions supporting semantic memory and associative thinking. The fpSTS responds to both language and non-language stimuli but the sensitivity to non-language material is greater. The horizontal portion of the left STS stem and terminal ascending branches (ftSTS) display intermediate functional specificity, with the anterior-dorsal ascending branch (fatSTS) supporting executive functions and motor planning and showing greater sensitivity to language material, and the horizontal stem and posterior-ventral ascending branch (fptSTS) supporting primarily semantic processing and displaying greater sensitivity to non-language material. We suggest that the high functional specificity of the left fmSTS for speech is an important means by which the human brain achieves exquisite affinity and efficiency for native speech perception. In contrast, the extreme multi-functionality of the left fpSTS reflects the role of this area as a cortical hub for semantic processing and the extraction of meaning from multiple sources of information. Finally, in the left ftSTS, further functional differentiation between the dorsal and ventral aspect is warranted.
Collapse
Affiliation(s)
- Einat Liebenthal
- Department of Neurology, Medical College of Wisconsin Milwaukee, WI, USA ; Department of Psychiatry, Brigham and Women's Hospital Boston, MA, USA
| | - Rutvik H Desai
- Department of Psychology, University of South Carolina Columbia, SC, USA
| | - Colin Humphries
- Department of Neurology, Medical College of Wisconsin Milwaukee, WI, USA
| | - Merav Sabri
- Department of Neurology, Medical College of Wisconsin Milwaukee, WI, USA
| | - Anjali Desai
- Department of Neurology, Medical College of Wisconsin Milwaukee, WI, USA
| |
Collapse
|
85
|
Abstract
Neuropsychological studies have described patients with a selective impairment of finger identification in association with posterior parietal lesions. However, evidence of the role of these areas in finger gnosis from studies of the healthy human brain is still scarce. Here we used functional magnetic resonance imaging to identify the brain network engaged in a novel finger gnosis task, the intermanual in-between task (IIBT), in healthy participants. Several brain regions exhibited a stronger blood oxygenation level-dependent (BOLD) response in IIBT than in a control task that did not explicitly rely on finger gnosis but used identical stimuli and motor responses as the IIBT. The IIBT involved stronger signal in the left inferior parietal lobule (IPL), bilateral precuneus (PCN), bilateral premotor cortex, and left inferior frontal gyrus. In all regions, stimulation of nonhomologous fingers of the two hands elicited higher BOLD signal than stimulation of homologous fingers. Only in the left anteromedial IPL (a-mIPL) and left PCN did signal strength decrease parametrically from nonhomology, through partial homology, to total homology with stimulation delivered synchronously to the two hands. With asynchronous stimulation, the signal was stronger in the left a-mIPL than in any other region, possibly indicating retention of task-relevant information. We suggest that the left PCN may contribute a supporting visuospatial representation via its functional connection to the right PCN. The a-mIPL may instead provide the core substrate of an explicit bilateral body structure representation for the fingers that when disrupted can produce the typical symptoms of finger agnosia.
Collapse
|
86
|
Vogt K, Schnaitmann C, Dylla KV, Knapek S, Aso Y, Rubin GM, Tanimoto H. Shared mushroom body circuits underlie visual and olfactory memories in Drosophila. eLife 2014; 3:e02395. [PMID: 25139953 PMCID: PMC4135349 DOI: 10.7554/elife.02395] [Citation(s) in RCA: 129] [Impact Index Per Article: 11.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023] Open
Abstract
In nature, animals form memories associating reward or punishment with stimuli from different sensory modalities, such as smells and colors. It is unclear, however, how distinct sensory memories are processed in the brain. We established appetitive and aversive visual learning assays for Drosophila that are comparable to the widely used olfactory learning assays. These assays share critical features, such as reinforcing stimuli (sugar reward and electric shock punishment), and allow direct comparison of the cellular requirements for visual and olfactory memories. We found that the same subsets of dopamine neurons drive formation of both sensory memories. Furthermore, distinct yet partially overlapping subsets of mushroom body intrinsic neurons are required for visual and olfactory memories. Thus, our results suggest that distinct sensory memories are processed in a common brain center. Such centralization of related brain functions is an economical design that avoids the repetition of similar circuit motifs. DOI:http://dx.doi.org/10.7554/eLife.02395.001 Animals tend to associate good and bad things with certain visual scenes, smells and other kinds of sensory information. If we get food poisoning after eating a new food, for example, we tend to associate the taste and smell of the new food with feelings of illness. This is an example of a negative ‘associative memory’, and it can persist for months, even when we know that our sickness was not caused by the new food itself but by some foreign body that should not have been in the food. The same is true for positive associative memories. It is known that many associative memories contain information from more than one of the senses. Our memory of a favorite food, for instance, includes its scent, color and texture, as well as its taste. However, little is known about the ways in which information from the different senses is processed in the brain. Does each sense have its own dedicated memory circuit, or do multiple senses converge to the same memory circuit? A number of studies have used olfactory (smell) and visual stimuli to study the basic neuroscience that underpins associative memories in fruit flies. The olfactory experiments traditionally use sugar and electric shocks to induce positive and negative associations with various scents. However, the visual experiments use other methods to induce associations with colors. This means that it is difficult to combine and compare the results of olfactory and visual experiments. Now, Vogt, Schnaitmann et al. have developed a transparent grid that can be used to administer electric shocks in visual experiments. This allows direct comparisons to be made between the neuronal processing of visual associative memories and the neural processing of olfactory associative memories. Vogt, Schnaitmann et al. showed that both visual and olfactory stimuli are modulated in the same subset of dopamine neurons for positive associative memories. Similarly, another subset of dopamine neurons was found to drive negative memories of both the visual and olfactory stimuli. The work of Vogt, Schnaitmann et al. shows that associative memories are processed by a centralized circuit that receives both visual and olfactory inputs, thus reducing the number of memory circuits needed for such memories. DOI:http://dx.doi.org/10.7554/eLife.02395.002
Collapse
Affiliation(s)
- Katrin Vogt
- Max-Planck-Institute of Neurobiology, Martinsried, Germany
| | | | | | - Stephan Knapek
- Max-Planck-Institute of Neurobiology, Martinsried, Germany
| | - Yoshinori Aso
- Janelia Farm Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Gerald M Rubin
- Janelia Farm Research Campus, Howard Hughes Medical Institute, Ashburn, United States
| | - Hiromu Tanimoto
- Max-Planck-Institute of Neurobiology, Martinsried, Germany Graduate School of Life Sciences, Tohoku University, Sendai, Japan
| |
Collapse
|
87
|
Erickson LC, Heeg E, Rauschecker JP, Turkeltaub PE. An ALE meta-analysis on the audiovisual integration of speech signals. Hum Brain Mapp 2014; 35:5587-605. [PMID: 24996043 DOI: 10.1002/hbm.22572] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2013] [Revised: 05/28/2014] [Accepted: 06/24/2014] [Indexed: 11/09/2022] Open
Abstract
The brain improves speech processing through the integration of audiovisual (AV) signals. Situations involving AV speech integration may be crudely dichotomized into those where auditory and visual inputs contain (1) equivalent, complementary signals (validating AV speech) or (2) inconsistent, different signals (conflicting AV speech). This simple framework may allow the systematic examination of broad commonalities and differences between AV neural processes engaged by various experimental paradigms frequently used to study AV speech integration. We conducted an activation likelihood estimation metaanalysis of 22 functional imaging studies comprising 33 experiments, 311 subjects, and 347 foci examining "conflicting" versus "validating" AV speech. Experimental paradigms included content congruency, timing synchrony, and perceptual measures, such as the McGurk effect or synchrony judgments, across AV speech stimulus types (sublexical to sentence). Colocalization of conflicting AV speech experiments revealed consistency across at least two contrast types (e.g., synchrony and congruency) in a network of dorsal stream regions in the frontal, parietal, and temporal lobes. There was consistency across all contrast types (synchrony, congruency, and percept) in the bilateral posterior superior/middle temporal cortex. Although fewer studies were available, validating AV speech experiments were localized to other regions, such as ventral stream visual areas in the occipital and inferior temporal cortex. These results suggest that while equivalent, complementary AV speech signals may evoke activity in regions related to the corroboration of sensory input, conflicting AV speech signals recruit widespread dorsal stream areas likely involved in the resolution of conflicting sensory signals.
Collapse
Affiliation(s)
- Laura C Erickson
- Department of Neurology, Georgetown University Medical Center, Washington, District of Columbia; Department of Neuroscience, Georgetown University Medical Center, Washington, District of Columbia
| | | | | | | |
Collapse
|
88
|
Erickson LC, Zielinski BA, Zielinski JEV, Liu G, Turkeltaub PE, Leaver AM, Rauschecker JP. Distinct cortical locations for integration of audiovisual speech and the McGurk effect. Front Psychol 2014; 5:534. [PMID: 24917840 PMCID: PMC4040936 DOI: 10.3389/fpsyg.2014.00534] [Citation(s) in RCA: 42] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2014] [Accepted: 05/14/2014] [Indexed: 11/13/2022] Open
Abstract
Audiovisual (AV) speech integration is often studied using the McGurk effect, where the combination of specific incongruent auditory and visual speech cues produces the perception of a third illusory speech percept. Recently, several studies have implicated the posterior superior temporal sulcus (pSTS) in the McGurk effect; however, the exact roles of the pSTS and other brain areas in "correcting" differing AV sensory inputs remain unclear. Using functional magnetic resonance imaging (fMRI) in ten participants, we aimed to isolate brain areas specifically involved in processing congruent AV speech and the McGurk effect. Speech stimuli were composed of sounds and/or videos of consonant-vowel tokens resulting in four stimulus classes: congruent AV speech (AVCong), incongruent AV speech resulting in the McGurk effect (AVMcGurk), acoustic-only speech (AO), and visual-only speech (VO). In group- and single-subject analyses, left pSTS exhibited significantly greater fMRI signal for congruent AV speech (i.e., AVCong trials) than for both AO and VO trials. Right superior temporal gyrus, medial prefrontal cortex, and cerebellum were also identified. For McGurk speech (i.e., AVMcGurk trials), two clusters in the left posterior superior temporal gyrus (pSTG), just posterior to Heschl's gyrus or on its border, exhibited greater fMRI signal than both AO and VO trials. We propose that while some brain areas, such as left pSTS, may be more critical for the integration of AV speech, other areas, such as left pSTG, may generate the "corrected" or merged percept arising from conflicting auditory and visual cues (i.e., as in the McGurk effect). These findings are consistent with the concept that posterior superior temporal areas represent part of a "dorsal auditory stream," which is involved in multisensory integration, sensorimotor control, and optimal state estimation (Rauschecker and Scott, 2009).
Collapse
Affiliation(s)
- Laura C Erickson
- Department of Neuroscience, Georgetown University Medical Center, Washington DC, USA ; Department of Neurology, Georgetown University Medical Center, Washington DC, USA
| | - Brandon A Zielinski
- Department of Physiology and Biophysics, Georgetown University Medical Center, Washington DC, USA ; Departments of Pediatrics and Neurology, Division of Child Neurology, University of Utah, Salt Lake City UT, USA
| | - Jennifer E V Zielinski
- Department of Physiology and Biophysics, Georgetown University Medical Center, Washington DC, USA
| | - Guoying Liu
- Department of Physiology and Biophysics, Georgetown University Medical Center, Washington DC, USA ; National Institutes of Health, Bethesda MD, USA
| | - Peter E Turkeltaub
- Department of Neurology, Georgetown University Medical Center, Washington DC, USA ; MedStar National Rehabilitation Hospital, Washington DC, USA
| | - Amber M Leaver
- Department of Neuroscience, Georgetown University Medical Center, Washington DC, USA ; Department of Neurology, University of California Los Angeles, Los Angeles CA, USA
| | - Josef P Rauschecker
- Department of Neuroscience, Georgetown University Medical Center, Washington DC, USA ; Department of Physiology and Biophysics, Georgetown University Medical Center, Washington DC, USA
| |
Collapse
|
89
|
Kassuba T, Klinge C, Hölig C, Röder B, Siebner HR. Short-term plasticity of visuo-haptic object recognition. Front Psychol 2014; 5:274. [PMID: 24765082 PMCID: PMC3980106 DOI: 10.3389/fpsyg.2014.00274] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2014] [Accepted: 03/14/2014] [Indexed: 11/13/2022] Open
Abstract
Functional magnetic resonance imaging (fMRI) studies have provided ample evidence for the involvement of the lateral occipital cortex (LO), fusiform gyrus (FG), and intraparietal sulcus (IPS) in visuo-haptic object integration. Here we applied 30 min of sham (non-effective) or real offline 1 Hz repetitive transcranial magnetic stimulation (rTMS) to perturb neural processing in left LO immediately before subjects performed a visuo-haptic delayed-match-to-sample task during fMRI. In this task, subjects had to match sample (S1) and target (S2) objects presented sequentially within or across vision and/or haptics in both directions (visual-haptic or haptic-visual) and decide whether or not S1 and S2 were the same objects. Real rTMS transiently decreased activity at the site of stimulation and remote regions such as the right LO and bilateral FG during haptic S1 processing. Without affecting behavior, the same stimulation gave rise to relative increases in activation during S2 processing in the right LO, left FG, bilateral IPS, and other regions previously associated with object recognition. Critically, the modality of S2 determined which regions were recruited after rTMS. Relative to sham rTMS, real rTMS induced increased activations during crossmodal congruent matching in the left FG for haptic S2 and the temporal pole for visual S2. In addition, we found stronger activations for incongruent than congruent matching in the right anterior parahippocampus and middle frontal gyrus for crossmodal matching of haptic S2 and in the left FG and bilateral IPS for unimodal matching of visual S2, only after real but not sham rTMS. The results imply that a focal perturbation of the left LO triggers modality-specific interactions between the stimulated left LO and other key regions of object processing possibly to maintain unimpaired object recognition. This suggests that visual and haptic processing engage partially distinct brain networks during visuo-haptic object matching.
Collapse
Affiliation(s)
- Tanja Kassuba
- Danish Research Centre for Magnetic Resonance, Copenhagen University Hospital Hvidovre Hvidovre, Denmark ; NeuroImageNord/Department of Systems Neuroscience, University Medical Center Hamburg-Eppendorf Hamburg, Germany ; Department of Neurology, Christian-Albrechts-University Kiel, Germany
| | - Corinna Klinge
- NeuroImageNord/Department of Systems Neuroscience, University Medical Center Hamburg-Eppendorf Hamburg, Germany ; Department of Psychiatry, Warneford Hospital Oxford, UK
| | - Cordula Hölig
- NeuroImageNord/Department of Systems Neuroscience, University Medical Center Hamburg-Eppendorf Hamburg, Germany ; Biological Psychology and Neuropsychology, University of Hamburg Hamburg, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg Hamburg, Germany
| | - Hartwig R Siebner
- Danish Research Centre for Magnetic Resonance, Copenhagen University Hospital Hvidovre Hvidovre, Denmark ; NeuroImageNord/Department of Systems Neuroscience, University Medical Center Hamburg-Eppendorf Hamburg, Germany ; Department of Neurology, Christian-Albrechts-University Kiel, Germany
| |
Collapse
|
90
|
Petrini K, Remark A, Smith L, Nardini M. When vision is not an option: children's integration of auditory and haptic information is suboptimal. Dev Sci 2014; 17:376-87. [PMID: 24612244 PMCID: PMC4240463 DOI: 10.1111/desc.12127] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2013] [Accepted: 08/19/2013] [Indexed: 11/29/2022]
Abstract
When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark environment, visual disorders), the use of other information is vital. Here we ask how humans combine haptic and auditory information from childhood. In the first experiment, adults and children aged 5 to 11 years judged the relative sizes of two objects in auditory, haptic, and non-conflicting bimodal conditions. In Experiment 2, different groups of adults and children were tested in non-conflicting and conflicting bimodal conditions. In Experiment 1, adults reduced sensory uncertainty by integrating the cues optimally, while children did not. In Experiment 2, adults and children used similar weighting strategies to solve audio–haptic conflict. These results suggest that, in the absence of visual information, optimal integration of cues for discrimination of object size develops late in childhood.
Collapse
Affiliation(s)
- Karin Petrini
- Institute of Ophthalmology, University College London, UK
| | | | | | | |
Collapse
|
91
|
Hertz U, Amedi A. Flexibility and Stability in Sensory Processing Revealed Using Visual-to-Auditory Sensory Substitution. Cereb Cortex 2014; 25:2049-64. [PMID: 24518756 PMCID: PMC4494022 DOI: 10.1093/cercor/bhu010] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
The classical view of sensory processing involves independent processing in sensory cortices and multisensory integration in associative areas. This hierarchical structure has been challenged by evidence of multisensory responses in sensory areas, and dynamic weighting of sensory inputs in associative areas, thus far reported independently. Here, we used a visual-to-auditory sensory substitution algorithm (SSA) to manipulate the information conveyed by sensory inputs while keeping the stimuli intact. During scan sessions before and after SSA learning, subjects were presented with visual images and auditory soundscapes. The findings reveal 2 dynamic processes. First, crossmodal attenuation of sensory cortices changed direction after SSA learning from visual attenuations of the auditory cortex to auditory attenuations of the visual cortex. Secondly, associative areas changed their sensory response profile from strongest response for visual to that for auditory. The interaction between these phenomena may play an important role in multisensory processing. Consistent features were also found in the sensory dominance in sensory areas and audiovisual convergence in associative area Middle Temporal Gyrus. These 2 factors allow for both stability and a fast, dynamic tuning of the system when required.
Collapse
Affiliation(s)
- Uri Hertz
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada (IMRIC), Hadassah Medical School, Hebrew University of Jerusalem, Jerusalem 91220, Israel Interdisciplinary Center for Neural Computation, The Edmond & Lily Safra Center for Brain Sciences (ELSC), Hebrew University of Jerusalem, Jerusalem 91905, Israel
| | - Amir Amedi
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada (IMRIC), Hadassah Medical School, Hebrew University of Jerusalem, Jerusalem 91220, Israel Interdisciplinary Center for Neural Computation, The Edmond & Lily Safra Center for Brain Sciences (ELSC), Hebrew University of Jerusalem, Jerusalem 91905, Israel
| |
Collapse
|
92
|
Pollonini L, Olds C, Abaya H, Bortfeld H, Beauchamp MS, Oghalai JS. Auditory cortex activation to natural speech and simulated cochlear implant speech measured with functional near-infrared spectroscopy. Hear Res 2013; 309:84-93. [PMID: 24342740 DOI: 10.1016/j.heares.2013.11.007] [Citation(s) in RCA: 113] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/30/2013] [Revised: 11/22/2013] [Accepted: 11/25/2013] [Indexed: 11/29/2022]
Abstract
The primary goal of most cochlear implant procedures is to improve a patient's ability to discriminate speech. To accomplish this, cochlear implants are programmed so as to maximize speech understanding. However, programming a cochlear implant can be an iterative, labor-intensive process that takes place over months. In this study, we sought to determine whether functional near-infrared spectroscopy (fNIRS), a non-invasive neuroimaging method which is safe to use repeatedly and for extended periods of time, can provide an objective measure of whether a subject is hearing normal speech or distorted speech. We used a 140 channel fNIRS system to measure activation within the auditory cortex in 19 normal hearing subjects while they listed to speech with different levels of intelligibility. Custom software was developed to analyze the data and compute topographic maps from the measured changes in oxyhemoglobin and deoxyhemoglobin concentration. Normal speech reliably evoked the strongest responses within the auditory cortex. Distorted speech produced less region-specific cortical activation. Environmental sounds were used as a control, and they produced the least cortical activation. These data collected using fNIRS are consistent with the fMRI literature and thus demonstrate the feasibility of using this technique to objectively detect differences in cortical responses to speech of different intelligibility.
Collapse
Affiliation(s)
- Luca Pollonini
- Abramson Center for the Future of Health and Department of Engineering Technology, University of Houston, 300 Technology Building, Suite 123, Houston, TX 77204, USA.
| | - Cristen Olds
- Department of Otolaryngology - Head and Neck Surgery, Stanford University, 801 Welch Road, Stanford, CA 94305-5739, USA.
| | - Homer Abaya
- Department of Otolaryngology - Head and Neck Surgery, Stanford University, 801 Welch Road, Stanford, CA 94305-5739, USA.
| | - Heather Bortfeld
- Department of Psychology, University of Connecticut, 406 Babbidge Road, Unit 1020, Storrs, CT 06269-1020, USA.
| | - Michael S Beauchamp
- Department of Neurobiology and Anatomy, University of Texas Health Science Center at Houston, 6431 Fannin St., Suite MSB 7.046, Houston, TX 77030, USA.
| | - John S Oghalai
- Department of Otolaryngology - Head and Neck Surgery, Stanford University, 801 Welch Road, Stanford, CA 94305-5739, USA.
| |
Collapse
|
93
|
Freiherr J, Lundström JN, Habel U, Reetz K. Multisensory integration mechanisms during aging. Front Hum Neurosci 2013; 7:863. [PMID: 24379773 PMCID: PMC3861780 DOI: 10.3389/fnhum.2013.00863] [Citation(s) in RCA: 113] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2013] [Accepted: 11/26/2013] [Indexed: 11/25/2022] Open
Abstract
The rapid demographical shift occurring in our society implies that understanding of healthy aging and age-related diseases is one of our major future challenges. Sensory impairments have an enormous impact on our lives and are closely linked to cognitive functioning. Due to the inherent complexity of sensory perceptions, we are commonly presented with a complex multisensory stimulation and the brain integrates the information from the individual sensory channels into a unique and holistic percept. The cerebral processes involved are essential for our perception of sensory stimuli and becomes especially important during the perception of emotional content. Despite ongoing deterioration of the individual sensory systems during aging, there is evidence for an increase in, or maintenance of, multisensory integration processing in aging individuals. Within this comprehensive literature review on multisensory integration we aim to highlight basic mechanisms and potential compensatory strategies the human brain utilizes to help maintain multisensory integration capabilities during healthy aging to facilitate a broader understanding of age-related pathological conditions. Further our goal was to identify where further research is needed.
Collapse
Affiliation(s)
- Jessica Freiherr
- Diagnostic and Interventional Neuroradiology, RWTH Aachen University Aachen, Germany
| | - Johan N Lundström
- Department of Clinical Neuroscience, Karolinska Institute Stockholm, Sweden ; Monell Chemical Senses Center, Philadelphia PA, USA ; Department of Psychology, University of Pennsylvania Philadelphia, PA, USA
| | - Ute Habel
- Department of Psychiatry, Psychotherapy, and Psychosomatics, RWTH Aachen University Aachen, Germany ; JARA BRAIN - Translational Brain Medicine, RWTH Aachen University Aachen, Germany
| | - Kathrin Reetz
- JARA BRAIN - Translational Brain Medicine, RWTH Aachen University Aachen, Germany ; Department of Neurology, RWTH Aachen University Aachen, Germany ; Institute of Neuroscience and Medicine (INM-4), Research Center Jülich, Jülich Germany
| |
Collapse
|
94
|
Sella I, Reiner M, Pratt H. Natural stimuli from three coherent modalities enhance behavioral responses and electrophysiological cortical activity in humans. Int J Psychophysiol 2013; 93:45-55. [PMID: 24315926 DOI: 10.1016/j.ijpsycho.2013.11.003] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2012] [Revised: 10/23/2013] [Accepted: 11/26/2013] [Indexed: 11/15/2022]
Abstract
Cues that involve a number of sensory modalities are processed in the brain in an interactive multimodal manner rather than independently for each modality. We studied multimodal integration in a natural, yet fully controlled scene, implemented as an interactive game in an auditory-haptic-visual virtual environment. In this imitation of a natural scene, the targets of perception were ecologically valid uni-, bi- and tri-modal manifestations of a simple event-a ball hitting a wall. Subjects were engaged in the game while their behavioral and early cortical electrophysiological responses were measured. Behavioral results confirmed that tri-modal cues were detected faster and more accurately than bi-modal cues, which, likewise, showed advantages over unimodal responses. Event-Related Potentials (ERPs) were recorded, and the first 200 ms following stimulus onset was analyzed to reveal the latencies of cortical multimodal interactions as estimated by sLORETA. These electrophysiological findings indicated bi-modal as well as tri-modal interactions beginning very early (~30 ms), uniquely for each multimodal combination. The results suggest that early cortical multimodal integration accelerates cortical activity and, in turn, enhances performance measures. This acceleration registers on the scalp as sub-additive cortical activation.
Collapse
Affiliation(s)
- Irit Sella
- The Virtual Reality and NeuroCognition Laboratory, Technion, Israel Institute of Science, Israel; Evoked Potentials Laboratory, Technion, Israel Institute of Science, Israel
| | - Miriam Reiner
- The Virtual Reality and NeuroCognition Laboratory, Technion, Israel Institute of Science, Israel.
| | - Hillel Pratt
- Evoked Potentials Laboratory, Technion, Israel Institute of Science, Israel
| |
Collapse
|
95
|
Ito T, Johns AR, Ostry DJ. Left lateralized enhancement of orofacial somatosensory processing due to speech sounds. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2013; 56:S1875-S1881. [PMID: 24687443 PMCID: PMC4228692 DOI: 10.1044/1092-4388(2013/12-0226)] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
PURPOSE Somatosensory information associated with speech articulatory movements affects the perception of speech sounds and vice versa, suggesting an intimate linkage between speech production and perception systems. However, it is unclear which cortical processes are involved in the interaction between speech sounds and orofacial somatosensory inputs. The authors examined whether speech sounds modify orofacial somatosensory cortical potentials that were elicited using facial skin perturbations. METHOD Somatosensory event-related potentials in EEG were recorded in 3 background sound conditions (pink noise, speech sounds, and nonspeech sounds) and also in a silent condition. Facial skin deformations that are similar in timing and duration to those experienced in speech production were used for somatosensory stimulation. RESULTS The authors found that speech sounds reliably enhanced the first negative peak of the somatosensory event-related potential when compared with the other 3 sound conditions. The enhancement was evident at electrode locations above the left motor and premotor area of the orofacial system. The result indicates that speech sounds interact with somatosensory cortical processes that are produced by speech-production-like patterns of facial skin stretch. CONCLUSION Neural circuits in the left hemisphere, presumably in left motor and premotor cortex, may play a prominent role in the interaction between auditory inputs and speech-relevant somatosensory processing.
Collapse
Affiliation(s)
| | - Alexis R. Johns
- Haskins Laboratories, New Haven, CT
- University of Connecticut, Storrs
| | - David J. Ostry
- Haskins Laboratories, New Haven, CT
- McGill University, Montreal, Quebec, Canada
| |
Collapse
|
96
|
Straube B, Green A, Sass K, Kirner-Veselinovic A, Kircher T. Neural integration of speech and gesture in schizophrenia: evidence for differential processing of metaphoric gestures. Hum Brain Mapp 2013; 34:1696-712. [PMID: 22378493 PMCID: PMC6870001 DOI: 10.1002/hbm.22015] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2011] [Revised: 11/21/2011] [Accepted: 11/22/2011] [Indexed: 11/11/2022] Open
Abstract
Gestures are an important component of interpersonal communication. Especially, complex multimodal communication is assumed to be disrupted in patients with schizophrenia. In healthy subjects, differential neural integration processes for gestures in the context of concrete [iconic (IC) gestures] and abstract sentence contents [metaphoric (MP) gestures] had been demonstrated. With this study we wanted to investigate neural integration processes for both gesture types in patients with schizophrenia. During functional magnetic resonance imaging-data acquisition, 16 patients with schizophrenia (P) and a healthy control group (C) were shown videos of an actor performing IC and MP gestures and associated sentences. An isolated gesture (G) and isolated sentence condition (S) were included to separate unimodal from bimodal effects at the neural level. During IC conditions (IC > G ∩ IC > S) we found increased activity in the left posterior middle temporal gyrus (pMTG) in both groups. Whereas in the control group the left pMTG and the inferior frontal gyrus (IFG) were activated for the MP conditions (MP > G ∩ MP > S), no significant activation was found for the identical contrast in patients. The interaction of group (P/C) and gesture condition (MP/IC) revealed activation in the bilateral hippocampus, the left middle/superior temporal and IFG. Activation of the pMTG for the IC condition in both groups indicates intact neural integration of IC gestures in schizophrenia. However, failure to activate the left pMTG and IFG for MP co-verbal gestures suggests a disturbed integration of gestures embedded in an abstract sentence context. This study provides new insight into the neural integration of co-verbal gestures in patients with schizophrenia.
Collapse
Affiliation(s)
- Benjamin Straube
- Department of Psychiatry und Psychotherapy, Philipps-University Marburg, Rudolf-Bultmann-Strasse 8, D-35039 Marburg, Germany.
| | | | | | | | | |
Collapse
|
97
|
|
98
|
Hoefer M, Tyll S, Kanowski M, Brosch M, Schoenfeld MA, Heinze HJ, Noesselt T. Tactile stimulation and hemispheric asymmetries modulate auditory perception and neural responses in primary auditory cortex. Neuroimage 2013; 79:371-82. [PMID: 23664954 DOI: 10.1016/j.neuroimage.2013.04.119] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2013] [Revised: 04/12/2013] [Accepted: 04/27/2013] [Indexed: 10/26/2022] Open
Abstract
Although multisensory integration has been an important area of recent research, most studies focused on audiovisual integration. Importantly, however, the combination of audition and touch can guide our behavior as effectively which we studied here using psychophysics and functional magnetic resonance imaging (fMRI). We tested whether task-irrelevant tactile stimuli would enhance auditory detection, and whether hemispheric asymmetries would modulate these audiotactile benefits using lateralized sounds. Spatially aligned task-irrelevant tactile stimuli could occur either synchronously or asynchronously with the sounds. Auditory detection was enhanced by non-informative synchronous and asynchronous tactile stimuli, if presented on the left side. Elevated fMRI-signals to left-sided synchronous bimodal stimulation were found in primary auditory cortex (A1). Adjacent regions (planum temporale, PT) expressed enhanced BOLD-responses for synchronous and asynchronous left-sided bimodal conditions. Additional connectivity analyses seeded in right-hemispheric A1 and PT for both bimodal conditions showed enhanced connectivity with right-hemispheric thalamic, somatosensory and multisensory areas that scaled with subjects' performance. Our results indicate that functional asymmetries interact with audiotactile interplay which can be observed for left-lateralized stimulation in the right hemisphere. There, audiotactile interplay recruits a functional network of unisensory cortices, and the strength of these functional network connections is directly related to subjects' perceptual sensitivity.
Collapse
Affiliation(s)
- M Hoefer
- Department of Biological Psychology, Otto-von-Guericke-University Magdeburg, Postfach 4120, 39106 Magdeburg, Germany.
| | | | | | | | | | | | | |
Collapse
|
99
|
Brang D, Taich ZJ, Hillyard SA, Grabowecky M, Ramachandran VS. Parietal connectivity mediates multisensory facilitation. Neuroimage 2013; 78:396-401. [PMID: 23611862 DOI: 10.1016/j.neuroimage.2013.04.047] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2013] [Revised: 04/10/2013] [Accepted: 04/11/2013] [Indexed: 11/27/2022] Open
Abstract
Our senses interact in daily life through multisensory integration, facilitating perceptual processes and behavioral responses. The neural mechanisms proposed to underlie this multisensory facilitation include anatomical connections directly linking early sensory areas, indirect connections to higher-order multisensory regions, as well as thalamic connections. Here we examine the relationship between white matter connectivity, as assessed with diffusion tensor imaging, and individual differences in multisensory facilitation and provide the first demonstration of a relationship between anatomical connectivity and multisensory processing in typically developed individuals. Using a whole-brain analysis and contrasting anatomical models of multisensory processing we found that increased connectivity between parietal regions and early sensory areas was associated with the facilitation of reaction times to multisensory (auditory-visual) stimuli. Furthermore, building on prior animal work suggesting the involvement of the superior colliculus in this process, using probabilistic tractography we determined that the strongest cortical projection area connected with the superior colliculus includes the region of connectivity implicated in our independent whole-brain analysis.
Collapse
Affiliation(s)
- David Brang
- Department of Psychology, Northwestern University, 2029 Sheridan Road, Evanston, IL 60208-2710, USA.
| | | | | | | | | |
Collapse
|
100
|
Beer AL, Plank T, Meyer G, Greenlee MW. Combined diffusion-weighted and functional magnetic resonance imaging reveals a temporal-occipital network involved in auditory-visual object processing. Front Integr Neurosci 2013; 7:5. [PMID: 23407860 PMCID: PMC3570774 DOI: 10.3389/fnint.2013.00005] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2012] [Accepted: 01/25/2013] [Indexed: 11/22/2022] Open
Abstract
Functional magnetic resonance imaging (MRI) showed that the superior temporal and occipital cortex are involved in multisensory integration. Probabilistic fiber tracking based on diffusion-weighted MRI suggests that multisensory processing is supported by white matter connections between auditory cortex and the temporal and occipital lobe. Here, we present a combined functional MRI and probabilistic fiber tracking study that reveals multisensory processing mechanisms that remained undetected by either technique alone. Ten healthy participants passively observed visually presented lip or body movements, heard speech or body action sounds, or were exposed to a combination of both. Bimodal stimulation engaged a temporal-occipital brain network including the multisensory superior temporal sulcus (msSTS), the lateral superior temporal gyrus (lSTG), and the extrastriate body area (EBA). A region-of-interest (ROI) analysis showed multisensory interactions (e.g., subadditive responses to bimodal compared to unimodal stimuli) in the msSTS, the lSTG, and the EBA region. Moreover, sounds elicited responses in the medial occipital cortex. Probabilistic tracking revealed white matter tracts between the auditory cortex and the medial occipital cortex, the inferior occipital cortex (IOC), and the superior temporal sulcus (STS). However, STS terminations of auditory cortex tracts showed limited overlap with the msSTS region. Instead, msSTS was connected to primary sensory regions via intermediate nodes in the temporal and occipital cortex. Similarly, the lSTG and EBA regions showed limited direct white matter connections but instead were connected via intermediate nodes. Our results suggest that multisensory processing in the STS is mediated by separate brain areas that form a distinct network in the lateral temporal and inferior occipital cortex.
Collapse
Affiliation(s)
- Anton L. Beer
- Institut für Psychologie, Universität RegensburgRegensburg, Germany
- Experimental and Clinical Neurosciences Programme, Universität RegensburgRegensburg, Germany
| | - Tina Plank
- Institut für Psychologie, Universität RegensburgRegensburg, Germany
| | - Georg Meyer
- Department of Experimental Psychology, University of LiverpoolLiverpool, UK
| | - Mark W. Greenlee
- Institut für Psychologie, Universität RegensburgRegensburg, Germany
- Experimental and Clinical Neurosciences Programme, Universität RegensburgRegensburg, Germany
| |
Collapse
|