1
|
Banaszkiewicz A, Costello B, Marchewka A. Early language experience and modality affect parietal cortex activation in different hemispheres: Insights from hearing bimodal bilinguals. Neuropsychologia 2024; 204:108973. [PMID: 39151687 DOI: 10.1016/j.neuropsychologia.2024.108973] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Revised: 08/12/2024] [Accepted: 08/12/2024] [Indexed: 08/19/2024]
Abstract
The goal of this study was to investigate the impact of the age of acquisition (AoA) on functional brain representations of sign language in two exceptional groups of hearing bimodal bilinguals: native signers (simultaneous bilinguals since early childhood) and late signers (proficient sequential bilinguals, who learnt a sign language after puberty). We asked whether effects of AoA would be present across languages - signed and audiovisual spoken - and thus observed only in late signers as they acquired each language at different life stages, and whether effects of AoA would be present during sign language processing across groups. Moreover, we aimed to carefully control participants' level of sign language proficiency by implementing a battery of language tests developed for the purpose of the project, which confirmed that participants had high competences of sign language. Between-group analyses revealed a hypothesized modulatory effect of AoA in the right inferior parietal lobule (IPL) in native signers, compared to late signers. With respect to within-group differences across languages we observed greater involvement of the left IPL in response to sign language in comparison to spoken language in both native and late signers, indicating language modality effects. Overall, our results suggest that the neural underpinnings of language are molded by the linguistic characteristics of the language as well as by when in life the language is learnt.
Collapse
Affiliation(s)
- A Banaszkiewicz
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland; Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| | - B Costello
- Basque Center of Cognition, Brain and Language, Donostia-San Sebstián, Spain; Ikerbasque, Basque Foundation for Science, Bilbao, Spain
| | - A Marchewka
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
2
|
Yang T, Fan X, Hou B, Wang J, Chen X. Linguistic network in early deaf individuals: A neuroimaging meta-analysis. Neuroimage 2024:120720. [PMID: 38971484 DOI: 10.1016/j.neuroimage.2024.120720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Revised: 07/01/2024] [Accepted: 07/03/2024] [Indexed: 07/08/2024] Open
Abstract
This meta-analysis summarizes evidence from 44 neuroimaging experiments and characterizes the general linguistic network in early deaf individuals. Meta-analytic comparisons with hearing individuals found that a specific set of regions (in particular the left inferior frontal gyrus and posterior middle temporal gyrus) participates in supramodal language processing. In addition to previously described modality-specific differences, the present study showed that the left calcarine gyrus and the right caudate were additionally recruited in deaf compared with hearing individuals. In addition, this study showed that the bilateral posterior superior temporal gyrus is shaped by cross-modal plasticity, whereas the left frontotemporal areas are shaped by early language experience. Although an overall left-lateralized pattern for language processing was observed in the early deaf individuals, regional lateralization was altered in the inferior frontal gyrus and anterior temporal lobe. These findings indicate that the core language network functions in a modality-independent manner, and provide a foundation for determining the contributions of sensory and linguistic experiences in shaping the neural bases of language processing.
Collapse
Affiliation(s)
- Tengyu Yang
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Xinmiao Fan
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Bo Hou
- Department of Radiology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China
| | - Jian Wang
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China.
| | - Xiaowei Chen
- Department of Otolaryngology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Peking Union Medical College, Beijing, PR China.
| |
Collapse
|
3
|
Hernández D, Puupponen A, Keränen J, Ortega G, Jantunen T. Between bodily action and conventionalized structure: The neural mechanisms of constructed action in sign language comprehension. BRAIN AND LANGUAGE 2024; 252:105413. [PMID: 38608511 DOI: 10.1016/j.bandl.2024.105413] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/06/2023] [Revised: 02/22/2024] [Accepted: 04/07/2024] [Indexed: 04/14/2024]
Abstract
Sign languages (SLs) are expressed through different bodily actions, ranging from re-enactment of physical events (constructed action, CA) to sequences of lexical signs with internal structure (plain telling, PT). Despite the prevalence of CA in signed interactions and its significance for SL comprehension, its neural dynamics remain unexplored. We examined the processing of different types of CA (subtle, reduced, and overt) and PT in 35 adult deaf or hearing native signers. The electroencephalographic-based processing of signed sentences with incongruent targets was recorded. Attenuated N300 and early N400 were observed for CA in deaf but not in hearing signers. No differences were found between sentences with CA types in all signers, suggesting a continuum from PT to overt CA. Deaf signers focused more on body movements; hearing signers on faces. We conclude that CA is processed less effortlessly than PT, arguably because of its strong focus on bodily actions.
Collapse
Affiliation(s)
- Doris Hernández
- Sign Language Centre, Department of Language and Communication, University of Jyväskylä, Finland; Center for Interdisciplinary Brain Research (CIBR), Department of Psychology, University of Jyväskylä, Finland.
| | - Anna Puupponen
- Sign Language Centre, Department of Language and Communication, University of Jyväskylä, Finland
| | - Jarkko Keränen
- Sign Language Centre, Department of Language and Communication, University of Jyväskylä, Finland
| | - Gerardo Ortega
- Department of English Language and Applied Linguistics, University of Birmingham, UK
| | - Tommi Jantunen
- Sign Language Centre, Department of Language and Communication, University of Jyväskylä, Finland
| |
Collapse
|
4
|
Arioli M, Segatta C, Papagno C, Tettamanti M, Cattaneo Z. Social perception in deaf individuals: A meta-analysis of neuroimaging studies. Hum Brain Mapp 2023; 44:5402-5415. [PMID: 37609693 PMCID: PMC10543108 DOI: 10.1002/hbm.26444] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Revised: 06/22/2023] [Accepted: 07/22/2023] [Indexed: 08/24/2023] Open
Abstract
Deaf individuals may report difficulties in social interactions. However, whether these difficulties depend on deafness affecting social brain circuits is controversial. Here, we report the first meta-analysis comparing brain activations of hearing and (prelingually) deaf individuals during social perception. Our findings showed that deafness does not impact on the functional mechanisms supporting social perception. Indeed, both deaf and hearing control participants recruited regions of the action observation network during performance of different social tasks employing visual stimuli, and including biological motion perception, face identification, action observation, viewing, identification and memory for signs and lip reading. Moreover, we found increased recruitment of the superior-middle temporal cortex in deaf individuals compared with hearing participants, suggesting a preserved and augmented function during social communication based on signs and lip movements. Overall, our meta-analysis suggests that social difficulties experienced by deaf individuals are unlikely to be associated with brain alterations but may rather depend on non-supportive environments.
Collapse
Affiliation(s)
- Maria Arioli
- Department of Human and Social SciencesUniversity of BergamoBergamoItaly
| | - Cecilia Segatta
- Department of Human and Social SciencesUniversity of BergamoBergamoItaly
| | - Costanza Papagno
- Center for Mind/Brain Sciences (CIMeC)University of TrentoTrentoItaly
| | | | - Zaira Cattaneo
- Department of Human and Social SciencesUniversity of BergamoBergamoItaly
- IRCCS Mondino FoundationPaviaItaly
| |
Collapse
|
5
|
Rosenzopf H, Wiesen D, Basilakos A, Yourganov G, Bonilha L, Rorden C, Fridriksson J, Karnath HO, Sperber C. Mapping the human praxis network: an investigation of white matter disconnection in limb apraxia of gesture production. Brain Commun 2022; 4:fcac004. [PMID: 35169709 PMCID: PMC8833454 DOI: 10.1093/braincomms/fcac004] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2021] [Revised: 11/19/2021] [Accepted: 01/07/2022] [Indexed: 11/14/2022] Open
Abstract
Abstract
Left hemispheric cerebral stroke can cause apraxia, a motor-cognitive disorder characterised by deficits of higher-order motor skills such as the failure to accurately produce meaningful gestures. This disorder provides unique insights into the anatomical and cognitive architecture of the human praxis system. The present study aimed to map the structural brain network that is damaged in apraxia. We assessed the ability to perform meaningful gestures with the hand in 101 patients with chronic left hemisphere stroke. Structural white matter fibre damage was directly assessed by diffusion tensor imaging and fractional anisotropy mapping. We used multivariate topographical inference on tract-based fractional anisotropy topographies to identify white matter disconnection associated with apraxia. We found relevant pathological white matter alterations in a densely connected fronto-temporo-parietal network of short and long association fibres. Hence, the findings suggest that heterogeneous topographical results in previous lesion mapping studies might not only result from differences in study design, but also from the general methodological limitations of univariate topographical mapping in uncovering the structural praxis network. A striking role of middle and superior temporal lobe disconnection, including temporo-temporal short association fibres, was found, suggesting strong involvement of the temporal lobe in the praxis network. Further, the results stressed the importance of subcortical disconnections for the emergence of apractic symptoms. Our study provides a fine-grain view into the structural connectivity of the human praxis network and suggests a potential value of disconnection measures in the clinical prediction of behavioural post-stroke outcome.
Collapse
Affiliation(s)
- Hannah Rosenzopf
- Centre of Neurology, Division of Neuropsychology, Hertie-Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
| | - Daniel Wiesen
- Centre of Neurology, Division of Neuropsychology, Hertie-Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
| | - Alexandra Basilakos
- Department of Communication Sciences and Disorders, University of South Carolina, Columbia, SC, USA
| | - Grigori Yourganov
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Leonardo Bonilha
- Department of Neurology, Medical University of South Carolina, Charleston, SC, USA
| | - Christopher Rorden
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Julius Fridriksson
- Department of Communication Sciences and Disorders, University of South Carolina, Columbia, SC, USA
| | - Hans-Otto Karnath
- Centre of Neurology, Division of Neuropsychology, Hertie-Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
- Department of Psychology, University of South Carolina, Columbia, SC, USA
| | - Christoph Sperber
- Centre of Neurology, Division of Neuropsychology, Hertie-Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
| |
Collapse
|
6
|
Quandt LC, Kubicek E, Willis A, Lamberton J. Enhanced biological motion perception in deaf native signers. Neuropsychologia 2021; 161:107996. [PMID: 34425145 DOI: 10.1016/j.neuropsychologia.2021.107996] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2020] [Revised: 07/22/2021] [Accepted: 08/17/2021] [Indexed: 02/06/2023]
Abstract
We conducted two studies to test how deaf signed language users perceive biological motions. We created 18 Biological Motion point-light displays (PLDs) depicting everyday human actions, and 18 Scrambled control PLDs. First, we conducted an online behavioral rating survey, in which deaf and hearing raters identified the biological motion PLDs and rated how easy it was for them to identify the actions. Then, we conducted an EEG study in which Deaf Signers and Hearing Non-Signers watched both the Biological Motion PLDs and the Scrambled PLDs, and we computed the time-frequency responses within the theta, alpha, and beta EEG rhythms. From the behavioral rating task, we show that the deaf raters reported significantly less effort required for identifying the Biological motion PLDs, across all stimuli. The EEG results showed that the Deaf Signers showed theta, mu, and beta differentiation between Scrambled and Biological PLDs earlier and more consistently than Hearing Non-Signers. We conclude that native ASL users exhibit experience-dependent neuroplasticity in the domain of biological human motion perception.
Collapse
Affiliation(s)
- Lorna C Quandt
- Ph.D in Educational Neuroscience Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA.
| | - Emily Kubicek
- Ph.D in Educational Neuroscience Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA
| | - Athena Willis
- Ph.D in Educational Neuroscience Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA
| | - Jason Lamberton
- VL2 Center, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA
| |
Collapse
|
7
|
Banaszkiewicz A, Bola Ł, Matuszewski J, Szczepanik M, Kossowski B, Mostowski P, Rutkowski P, Śliwińska M, Jednoróg K, Emmorey K, Marchewka A. The role of the superior parietal lobule in lexical processing of sign language: Insights from fMRI and TMS. Cortex 2020; 135:240-254. [PMID: 33401098 DOI: 10.1016/j.cortex.2020.10.025] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2020] [Revised: 09/24/2020] [Accepted: 10/22/2020] [Indexed: 11/29/2022]
Abstract
There is strong evidence that neuronal bases for language processing are remarkably similar for sign and spoken languages. However, as meanings and linguistic structures of sign languages are coded in movement and space and decoded through vision, differences are also present, predominantly in occipitotemporal and parietal areas, such as superior parietal lobule (SPL). Whether the involvement of SPL reflects domain-general visuospatial attention or processes specific to sign language comprehension remains an open question. Here we conducted two experiments to investigate the role of SPL and the laterality of its engagement in sign language lexical processing. First, using unique longitudinal and between-group designs we mapped brain responses to sign language in hearing late learners and deaf signers. Second, using transcranial magnetic stimulation (TMS) in both groups we tested the behavioural relevance of SPL's engagement and its lateralisation during sign language comprehension. SPL activation in hearing participants was observed in the right hemisphere before and bilaterally after the sign language course. Additionally, after the course hearing learners exhibited greater activation in the occipital cortex and left SPL than deaf signers. TMS applied to the right SPL decreased accuracy in both hearing learners and deaf signers. Stimulation of the left SPL decreased accuracy only in hearing learners. Our results suggest that right SPL might be involved in visuospatial attention while left SPL might support phonological decoding of signs in non-proficient signers.
Collapse
Affiliation(s)
- A Banaszkiewicz
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - Ł Bola
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - J Matuszewski
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - M Szczepanik
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - B Kossowski
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - P Mostowski
- Section for Sign Linguistics, Faculty of Polish Studies, University of Warsaw, Warsaw, Poland
| | - P Rutkowski
- Section for Sign Linguistics, Faculty of Polish Studies, University of Warsaw, Warsaw, Poland
| | - M Śliwińska
- Department of Psychology, University of York, Heslington, UK
| | - K Jednoróg
- Laboratory of Language Neurobiology, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland
| | - K Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, San Diego, USA
| | - A Marchewka
- Laboratory of Brain Imaging, Nencki Institute of Experimental Biology, Polish Academy of Sciences, Warsaw, Poland.
| |
Collapse
|
8
|
Simon M, Lazzouni L, Campbell E, Delcenserie A, Muise-Hennessey A, Newman AJ, Champoux F, Lepore F. Enhancement of visual biological motion recognition in early-deaf adults: Functional and behavioral correlates. PLoS One 2020; 15:e0236800. [PMID: 32776962 PMCID: PMC7416928 DOI: 10.1371/journal.pone.0236800] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2019] [Accepted: 07/15/2020] [Indexed: 11/19/2022] Open
Abstract
Deafness leads to brain modifications that are generally associated with a cross-modal activity of the auditory cortex, particularly for visual stimulations. In the present study, we explore the cortical processing of biological motion that conveyed either non-communicative (pantomimes) or communicative (emblems) information, in early-deaf and hearing individuals, using fMRI analyses. Behaviorally, deaf individuals showed an advantage in detecting communicative gestures relative to hearing individuals. Deaf individuals also showed significantly greater activation in the superior temporal cortex (including the planum temporale and primary auditory cortex) than hearing individuals. The activation levels in this region were correlated with deaf individuals’ response times. This study provides neural and behavioral evidence that cross-modal plasticity leads to functional advantages in the processing of biological motion following lifelong auditory deprivation.
Collapse
Affiliation(s)
- Marie Simon
- Département de Psychologie, Centre de recherche en neuropsychologie et cognition, Université de Montréal, Québec, Canada
- * E-mail:
| | - Latifa Lazzouni
- Département de Psychologie, Centre de recherche en neuropsychologie et cognition, Université de Montréal, Québec, Canada
| | - Emma Campbell
- Département de Psychologie, Centre de recherche en neuropsychologie et cognition, Université de Montréal, Québec, Canada
| | - Audrey Delcenserie
- Département de Psychologie, Centre de recherche en neuropsychologie et cognition, Université de Montréal, Québec, Canada
- École d’orthophonie et d’audiologie, Université de Montréal, Montréal, Québec, Canada
| | - Alexandria Muise-Hennessey
- Department of Psychology and Neuroscience, NeuroCognitive Imaging Lab, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Aaron J. Newman
- Department of Psychology and Neuroscience, NeuroCognitive Imaging Lab, Dalhousie University, Halifax, Nova Scotia, Canada
| | - François Champoux
- École d’orthophonie et d’audiologie, Université de Montréal, Montréal, Québec, Canada
- Centre de recherche de l’Institut Universitaire de Gériatrie de Montréal, Montréal, Québec, Canada
| | - Franco Lepore
- Département de Psychologie, Centre de recherche en neuropsychologie et cognition, Université de Montréal, Québec, Canada
| |
Collapse
|
9
|
Kubicek E, Quandt LC. Sensorimotor system engagement during ASL sign perception: An EEG study in deaf signers and hearing non-signers. Cortex 2019; 119:457-469. [PMID: 31505437 DOI: 10.1016/j.cortex.2019.07.016] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2019] [Revised: 06/04/2019] [Accepted: 07/29/2019] [Indexed: 10/26/2022]
Abstract
When a person observes someone else performing an action, the observer's sensorimotor cortex activates as if the observer is the one performing the action, a phenomenon known as action simulation. While this process has been well-established for basic (e.g., grasping) and complex (e.g., dancing) actions, it remains unknown if the framework of action simulation is applicable to visual languages such as American Sign Language (ASL). We conducted an EEG experiment with deaf signers and hearing non-signers to compare overall sensorimotor EEG between groups, and to test whether sensorimotor systems are differentially sensitive to signs that are produced with one hand ("1H") or two hands ("2H"). We predicted greater alpha and beta event-related desynchronization (previously correlated with action simulation) during the perception of 2H ASL signs compared to 1H ASL signs, due to greater demands on sensorimotor processing systems required for producing two-handed actions. We recorded EEG from both groups as they observed videos of ASL signs, half 1H and half 2H. Event-related spectral perturbations (ERSPs) in the alpha and beta ranges were computed for the two conditions at central electrode sites overlying the sensorimotor cortex. Sensorimotor EEG responses in both Hearing and Deaf groups were sensitive to the observed gross motor characteristics of the observed signs. We show for the first time that despite hearing non-signers showing overall more sensorimotor cortex involvement during sign observation, mirroring-related processes are in fact involved when deaf signers observe signs.
Collapse
Affiliation(s)
- Emily Kubicek
- Educational Neuroscience Program, Gallaudet University, Washington, DC, USA
| | - Lorna C Quandt
- Educational Neuroscience Program, Gallaudet University, Washington, DC, USA; Department of Psychology, Gallaudet University, Washington, DC, USA.
| |
Collapse
|
10
|
Arioli M, Canessa N. Neural processing of social interaction: Coordinate-based meta-analytic evidence from human neuroimaging studies. Hum Brain Mapp 2019; 40:3712-3737. [PMID: 31077492 DOI: 10.1002/hbm.24627] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2019] [Revised: 05/01/2019] [Accepted: 05/02/2019] [Indexed: 12/13/2022] Open
Abstract
While the action observation and mentalizing networks are considered to play complementary roles in understanding others' goals and intentions, they might be concurrently engaged when processing social interactions. We assessed this hypothesis via three activation-likelihood-estimation meta-analyses of neuroimaging studies on the neural processing of: (a) social interactions, (b) individual actions by the action observation network, and (c) mental states by the mentalizing network. Conjunction analyses and direct comparisons unveiled overlapping and specific regions among the resulting maps. We report quantitative meta-analytic evidence for a "social interaction network" including key nodes of the action observation and mentalizing networks. An action-social interaction-mentalizing gradient of activity along the posterior temporal cortex highlighted a hierarchical processing of interactions, from visuomotor analyses decoding individual and shared intentions to in-depth inferences on actors' intentional states. The medial prefrontal cortex, possibly in conjunction with the amygdala, might provide additional information concerning the affective valence of the interaction. This evidence suggests that the functional architecture underlying the neural processing of interactions involves the joint involvement of the action observation and mentalizing networks. These data might inform the design of rehabilitative treatments for social cognition disorders in pathological conditions, and the assessment of their outcome in randomized controlled trials.
Collapse
Affiliation(s)
- Maria Arioli
- Department of Humanities and Life Sciences, Scuola Universitaria Superiore IUSS, Pavia, Italy.,Cognitive Neuroscience Laboratory, IRCCS ICS Maugeri, Pavia, Italy
| | - Nicola Canessa
- Department of Humanities and Life Sciences, Scuola Universitaria Superiore IUSS, Pavia, Italy.,Cognitive Neuroscience Laboratory, IRCCS ICS Maugeri, Pavia, Italy
| |
Collapse
|
11
|
Quandt LC, Kubicek E. Sensorimotor characteristics of sign translations modulate EEG when deaf signers read English. BRAIN AND LANGUAGE 2018; 187:9-17. [PMID: 30399489 DOI: 10.1016/j.bandl.2018.10.001] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2018] [Revised: 10/19/2018] [Accepted: 10/19/2018] [Indexed: 06/08/2023]
Abstract
Bilingual individuals automatically translate written words from one language to another. While this process is established in spoken-language bilinguals, there is less known about its occurrence in deaf bilinguals who know signed and spoken languages. Since sign language uses motion and space to convey linguistic content, it is possible that action simulation in the brain's sensorimotor system plays a role in this process. We recorded EEG from deaf participants fluent in ASL as they read individual English words and found significant differences in alpha and beta EEG at central electrode sites during the reading of English words whose ASL translations use two hands, compared to English words whose ASL translations use one hand. Hearing non-signers did not show any differences between conditions. These results demonstrate the involvement of the sensorimotor system in cross-linguistic, cross-modal translation, and suggest that covert action simulation processes are involved when deaf signers read.
Collapse
Affiliation(s)
- Lorna C Quandt
- Ph.D. in Educational Neuroscience (PEN) Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA; Department of Psychology, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA.
| | - Emily Kubicek
- Ph.D. in Educational Neuroscience (PEN) Program, Gallaudet University, 800 Florida Ave NE, Washington, D.C. 20002, USA
| |
Collapse
|
12
|
Le HB, Zhang HH, Wu QL, Zhang J, Yin JJ, Ma SH. Neural Activity During Mental Rotation in Deaf Signers: The Influence of Long-Term Sign Language Experience. Ear Hear 2018; 39:1015-1024. [PMID: 29298164 DOI: 10.1097/aud.0000000000000540] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
OBJECTIVES Mental rotation is the brain's visuospatial understanding of what objects are and where they belong. Previous research indicated that deaf signers showed behavioral enhancement for nonlinguistic visual tasks, including mental rotation. In this study, we investigated the neural difference of mental rotation processing between deaf signers and hearing nonsigners using blood oxygen level-dependent (BOLD) functional magnetic resonance imaging (fMRI). DESIGN The participants performed a block-designed experiment, consisting of alternating blocks of comparison and rotation periods, separated by a baseline or fixation period. Mental rotation tasks were performed using three-dimensional figures. fMRI images were acquired during the entire experiment, and the fMRI data were analyzed with Analysis of Functional NeuroImages. A factorial design analysis of variance was designed for fMRI analyses. The differences of activation were analyzed for the main effects of group and task, as well as for the interaction of group by task. RESULTS The study showed differences in activated areas between deaf signers and hearing nonsigners on the mental rotation of three-dimensional figures. Subtracting activations of fixation from activations of rotation, both groups showed consistent activation in bilateral occipital lobe, bilateral parietal lobe, and bilateral posterior temporal lobe. There were different main effects of task (rotation versus comparison) with significant activation clusters in the bilateral precuneus, the right middle frontal gyrus, the bilateral medial frontal gyrus, the right interior frontal gyrus, the right superior frontal gyrus, the right anterior cingulate, and the bilateral posterior cingulate. There were significant interaction effects of group by task in the bilateral anterior cingulate, the right inferior frontal gyrus, the left superior frontal gyrus, the left posterior cingulate, the left middle temporal gyrus, and the right inferior parietal lobe. In simple effects of deaf and hearing groups with rotation minus comparison, deaf signers mainly showed activity in the right hemisphere, while hearing nonsigners showed bilateral activity. In the simple effects of rotation task, decreased activities were shown for deaf signers compared with hearing nonsigners throughout several regions, including the bilateral parahippocampal gyrus, the left posterior cingulate cortex, the right anterior cingulate cortex, and the right inferior parietal lobe. CONCLUSION Decreased activations in several brain regions of deaf signers when compared to hearing nonsigners reflected increased neural efficiency and a precise functional circuitry, which was generated through long-term experience with sign language processing. In addition, we inferred tentatively that there may be a lateralization pattern to the right hemisphere for deaf signers when performing mental rotation tasks.
Collapse
Affiliation(s)
- Hong-Bo Le
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Hui-Hong Zhang
- Department of Radiology, Shenzhen Hospital of Southern Medical University, Shenzhen, China
- MR Division, Shantou Central Hospital, Shantou, China
| | - Qiu-Lin Wu
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Jiong Zhang
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Jing-Jing Yin
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Shu-Hua Ma
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| |
Collapse
|
13
|
Liu L, Yan X, Liu J, Xia M, Lu C, Emmorey K, Chu M, Ding G. Graph theoretical analysis of functional network for comprehension of sign language. Brain Res 2017; 1671:55-66. [PMID: 28690129 DOI: 10.1016/j.brainres.2017.06.031] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2017] [Revised: 06/29/2017] [Accepted: 06/30/2017] [Indexed: 12/14/2022]
Abstract
Signed languages are natural human languages using the visual-motor modality. Previous neuroimaging studies based on univariate activation analysis show that a widely overlapped cortical network is recruited regardless whether the sign language is comprehended (for signers) or not (for non-signers). Here we move beyond previous studies by examining whether the functional connectivity profiles and the underlying organizational structure of the overlapped neural network may differ between signers and non-signers when watching sign language. Using graph theoretical analysis (GTA) and fMRI, we compared the large-scale functional network organization in hearing signers with non-signers during the observation of sentences in Chinese Sign Language. We found that signed sentences elicited highly similar cortical activations in the two groups of participants, with slightly larger responses within the left frontal and left temporal gyrus in signers than in non-signers. Crucially, further GTA revealed substantial group differences in the topologies of this activation network. Globally, the network engaged by signers showed higher local efficiency (t(24)=2.379, p=0.026), small-worldness (t(24)=2.604, p=0.016) and modularity (t(24)=3.513, p=0.002), and exhibited different modular structures, compared to the network engaged by non-signers. Locally, the left ventral pars opercularis served as a network hub in the signer group but not in the non-signer group. These findings suggest that, despite overlap in cortical activation, the neural substrates underlying sign language comprehension are distinguishable at the network level from those for the processing of gestural action.
Collapse
Affiliation(s)
- Lanfang Liu
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, PR China; IDG/McGovern Institute for Brain Research, Beijing Normal University, PR China
| | - Xin Yan
- Department of Communicative Sciences and Disorders, Michigan State University, East Lansing Michigan 48823, United States
| | - Jin Liu
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, PR China; IDG/McGovern Institute for Brain Research, Beijing Normal University, PR China
| | - Mingrui Xia
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, PR China; IDG/McGovern Institute for Brain Research, Beijing Normal University, PR China
| | - Chunming Lu
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, PR China; IDG/McGovern Institute for Brain Research, Beijing Normal University, PR China
| | - Karen Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, 6495 Alvarado Road, Suite 200, San Diego, CA 92120, United States
| | - Mingyuan Chu
- School of Psychology, University of Aberdeen, AB24 2UB, United Kingdom.
| | - Guosheng Ding
- State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing 100875, PR China; IDG/McGovern Institute for Brain Research, Beijing Normal University, PR China.
| |
Collapse
|
14
|
Weidinger N, Lindner K, Hogrefe K, Ziegler W, Goldenberg G. Getting a Grasp on Children’s Representational Capacities in Pantomime of Object Use. JOURNAL OF COGNITION AND DEVELOPMENT 2016. [DOI: 10.1080/15248372.2016.1255625] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
15
|
Li W, Li J, Wang Z, Li Y, Liu Z, Yan F, Xian J, He H. Grey matter connectivity within and between auditory, language and visual systems in prelingually deaf adolescents. Restor Neurol Neurosci 2016; 33:279-90. [PMID: 25698109 PMCID: PMC4923723 DOI: 10.3233/rnn-140437] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
PURPOSE Previous studies have shown brain reorganizations after early deprivation of auditory sensory. However, changes of grey matter connectivity have not been investigated in prelingually deaf adolescents yet. In the present study, we aimed to investigate changes of grey matter connectivity within and between auditory, language and visual systems in prelingually deaf adolescents. METHODS We recruited 16 prelingually deaf adolescents and 16 age-and gender-matched normal controls, and extracted the grey matter volume as the structural characteristic from 14 regions of interest involved in auditory, language or visual processing to investigate the changes of grey matter connectivity within and between auditory, language and visual systems. Sparse inverse covariance estimation (SICE) was utilized to construct grey matter connectivity between these brain regions. RESULTS The results show that prelingually deaf adolescents present weaker grey matter connectivity within auditory and visual systems, and connectivity between language and visual systems declined. Notably, significantly increased brain connectivity was found between auditory and visual systems in prelingually deaf adolescents. CONCLUSIONS Our results indicate "cross-modal" plasticity after deprivation of the auditory input in prelingually deaf adolescents, especially between auditory and visual systems. Besides, auditory deprivation and visual deficits might affect the connectivity pattern within language and visual systems in prelingually deaf adolescents.
Collapse
Affiliation(s)
- Wenjing Li
- College of Electronic and Control Engineering, Beijing University of Technology, Beijing, China.,State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China.,Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing, China
| | - Jianhong Li
- Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, China
| | - Zhenchang Wang
- Department of Radiology, Beijing Friendship Hospital, Capital Medical University, Beijing, China
| | - Yong Li
- Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, China
| | - Zhaohui Liu
- Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, China
| | - Fei Yan
- Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, China
| | - Junfang Xian
- Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, China
| | - Huiguang He
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
16
|
Fang Y, Chen Q, Lingnau A, Han Z, Bi Y. Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience. Front Hum Neurosci 2016; 10:94. [PMID: 27014025 PMCID: PMC4781852 DOI: 10.3389/fnhum.2016.00094] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2015] [Accepted: 02/22/2016] [Indexed: 11/26/2022] Open
Abstract
The observation of other people’s actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during non-linguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people’s actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.
Collapse
Affiliation(s)
- Yuxing Fang
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| | - Quanjing Chen
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| | - Angelika Lingnau
- Center for Mind/Brain Sciences, University of TrentoRovereto, Italy; Department of Psychology and Cognitive Science, University of TrentoRovereto, Italy; Department of Psychology, Royal Holloway University of LondonEgham, UK
| | - Zaizhu Han
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| | - Yanchao Bi
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| |
Collapse
|
17
|
Möttönen R, Farmer H, Watkins KE. Neural basis of understanding communicative actions: Changes associated with knowing the actor's intention and the meanings of the actions. Neuropsychologia 2016; 81:230-237. [PMID: 26752450 PMCID: PMC4749541 DOI: 10.1016/j.neuropsychologia.2016.01.002] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2015] [Revised: 12/14/2015] [Accepted: 01/01/2016] [Indexed: 11/16/2022]
Abstract
People can communicate by using hand actions, e.g., signs. Understanding communicative actions requires that the observer knows that the actor has an intention to communicate and the meanings of the actions. Here, we investigated how this prior knowledge affects processing of observed actions. We used functional MRI to determine changes in action processing when non-signers were told that the observed actions are communicative (i.e., signs) and learned the meanings of half of the actions. Processing of hand actions activated the left and right inferior frontal gyrus (IFG, BA 44 and 45) when the communicative intention of the actor was known, even when the meanings of the actions remained unknown. These regions were not active when the observers did not know about the communicative nature of the hand actions. These findings suggest that the left and right IFG play a role in understanding the intention of the actor, but do not process visuospatial features of the communicative actions. Knowing the meanings of the hand actions further enhanced activity in the anterior part of the IFG (BA 45), the inferior parietal lobule and posterior inferior and middle temporal gyri in the left hemisphere. These left-hemisphere language regions could provide a link between meanings and observed actions. In sum, the findings provide evidence for the segregation of the networks involved in the neural processing of visuospatial features of communicative hand actions and those involved in understanding the actor’s intention and the meanings of the actions. Participants observed hand actions before and after learning that they are signs. Learning-induced changes in brain activity measured using fMRI. No activity in mirror neuron system when actions were not known to be communicative. Knowing the actor’s intention to communicate activated IFG and IPL. Knowing meanings of the actions increased activity in left IFG (BA 45), IPL and MTG.
Collapse
Affiliation(s)
- Riikka Möttönen
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford OX1 3UD, UK; Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), University of Oxford, John Radcliffe Hospital, Oxford OX3 9DU, UK.
| | - Harry Farmer
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford OX1 3UD, UK
| | - Kate E Watkins
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford OX1 3UD, UK; Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), University of Oxford, John Radcliffe Hospital, Oxford OX3 9DU, UK
| |
Collapse
|
18
|
Almeida D, Poeppel D, Corina D. The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning. LANGUAGE, COGNITION AND NEUROSCIENCE 2015; 31:361-374. [PMID: 27135041 PMCID: PMC4849140 DOI: 10.1080/23273798.2015.1100315] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/18/2015] [Accepted: 09/16/2015] [Indexed: 05/29/2023]
Abstract
The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data demonstrate that the perceptual tuning that underlies the discrimination of language and non-language information is not limited to spoken languages but extends to languages expressed in the visual modality.
Collapse
Affiliation(s)
- Diogo Almeida
- Division of Sciences, Psychology program, New York University – Abu Dhabi, Abu Dhabi, UAE
| | - David Poeppel
- Department of Psychology, New York University, New York, NY, USA
- Department of Neuroscience, Max-Planck-Institute (MPIEA), Frankfurt, Germany
| | - David Corina
- Department of Linguistics and the Center for Mind and Brain, University of California, Davis, CA, USA
| |
Collapse
|
19
|
Williams JT, Darcy I, Newman SD. Modality-independent neural mechanisms for novel phonetic processing. Brain Res 2015; 1620:107-15. [DOI: 10.1016/j.brainres.2015.05.014] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2015] [Revised: 04/28/2015] [Accepted: 05/11/2015] [Indexed: 01/20/2023]
|
20
|
Neural systems supporting linguistic structure, linguistic experience, and symbolic communication in sign language and gesture. Proc Natl Acad Sci U S A 2015; 112:11684-9. [PMID: 26283352 DOI: 10.1073/pnas.1510527112] [Citation(s) in RCA: 51] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Sign languages used by deaf communities around the world possess the same structural and organizational properties as spoken languages: In particular, they are richly expressive and also tightly grammatically constrained. They therefore offer the opportunity to investigate the extent to which the neural organization for language is modality independent, as well as to identify ways in which modality influences this organization. The fact that sign languages share the visual-manual modality with a nonlinguistic symbolic communicative system-gesture-further allows us to investigate where the boundaries lie between language and symbolic communication more generally. In the present study, we had three goals: to investigate the neural processing of linguistic structure in American Sign Language (using verbs of motion classifier constructions, which may lie at the boundary between language and gesture); to determine whether we could dissociate the brain systems involved in deriving meaning from symbolic communication (including both language and gesture) from those specifically engaged by linguistically structured content (sign language); and to assess whether sign language experience influences the neural systems used for understanding nonlinguistic gesture. The results demonstrated that even sign language constructions that appear on the surface to be similar to gesture are processed within the left-lateralized frontal-temporal network used for spoken languages-supporting claims that these constructions are linguistically structured. Moreover, although nonsigners engage regions involved in human action perception to process communicative, symbolic gestures, signers instead engage parts of the language-processing network-demonstrating an influence of experience on the perception of nonlinguistic stimuli.
Collapse
|
21
|
Gallese V, Gernsbacher MA, Heyes C, Hickok G, Iacoboni M. Mirror Neuron Forum. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2015; 6:369-407. [PMID: 25520744 DOI: 10.1177/1745691611413392] [Citation(s) in RCA: 106] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Vittorio Gallese
- Department of Neuroscience, University of Parma, and Italian Institute of Technology Brain Center for Social and Motor Cognition, Parma, Italy
| | | | - Cecilia Heyes
- All Souls College and Department of Experimental Psychology, University of Oxford, United Kingdom
| | - Gregory Hickok
- Center for Cognitive Neuroscience, Department of Cognitive Sciences, University of California, Irvine
| | - Marco Iacoboni
- Ahmanson-Lovelace Brain Mapping Center, Department of Psychiatry and Biobehavioral Sciences, Semel Institute for Neuroscience and Social Behavior, Brain Research Institute, David Geffen School of Medicine, University of California, Los Angeles
| |
Collapse
|
22
|
Bilingualism alters brain functional connectivity between “control” regions and “language” regions: Evidence from bimodal bilinguals. Neuropsychologia 2015; 71:236-47. [DOI: 10.1016/j.neuropsychologia.2015.04.007] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2014] [Revised: 03/21/2015] [Accepted: 04/04/2015] [Indexed: 01/12/2023]
|
23
|
Morett LM. Lending a hand to signed language acquisition: Enactment and iconicity enhance sign recall in hearing adult American Sign Language learners. JOURNAL OF COGNITIVE PSYCHOLOGY 2015. [DOI: 10.1080/20445911.2014.999684] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
24
|
Helmich I, Skomroch H, Lausberg H. Neuropsychological functions of hand movements and gestures change in the presence or absence of speech. JOURNAL OF COGNITIVE PSYCHOLOGY 2014. [DOI: 10.1080/20445911.2014.961925] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
25
|
Linking neocortical, cognitive, and genetic variability in autism with alterations of brain plasticity: the Trigger-Threshold-Target model. Neurosci Biobehav Rev 2014; 47:735-52. [PMID: 25155242 DOI: 10.1016/j.neubiorev.2014.07.012] [Citation(s) in RCA: 45] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2013] [Revised: 07/02/2014] [Accepted: 07/12/2014] [Indexed: 11/23/2022]
Abstract
The phenotype of autism involves heterogeneous adaptive traits (strengths vs. disabilities), different domains of alterations (social vs. non-social), and various associated genetic conditions (syndromic vs. nonsyndromic autism). Three observations suggest that alterations in experience-dependent plasticity are an etiological factor in autism: (1) the main cognitive domains enhanced in autism are controlled by the most plastic cortical brain regions, the multimodal association cortices; (2) autism and sensory deprivation share several features of cortical and functional reorganization; and (3) genetic mutations and/or environmental insults involved in autism all appear to affect developmental synaptic plasticity, and mostly lead to its upregulation. We present the Trigger-Threshold-Target (TTT) model of autism to organize these findings. In this model, genetic mutations trigger brain reorganization in individuals with a low plasticity threshold, mostly within regions sensitive to cortical reallocations. These changes account for the cognitive enhancements and reduced social expertise associated with autism. Enhanced but normal plasticity may underlie non-syndromic autism, whereas syndromic autism may occur when a triggering mutation or event produces an altered plastic reaction, also resulting in intellectual disability and dysmorphism in addition to autism. Differences in the target of brain reorganization (perceptual vs. language regions) account for the main autistic subgroups. In light of this model, future research should investigate how individual and sex-related differences in synaptic/regional brain plasticity influence the occurrence of autism.
Collapse
|
26
|
García RR, Zamorano F, Aboitiz F. From imitation to meaning: circuit plasticity and the acquisition of a conventionalized semantics. Front Hum Neurosci 2014; 8:605. [PMID: 25152726 PMCID: PMC4126550 DOI: 10.3389/fnhum.2014.00605] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2013] [Accepted: 07/20/2014] [Indexed: 12/04/2022] Open
Abstract
The capacity for language is arguably the most remarkable innovation of the human brain. A relatively recent interpretation prescribes that part of the language-related circuits were co-opted from circuitry involved in hand control-the mirror neuron system (MNS), involved both in the perception and in the execution of voluntary grasping actions. A less radical view is that in early humans, communication was opportunistic and multimodal, using signs, vocalizations or whatever means available to transmit social information. However, one point that is not yet clear under either perspective is how learned communication acquired a semantic property thereby allowing us to name objects and eventually describe our surrounding environment. Here we suggest a scenario involving both manual gestures and learned vocalizations that led to the development of a primitive form of conventionalized reference. This proposal is based on comparative evidence gathered from other species and on neurolinguistic evidence in humans, which points to a crucial role for vocal learning in the early development of language. Firstly, the capacity to direct the attention of others to a common object may have been crucial for developing a consensual referential system. Pointing, which is a ritualized grasping gesture, may have been crucial to this end. Vocalizations also served to generate joint attention among conversants, especially when combined with gaze direction. Another contributing element was the development of pantomimic actions resembling events or animals. In conjunction with this mimicry, the development of plastic neural circuits that support complex, learned vocalizations was probably a significant factor in the evolution of conventionalized semantics in our species. Thus, vocal imitations of sounds, as in onomatopoeias (words whose sound resembles their meaning), are possibly supported by mirror system circuits, and may have been relevant in the acquisition of early meanings.
Collapse
Affiliation(s)
- Ricardo R. García
- Centro de Estudios Cognitivos, Facultad de Filosofía y Humanidades, Universidad de ChileSantiago, Chile
| | - Francisco Zamorano
- División de Neurociencia, Centro de Investigación en Complejidad Social, Facultad de Gobierno, Universidad del DesarrolloSantiago, Chile
| | - Francisco Aboitiz
- Departamento de Psiquiatría, Escuela de Medicina, y Centro Interdisciplinario de Neurociencia, Pontificia Universidad Católica de ChileSantiago, Chile
| |
Collapse
|
27
|
Malaia E, Talavage TM, Wilbur RB. Functional connectivity in task-negative network of the Deaf: effects of sign language experience. PeerJ 2014; 2:e446. [PMID: 25024915 PMCID: PMC4081178 DOI: 10.7717/peerj.446] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2014] [Accepted: 06/02/2014] [Indexed: 01/23/2023] Open
Abstract
Prior studies investigating cortical processing in Deaf signers suggest that life-long experience with sign language and/or auditory deprivation may alter the brain’s anatomical structure and the function of brain regions typically recruited for auditory processing (Emmorey et al., 2010; Pénicaud et al., 2013 inter alia). We report the first investigation of the task-negative network in Deaf signers and its functional connectivity—the temporal correlations among spatially remote neurophysiological events. We show that Deaf signers manifest increased functional connectivity between posterior cingulate/precuneus and left medial temporal gyrus (MTG), but also inferior parietal lobe and medial temporal gyrus in the right hemisphere- areas that have been found to show functional recruitment specifically during sign language processing. These findings suggest that the organization of the brain at the level of inter-network connectivity is likely affected by experience with processing visual language, although sensory deprivation could be another source of the difference. We hypothesize that connectivity alterations in the task negative network reflect predictive/automatized processing of the visual signal.
Collapse
Affiliation(s)
- Evie Malaia
- Center for Mind, Brain, and Education, University of Texas at Arlington , TX , USA
| | - Thomas M Talavage
- Weldon School of Biomedical Engineering, Purdue University , IN , USA ; School of Electrical and Computer Engineering, Purdue University , IN , USA
| | - Ronnie B Wilbur
- Speech, Language, and Hearing Sciences, and Linguistics Program, Purdue University , IN , USA
| |
Collapse
|
28
|
Van Overwalle F, Baetens K, Mariën P, Vandekerckhove M. Social cognition and the cerebellum: a meta-analysis of over 350 fMRI studies. Neuroimage 2013; 86:554-72. [PMID: 24076206 DOI: 10.1016/j.neuroimage.2013.09.033] [Citation(s) in RCA: 306] [Impact Index Per Article: 27.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2013] [Revised: 09/02/2013] [Accepted: 09/12/2013] [Indexed: 01/31/2023] Open
Abstract
This meta-analysis explores the role of the cerebellum in social cognition. Recent meta-analyses of neuroimaging studies since 2008 demonstrate that the cerebellum is only marginally involved in social cognition and emotionality, with a few meta-analyses pointing to an involvement of at most 54% of the individual studies. In this study, novel meta-analyses of over 350 fMRI studies, dividing up the domain of social cognition in homogeneous subdomains, confirmed this low involvement of the cerebellum in conditions that trigger the mirror network (e.g., when familiar movements of body parts are observed) and the mentalizing network (when no moving body parts or unfamiliar movements are present). There is, however, one set of mentalizing conditions that strongly involve the cerebellum in 50-100% of the individual studies. In particular, when the level of abstraction is high, such as when behaviors are described in terms of traits or permanent characteristics, in terms of groups rather than individuals, in terms of the past (episodic autobiographic memory) or the future rather than the present, or in terms of hypothetical events that may happen. An activation likelihood estimation (ALE) meta-analysis conducted in this study reveals that the cerebellum is critically implicated in social cognition and that the areas of the cerebellum which are consistently involved in social cognitive processes show extensive overlap with the areas involved in sensorimotor (during mirror and self-judgments tasks) as well as in executive functioning (across all tasks). We discuss the role of the cerebellum in social cognition in general and in higher abstraction mentalizing in particular. We also point out a number of methodological limitations of some available studies on the social brain that hamper the detection of cerebellar activity.
Collapse
Affiliation(s)
- Frank Van Overwalle
- Faculty of Psychology and Educational Sciences, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium.
| | - Kris Baetens
- Faculty of Psychology and Educational Sciences, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
| | - Peter Mariën
- Faculty of Arts, Department of Clinical and Experimental Neurolinguistics, CLIN, Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels, Belgium; Department of Neurology and Memory Clinic, ZNA Middelheim Hospital, Lindendreef 1, B-2020 Antwerp, Belgium
| | - Marie Vandekerckhove
- Faculty of Psychology and Educational Sciences, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
| |
Collapse
|
29
|
Straube B, He Y, Steines M, Gebhardt H, Kircher T, Sammer G, Nagels A. Supramodal neural processing of abstract information conveyed by speech and gesture. Front Behav Neurosci 2013; 7:120. [PMID: 24062652 PMCID: PMC3772311 DOI: 10.3389/fnbeh.2013.00120] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2013] [Accepted: 08/24/2013] [Indexed: 11/13/2022] Open
Abstract
Abstractness and modality of interpersonal communication have a considerable impact on comprehension. They are relevant for determining thoughts and constituting internal models of the environment. Whereas concrete object-related information can be represented in mind irrespective of language, abstract concepts require a representation in speech. Consequently, modality-independent processing of abstract information can be expected. Here we investigated the neural correlates of abstractness (abstract vs. concrete) and modality (speech vs. gestures), to identify an abstractness-specific supramodal neural network. During fMRI data acquisition 20 participants were presented with videos of an actor either speaking sentences with an abstract-social [AS] or concrete-object-related content [CS], or performing meaningful abstract-social emblematic [AG] or concrete-object-related tool-use gestures [CG]. Gestures were accompanied by a foreign language to increase the comparability between conditions and to frame the communication context of the gesture videos. Participants performed a content judgment task referring to the person vs. object-relatedness of the utterances. The behavioral data suggest a comparable comprehension of contents communicated by speech or gesture. Furthermore, we found common neural processing for abstract information independent of modality (AS > CS ∩ AG > CG) in a left hemispheric network including the left inferior frontal gyrus (IFG), temporal pole, and medial frontal cortex. Modality specific activations were found in bilateral occipital, parietal, and temporal as well as right inferior frontal brain regions for gesture (G > S) and in left anterior temporal regions and the left angular gyrus for the processing of speech semantics (S > G). These data support the idea that abstract concepts are represented in a supramodal manner. Consequently, gestures referring to abstract concepts are processed in a predominantly left hemispheric language related neural network.
Collapse
Affiliation(s)
- Benjamin Straube
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg Marburg, Germany
| | | | | | | | | | | | | |
Collapse
|
30
|
Emmorey K. The neurobiology of sign language and the mirror system hypothesis. LANGUAGE AND COGNITION 2013; 5:205-210. [PMID: 24707322 PMCID: PMC3972212 DOI: 10.1515/langcog-2013-0014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
I suggest two puzzles for the Mirror System Hypothesis. First, there is little evidence that mirror neuron populations for words or for signs exist in Broca's area, and a mirror system is not critical for either speech or sign perception. Damage to Broca's area (or to the mirror system for human action) does not result in deficits in sign or speech perception. Second, the gesticulations of speakers are highly integrated with speech, but pantomimes and modern protosigns (conventional gestures) are not co-expressive with speech, and they do not co-occur with speech. Further, signers also produce global, imagistic gesticulations with their mouths and bodies simultaneously while signing with their hands. The expanding spiral of protosign and protospeech does not predict the integrated and co-expressive nature of modern gestures produced by signers and speakers.
Collapse
Affiliation(s)
- Karen Emmorey
- Lab for Language and Cognitive Neuroscience, 6495 Alvarado Road, Suite 200, San Diego, CA 92120, USA.
| |
Collapse
|
31
|
|
32
|
Straube B, Green A, Weis S, Kircher T. A supramodal neural network for speech and gesture semantics: an fMRI study. PLoS One 2012; 7:e51207. [PMID: 23226488 PMCID: PMC3511386 DOI: 10.1371/journal.pone.0051207] [Citation(s) in RCA: 51] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2012] [Accepted: 10/30/2012] [Indexed: 12/03/2022] Open
Abstract
In a natural setting, speech is often accompanied by gestures. As language, speech-accompanying iconic gestures to some extent convey semantic information. However, if comprehension of the information contained in both the auditory and visual modality depends on same or different brain-networks is quite unknown. In this fMRI study, we aimed at identifying the cortical areas engaged in supramodal processing of semantic information. BOLD changes were recorded in 18 healthy right-handed male subjects watching video clips showing an actor who either performed speech (S, acoustic) or gestures (G, visual) in more (+) or less (−) meaningful varieties. In the experimental conditions familiar speech or isolated iconic gestures were presented; during the visual control condition the volunteers watched meaningless gestures (G−), while during the acoustic control condition a foreign language was presented (S−). The conjunction of the visual and acoustic semantic processing revealed activations extending from the left inferior frontal gyrus to the precentral gyrus, and included bilateral posterior temporal regions. We conclude that proclaiming this frontotemporal network the brain's core language system is to take too narrow a view. Our results rather indicate that these regions constitute a supramodal semantic processing network.
Collapse
Affiliation(s)
- Benjamin Straube
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, Marburg, Germany.
| | | | | | | |
Collapse
|
33
|
Quandt LC, Marshall PJ, Shipley TF, Beilock SL, Goldin-Meadow S. Sensitivity of alpha and beta oscillations to sensorimotor characteristics of action: an EEG study of action production and gesture observation. Neuropsychologia 2012; 50:2745-2751. [PMID: 22910276 DOI: 10.1016/j.neuropsychologia.2012.08.005] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2012] [Revised: 07/20/2012] [Accepted: 08/06/2012] [Indexed: 11/18/2022]
Abstract
The sensorimotor experiences we gain when performing an action have been found to influence how our own motor systems are activated when we observe others performing that same action. Here we asked whether this phenomenon applies to the observation of gesture. Would the sensorimotor experiences we gain when performing an action on an object influence activation in our own motor systems when we observe others performing a gesture for that object? Participants were given sensorimotor experience with objects that varied in weight, and then observed video clips of an actor producing gestures for those objects. Electroencephalography (EEG) was recorded while participants first observed either an iconic gesture (pantomiming lifting an object) or a deictic gesture (pointing to an object) for an object, and then grasped and lifted the object indicated by the gesture. We analyzed EEG during gesture observation to determine whether oscillatory activity was affected by the observer's sensorimotor experiences with the object represented in the gesture. Seeing a gesture for an object previously experienced as light was associated with a suppression of power in alpha and beta frequency bands, particularly at posterior electrodes. A similar pattern was found when participants lifted the light object, but over more diffuse electrodes. Moreover, alpha and beta bands at right parieto-occipital electrodes were sensitive to the type of gesture observed (iconic vs. deictic). These results demonstrate that sensorimotor experience with an object affects how a gesture for that object is processed, as measured by the gesture-observer's EEG, and suggest that different types of gestures recruit the observer's own motor system in different ways.
Collapse
Affiliation(s)
- Lorna C Quandt
- Temple University, Department of Psychology, 1701 North 13th Street, Philadelphia, PA 19122, USA.
| | - Peter J Marshall
- Temple University, Department of Psychology, 1701 North 13th Street, Philadelphia, PA 19122, USA
| | - Thomas F Shipley
- Temple University, Department of Psychology, 1701 North 13th Street, Philadelphia, PA 19122, USA
| | - Sian L Beilock
- The University of Chicago, Department of Psychology, 5848 South University Avenue, Chicago, IL 60637, USA
| | - Susan Goldin-Meadow
- The University of Chicago, Department of Psychology, 5848 South University Avenue, Chicago, IL 60637, USA
| |
Collapse
|
34
|
Grosvald M, Gutierrez E, Hafer S, Corina D. Dissociating linguistic and non-linguistic gesture processing: electrophysiological evidence from American Sign Language. BRAIN AND LANGUAGE 2012; 121:12-24. [PMID: 22341555 PMCID: PMC3337787 DOI: 10.1016/j.bandl.2012.01.005] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/24/2011] [Revised: 01/21/2012] [Accepted: 01/24/2012] [Indexed: 05/29/2023]
Abstract
A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a "frame" (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a "last item" belonging to one of four categories: a high-close-probability sign (a "semantically reasonable" completion to the sentence; e.g. BED), a low-close-probability sign (a real sign that is nonetheless a "semantically odd" completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity.
Collapse
Affiliation(s)
- Michael Grosvald
- Department of Neurology, University of California at Irvine, Orange, CA 92868-4280, United States.
| | | | | | | |
Collapse
|
35
|
Corina DP, Grosvald M. Exploring perceptual processing of ASL and human actions: effects of inversion and repetition priming. Cognition 2012; 122:330-45. [PMID: 22153323 PMCID: PMC3259190 DOI: 10.1016/j.cognition.2011.10.011] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2011] [Revised: 10/04/2011] [Accepted: 10/20/2011] [Indexed: 11/24/2022]
Abstract
In this paper, we compare responses of deaf signers and hearing non-signers engaged in a categorization task of signs and non-linguistic human actions. We examine the time it takes to make such categorizations under conditions of 180° stimulus inversion and as a function of repetition priming, in an effort to understand whether the processing of sign language forms draws upon special processing mechanisms or makes use of mechanisms used in recognition of non-linguistic human actions. Our data show that deaf signers were much faster in the categorization of both linguistic and non-linguistic actions, and relative to hearing non-signers, show evidence that they were more sensitive to the configural properties of signs. Our study suggests that sign expertise may lead to modifications of a general-purpose human action recognition system rather than evoking a qualitatively different mode of processing, and supports the contention that signed languages make use of perceptual systems through which humans understand or parse human actions and gestures more generally.
Collapse
Affiliation(s)
- David P Corina
- Department of Linguistics, Center for Mind and Brain, University of California, Davis, Davis, CA 95618, United States.
| | | |
Collapse
|
36
|
Molinari E, Baraldi P, Campanella M, Duzzi D, Nocetti L, Pagnoni G, Porro CA. Human Parietofrontal Networks Related to Action Observation Detected at Rest. Cereb Cortex 2012; 23:178-86. [DOI: 10.1093/cercor/bhr393] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
|
37
|
Higuchi S, Holle H, Roberts N, Eickhoff S, Vogt S. Imitation and observational learning of hand actions: Prefrontal involvement and connectivity. Neuroimage 2012; 59:1668-83. [DOI: 10.1016/j.neuroimage.2011.09.021] [Citation(s) in RCA: 69] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2011] [Revised: 09/09/2011] [Accepted: 09/12/2011] [Indexed: 12/01/2022] Open
|
38
|
Hétu S, Mercier C, Eugène F, Michon PE, Jackson PL. Modulation of brain activity during action observation: influence of perspective, transitivity and meaningfulness. PLoS One 2011; 6:e24728. [PMID: 21931832 PMCID: PMC3171468 DOI: 10.1371/journal.pone.0024728] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2011] [Accepted: 08/19/2011] [Indexed: 02/03/2023] Open
Abstract
The coupling process between observed and performed actions is thought to be performed by a fronto-parietal perception-action system including regions of the inferior frontal gyrus and the inferior parietal lobule. When investigating the influence of the movements' characteristics on this process, most research on action observation has focused on only one particular variable even though the type of movements we observe can vary on several levels. By manipulating the visual perspective, transitivity and meaningfulness of observed movements in a functional magnetic resonance imaging study we aimed at investigating how the type of movements and the visual perspective can modulate brain activity during action observation in healthy individuals. Importantly, we used an active observation task where participants had to subsequently execute or imagine the observed movements. Our results show that the fronto-parietal regions of the perception action system were mostly recruited during the observation of meaningless actions while visual perspective had little influence on the activity within the perception-action system. Simultaneous investigation of several sources of modulation during active action observation is probably an approach that could lead to a greater ecological comprehension of this important sensorimotor process.
Collapse
Affiliation(s)
- Sébastien Hétu
- Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale, Québec City, Québec, Canada
- École de Psychologie, Faculté des Sciences Sociales, Université Laval, Québec City, Québec, Canada
| | - Catherine Mercier
- Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale, Québec City, Québec, Canada
- Département de Réadaptation, Faculté de Médecine, Université Laval, Québec City, Québec, Canada
| | - Fanny Eugène
- Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale, Québec City, Québec, Canada
| | - Pierre-Emmanuel Michon
- Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale, Québec City, Québec, Canada
| | - Philip L. Jackson
- Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale, Québec City, Québec, Canada
- École de Psychologie, Faculté des Sciences Sociales, Université Laval, Québec City, Québec, Canada
- Centre de Recherche Université Laval Robert-Giffard, Québec City, Québec, Canada
- * E-mail:
| |
Collapse
|
39
|
Emmorey K, McCullough S, Mehta S, Ponto LLB, Grabowski TJ. Sign language and pantomime production differentially engage frontal and parietal cortices. ACTA ACUST UNITED AC 2011; 26:878-901. [PMID: 21909174 DOI: 10.1080/01690965.2010.492643] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
We investigated the functional organization of neural systems supporting language production when the primary language articulators are also used for meaningful, but non-linguistic, expression such as pantomime. Fourteen hearing non-signers and 10 deaf native users of American Sign Language (ASL) participated in an H(2) (15)O-PET study in which they generated action pantomimes or ASL verbs in response to pictures of tools and manipulable objects. For pantomime generation, participants were instructed to "show how you would use the object." For verb generation, signers were asked to "generate a verb related to the object." The objects for this condition were selected to elicit handling verbs that resemble pantomime (e.g., TO-HAMMER (hand configuration and movement mimic the act of hammering) and non-handling verbs that do not (e.g., POUR-SYRUP, produced with a "Y" handshape). For the baseline task, participants viewed pictures of manipulable objects and an occasional non-manipulable object and decided whether the objects could be handled, gesturing "yes" (thumbs up) or "no" (hand wave). Relative to baseline, generation of ASL verbs engaged left inferior frontal cortex, but when non-signers produced pantomimes for the same objects, no frontal activation was observed. Both groups recruited left parietal cortex during pantomime production. However, for deaf signers the activation was more extensive and bilateral, which may reflect a more complex and integrated neural representation of hand actions. We conclude that the production of pantomime versus ASL verbs (even those that resemble pantomime) engage partially segregated neural systems that support praxic versus linguistic functions.
Collapse
|
40
|
Behmer LP, Jantzen KJ. Reading sheet music facilitates sensorimotor mu-desynchronization in musicians. Clin Neurophysiol 2011; 122:1342-7. [DOI: 10.1016/j.clinph.2010.12.035] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2010] [Revised: 11/24/2010] [Accepted: 12/05/2010] [Indexed: 10/18/2022]
|
41
|
Hickok G, Houde J, Rong F. Sensorimotor integration in speech processing: computational basis and neural organization. Neuron 2011; 69:407-22. [PMID: 21315253 DOI: 10.1016/j.neuron.2011.01.019] [Citation(s) in RCA: 508] [Impact Index Per Article: 39.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/20/2010] [Indexed: 11/19/2022]
Abstract
Sensorimotor integration is an active domain of speech research and is characterized by two main ideas, that the auditory system is critically involved in speech production and that the motor system is critically involved in speech perception. Despite the complementarity of these ideas, there is little crosstalk between these literatures. We propose an integrative model of the speech-related "dorsal stream" in which sensorimotor interaction primarily supports speech production, in the form of a state feedback control architecture. A critical component of this control system is forward sensory prediction, which affords a natural mechanism for limited motor influence on perception, as recent perceptual research has suggested. Evidence shows that this influence is modulatory but not necessary for speech perception. The neuroanatomy of the proposed circuit is discussed as well as some probable clinical correlates including conduction aphasia, stuttering, and aspects of schizophrenia.
Collapse
Affiliation(s)
- Gregory Hickok
- Center for Cognitive Neuroscience, Center for Hearing Research, Department of Cognitive Sciences, University of California, Irvine, CA 92697, USA.
| | | | | |
Collapse
|
42
|
Emmorey K, Xu J, Braun A. Neural responses to meaningless pseudosigns: evidence for sign-based phonetic processing in superior temporal cortex. BRAIN AND LANGUAGE 2011; 117:34-8. [PMID: 21094525 PMCID: PMC3075318 DOI: 10.1016/j.bandl.2010.10.003] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/05/2010] [Revised: 10/05/2010] [Accepted: 10/23/2010] [Indexed: 05/22/2023]
Abstract
To identify neural regions that automatically respond to linguistically structured, but meaningless manual gestures, 14 deaf native users of American Sign Language (ASL) and 14 hearing non-signers passively viewed pseudosigns (possible but non-existent ASL signs) and non-iconic ASL signs, in addition to a fixation baseline. For the contrast between pseudosigns and baseline, greater activation was observed in left posterior superior temporal sulcus (STS), but not in left inferior frontal gyrus (BA 44/45), for deaf signers compared to hearing non-signers, based on VOI analyses. We hypothesize that left STS is more engaged for signers because this region becomes tuned to human body movements that conform the phonological constraints of sign language. For deaf signers, the contrast between pseudosigns and known ASL signs revealed increased activation for pseudosigns in left posterior superior temporal gyrus (STG) and in left inferior frontal cortex, but no regions were found to be more engaged for known signs than for pseudosigns. This contrast revealed no significant differences in activation for hearing non-signers. We hypothesize that left STG is involved in recognizing linguistic phonetic units within a dynamic visual or auditory signal, such that less familiar structural combinations produce increased neural activation in this region for both pseudosigns and pseudowords.
Collapse
|
43
|
Noordzij ML, Newman-Norlund SE, de Ruiter JP, Hagoort P, Levinson SC, Toni I. Neural correlates of intentional communication. Front Neurosci 2010; 4:188. [PMID: 21151781 PMCID: PMC2999989 DOI: 10.3389/fnins.2010.00188] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2010] [Accepted: 10/19/2010] [Indexed: 11/13/2022] Open
Abstract
We know a great deal about the neurophysiological mechanisms supporting instrumental actions, i.e., actions designed to alter the physical state of the environment. In contrast, little is known about our ability to select communicative actions, i.e., actions directly designed to modify the mental state of another agent. We have recently provided novel empirical evidence for a mechanism in which a communicator selects his actions on the basis of a prediction of the communicative intentions that an addressee is most likely to attribute to those actions. The main novelty of those findings was that this prediction of intention recognition is cerebrally implemented within the intention recognition system of the communicator, is modulated by the ambiguity in meaning of the communicative acts, and not by their sensorimotor complexity. The characteristics of this predictive mechanism support the notion that human communicative abilities are distinct from both sensorimotor and linguistic processes.
Collapse
Affiliation(s)
- Matthijs L Noordzij
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Nijmegen, Netherlands
| | | | | | | | | | | |
Collapse
|
44
|
Braskie MN, Medina LD, Rodriguez-Agudelo Y, Geschwind DH, Macias-Islas MA, Cummings JL, Bookheimer SY, Ringman JM. Increased fMRI signal with age in familial Alzheimer's disease mutation carriers. Neurobiol Aging 2010; 33:424.e11-21. [PMID: 21129823 DOI: 10.1016/j.neurobiolaging.2010.09.028] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2010] [Revised: 08/25/2010] [Accepted: 09/29/2010] [Indexed: 01/06/2023]
Abstract
Although many Alzheimer's disease (AD) patients have a family history of the disease, it is rarely inherited in a predictable way. Functional magnetic resonance imaging (fMRI) studies of nondemented adults carrying familial AD mutations provide an opportunity to prospectively identify brain differences associated with early AD-related changes. We compared fMRI activity of 18 nondemented autosomal dominant AD mutation carriers with fMRI activity in eight of their noncarrier relatives as they performed a novelty encoding task in which they viewed novel and repeated images. Because age of disease onset is relatively consistent within families, we also correlated fMRI activity with subjects' distance from the median age of diagnosis for their family. Mutation carriers did not show significantly different voxelwise fMRI activity from noncarriers as a group. However, as they approached their family age of disease diagnosis, only mutation carriers showed increased fMRI activity in the fusiform and middle temporal gyri. This suggests that during novelty encoding, increased fMRI activity in the temporal lobe may relate to incipient AD processes.
Collapse
Affiliation(s)
- Meredith N Braskie
- Mary S. Easton Center for Alzheimer's Disease Research, Department of Neurology, David Geffen School of Medicine at UCLA, Los Angeles, CA, USA.
| | | | | | | | | | | | | | | |
Collapse
|
45
|
Alaerts K, Swinnen SP, Wenderoth N. Action perception in individuals with congenital blindness or deafness: how does the loss of a sensory modality from birth affect perception-induced motor facilitation? J Cogn Neurosci 2010; 23:1080-7. [PMID: 20521855 DOI: 10.1162/jocn.2010.21517] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Seeing or hearing manual actions activates the mirror neuron system, that is, specialized neurons within motor areas which fire when an action is performed but also when it is passively perceived. Using TMS, it was shown that motor cortex of typically developed subjects becomes facilitated not only from seeing others' actions, but also from merely hearing action-related sounds. In the present study, TMS was used for the first time to explore the "auditory" and "visual" responsiveness of motor cortex in individuals with congenital blindness or deafness. TMS was applied over left primary motor cortex (M1) to measure cortico-motor facilitation while subjects passively perceived manual actions (either visually or aurally). Although largely unexpected, congenitally blind or deaf subjects displayed substantially lower resonant motor facilitation upon action perception compared to seeing/hearing control subjects. Moreover, muscle-specific changes in cortico-motor excitability within M1 appeared to be absent in individuals with profound blindness or deafness. Overall, these findings strongly argue against the hypothesis that an increased reliance on the remaining sensory modality in blind or deaf subjects is accompanied by an increased responsiveness of the "auditory" or "visual" perceptual-motor "mirror" system, respectively. Moreover, the apparent lack of resonant motor facilitation for the blind and deaf subjects may challenge the hypothesis of a unitary mirror system underlying human action recognition and may suggest that action perception in blind and deaf subjects engages a mode of action processing that is different from the human action recognition system recruited in typically developed subjects.
Collapse
Affiliation(s)
- Kaat Alaerts
- Research Centre of Movement Control and Neuroplasticity, Department of Biomedical Kinesiology, Katholieke Universtiteit Leuven, Belgium.
| | | | | |
Collapse
|
46
|
Malaia E, Wilbur RB. Early acquisition of sign language What neuroimaging data tell us. SIGN LANGUAGE AND LINGUISTICS 2010; 13:183-199. [PMID: 21847357 PMCID: PMC3155772 DOI: 10.1075/sll.13.2.03mal] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
Early acquisition of a natural language, signed or spoken, has been shown to fundamentally impact both one's ability to use the first language, and the ability to learn subsequent languages later in life (Mayberry 2007, 2009). This review summarizes a number of recent neuroimaging studies in order to detail the neural bases of sign language acquisition. The logic of this review is to present research reports that contribute to the bigger picture showing that people who acquire a natural language, spoken or signed, in the normal way possess specialized linguistic abilities and brain functions that are missing or deficient in people whose exposure to natural language is delayed or absent. Comparing the function of each brain region with regards to the processing of spoken and sign languages, we attempt to clarify the role each region plays in language processing in general, and to outline the challenges and remaining questions in understanding language processing in the brain.
Collapse
|
47
|
Symbolic gestures and spoken language are processed by a common neural system. Proc Natl Acad Sci U S A 2009; 106:20664-9. [PMID: 19923436 DOI: 10.1073/pnas.0909197106] [Citation(s) in RCA: 181] [Impact Index Per Article: 12.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Symbolic gestures, such as pantomimes that signify actions (e.g., threading a needle) or emblems that facilitate social transactions (e.g., finger to lips indicating "be quiet"), play an important role in human communication. They are autonomous, can fully take the place of words, and function as complete utterances in their own right. The relationship between these gestures and spoken language remains unclear. We used functional MRI to investigate whether these two forms of communication are processed by the same system in the human brain. Responses to symbolic gestures, to their spoken glosses (expressing the gestures' meaning in English), and to visually and acoustically matched control stimuli were compared in a randomized block design. General Linear Models (GLM) contrasts identified shared and unique activations and functional connectivity analyses delineated regional interactions associated with each condition. Results support a model in which bilateral modality-specific areas in superior and inferior temporal cortices extract salient features from vocal-auditory and gestural-visual stimuli respectively. However, both classes of stimuli activate a common, left-lateralized network of inferior frontal and posterior temporal regions in which symbolic gestures and spoken words may be mapped onto common, corresponding conceptual representations. We suggest that these anterior and posterior perisylvian areas, identified since the mid-19th century as the core of the brain's language system, are not in fact committed to language processing, but may function as a modality-independent semiotic system that plays a broader role in human communication, linking meaning with symbols whether these are words, gestures, images, sounds, or objects.
Collapse
|