1
|
Trettenbrein PC, Zaccarella E, Friederici AD. Functional and structural brain asymmetries in sign language processing. HANDBOOK OF CLINICAL NEUROLOGY 2025; 208:327-350. [PMID: 40074405 DOI: 10.1016/b978-0-443-15646-5.00021-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/14/2025]
Abstract
The capacity for language constitutes a cornerstone of human cognition and distinguishes our species from other animals. Research in the cognitive sciences has demonstrated that this capacity is not bound to speech but can also be externalized in the form of sign language. Sign languages are the naturally occurring languages of the deaf and rely on movements and configurations of hands, arms, face, and torso in space. This chapter reviews the functional and structural organization of the neural substrates of sign language, as identified by neuroimaging research over the past decades. Most aspects of sign language processing in adult deaf signers markedly mirror the well-known, functional left-lateralization of spoken and written language. However, both hemispheres exhibit a certain equipotentiality for processing linguistic information and the right hemisphere seems to specifically support processing of some constructions unique to the signed modality. Crucially, the so-called "core language network" in the left hemisphere constitutes a functional and structural asymmetry in typically developed deaf and hearing populations alike: This network is (i) pivotal for processing complex syntax independent of the modality of language use, (ii) matures in accordance with a genetically determined biologic matrix, and (iii) may have constituted an evolutionary prerequisite for the emergence of the human capacity for language.
Collapse
Affiliation(s)
- Patrick C Trettenbrein
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany; International Max Planck Research School on Neuroscience of Communication: Structure, Function, and Plasticity (IMPRS NeuroCom), Leipzig, Germany; Experimental Sign Language Laboratory (SignLab), Department of German Philology, University of Göttingen, Göttingen, Germany.
| | - Emiliano Zaccarella
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Angela D Friederici
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
2
|
Haluts N, Levy D, Friedmann N. Bimodal aphasia and dysgraphia: Phonological output buffer aphasia and orthographic output buffer dysgraphia in spoken and sign language. Cortex 2025; 182:147-180. [PMID: 39672692 DOI: 10.1016/j.cortex.2024.10.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2024] [Revised: 08/09/2024] [Accepted: 10/08/2024] [Indexed: 12/15/2024]
Abstract
We report a case of crossmodal bilingual aphasia-aphasia in two modalities, spoken and sign language-and dysgraphia in both writing and fingerspelling. The patient, Sunny, was a 42 year-old woman after a left temporo-parietal stroke, a speaker of Hebrew, Romanian, and English and an adult learner, daily user of Israeli Sign language (ISL). We assessed Sunny's spoken and sign languages using a comprehensive test battery of naming, reading, and repetition tasks, and also analysed her spontaneous-speech and sign. Her writing and fingerspelling were assessed using tasks of dictation, naming, and delayed copying. In spoken language production, Sunny showed a classical phonological output buffer (POB) impairment in naming, reading, repetition, and spontaneous production, with phonological errors (transpositions, substitutions, insertions, and omissions) in words and pseudo-words, and whole-unit errors in morphological affixes, function-words, and number-words, with a length effect. Importantly, her error pattern in ISL was remarkably similar in the parallel tasks, with phonological errors in signs and pseudo-signs, affecting all the phonological parameters of the sign (movement, handshape, location, and orientation), and whole-unit errors in morphemes, function-signs, and number-signs. Sunny's impairment was selective to the POB, without phonological input, semantic-conceptual, or syntactic deficits. This shows for the first time how POB impairment, a kind of conduction aphasia, manifests itself in a sign language, and indicates that the POB for sign-language has the same cognitive architecture as the one for spoken language. It may also indicate similar neural underpinnings for spoken and sign languages. In writing, Sunny forms the first case of a selective type of dysgraphia in fingerspelling, orthographic (graphemic) output buffer dysgraphia. In both writing and fingerspelling, she made letter errors (letter transpositions, substitutions, insertions, and omissions), as well as morphological errors and errors in function words, and showed length effect. Sunny's impairment was selective to the orthographic output buffer, whereas her reading, including orthographic input processing, was intact. This suggests that the orthographic output buffer is shared for writing and fingerspelling, at least in a late learner of sign language. The results shed further light on the architecture of phonological and orthographic production.
Collapse
Affiliation(s)
- Neta Haluts
- Language and Brain Lab, Sagol School of Neuroscience, and School of Education, Tel Aviv University, Tel Aviv, Israel
| | - Doron Levy
- Language and Brain Lab, Sagol School of Neuroscience, and School of Education, Tel Aviv University, Tel Aviv, Israel
| | - Naama Friedmann
- Language and Brain Lab, Sagol School of Neuroscience, and School of Education, Tel Aviv University, Tel Aviv, Israel.
| |
Collapse
|
3
|
Nematova S, Zinszer B, Morlet T, Morini G, Petitto LA, Jasińska KK. Impact of ASL Exposure on Spoken Phonemic Discrimination in Adult CI Users: A Functional Near-Infrared Spectroscopy Study. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2024; 5:553-588. [PMID: 38939730 PMCID: PMC11210937 DOI: 10.1162/nol_a_00143] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Accepted: 03/11/2024] [Indexed: 06/29/2024]
Abstract
We examined the impact of exposure to a signed language (American Sign Language, or ASL) at different ages on the neural systems that support spoken language phonemic discrimination in deaf individuals with cochlear implants (CIs). Deaf CI users (N = 18, age = 18-24 yrs) who were exposed to a signed language at different ages and hearing individuals (N = 18, age = 18-21 yrs) completed a phonemic discrimination task in a spoken native (English) and non-native (Hindi) language while undergoing functional near-infrared spectroscopy neuroimaging. Behaviorally, deaf CI users who received a CI early versus later in life showed better English phonemic discrimination, albeit phonemic discrimination was poor relative to hearing individuals. Importantly, the age of exposure to ASL was not related to phonemic discrimination. Neurally, early-life language exposure, irrespective of modality, was associated with greater neural activation of left-hemisphere language areas critically involved in phonological processing during the phonemic discrimination task in deaf CI users. In particular, early exposure to ASL was associated with increased activation in the left hemisphere's classic language regions for native versus non-native language phonemic contrasts for deaf CI users who received a CI later in life. For deaf CI users who received a CI early in life, the age of exposure to ASL was not related to neural activation during phonemic discrimination. Together, the findings suggest that early signed language exposure does not negatively impact spoken language processing in deaf CI users, but may instead potentially offset the negative effects of language deprivation that deaf children without any signed language exposure experience prior to implantation. This empirical evidence aligns with and lends support to recent perspectives regarding the impact of ASL exposure in the context of CI usage.
Collapse
Affiliation(s)
- Shakhlo Nematova
- Department of Linguistics and Cognitive Science, University of Delaware, Newark, DE, USA
| | - Benjamin Zinszer
- Department of Psychology, Swarthmore College, Swarthmore, PA, USA
| | - Thierry Morlet
- Nemours Children’s Hospital, Delaware, Wilmington, DE, USA
| | - Giovanna Morini
- Department of Communication Sciences and Disorders, University of Delaware, Newark, DE, USA
| | - Laura-Ann Petitto
- Brain and Language Center for Neuroimaging, Gallaudet University, Washington, DC, USA
| | - Kaja K. Jasińska
- Department of Applied Psychology and Human Development, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
4
|
Goldberg EB, Hillis AE. Sign language aphasia. HANDBOOK OF CLINICAL NEUROLOGY 2022; 185:297-315. [PMID: 35078607 DOI: 10.1016/b978-0-12-823384-9.00019-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Signed languages are naturally occurring, fully formed linguistic systems that rely on the movement of the hands, arms, torso, and face within a sign space for production, and are perceived predominantly using visual perception. Despite stark differences in modality and linguistic structure, functional neural organization is strikingly similar to spoken language. Generally speaking, left frontal areas support sign production, and regions in the auditory cortex underlie sign comprehension-despite signers not relying on audition to process language. Given this, should a deaf or hearing signer suffer damage to the left cerebral hemisphere, language is vulnerable to impairment. Multiple cases of sign language aphasia have been documented following left hemisphere injury, and the general pattern of linguistic deficits mirrors those observed in spoken language. The right hemisphere likely plays a role in non-linguistic but critical visuospatial functions of sign language; therefore, individuals who are spared from damage to the left hemisphere but suffer injury to the right are at risk for a different set of communication deficits. In this chapter, we review the neurobiology of sign language and patterns of language deficits that follow brain injury in the deaf signing population.
Collapse
Affiliation(s)
- Emily B Goldberg
- Department of Neurology, Johns Hopkins University School of Medicine, Baltimore, MD, United States.
| | - Argye Elizabeth Hillis
- Department of Neurology, Johns Hopkins University School of Medicine, Baltimore, MD, United States; Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, MD, United States; Department of Cognitive Science, Johns Hopkins University, Baltimore, MD, United States
| |
Collapse
|
5
|
Andin J, Holmer E, Schönström K, Rudner M. Working Memory for Signs with Poor Visual Resolution: fMRI Evidence of Reorganization of Auditory Cortex in Deaf Signers. Cereb Cortex 2021; 31:3165-3176. [PMID: 33625498 PMCID: PMC8196262 DOI: 10.1093/cercor/bhaa400] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2020] [Revised: 12/14/2020] [Accepted: 12/14/2020] [Indexed: 11/16/2022] Open
Abstract
Stimulus degradation adds to working memory load during speech processing. We investigated whether this applies to sign processing and, if so, whether the mechanism implicates secondary auditory cortex. We conducted an fMRI experiment where 16 deaf early signers (DES) and 22 hearing non-signers performed a sign-based n-back task with three load levels and stimuli presented at high and low resolution. We found decreased behavioral performance with increasing load and decreasing visual resolution, but the neurobiological mechanisms involved differed between the two manipulations and did so for both groups. Importantly, while the load manipulation was, as predicted, accompanied by activation in the frontoparietal working memory network, the resolution manipulation resulted in temporal and occipital activation. Furthermore, we found evidence of cross-modal reorganization in the secondary auditory cortex: DES had stronger activation and stronger connectivity between this and several other regions. We conclude that load and stimulus resolution have different neural underpinnings in the visual–verbal domain, which has consequences for current working memory models, and that for DES the secondary auditory cortex is involved in the binding of representations when task demands are low.
Collapse
Affiliation(s)
- Josefine Andin
- Department of Behavioural Science and Learning, Linköping University, Linköping, Sweden.,Swedish Institute for Disability Research, Linnaeus Centre HEAD, Sweden
| | - Emil Holmer
- Department of Behavioural Science and Learning, Linköping University, Linköping, Sweden.,Swedish Institute for Disability Research, Linnaeus Centre HEAD, Sweden.,Center for Medical Image Science and Visualization, Linköping, Sweden
| | | | - Mary Rudner
- Department of Behavioural Science and Learning, Linköping University, Linköping, Sweden.,Swedish Institute for Disability Research, Linnaeus Centre HEAD, Sweden.,Center for Medical Image Science and Visualization, Linköping, Sweden
| |
Collapse
|
6
|
Cheng Q, Silvano E, Bedny M. Sensitive periods in cortical specialization for language: insights from studies with Deaf and blind individuals. Curr Opin Behav Sci 2020; 36:169-176. [PMID: 33718533 PMCID: PMC7945734 DOI: 10.1016/j.cobeha.2020.10.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Studies with Deaf and blind individuals demonstrate that linguistic and sensory experiences during sensitive periods have potent effects on neurocognitive basis of language. Native users of sign and spoken languages recruit similar fronto-temporal systems during language processing. By contrast, delays in sign language access impact proficiency and the neural basis of language. Analogously, early but not late-onset blindness modifies the neural basis of language. People born blind recruit 'visual' areas during language processing, show reduced left-lateralization of language and enhanced performance on some language tasks. Sensitive period plasticity in and outside fronto-temporal language systems shapes the neural basis of language.
Collapse
Affiliation(s)
- Qi Cheng
- University of California San Diego
- University of Washington
| | - Emily Silvano
- Federal University of Rio de Janeiro
- Johns Hopkins University
| | | |
Collapse
|
7
|
Trettenbrein PC, Papitto G, Friederici AD, Zaccarella E. Functional neuroanatomy of language without speech: An ALE meta-analysis of sign language. Hum Brain Mapp 2020; 42:699-712. [PMID: 33118302 PMCID: PMC7814757 DOI: 10.1002/hbm.25254] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Accepted: 10/09/2020] [Indexed: 12/19/2022] Open
Abstract
Sign language (SL) conveys linguistic information using gestures instead of sounds. Here, we apply a meta‐analytic estimation approach to neuroimaging studies (N = 23; subjects = 316) and ask whether SL comprehension in deaf signers relies on the same primarily left‐hemispheric cortical network implicated in spoken and written language (SWL) comprehension in hearing speakers. We show that: (a) SL recruits bilateral fronto‐temporo‐occipital regions with strong left‐lateralization in the posterior inferior frontal gyrus known as Broca's area, mirroring functional asymmetries observed for SWL. (b) Within this SL network, Broca's area constitutes a hub which attributes abstract linguistic information to gestures. (c) SL‐specific voxels in Broca's area are also crucially involved in SWL, as confirmed by meta‐analytic connectivity modeling using an independent large‐scale neuroimaging database. This strongly suggests that the human brain evolved a lateralized language network with a supramodal hub in Broca's area which computes linguistic information independent of speech.
Collapse
Affiliation(s)
- Patrick C Trettenbrein
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,International Max Planck Research School on Neuroscience of Communication: Structure, Function, and Plasticity (IMPRS NeuroCom), Leipzig, Germany
| | - Giorgio Papitto
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,International Max Planck Research School on Neuroscience of Communication: Structure, Function, and Plasticity (IMPRS NeuroCom), Leipzig, Germany
| | - Angela D Friederici
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Emiliano Zaccarella
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
8
|
Canjels LP, Backes WH, van Veenendaal TM, Vlooswijk MC, Hofman PA, Aldenkamp AP, Rouhl RP, Jansen JF. Volumetric and Functional Activity Lateralization in Healthy Subjects and Patients with Focal Epilepsy: Initial Findings in a 7T MRI Study. J Neuroimaging 2020; 30:666-673. [PMID: 32472965 PMCID: PMC7586826 DOI: 10.1111/jon.12739] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2020] [Revised: 05/20/2020] [Accepted: 05/20/2020] [Indexed: 11/27/2022] Open
Abstract
BACKGROUND AND PURPOSE In 30% of the patients with focal epilepsy, an epileptogenic lesion cannot be visually detected with structural MRI. Ultra-high field MRI may be able to identify subtle pathology related to the epileptic focus. We set out to assess 7T MRI-derived volumetric and functional activity lateralization of the hippocampus, hippocampal subfields, temporal and frontal lobe in healthy subjects and MRI-negative patients with focal epilepsy. METHODS Twenty controls and 10 patients with MRI-negative temporal or frontal lobe epilepsy (TLE and FLE, respectively) underwent a 7T MRI exam. T1 -weigthed imaging and resting-state fMRI was performed. T1 -weighted images were segmented to yield volumes, while from fMRI data, the fractional amplitude of low frequency fluctuations was calculated. Subsequently, volumetric and functional lateralization was calculated from left-right asymmetry. RESULTS In controls, volumetric lateralization was symmetric, with a slight asymmetry of the hippocampus and subiculum, while functional lateralization consistently showed symmetry. Contrarily, in epilepsy patients, regions were less symmetric. In TLE patients with known focus, volumetric lateralization in the hippocampus and hippocampal subfields was indicative of smaller ipsilateral volumes. These patients also showed clear functional lateralization, though not consistently ipsilateral or contralateral to the epileptic focus. TLE patients with unknown focus showed an obvious volumetric lateralization, facilitating the localization of the epileptic focus. Lateralization results in the FLE patients were less consistent with the epileptic focus. CONCLUSION MRI-derived volume and fluctuation amplitude are highly symmetric in controls, whereas in TLE, volumetric and functional lateralization effects were observed. This highlights the potential of the technique.
Collapse
Affiliation(s)
- Lisanne P.W. Canjels
- Departments of Radiology and Nuclear MedicineMaastricht University Medical CenterMaastrichtThe Netherlands
- School for Mental Health and NeuroscienceMaastricht UniversityMaastrichtThe Netherlands
- Department of Electrical EngineeringEindhoven University of TechnologyEindhovenThe Netherlands
| | - Walter H. Backes
- Departments of Radiology and Nuclear MedicineMaastricht University Medical CenterMaastrichtThe Netherlands
- School for Mental Health and NeuroscienceMaastricht UniversityMaastrichtThe Netherlands
- School for Cardiovascular DisordersMaastricht UniversityMaastrichtThe Netherlands
| | - Tamar M. van Veenendaal
- Departments of Radiology and Nuclear MedicineMaastricht University Medical CenterMaastrichtThe Netherlands
- School for Mental Health and NeuroscienceMaastricht UniversityMaastrichtThe Netherlands
| | - Marielle C.G. Vlooswijk
- School for Mental Health and NeuroscienceMaastricht UniversityMaastrichtThe Netherlands
- Department of NeurologyMaastricht University Medical CenterMaastrichtThe Netherlands
- Academic Center for Epileptology Kempenhaeghe/Maastricht UMC+MaastrichtThe Netherlands
| | - Paul A.M. Hofman
- Departments of Radiology and Nuclear MedicineMaastricht University Medical CenterMaastrichtThe Netherlands
| | - Albert P. Aldenkamp
- School for Mental Health and NeuroscienceMaastricht UniversityMaastrichtThe Netherlands
- Department of Electrical EngineeringEindhoven University of TechnologyEindhovenThe Netherlands
- Department of NeurologyMaastricht University Medical CenterMaastrichtThe Netherlands
- Academic Center for Epileptology Kempenhaeghe/Maastricht UMC+MaastrichtThe Netherlands
| | - Rob P.W. Rouhl
- School for Mental Health and NeuroscienceMaastricht UniversityMaastrichtThe Netherlands
- Department of NeurologyMaastricht University Medical CenterMaastrichtThe Netherlands
- Academic Center for Epileptology Kempenhaeghe/Maastricht UMC+MaastrichtThe Netherlands
| | - Jacobus F.A. Jansen
- Departments of Radiology and Nuclear MedicineMaastricht University Medical CenterMaastrichtThe Netherlands
- School for Mental Health and NeuroscienceMaastricht UniversityMaastrichtThe Netherlands
- Department of Electrical EngineeringEindhoven University of TechnologyEindhovenThe Netherlands
| |
Collapse
|
9
|
Abstract
Syntax, the structure of sentences, enables humans to express an infinite range of meanings through finite means. The neurobiology of syntax has been intensely studied but with little consensus. Two main candidate regions have been identified: the posterior inferior frontal gyrus (pIFG) and the posterior middle temporal gyrus (pMTG). Integrating research in linguistics, psycholinguistics, and neuroscience, we propose a neuroanatomical framework for syntax that attributes distinct syntactic computations to these regions in a unified model. The key theoretical advances are adopting a modern lexicalized view of syntax in which the lexicon and syntactic rules are intertwined, and recognizing a computational asymmetry in the role of syntax during comprehension and production. Our model postulates a hierarchical lexical-syntactic function to the pMTG, which interconnects previously identified speech perception and conceptual-semantic systems in the temporal and inferior parietal lobes, crucial for both sentence production and comprehension. These relational hierarchies are transformed via the pIFG into morpho-syntactic sequences, primarily tied to production. We show how this architecture provides a better account of the full range of data and is consistent with recent proposals regarding the organization of phonological processes in the brain.
Collapse
Affiliation(s)
- William Matchin
- Department of Communication Sciences and Disorders, University of South Carolina, Columbia, SC, 29208, USA
| | - Gregory Hickok
- Department of Cognitive Sciences, University of California, Irvine, Irvine, CA, 92697, USA
- Department of Language Science, University of California, Irvine, Irvine, CA, 92697, USA
| |
Collapse
|
10
|
Mercure E, Evans S, Pirazzoli L, Goldberg L, Bowden-Howl H, Coulson-Thaker K, Beedie I, Lloyd-Fox S, Johnson MH, MacSweeney M. Language Experience Impacts Brain Activation for Spoken and Signed Language in Infancy: Insights From Unimodal and Bimodal Bilinguals. NEUROBIOLOGY OF LANGUAGE (CAMBRIDGE, MASS.) 2020; 1:9-32. [PMID: 32274469 PMCID: PMC7145445 DOI: 10.1162/nol_a_00001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Recent neuroimaging studies suggest that monolingual infants activate a left-lateralized frontotemporal brain network in response to spoken language, which is similar to the network involved in processing spoken and signed language in adulthood. However, it is unclear how brain activation to language is influenced by early experience in infancy. To address this question, we present functional near-infrared spectroscopy (fNIRS) data from 60 hearing infants (4 to 8 months of age): 19 monolingual infants exposed to English, 20 unimodal bilingual infants exposed to two spoken languages, and 21 bimodal bilingual infants exposed to English and British Sign Language (BSL). Across all infants, spoken language elicited activation in a bilateral brain network including the inferior frontal and posterior temporal areas, whereas sign language elicited activation in the right temporoparietal area. A significant difference in brain lateralization was observed between groups. Activation in the posterior temporal region was not lateralized in monolinguals and bimodal bilinguals, but right lateralized in response to both language modalities in unimodal bilinguals. This suggests that the experience of two spoken languages influences brain activation for sign language when experienced for the first time. Multivariate pattern analyses (MVPAs) could classify distributed patterns of activation within the left hemisphere for spoken and signed language in monolinguals (proportion correct = 0.68; p = 0.039) but not in unimodal or bimodal bilinguals. These results suggest that bilingual experience in infancy influences brain activation for language and that unimodal bilingual experience has greater impact on early brain lateralization than bimodal bilingual experience.
Collapse
Affiliation(s)
| | - Samuel Evans
- University College London, London, UK
- University of Westminster, London, UK
| | - Laura Pirazzoli
- Birkbeck - University of London, London, UK
- Boston Children’s Hospital, Boston, Massachusetts, US
| | | | - Harriet Bowden-Howl
- University College London, London, UK
- University of Plymouth, Plymouth, Devon, UK
| | | | | | - Sarah Lloyd-Fox
- Birkbeck - University of London, London, UK
- University of Cambridge, Cambridge, Cambridgeshire, UK
| | - Mark H. Johnson
- Birkbeck - University of London, London, UK
- University of Cambridge, Cambridge, Cambridgeshire, UK
| | | |
Collapse
|
11
|
Stroh AL, Rösler F, Dormal G, Salden U, Skotara N, Hänel-Faulhaber B, Röder B. Neural correlates of semantic and syntactic processing in German Sign Language. Neuroimage 2019; 200:231-241. [PMID: 31220577 DOI: 10.1016/j.neuroimage.2019.06.025] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2018] [Revised: 05/16/2019] [Accepted: 06/12/2019] [Indexed: 11/24/2022] Open
Abstract
The study of deaf and hearing native users of signed languages can offer unique insights into how biological constraints and environmental input interact to shape the neural bases of language processing. Here, we use functional magnetic resonance imaging (fMRI) to address two questions: (1) Do semantic and syntactic processing in a signed language rely on anatomically and functionally distinct neural substrates as it has been shown for spoken languages? and (2) Does hearing status affect the neural correlates of these two types of linguistic processing? Deaf and hearing native signers performed a sentence judgement task on German Sign Language (Deutsche Gebärdensprache: DGS) sentences which were correct or contained either syntactic or semantic violations. We hypothesized that processing of semantic and syntactic violations in DGS relies on distinct neural substrates as it has been shown for spoken languages. Moreover, we hypothesized that effects of hearing status are observed within auditory regions, as deaf native signers have been shown to activate auditory areas to a greater extent than hearing native signers when processing a signed language. Semantic processing activated low-level visual areas and the left inferior frontal gyrus (IFG), suggesting both modality-dependent and independent processing mechanisms. Syntactic processing elicited increased activation in the right supramarginal gyrus (SMG). Moreover, psychophysiological interaction (PPI) analyses revealed a cluster in left middle occipital regions showing increased functional coupling with the right SMG during syntactic relative to semantic processing, possibly indicating spatial processing mechanisms that are specific to signed syntax. Effects of hearing status were observed in the right superior temporal cortex (STC): deaf but not hearing native signers showed greater activation for semantic violations than for syntactic violations in this region. Taken together, the present findings suggest that the neural correlates of language processing are partly determined by biological constraints, but that they may additionally be influenced by the unique processing demands of the language modality and different sensory experiences.
Collapse
Affiliation(s)
- Anna-Lena Stroh
- Biological Psychology and Neuropsychology, University of Hamburg, Germany.
| | - Frank Rösler
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Giulia Dormal
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Uta Salden
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Nils Skotara
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| | - Barbara Hänel-Faulhaber
- Biological Psychology and Neuropsychology, University of Hamburg, Germany; Special Education, University of Hamburg, Germany
| | - Brigitte Röder
- Biological Psychology and Neuropsychology, University of Hamburg, Germany
| |
Collapse
|
12
|
Shared neural correlates for building phrases in signed and spoken language. Sci Rep 2018; 8:5492. [PMID: 29615785 PMCID: PMC5882945 DOI: 10.1038/s41598-018-23915-0] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2017] [Accepted: 03/21/2018] [Indexed: 11/08/2022] Open
Abstract
Research on the mental representation of human language has convincingly shown that sign languages are structured similarly to spoken languages. However, whether the same neurobiology underlies the online construction of complex linguistic structures in sign and speech remains unknown. To investigate this question with maximally controlled stimuli, we studied the production of minimal two-word phrases in sign and speech. Signers and speakers viewed the same pictures during magnetoencephalography recording and named them with semantically identical expressions. For both signers and speakers, phrase building engaged left anterior temporal and ventromedial cortices with similar timing, despite different linguistic articulators. Thus the neurobiological similarity of sign and speech goes beyond gross measures such as lateralization: the same fronto-temporal network achieves the planning of structured linguistic expressions.
Collapse
|
13
|
Le HB, Zhang HH, Wu QL, Zhang J, Yin JJ, Ma SH. Neural Activity During Mental Rotation in Deaf Signers: The Influence of Long-Term Sign Language Experience. Ear Hear 2018; 39:1015-1024. [PMID: 29298164 DOI: 10.1097/aud.0000000000000540] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
OBJECTIVES Mental rotation is the brain's visuospatial understanding of what objects are and where they belong. Previous research indicated that deaf signers showed behavioral enhancement for nonlinguistic visual tasks, including mental rotation. In this study, we investigated the neural difference of mental rotation processing between deaf signers and hearing nonsigners using blood oxygen level-dependent (BOLD) functional magnetic resonance imaging (fMRI). DESIGN The participants performed a block-designed experiment, consisting of alternating blocks of comparison and rotation periods, separated by a baseline or fixation period. Mental rotation tasks were performed using three-dimensional figures. fMRI images were acquired during the entire experiment, and the fMRI data were analyzed with Analysis of Functional NeuroImages. A factorial design analysis of variance was designed for fMRI analyses. The differences of activation were analyzed for the main effects of group and task, as well as for the interaction of group by task. RESULTS The study showed differences in activated areas between deaf signers and hearing nonsigners on the mental rotation of three-dimensional figures. Subtracting activations of fixation from activations of rotation, both groups showed consistent activation in bilateral occipital lobe, bilateral parietal lobe, and bilateral posterior temporal lobe. There were different main effects of task (rotation versus comparison) with significant activation clusters in the bilateral precuneus, the right middle frontal gyrus, the bilateral medial frontal gyrus, the right interior frontal gyrus, the right superior frontal gyrus, the right anterior cingulate, and the bilateral posterior cingulate. There were significant interaction effects of group by task in the bilateral anterior cingulate, the right inferior frontal gyrus, the left superior frontal gyrus, the left posterior cingulate, the left middle temporal gyrus, and the right inferior parietal lobe. In simple effects of deaf and hearing groups with rotation minus comparison, deaf signers mainly showed activity in the right hemisphere, while hearing nonsigners showed bilateral activity. In the simple effects of rotation task, decreased activities were shown for deaf signers compared with hearing nonsigners throughout several regions, including the bilateral parahippocampal gyrus, the left posterior cingulate cortex, the right anterior cingulate cortex, and the right inferior parietal lobe. CONCLUSION Decreased activations in several brain regions of deaf signers when compared to hearing nonsigners reflected increased neural efficiency and a precise functional circuitry, which was generated through long-term experience with sign language processing. In addition, we inferred tentatively that there may be a lateralization pattern to the right hemisphere for deaf signers when performing mental rotation tasks.
Collapse
Affiliation(s)
- Hong-Bo Le
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Hui-Hong Zhang
- Department of Radiology, Shenzhen Hospital of Southern Medical University, Shenzhen, China
- MR Division, Shantou Central Hospital, Shantou, China
| | - Qiu-Lin Wu
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Jiong Zhang
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Jing-Jing Yin
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| | - Shu-Hua Ma
- Department of Radiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
- Guangdong Key Laboratory of Medical Molecular Imaging, The First Affiliated Hospital of Shantou University Medical College, Shantou, China
| |
Collapse
|
14
|
Cross-Modal Recruitment of Auditory and Orofacial Areas During Sign Language in a Deaf Subject. World Neurosurg 2017; 105:1033.e1-1033.e5. [DOI: 10.1016/j.wneu.2017.05.170] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2017] [Revised: 05/28/2017] [Accepted: 05/29/2017] [Indexed: 11/19/2022]
|
15
|
Komeilipoor N, Tiainen M, Tiippana K, Vainio M, Vainio L. Excitability of hand motor areas during articulation of syllables. Neurosci Lett 2016; 620:154-8. [PMID: 27057730 DOI: 10.1016/j.neulet.2016.04.004] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2016] [Revised: 04/01/2016] [Accepted: 04/02/2016] [Indexed: 11/24/2022]
Abstract
It is known that articulating different syllables is linked to different grasp actions, e.g. [ti] is linked to precision grip, and [kɑ] to power grip. The aim of the present study was to test whether articulating or hearing these syllables would result in an increased activity in the representation of hand muscles involved in these two actions in a muscle-specific manner. To this end, we used transcranial magnetic stimulation (TMS) to investigate changes in the excitability of the left primary motor cortex (M1) innervating hand muscles while participants articulated or listened to meaningless syllables, listened to a metronome, or observed a fixation cross. The motor-evoked potentials of two hand muscles associated with either a precision or power grip exhibited significantly greater amplitudes during articulation than in passive listening, metronome, and fixation cross conditions. Moreover, these muscles exhibited similar patterns of excitability during articulation regardless of which syllable was articulated. The increased excitability of the left M1 hand area during articulation, but not during perception of the syllables, might be due to the cortico-cortical interaction between the motor representations of oral organs with the hand area.
Collapse
Affiliation(s)
- Naeem Komeilipoor
- Division of Cognitive and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki, Siltavuorenpenger 1-5, 00014 University of Helsinki, Finland.
| | - Mikko Tiainen
- Division of Cognitive and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki, Siltavuorenpenger 1-5, 00014 University of Helsinki, Finland
| | - Kaisa Tiippana
- Division of Cognitive and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki, Siltavuorenpenger 1-5, 00014 University of Helsinki, Finland
| | - Martti Vainio
- Phonetics and Speech Synthesis Research Group, Institute of Behavioural Sciences, University of Helsinki, Siltavuorenpenger 1-5, 00014 University of Helsinki, Finland
| | - Lari Vainio
- Division of Cognitive and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki, Siltavuorenpenger 1-5, 00014 University of Helsinki, Finland
| |
Collapse
|
16
|
Ferjan Ramirez N, Leonard MK, Davenport TS, Torres C, Halgren E, Mayberry RI. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language. Cereb Cortex 2016; 26:1015-26. [PMID: 25410427 PMCID: PMC4737603 DOI: 10.1093/cercor/bhu273] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772-2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience.
Collapse
Affiliation(s)
- Naja Ferjan Ramirez
- Department of Linguistics
- Multimodal Imaging Laboratory
- Institute for Learning and Brain Sciences, University of Washington, Seattle, WA 98195, USA
| | - Matthew K. Leonard
- Multimodal Imaging Laboratory
- Department of Radiology
- Department of Neurological Surgery, University of California, San Francisco, CA 94158, USA
| | | | | | - Eric Halgren
- Multimodal Imaging Laboratory
- Department of Radiology
- Department of Neuroscience and
- Kavli Institute for Brain and Mind, University of California, San Diego, La Jolla, CA 92093, USA
| | | |
Collapse
|
17
|
|
18
|
Okada K, Rogalsky C, O'Grady L, Hanaumi L, Bellugi U, Corina D, Hickok G. An fMRI study of perception and action in deaf signers. Neuropsychologia 2016; 82:179-188. [PMID: 26796716 DOI: 10.1016/j.neuropsychologia.2016.01.015] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2015] [Revised: 01/11/2016] [Accepted: 01/13/2016] [Indexed: 11/30/2022]
Abstract
Since the discovery of mirror neurons, there has been a great deal of interest in understanding the relationship between perception and action, and the role of the human mirror system in language comprehension and production. Two questions have dominated research. One concerns the role of Broca's area in speech perception. The other concerns the role of the motor system more broadly in understanding action-related language. The current study investigates both of these questions in a way that bridges research on language with research on manual actions. We studied the neural basis of observing and executing American Sign Language (ASL) object and action signs. In an fMRI experiment, deaf signers produced signs depicting actions and objects as well as observed/comprehended signs of actions and objects. Different patterns of activation were found for observation and execution although with overlap in Broca's area, providing prima facie support for the claim that the motor system participates in language perception. In contrast, we found no evidence that action related signs differentially involved the motor system compared to object related signs. These findings are discussed in the context of lesion studies of sign language execution and observation. In this broader context, we conclude that the activation in Broca's area during ASL observation is not causally related to sign language understanding.
Collapse
Affiliation(s)
- Kayoko Okada
- Department of Psychological Sciences, Whittier College, Whittier, CA, United states; Department of Cognitive Sciences, University of California, Irvine, CA, United States
| | - Corianne Rogalsky
- Department of Speech and Hearing Science, Arizona State University, Tempe, AZ, United States
| | - Lucinda O'Grady
- Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies, San Diego, CA, United States
| | - Leila Hanaumi
- Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies, San Diego, CA, United States
| | - Ursula Bellugi
- Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies, San Diego, CA, United States
| | - David Corina
- Department of Linguistics, University of California, Davis, CA, United States
| | - Gregory Hickok
- Department of Cognitive Sciences, University of California, Irvine, CA, United States.
| |
Collapse
|
19
|
Handedness prevalence in the deaf: Meta-analyses. Neurosci Biobehav Rev 2016; 60:98-114. [DOI: 10.1016/j.neubiorev.2015.11.013] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2015] [Revised: 11/19/2015] [Accepted: 11/21/2015] [Indexed: 11/23/2022]
|
20
|
Neural systems supporting linguistic structure, linguistic experience, and symbolic communication in sign language and gesture. Proc Natl Acad Sci U S A 2015; 112:11684-9. [PMID: 26283352 DOI: 10.1073/pnas.1510527112] [Citation(s) in RCA: 51] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Sign languages used by deaf communities around the world possess the same structural and organizational properties as spoken languages: In particular, they are richly expressive and also tightly grammatically constrained. They therefore offer the opportunity to investigate the extent to which the neural organization for language is modality independent, as well as to identify ways in which modality influences this organization. The fact that sign languages share the visual-manual modality with a nonlinguistic symbolic communicative system-gesture-further allows us to investigate where the boundaries lie between language and symbolic communication more generally. In the present study, we had three goals: to investigate the neural processing of linguistic structure in American Sign Language (using verbs of motion classifier constructions, which may lie at the boundary between language and gesture); to determine whether we could dissociate the brain systems involved in deriving meaning from symbolic communication (including both language and gesture) from those specifically engaged by linguistically structured content (sign language); and to assess whether sign language experience influences the neural systems used for understanding nonlinguistic gesture. The results demonstrated that even sign language constructions that appear on the surface to be similar to gesture are processed within the left-lateralized frontal-temporal network used for spoken languages-supporting claims that these constructions are linguistically structured. Moreover, although nonsigners engage regions involved in human action perception to process communicative, symbolic gestures, signers instead engage parts of the language-processing network-demonstrating an influence of experience on the perception of nonlinguistic stimuli.
Collapse
|
21
|
Weisberg J, McCullough S, Emmorey K. Simultaneous perception of a spoken and a signed language: The brain basis of ASL-English code-blends. BRAIN AND LANGUAGE 2015; 147:96-106. [PMID: 26177161 PMCID: PMC5769874 DOI: 10.1016/j.bandl.2015.05.006] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/15/2014] [Revised: 04/17/2015] [Accepted: 05/16/2015] [Indexed: 05/29/2023]
Abstract
Code-blends (simultaneous words and signs) are a unique characteristic of bimodal bilingual communication. Using fMRI, we investigated code-blend comprehension in hearing native ASL-English bilinguals who made a semantic decision (edible?) about signs, audiovisual words, and semantically equivalent code-blends. English and ASL recruited a similar fronto-temporal network with expected modality differences: stronger activation for English in auditory regions of bilateral superior temporal cortex, and stronger activation for ASL in bilateral occipitotemporal visual regions and left parietal cortex. Code-blend comprehension elicited activity in a combination of these regions, and no cognitive control regions were additionally recruited. Furthermore, code-blends elicited reduced activation relative to ASL presented alone in bilateral prefrontal and visual extrastriate cortices, and relative to English alone in auditory association cortex. Consistent with behavioral facilitation observed during semantic decisions, the findings suggest that redundant semantic content induces more efficient neural processing in language and sensory regions during bimodal language integration.
Collapse
Affiliation(s)
- Jill Weisberg
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, 6495 Alvarado Rd., Suite 200, San Diego, CA 92120, USA.
| | - Stephen McCullough
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, 6495 Alvarado Rd., Suite 200, San Diego, CA 92120, USA.
| | - Karen Emmorey
- Laboratory for Language and Cognitive Neuroscience, San Diego State University, 6495 Alvarado Rd., Suite 200, San Diego, CA 92120, USA.
| |
Collapse
|
22
|
Proverbio AM, Gabaro V, Orlandi A, Zani A. Semantic brain areas are involved in gesture comprehension: An electrical neuroimaging study. BRAIN AND LANGUAGE 2015; 147:30-40. [PMID: 26011745 DOI: 10.1016/j.bandl.2015.05.002] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/04/2014] [Revised: 04/13/2015] [Accepted: 05/02/2015] [Indexed: 06/04/2023]
Abstract
While the mechanism of sign language comprehension in deaf people has been widely investigated, little is known about the neural underpinnings of spontaneous gesture comprehension in healthy speakers. Bioelectrical responses to 800 pictures of actors showing common Italian gestures (e.g., emblems, deictic or iconic gestures) were recorded in 14 persons. Stimuli were selected from a wider corpus of 1122 gestures. Half of the pictures were preceded by an incongruent description. ERPs were recorded from 128 sites while participants decided whether the stimulus was congruent. Congruent pictures elicited a posterior P300 followed by late positivity, while incongruent gestures elicited an anterior N400 response. N400 generators were investigated with swLORETA reconstruction. Processing of congruent gestures activated face- and body-related visual areas (e.g., BA19, BA37, BA22), the left angular gyrus, mirror fronto/parietal areas. The incongruent-congruent contrast particularly stimulated linguistic and semantic brain areas, such as the left medial and the superior temporal lobe.
Collapse
Affiliation(s)
- Alice Mado Proverbio
- NeuroMI-Milan Center for Neuroscience, Dept. of Psychology, University of Milano-Bicocca, Piazza dell'Ateneo Nuovo 1, 20126 Milan, Italy.
| | - Veronica Gabaro
- NeuroMI-Milan Center for Neuroscience, Dept. of Psychology, University of Milano-Bicocca, Piazza dell'Ateneo Nuovo 1, 20126 Milan, Italy
| | - Andrea Orlandi
- NeuroMI-Milan Center for Neuroscience, Dept. of Psychology, University of Milano-Bicocca, Piazza dell'Ateneo Nuovo 1, 20126 Milan, Italy; Institute of Bioimaging and Molecular Physiology, IBFM-CNR, Milan, Italy
| | - Alberto Zani
- Institute of Bioimaging and Molecular Physiology, IBFM-CNR, Milan, Italy
| |
Collapse
|
23
|
Hall ML, Ferreira VS, Mayberry RI. Syntactic priming in American Sign Language. PLoS One 2015; 10:e0119611. [PMID: 25786230 PMCID: PMC4364966 DOI: 10.1371/journal.pone.0119611] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2014] [Accepted: 02/02/2015] [Indexed: 11/18/2022] Open
Abstract
Psycholinguistic studies of sign language processing provide valuable opportunities to assess whether language phenomena, which are primarily studied in spoken language, are fundamentally shaped by peripheral biology. For example, we know that when given a choice between two syntactically permissible ways to express the same proposition, speakers tend to choose structures that were recently used, a phenomenon known as syntactic priming. Here, we report two experiments testing syntactic priming of a noun phrase construction in American Sign Language (ASL). Experiment 1 shows that second language (L2) signers with normal hearing exhibit syntactic priming in ASL and that priming is stronger when the head noun is repeated between prime and target (the lexical boost effect). Experiment 2 shows that syntactic priming is equally strong among deaf native L1 signers, deaf late L1 learners, and hearing L2 signers. Experiment 2 also tested for, but did not find evidence of, phonological or semantic boosts to syntactic priming in ASL. These results show that despite the profound differences between spoken and signed languages in terms of how they are produced and perceived, the psychological representation of sentence structure (as assessed by syntactic priming) operates similarly in sign and speech.
Collapse
Affiliation(s)
- Matthew L. Hall
- Linguistics, University of Connecticut, Storrs, Connecticut, United States of America
- * E-mail:
| | - Victor S. Ferreira
- Psychology, UC San Diego, San Diego, California, United States of America
| | - Rachel I. Mayberry
- Linguistics, UC San Diego, San Diego, California, United States of America
| |
Collapse
|
24
|
Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. Neural language processing in adolescent first-language learners. Cereb Cortex 2014; 24:2772-83. [PMID: 23696277 PMCID: PMC4153811 DOI: 10.1093/cercor/bht137] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The relation between the timing of language input and development of neural organization for language processing in adulthood has been difficult to tease apart because language is ubiquitous in the environment of nearly all infants. However, within the congenitally deaf population are individuals who do not experience language until after early childhood. Here, we investigated the neural underpinnings of American Sign Language (ASL) in 2 adolescents who had no sustained language input until they were approximately 14 years old. Using anatomically constrained magnetoencephalography, we found that recently learned signed words mainly activated right superior parietal, anterior occipital, and dorsolateral prefrontal areas in these 2 individuals. This spatiotemporal activity pattern was significantly different from the left fronto-temporal pattern observed in young deaf adults who acquired ASL from birth, and from that of hearing young adults learning ASL as a second language for a similar length of time as the cases. These results provide direct evidence that the timing of language experience over human development affects the organization of neural language processing.
Collapse
Affiliation(s)
| | | | | | | | - Eric Halgren
- Multimodal Imaging Laboratory
- Department of Radiology
- Department of Neurosciences
- Kavli Institute for Brain and Mind, University of California, San Diego, USA
| | | |
Collapse
|
25
|
Komeilipoor N, Vicario CM, Daffertshofer A, Cesari P. Talking hands: tongue motor excitability during observation of hand gestures associated with words. Front Hum Neurosci 2014; 8:767. [PMID: 25324761 PMCID: PMC4179693 DOI: 10.3389/fnhum.2014.00767] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2014] [Accepted: 09/10/2014] [Indexed: 11/15/2022] Open
Abstract
Perception of speech and gestures engage common brain areas. Neural regions involved in speech perception overlap with those involved in speech production in an articulator-specific manner. Yet, it is unclear whether motor cortex also has a role in processing communicative actions like gesture and sign language. We asked whether the mere observation of hand gestures, paired and not paired with words, may result in changes in the excitability of the hand and tongue areas of motor cortex. Using single-pulse transcranial magnetic stimulation (TMS), we measured the motor excitability in tongue and hand areas of left primary motor cortex, while participants viewed video sequences of bimanual hand movements associated or not-associated with nouns. We found higher motor excitability in the tongue area during the presentation of meaningful gestures (noun-associated) as opposed to meaningless ones, while the excitability of hand motor area was not differentially affected by gesture observation. Our results let us argue that the observation of gestures associated with a word results in activation of articulatory motor network accompanying speech production.
Collapse
Affiliation(s)
- Naeem Komeilipoor
- Department of Neurological and Movement Sciences, University of Verona Verona, Italy ; MOVE Research Institute Amsterdam, VU University Amsterdam Amsterdam, Netherlands
| | | | | | - Paola Cesari
- Department of Neurological and Movement Sciences, University of Verona Verona, Italy
| |
Collapse
|
26
|
Kovelman I, Shalinsky MH, Berens MS, Petitto LA. Words in the bilingual brain: an fNIRS brain imaging investigation of lexical processing in sign-speech bimodal bilinguals. Front Hum Neurosci 2014; 8:606. [PMID: 25191247 PMCID: PMC4139656 DOI: 10.3389/fnhum.2014.00606] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2014] [Accepted: 07/21/2014] [Indexed: 11/29/2022] Open
Abstract
Early bilingual exposure, especially exposure to two languages in different modalities such as speech and sign, can profoundly affect an individual's language, culture, and cognition. Here we explore the hypothesis that bimodal dual language exposure can also affect the brain's organization for language. These changes occur across brain regions universally important for language and parietal regions especially critical for sign language (Newman et al., 2002). We investigated three groups of participants (N = 29) that completed a word repetition task in American Sign Language (ASL) during fNIRS brain imaging. Those groups were (1) hearing ASL-English bimodal bilinguals (n = 5), (2) deaf ASL signers (n = 7), and (3) English monolinguals naïve to sign language (n = 17). The key finding of the present study is that bimodal bilinguals showed reduced activation in left parietal regions relative to deaf ASL signers when asked to use only ASL. In contrast, this group of bimodal signers showed greater activation in left temporo-parietal regions relative to English monolinguals when asked to switch between their two languages (Kovelman et al., 2009). Converging evidence now suggest that bimodal bilingual experience changes the brain bases of language, including the left temporo-parietal regions known to be critical for sign language processing (Emmorey et al., 2007). The results provide insight into the resilience and constraints of neural plasticity for language and bilingualism.
Collapse
Affiliation(s)
- Ioulia Kovelman
- Department of Psychology, Psychology and Center for Human Growth and Development, University of Michigan Ann Arbor, MI, USA
| | - Mark H Shalinsky
- Department of Psychology, Psychology and Center for Human Growth and Development, University of Michigan Ann Arbor, MI, USA
| | | | - Laura-Ann Petitto
- Visual Language and Visual Learning (VL2), Science of Learning Center, Gallaudet University, National Science Foundation Washington, DC, USA
| |
Collapse
|
27
|
Heimler B, Weisz N, Collignon O. Revisiting the adaptive and maladaptive effects of crossmodal plasticity. Neuroscience 2014; 283:44-63. [PMID: 25139761 DOI: 10.1016/j.neuroscience.2014.08.003] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2014] [Revised: 08/01/2014] [Accepted: 08/06/2014] [Indexed: 11/15/2022]
Abstract
One of the most striking demonstrations of experience-dependent plasticity comes from studies of sensory-deprived individuals (e.g., blind or deaf), showing that brain regions deprived of their natural inputs change their sensory tuning to support the processing of inputs coming from the spared senses. These mechanisms of crossmodal plasticity have been traditionally conceptualized as having a double-edged sword effect on behavior. On one side, crossmodal plasticity is conceived as adaptive for the development of enhanced behavioral skills in the remaining senses of early-deaf or blind individuals. On the other side, crossmodal plasticity raises crucial challenges for sensory restoration and is typically conceived as maladaptive since its presence may prevent optimal recovery in sensory-re-afferented individuals. In the present review we stress that this dichotomic vision is oversimplified and we emphasize that the notions of the unavoidable adaptive/maladaptive effects of crossmodal reorganization for sensory compensation/restoration may actually be misleading. For this purpose we critically review the findings from the blind and deaf literatures, highlighting the complementary nature of these two fields of research. The integrated framework we propose here has the potential to impact on the way rehabilitation programs for sensory recovery are carried out, with the promising prospect of eventually improving their final outcomes.
Collapse
Affiliation(s)
- B Heimler
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Italy.
| | - N Weisz
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Italy
| | - O Collignon
- Center for Mind/Brain Sciences (CIMeC), University of Trento, Italy
| |
Collapse
|
28
|
Emmorey K, McCullough S, Mehta S, Grabowski TJ. How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language. Front Psychol 2014; 5:484. [PMID: 24904497 PMCID: PMC4033845 DOI: 10.3389/fpsyg.2014.00484] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2013] [Accepted: 05/03/2014] [Indexed: 11/24/2022] Open
Abstract
To investigate the impact of sensory-motor systems on the neural organization for language, we conducted an H215O-PET study of sign and spoken word production (picture-naming) and an fMRI study of sign and audio-visual spoken language comprehension (detection of a semantically anomalous sentence) with hearing bilinguals who are native users of American Sign Language (ASL) and English. Directly contrasting speech and sign production revealed greater activation in bilateral parietal cortex for signing, while speaking resulted in greater activation in bilateral superior temporal cortex (STC) and right frontal cortex, likely reflecting auditory feedback control. Surprisingly, the language production contrast revealed a relative increase in activation in bilateral occipital cortex for speaking. We speculate that greater activation in visual cortex for speaking may actually reflect cortical attenuation when signing, which functions to distinguish self-produced from externally generated visual input. Directly contrasting speech and sign comprehension revealed greater activation in bilateral STC for speech and greater activation in bilateral occipital-temporal cortex for sign. Sign comprehension, like sign production, engaged bilateral parietal cortex to a greater extent than spoken language. We hypothesize that posterior parietal activation in part reflects processing related to spatial classifier constructions in ASL and that anterior parietal activation may reflect covert imitation that functions as a predictive model during sign comprehension. The conjunction analysis for comprehension revealed that both speech and sign bilaterally engaged the inferior frontal gyrus (with more extensive activation on the left) and the superior temporal sulcus, suggesting an invariant bilateral perisylvian language system. We conclude that surface level differences between sign and spoken languages should not be dismissed and are critical for understanding the neurobiology of language.
Collapse
Affiliation(s)
- Karen Emmorey
- Laboratory for Language and Cognitive Neuroscience, School of Speech, Language, and Hearing Sciences, San Diego State University San Diego, CA, USA
| | - Stephen McCullough
- Laboratory for Language and Cognitive Neuroscience, School of Speech, Language, and Hearing Sciences, San Diego State University San Diego, CA, USA
| | - Sonya Mehta
- Department of Psychology, University of Washington Seattle, WA, USA ; Department of Radiology, University of Washington Seattle, WA, USA
| | | |
Collapse
|
29
|
Inubushi T, Sakai KL. Functional and anatomical correlates of word-, sentence-, and discourse-level integration in sign language. Front Hum Neurosci 2013; 7:681. [PMID: 24155706 PMCID: PMC3804906 DOI: 10.3389/fnhum.2013.00681] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2013] [Accepted: 09/27/2013] [Indexed: 11/17/2022] Open
Abstract
In both vocal and sign languages, we can distinguish word-, sentence-, and discourse-level integration in terms of hierarchical processes, which integrate various elements into another higher level of constructs. In the present study, we used magnetic resonance imaging and voxel-based morphometry (VBM) to test three language tasks in Japanese Sign Language (JSL): word-level (Word), sentence-level (Sent), and discourse-level (Disc) decision tasks. We analyzed cortical activity and gray matter (GM) volumes of Deaf signers, and clarified three major points. First, we found that the activated regions in the frontal language areas gradually expanded in the dorso-ventral axis, corresponding to a difference in linguistic units for the three tasks. Moreover, the activations in each region of the frontal language areas were incrementally modulated with the level of linguistic integration. These dual mechanisms of the frontal language areas may reflect a basic organization principle of hierarchically integrating linguistic information. Secondly, activations in the lateral premotor cortex and inferior frontal gyrus were left-lateralized. Direct comparisons among the language tasks exhibited more focal activation in these regions, suggesting their functional localization. Thirdly, we found significantly positive correlations between individual task performances and GM volumes in localized regions, even when the ages of acquisition (AOAs) of JSL and Japanese were factored out. More specifically, correlations with the performances of the Word and Sent tasks were found in the left precentral/postcentral gyrus and insula, respectively, while correlations with those of the Disc task were found in the left ventral inferior frontal gyrus and precuneus. The unification of functional and anatomical studies would thus be fruitful for understanding human language systems from the aspects of both universality and individuality.
Collapse
Affiliation(s)
- Tomoo Inubushi
- Department of Basic Science, Graduate School of Arts and Sciences, The University of Tokyo Tokyo, Japan ; Japan Society for the Promotion of Science Tokyo, Japan
| | | |
Collapse
|
30
|
Dissociating cognitive and sensory neural plasticity in human superior temporal cortex. Nat Commun 2013; 4:1473. [PMID: 23403574 DOI: 10.1038/ncomms2463] [Citation(s) in RCA: 90] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2012] [Accepted: 01/09/2013] [Indexed: 11/08/2022] Open
Abstract
Disentangling the effects of sensory and cognitive factors on neural reorganization is fundamental for establishing the relationship between plasticity and functional specialization. Auditory deprivation in humans provides a unique insight into this problem, because the origin of the anatomical and functional changes observed in deaf individuals is not only sensory, but also cognitive, owing to the implementation of visual communication strategies such as sign language and speechreading. Here, we describe a functional magnetic resonance imaging study of individuals with different auditory deprivation and sign language experience. We find that sensory and cognitive experience cause plasticity in anatomically and functionally distinguishable substrates. This suggests that after plastic reorganization, cortical regions adapt to process a different type of input signal, but preserve the nature of the computation they perform, both at a sensory and cognitive level.
Collapse
|
31
|
Corina DP, Lawyer LA, Cates D. Cross-linguistic differences in the neural representation of human language: evidence from users of signed languages. Front Psychol 2013; 3:587. [PMID: 23293624 PMCID: PMC3534395 DOI: 10.3389/fpsyg.2012.00587] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2012] [Accepted: 12/11/2012] [Indexed: 11/13/2022] Open
Abstract
Studies of deaf individuals who are users of signed languages have provided profound insight into the neural representation of human language. Case studies of deaf signers who have incurred left- and right-hemisphere damage have shown that left-hemisphere resources are a necessary component of sign language processing. These data suggest that, despite frank differences in the input and output modality of language, core left perisylvian regions universally serve linguistic function. Neuroimaging studies of deaf signers have generally provided support for this claim. However, more fine-tuned studies of linguistic processing in deaf signers are beginning to show evidence of important differences in the representation of signed and spoken languages. In this paper, we provide a critical review of this literature and present compelling evidence for language-specific cortical representations in deaf signers. These data lend support to the claim that the neural representation of language may show substantive cross-linguistic differences. We discuss the theoretical implications of these findings with respect to an emerging understanding of the neurobiology of language.
Collapse
Affiliation(s)
- David P Corina
- Cognitive Neurolinguistics Laboratory, Center for Mind and Brain, Department of Linguistics, University of California Davis Davis, CA, USA
| | | | | |
Collapse
|
32
|
Signed words in the congenitally deaf evoke typical late lexicosemantic responses with no early visual responses in left superior temporal cortex. J Neurosci 2012; 32:9700-5. [PMID: 22787055 DOI: 10.1523/jneurosci.1002-12.2012] [Citation(s) in RCA: 52] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Congenitally deaf individuals receive little or no auditory input, and when raised by deaf parents, they acquire sign as their native and primary language. We asked two questions regarding how the deaf brain in humans adapts to sensory deprivation: (1) is meaning extracted and integrated from signs using the same classical left hemisphere frontotemporal network used for speech in hearing individuals, and (2) in deafness, is superior temporal cortex encompassing primary and secondary auditory regions reorganized to receive and process visual sensory information at short latencies? Using MEG constrained by individual cortical anatomy obtained with MRI, we examined an early time window associated with sensory processing and a late time window associated with lexicosemantic integration. We found that sign in deaf individuals and speech in hearing individuals activate a highly similar left frontotemporal network (including superior temporal regions surrounding auditory cortex) during lexicosemantic processing, but only speech in hearing individuals activates auditory regions during sensory processing. Thus, neural systems dedicated to processing high-level linguistic information are used for processing language regardless of modality or hearing status, and we do not find evidence for rewiring of afferent connections from visual systems to auditory cortex.
Collapse
|
33
|
Hu Z, Wang W, Liu H, Peng D, Yang Y, Li K, Zhang JX, Ding G. Brain activations associated with sign production using word and picture inputs in deaf signers. BRAIN AND LANGUAGE 2011; 116:64-70. [PMID: 21215442 DOI: 10.1016/j.bandl.2010.11.006] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/02/2009] [Revised: 10/18/2010] [Accepted: 11/25/2010] [Indexed: 05/30/2023]
Abstract
Effective literacy education in deaf students calls for psycholinguistic research revealing the cognitive and neural mechanisms underlying their written language processing. When learning a written language, deaf students are often instructed to sign out printed text. The present fMRI study was intended to reveal the neural substrates associated with word signing by comparing it with picture signing. Native deaf signers were asked to overtly sign in Chinese Sign Language (CSL) common objects indicated with written words or presented as pictures. Except in left inferior frontal gyrus and inferior parietal lobule where word signing elicited greater activation than picture signing, the two tasks engaged a highly overlapping set of brain regions previously implicated in sign production. The results suggest that word signing in the deaf signers relies on meaning activation from printed visual forms, followed by similar production processes from meaning to signs as in picture signing. The present study also documents the basic brain activation pattern for sign production in CSL and supports the notion of a universal core neural network for sign production across different sign languages.
Collapse
Affiliation(s)
- Zhiguo Hu
- Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | | | | | | | | | | | | | | |
Collapse
|
34
|
Pereira FRS, Alessio A, Sercheli MS, Pedro T, Bilevicius E, Rondina JM, Ozelo HFB, Castellano G, Covolan RJM, Damasceno BP, Cendes F. Asymmetrical hippocampal connectivity in mesial temporal lobe epilepsy: evidence from resting state fMRI. BMC Neurosci 2010; 11:66. [PMID: 20525202 PMCID: PMC2890013 DOI: 10.1186/1471-2202-11-66] [Citation(s) in RCA: 163] [Impact Index Per Article: 10.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2009] [Accepted: 06/02/2010] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Mesial temporal lobe epilepsy (MTLE), the most common type of focal epilepsy in adults, is often caused by hippocampal sclerosis (HS). Patients with HS usually present memory dysfunction, which is material-specific according to the hemisphere involved and has been correlated to the degree of HS as measured by postoperative histopathology as well as by the degree of hippocampal atrophy on magnetic resonance imaging (MRI). Verbal memory is mostly affected by left-sided HS, whereas visuo-spatial memory is more affected by right HS. Some of these impairments may be related to abnormalities of the network in which individual hippocampus takes part. Functional connectivity can play an important role to understand how the hippocampi interact with other brain areas. It can be estimated via functional Magnetic Resonance Imaging (fMRI) resting state experiments by evaluating patterns of functional networks. In this study, we investigated the functional connectivity patterns of 9 control subjects, 9 patients with right MTLE and 9 patients with left MTLE. RESULTS We detected differences in functional connectivity within and between hippocampi in patients with unilateral MTLE associated with ipsilateral HS by resting state fMRI. Functional connectivity resulted to be more impaired ipsilateral to the seizure focus in both patient groups when compared to control subjects. This effect was even more pronounced for the left MTLE group. CONCLUSIONS The findings presented here suggest that left HS causes more reduction of functional connectivity than right HS in subjects with left hemisphere dominance for language.
Collapse
Affiliation(s)
- Fabrício R S Pereira
- Neuroimaging Laboratory, Department of Neurology, University of Campinas - UNICAMP, Cidade Universitária, Campinas, SP, Brazil
| | | | | | | | | | | | | | | | | | | | | |
Collapse
|
35
|
Newman AJ, Supalla T, Hauser PC, Newport EL, Bavelier D. Prosodic and narrative processing in American Sign Language: an fMRI study. Neuroimage 2010; 52:669-76. [PMID: 20347996 DOI: 10.1016/j.neuroimage.2010.03.055] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2009] [Revised: 03/18/2010] [Accepted: 03/19/2010] [Indexed: 11/16/2022] Open
Abstract
Signed languages such as American Sign Language (ASL) are natural human languages that share all of the core properties of spoken human languages but differ in the modality through which they are communicated. Neuroimaging and patient studies have suggested similar left hemisphere (LH)-dominant patterns of brain organization for signed and spoken languages, suggesting that the linguistic nature of the information, rather than modality, drives brain organization for language. However, the role of the right hemisphere (RH) in sign language has been less explored. In spoken languages, the RH supports the processing of numerous types of narrative-level information, including prosody, affect, facial expression, and discourse structure. In the present fMRI study, we contrasted the processing of ASL sentences that contained these types of narrative information with similar sentences without marked narrative cues. For all sentences, Deaf native signers showed robust bilateral activation of perisylvian language cortices as well as the basal ganglia, medial frontal, and medial temporal regions. However, RH activation in the inferior frontal gyrus and superior temporal sulcus was greater for sentences containing narrative devices, including areas involved in processing narrative content in spoken languages. These results provide additional support for the claim that all natural human languages rely on a core set of LH brain regions, and extend our knowledge to show that narrative linguistic functions typically associated with the RH in spoken languages are similarly organized in signed languages.
Collapse
Affiliation(s)
- Aaron J Newman
- Departments of Psychology, Psychiatry, & Surgery & Pediatrics (Division of Neurology), and Neuroscience Institute, Dalhousie University, Halifax, NS, Canada.
| | | | | | | | | |
Collapse
|
36
|
Brain systems mediating semantic and syntactic processing in deaf native signers: biological invariance and modality specificity. Proc Natl Acad Sci U S A 2009; 106:8784-9. [PMID: 19433795 DOI: 10.1073/pnas.0809609106] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Studies of written and spoken language suggest that nonidentical brain networks support semantic and syntactic processing. Event-related brain potential (ERP) studies of spoken and written languages show that semantic anomalies elicit a posterior bilateral N400, whereas syntactic anomalies elicit a left anterior negativity, followed by a broadly distributed late positivity. The present study assessed whether these ERP indicators index the activity of language systems specific for the processing of aural-oral language or if they index neural systems underlying any natural language, including sign language. The syntax of a signed language is mediated through space. Thus the question arises of whether the comprehension of a signed language requires neural systems specific for this kind of code. Deaf native users of American Sign Language (ASL) were presented signed sentences that were either correct or that contained either a semantic or a syntactic error (1 of 2 types of verb agreement errors). ASL sentences were presented at the natural rate of signing, while the electroencephalogram was recorded. As predicted on the basis of earlier studies, an N400 was elicited by semantic violations. In addition, signed syntactic violations elicited an early frontal negativity and a later posterior positivity. Crucially, the distribution of the anterior negativity varied as a function of the type of syntactic violation, suggesting a unique involvement of spatial processing in signed syntax. Together, these findings suggest that biological constraints and experience shape the development of neural systems important for language.
Collapse
|
37
|
MacSweeney M, Capek CM, Campbell R, Woll B. The signing brain: the neurobiology of sign language. Trends Cogn Sci 2008; 12:432-40. [PMID: 18805728 DOI: 10.1016/j.tics.2008.07.010] [Citation(s) in RCA: 137] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2008] [Revised: 07/24/2008] [Accepted: 07/31/2008] [Indexed: 10/21/2022]
|
38
|
Hickok G, Pickell H, Klima E, Bellugi U. Neural dissociation in the production of lexical versus classifier signs in ASL: distinct patterns of hemispheric asymmetry. Neuropsychologia 2008; 47:382-7. [PMID: 18929583 DOI: 10.1016/j.neuropsychologia.2008.09.009] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2007] [Revised: 04/23/2008] [Accepted: 09/10/2008] [Indexed: 10/21/2022]
Abstract
We examine the hemispheric organization for the production of two classes of ASL signs, lexical signs and classifier signs. Previous work has found strong left hemisphere dominance for the production of lexical signs, but several authors have speculated that classifier signs may involve the right hemisphere to a greater degree because they can represent spatial information in a topographic, non-categorical manner. Twenty-one unilaterally brain damaged signers (13 left hemisphere damaged, 8 right hemisphere damaged) were presented with a story narration task designed to elicit both lexical and classifier signs. Relative frequencies of the two types of errors were tabulated. Left hemisphere damaged signers produced significantly more lexical errors than did right hemisphere damaged signers, whereas the reverse pattern held for classifier signs. Our findings argue for different patterns of hemispheric asymmetry for these two classes of ASL signs. We suggest that the requirement to encode analogue spatial information in the production of classifier signs results in the increased involvement of the right hemisphere systems.
Collapse
Affiliation(s)
- Gregory Hickok
- Department of Cognitive Sciences, University of California, Irvine, CA 92697, USA.
| | | | | | | |
Collapse
|
39
|
|
40
|
Keehner M, Gathercole SE. Cognitive adaptations arising from nonnative experience of sign language in hearing adults. Mem Cognit 2007; 35:752-61. [PMID: 17848032 DOI: 10.3758/bf03193312] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Three experiments examined spatial transformation abilities in hearing people who acquired sign language in early adulthood. The performance of the nonnative hearing signers was compared with that of hearing people with no knowledge of sign language. The two groups were matched for age and gender. Using an adapted Corsi blocks paradigm, the experimental task simulated spatial relations in sign discourse but offered no opportunity for linguistic coding. Experiment 1 showed that the hearing signers performed significantly better than the nonsigners on a task that entailed 180 degree rotation, which is the canonical spatial relationship in sign language discourse. Experiment 2 found that the signers did not show the typical costs associated with processing rotated stimuli, and Experiment 3 ruled out the possibility that their advantage relied on seen hand movements. We conclude that sign language experience, even when acquired in adulthood by hearing people, can give rise to adaptations in cognitive processes associated with the manipulation of visuospatial information.
Collapse
|
41
|
Serrien DJ, Ivry RB, Swinnen SP. Dynamics of hemispheric specialization and integration in the context of motor control. Nat Rev Neurosci 2006; 7:160-6. [PMID: 16429125 DOI: 10.1038/nrn1849] [Citation(s) in RCA: 377] [Impact Index Per Article: 19.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Behavioural and neurophysiological evidence convincingly establish that the left hemisphere is dominant for motor skills that are carried out with either hand or those that require bimanual coordination. As well as this prioritization, we argue that specialized functions of the right hemisphere are also indispensable for the realization of goal-directed behaviour. As such, lateralization of motor function is a dynamic and multifaceted process that emerges across different timescales and is contingent on task- and performer-related determinants.
Collapse
Affiliation(s)
- Deborah J Serrien
- School of Psychology, University of Nottingham, University Park, Nottingham NG7 2RD, UK.
| | | | | |
Collapse
|
42
|
Altschuler EL, Multari A, Hirstein W, Ramachandran V. Situational therapy for Wernicke’s aphasia. Med Hypotheses 2006; 67:713-6. [DOI: 10.1016/j.mehy.2005.10.035] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2005] [Accepted: 10/06/2005] [Indexed: 11/16/2022]
|
43
|
Pickell H, Klima E, Love T, Kritchevsky M, Bellugi U, Hickok G. Sign language aphasia following right hemisphere damage in a left-hander: a case of reversed cerebral dominance in a deaf signer? Neurocase 2005; 11:194-203. [PMID: 16006340 DOI: 10.1080/13554790590944717] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Recent lesion studies have shown that left hemisphere lesions often give rise to frank sign language aphasias in deaf signers, whereas right hemisphere lesions do not, suggesting similar patterns of hemispheric asymmetry for signed and spoken language. We present here a case of a left-handed, deaf, life-long signer who became aphasic after a right-hemisphere lesion. The subject exhibits deficits in sign language comprehension and production typically associated with left hemisphere damaged signers. He also exhibits evidence of local versus global deficits similar to left-hemisphere lesioned hearing patients. This case represents reversed lateralization for sign language and also may represent reversed lateralization for visuo-spatial abilities in a deaf signer.
Collapse
Affiliation(s)
- Herbert Pickell
- Laboratory for Cognitive Neuroscience, The Salk Institute, La Jolla, CA 92037, USA.
| | | | | | | | | | | |
Collapse
|
44
|
Sakai KL, Tatsuno Y, Suzuki K, Kimura H, Ichida Y. Sign and speech: amodal commonality in left hemisphere dominance for comprehension of sentences. ACTA ACUST UNITED AC 2005; 128:1407-17. [PMID: 15728651 DOI: 10.1093/brain/awh465] [Citation(s) in RCA: 64] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
The neural basis of functional lateralization in language processing is a fundamental issue in systems neuroscience. We used functional MRI (fMRI) to examine hemispheric dominance during the processing of signed and spoken sentences. By using tasks involving comprehension of sentences (Sc) and sentential non-word detection (Sn), we compared different groups and stimulus conditions. Under the sign condition with sentence stimuli in Japanese Sign Language (JSL), we tested two groups of subjects: Deaf signers (Deaf) of JSL, and hearing bilinguals (children of Deaf adults, CODA) of JSL and Japanese (JPN). Under the speech condition, we tested hearing monolinguals (Mono) of JPN with auditory JPN stimuli alone (AUD), or with an audiovisual presentation of JPN and JSL stimuli (A&V). We found that the overall bilateral activation patterns under the four experimental conditions of Deaf, CODA, AUD and A&V were almost identical, despite differences in stimuli (JSL and JPN) and groups (Deaf, CODA and Mono). Moreover, consistently left-dominant activations involving frontal and temporo-parietal regions were observed across all four conditions. Furthermore, irrespective of the modalities of sign and speech, the main effects of task (Sc-Sn) were found primarily in the left regions: the ventral part of the inferior frontal gyrus (F3t/F3O), the precentral sulcus, the superior frontal gyrus, the middle temporal gyrus, the angular gyrus and the inferior parietal gyrus. Among these regions, only the left F3t/F3O showed no main effects of modality condition. These results demonstrate amodal commonality in the functional dominance of the left cortical regions for comprehension of sentences, as well as the essential and universal role of the left F3t/F3O in processing linguistic information from both signed and spoken sentences.
Collapse
Affiliation(s)
- Kuniyoshi L Sakai
- Department of Basic Science, Graduate School of Arts and Sciences, The University of Tokyo, Komaba, Tokyo, Japan.
| | | | | | | | | |
Collapse
|
45
|
MacSweeney M, Campbell R, Woll B, Giampietro V, David AS, McGuire PK, Calvert GA, Brammer MJ. Dissociating linguistic and nonlinguistic gestural communication in the brain. Neuroimage 2004; 22:1605-18. [PMID: 15275917 DOI: 10.1016/j.neuroimage.2004.03.015] [Citation(s) in RCA: 83] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2003] [Revised: 03/03/2004] [Accepted: 03/09/2004] [Indexed: 10/26/2022] Open
Abstract
Gestures of the face, arms, and hands are components of signed languages used by Deaf people. Signaling codes, such as the racecourse betting code known as Tic Tac, are also made up of such gestures. Tic Tac lacks the phonological structure of British Sign Language (BSL) but is similar in terms of its visual and articulatory components. Using fMRI, we compared the neural correlates of viewing a gestural language (BSL) and a manual-brachial code (Tic Tac) relative to a low-level baseline task. We compared three groups: Deaf native signers, hearing native signers, and hearing nonsigners. None of the participants had any knowledge of Tic Tac. All three groups activated an extensive frontal-posterior network in response to both types of stimuli. Superior temporal cortex, including the planum temporale, was activated bilaterally in response to both types of gesture in all groups, irrespective of hearing status. The engagement of these traditionally auditory processing regions was greater in Deaf than hearing participants. These data suggest that the planum temporale may be responsive to visual movement in both deaf and hearing people, yet when hearing is absent early in development, the visual processing role of this region is enhanced. Greater activation for BSL than Tic Tac was observed in signers, but not in nonsigners, in the left posterior superior temporal sulcus and gyrus, extending into the supramarginal gyrus. This suggests that the left posterior perisylvian cortex is of fundamental importance to language processing, regardless of the modality in which it is conveyed.
Collapse
Affiliation(s)
- Mairéad MacSweeney
- Department of Human Communication Science, University College London, London, WC1N 1PG, UK.
| | | | | | | | | | | | | | | |
Collapse
|
46
|
Allen A, Barnes A, Singh RS, Patterson J, Hadley DM, Wyper D. Perfusion SPECT in cochlear implantation and promontory stimulation. Nucl Med Commun 2004; 25:521-5. [PMID: 15100513 DOI: 10.1097/00006231-200405000-00015] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND Recent studies of profoundly deaf patients with cochlear implants have demonstrated that these patients are able to process sound in the auditory cortex in a similar way to normal subjects. However, there are large variations in outcome. Various clinical criteria are used for subject selection and the decision as to which ear is to be implanted involves electrical stimulation of the promontory which is used to confirm the persistence of auditory neurones and fibres that can be utilized by the cochlear implant. In this study we have used SPECT with Tc-HMPAO to investigate activation of the auditory cortex in cochlear implantees post-surgery. In addition we also investigated whether electrical stimulation of the promontory does produce change in blood flow in the auditory cortex in pre-surgery candidates, which would indicate viable auditory networks that can be utilized by a cochlear implant device. METHODS AND RESULTS Image analysis was performed with SPM99. Results of a simple subtraction paradigm indicated bilateral activation of auditory cortex and Wernicke's area in the post-implant group during auditory stimulus (speech) and bilateral activation of the ventral lateral posterior thalamus and bilateral auditory association cortex BA21/22/42, in the pre-implant group during electrical stimulus but no activation of the primary auditory cortex. A conjunction analysis used to investigate the common areas of activation across both groups during the stimulus condition showed that there was a common bilateral activation of the primary auditory cortex in both groups (BA22/41/42). In addition, analysis of a subset of the seven post-implant subjects who did not comprehend the speech in our study showed an activation (Pu<0.05, where Pu is the peak voxel threshold, uncorrected for multiple comparisons) in the left auditory cortex that extended into area BA22 synonymous with Wernicke's area. This supports the theory that this region has a sensory role.
Collapse
|
47
|
Ross DS, Bever TG. The time course for language acquisition in biologically distinct populations: evidence from deaf individuals. BRAIN AND LANGUAGE 2004; 89:115-121. [PMID: 15010243 DOI: 10.1016/s0093-934x(03)00308-0] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 07/29/2003] [Indexed: 05/24/2023]
Abstract
The present study provides evidence that individuals who have different patterns of cerebral lateralization and who develop along different maturational time courses can attain comparable levels of language proficiency. Right-handed individuals with left-handed family members (left-handed familials, LHFs) showed a shorter sensitive period for language acquisition than did right-handed individuals with only right-handed family members (right-handed familials, RHFs). The shorter sensitive period for LHFs may be due to a focus on non-linguistic, word-based conceptual information during language acquisition. RHFs may focus on grammatical relations during language acquisition, which matures later than lexical knowledge. This suggests that there may be different patterns of cerebral lateralization for language in all normal populations as a function of familial handedness.
Collapse
Affiliation(s)
- Danielle S Ross
- Department of Brain and Cognitive Sciences, University of Rochester, Meliora Hall, Room 360, Rochester, NY 14627-0268, USA.
| | | |
Collapse
|
48
|
Hickok G, Love-Geffen T, Klima ES. Role of the left hemisphere in sign language comprehension. BRAIN AND LANGUAGE 2002; 82:167-178. [PMID: 12096874 DOI: 10.1016/s0093-934x(02)00013-5] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
We investigated the relative role of the left versus right hemisphere in the comprehension of American Sign Language (ASL). Nineteen lifelong signers with unilateral brain lesions [11 left hemisphere damaged (LHD) and 8 right hemisphere damaged (RHD)] performed three tasks, an isolated single-sign comprehension task, a sentence-level comprehension task involving simple one-step commands, and a sentence-level comprehension task involving more complex multiclause/multistep commands. Eighteen of the participants were deaf, one RHD subject was hearing and bilingual (ASL and English). Performance was examined in relation to two factors: whether the lesion was in the right or left hemisphere and whether the temporal lobe was involved. The LHD group performed significantly worse than the RHD group on all three tasks, confirming left hemisphere dominance for sign language comprehension. The group with left temporal lobe involvement was significantly impaired on all tasks, whereas each of the other three groups performed at better than 95% correct on the single sign and simple sentence comprehension tasks, with performance falling off only on the complex sentence comprehension items. A comparison with previously published data suggests that the degree of difficulty exhibited by the deaf RHD group on the complex sentences is comparable to that observed in hearing RHD subjects. Based on these findings we hypothesize (i) that deaf and hearing individuals have a similar degree of lateralization of language comprehension processes and (ii) that language comprehension depends primarily on the integrity of the left temporal lobe.
Collapse
Affiliation(s)
- Gregory Hickok
- Department of Cognitive Sciences, University of California, Irvine, CA 92697, USA.
| | | | | |
Collapse
|
49
|
Newman AJ, Bavelier D, Corina D, Jezzard P, Neville HJ. A critical period for right hemisphere recruitment in American Sign Language processing. Nat Neurosci 2002; 5:76-80. [PMID: 11753419 DOI: 10.1038/nn775] [Citation(s) in RCA: 135] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Signed languages such as American Sign Language (ASL) are natural languages that are formally similar to spoken languages, and thus present an opportunity to examine the effects of language structure and modality on the neural organization for language. Native learners of spoken languages show predominantly left-lateralized patterns of neural activation for language processing, whereas native learners of ASL show extensive right hemisphere (RH) and LH activation. We demonstrate that the RH angular gyrus is active during ASL processing only in native signers (hearing, ASL-English bilinguals) but not in those who acquired ASL after puberty (hearing, native English speakers). This is the first demonstration of a 'sensitive' or 'critical' period for language in an RH structure. This has implications for language acquisition and for understanding age-related changes in neuroplasticity more generally.
Collapse
Affiliation(s)
- Aaron J Newman
- Department of Psychology, 1227 University of Oregon, Eugene, Oregon 97403-1227, USA.
| | | | | | | | | |
Collapse
|
50
|
Corina DP, McBurney SL. The neural representation of language in users of American Sign Language. JOURNAL OF COMMUNICATION DISORDERS 2001; 34:455-471. [PMID: 11725858 DOI: 10.1016/s0021-9924(01)00063-6] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
UNLABELLED Studies of American Sign Language (ASL) offer unique insights into the fundamental properties of human language. Neurolinguistic studies explore the effects of left and right hemisphere lesions on the production and comprehension of signed language. Following damage to the left hemisphere perisylvian regions, signers, like users of spoken languages, exhibit frank aphasic disturbances. Sign language paraphasia illustrates the linguistic specificity of impairment. A case study involving cortical stimulation mapping (CSM) in a deaf signer provides evidence for the specialization of Broca's area in sign language production. The effects of right hemisphere damage highlight the specialized properties of sign language use. Data from functional magnetic resonance imaging (fMRI) of deaf signers confirm the importance of left hemisphere language structures in the use of signed language, but also reveal the contributions of right hemisphere regions to the processing of ASL. These studies provide new insights into the complementary roles of biology and environment in language representation in the human brain. LEARNING OUTCOMES As a result of this activity, the participant will read studies of aphasia in users of signed language and a discussion of neurolinguistic studies of paraphasia in ASL. The participant will examine the role of the right hemisphere in language use and findings from a functional imaging study of sentence processing in ASL and English.
Collapse
Affiliation(s)
- D P Corina
- Department of Psychology, University of Washington, Seattle 98195, USA.
| | | |
Collapse
|