1
|
Holcomb PJ, Akers EM, Midgley KJ, Emmorey K. Orthographic and Phonological Code Activation in Deaf and Hearing Readers. J Cogn 2024; 7:19. [PMID: 38312942 PMCID: PMC10836169 DOI: 10.5334/joc.326] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2023] [Accepted: 10/11/2023] [Indexed: 02/06/2024] Open
Abstract
Grainger et al. (2006) were the first to use ERP masked priming to explore the differing contributions of phonological and orthographic representations to visual word processing. Here we adapted their paradigm to examine word processing in deaf readers. We investigated whether reading-matched deaf and hearing readers (n = 36) exhibit different ERP effects associated with the activation of orthographic and phonological codes during word processing. In a visual masked priming paradigm, participants performed a go/no-go categorization task (detect an occasional animal word). Critical target words were preceded by orthographically-related (transposed letter - TL) or phonologically-related (pseudohomophone - PH) masked non-word primes were contrasted with the same target words preceded by letter substitution (control) non-words primes. Hearing readers exhibited typical N250 and N400 priming effects (greater negativity for control compared to TL or PH primed targets), and the TL and PH priming effects did not differ. For deaf readers, the N250 PH priming effect was later (250-350 ms), and they showed a reversed N250 priming effect for TL primes in this time window. The N400 TL and PH priming effects did not differ between groups. For hearing readers, those with better phonological and spelling skills showed larger early N250 PH and TL priming effects (150-250 ms). For deaf readers, those with better phonological skills showed a larger reversed TL priming effect in the late N250 window. We speculate that phonological knowledge modulates how strongly deaf readers rely on whole-word orthographic representations and/or the mapping from sublexical to lexical representations.
Collapse
Affiliation(s)
| | - Emily M. Akers
- Department of Psychology, San Diego State University, CA, USA
| | | | - Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, CA, USA
| |
Collapse
|
2
|
Li C, Midgley KJ, Ferreira VS, Holcomb PJ, Gollan TH. Different language control mechanisms in comprehension and production: Evidence from paragraph reading. Brain Lang 2024; 248:105367. [PMID: 38113600 PMCID: PMC11081765 DOI: 10.1016/j.bandl.2023.105367] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 10/13/2023] [Accepted: 12/06/2023] [Indexed: 12/21/2023]
Abstract
Chinese-English bilinguals read paragraphs with language switches using a rapid serial visual presentation paradigm silently while ERPs were measured (Experiment 1) or read them aloud (Experiment 2). Each paragraph was written in either Chinese or English with several function or content words switched to the other language. In Experiment 1, language switches elicited an early, long-lasting positivity when switching from the dominant language to the nondominant language, but when switching to the dominant language, the positivity started later, and was never larger than when switching to the nondominant language. In addition, switch effects on function words were not significantly larger than those on content words in any analyses. In Experiment 2, participants produced more cross-language intrusion errors when switching to the dominant than to the nondominant language, and more errors on function than content words. These results implicate different control mechanisms in bilingual language selection across comprehension and production.
Collapse
Affiliation(s)
- Chuchu Li
- University of California, San Diego, United States.
| | | | | | | | | |
Collapse
|
3
|
Sehyr ZS, Midgley KJ, Emmorey K, Holcomb PJ. Asymetric Event-Related Potential Priming Effects Between English Letters and American Sign Language Fingerspelling Fonts. Neurobiol Lang (Camb) 2023; 4:361-381. [PMID: 37546690 PMCID: PMC10403274 DOI: 10.1162/nol_a_00104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/25/2022] [Accepted: 02/23/2023] [Indexed: 08/08/2023]
Abstract
Letter recognition plays an important role in reading and follows different phases of processing, from early visual feature detection to the access of abstract letter representations. Deaf ASL-English bilinguals experience orthography in two forms: English letters and fingerspelling. However, the neurobiological nature of fingerspelling representations, and the relationship between the two orthographies, remains unexplored. We examined the temporal dynamics of single English letter and ASL fingerspelling font processing in an unmasked priming paradigm with centrally presented targets for 200 ms preceded by 100 ms primes. Event-related brain potentials were recorded while participants performed a probe detection task. Experiment 1 examined English letter-to-letter priming in deaf signers and hearing non-signers. We found that English letter recognition is similar for deaf and hearing readers, extending previous findings with hearing readers to unmasked presentations. Experiment 2 examined priming effects between English letters and ASL fingerspelling fonts in deaf signers only. We found that fingerspelling fonts primed both fingerspelling fonts and English letters, but English letters did not prime fingerspelling fonts, indicating a priming asymmetry between letters and fingerspelling fonts. We also found an N400-like priming effect when the primes were fingerspelling fonts which might reflect strategic access to the lexical names of letters. The studies suggest that deaf ASL-English bilinguals process English letters and ASL fingerspelling differently and that the two systems may have distinct neural representations. However, the fact that fingerspelling fonts can prime English letters suggests that the two orthographies may share abstract representations to some extent.
Collapse
Affiliation(s)
- Zed Sevcikova Sehyr
- San Diego State University Research Foundation, San Diego State University, San Diego, CA, USA
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| |
Collapse
|
4
|
McGarry ME, Midgley KJ, Holcomb PJ, Emmorey K. How (and why) does iconicity effect lexical access: An electrophysiological study of American sign language. Neuropsychologia 2023; 183:108516. [PMID: 36796720 PMCID: PMC10576952 DOI: 10.1016/j.neuropsychologia.2023.108516] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 12/17/2022] [Accepted: 02/13/2023] [Indexed: 02/16/2023]
Abstract
Prior research has found that iconicity facilitates sign production in picture-naming paradigms and has effects on ERP components. These findings may be explained by two separate hypotheses: (1) a task-specific hypothesis that suggests these effects occur because visual features of the iconic sign form can map onto the visual features of the pictures, and (2) a semantic feature hypothesis that suggests that the retrieval of iconic signs results in greater semantic activation due to the robust representation of sensory-motor semantic features compared to non-iconic signs. To test these two hypotheses, iconic and non-iconic American Sign Language (ASL) signs were elicited from deaf native/early signers using a picture-naming task and an English-to-ASL translation task, while electrophysiological recordings were made. Behavioral facilitation (faster response times) and reduced negativities were observed for iconic signs (both prior to and within the N400 time window), but only in the picture-naming task. No ERP or behavioral differences were found between iconic and non-iconic signs in the translation task. This pattern of results supports the task-specific hypothesis and provides evidence that iconicity only facilitates sign production when the eliciting stimulus and the form of the sign can visually overlap (a picture-sign alignment effect).
Collapse
Affiliation(s)
- Meghan E McGarry
- Joint Doctoral Program in Language and Communication Disorders, San Diego State University and University of California, San Diego, San Diego, CA, USA
| | | | - Phillip J Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| | - Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, CA, USA.
| |
Collapse
|
5
|
Lee B, Martinez PM, Midgley KJ, Holcomb PJ, Emmorey K. Sensitivity to orthographic vs. phonological constraints on word recognition: An ERP study with deaf and hearing readers. Neuropsychologia 2022; 177:108420. [PMID: 36396091 PMCID: PMC10152474 DOI: 10.1016/j.neuropsychologia.2022.108420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2022] [Revised: 09/28/2022] [Accepted: 11/07/2022] [Indexed: 11/16/2022]
Abstract
The role of phonology in word recognition has previously been investigated using a masked lexical decision task and transposed letter (TL) nonwords that were either pronounceable (barve) or unpronounceable (brvae). We used event-related potentials (ERPs) to investigate these effects in skilled deaf readers, who may be more sensitive to orthotactic than phonotactic constraints, which are conflated in English. Twenty deaf and twenty hearing adults completed a masked lexical decision task while ERPs were recorded. The groups were matched in reading skill and IQ, but deaf readers had poorer phonological ability. Deaf readers were faster and more accurate at rejecting TL nonwords than hearing readers. Neither group exhibited an effect of nonword pronounceability in RTs or accuracy. For both groups, the N250 and N400 components were modulated by lexicality (more negative for nonwords). The N250 was not modulated by nonword pronounceability, but pronounceable nonwords elicited a larger amplitude N400 than unpronounceable nonwords. Because pronounceable nonwords are more word-like, they may incite activation that is unresolved when no lexical entry is found, leading to a larger N400 amplitude. Similar N400 pronounceability effects for deaf and hearing readers, despite differences in phonological sensitivity, suggest these TL effects arise from sensitivity to lexical-level orthotactic constraints. Deaf readers may have an advantage in processing TL nonwords because of enhanced early visual attention and/or tight orthographic-to-semantic connections, bypassing the phonologically mediated route to word recognition.
Collapse
Affiliation(s)
- Brittany Lee
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, United States.
| | | | | | | | | |
Collapse
|
6
|
Anderson EJ, Midgley KJ, Holcomb PJ, Riès SK. Taxonomic and thematic semantic relationships in picture naming as revealed by Laplacian-transformed event-related potentials. Psychophysiology 2022; 59:e14091. [PMID: 35554943 PMCID: PMC9788343 DOI: 10.1111/psyp.14091] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2021] [Revised: 03/30/2022] [Accepted: 04/20/2022] [Indexed: 12/31/2022]
Abstract
Semantically related concepts co-activate when we speak. Prior research reported both behavioral interference and facilitation due to co-activation during picture naming. Different word relationships may account for some of this discrepancy. Taxonomically related words (e.g., WOLF-DOG) have been associated with semantic interference; thematically related words (e.g., BONE-DOG) have been associated with facilitation. Although these different semantic relationships have been associated with opposite behavioral outcomes, electrophysiological studies have found inconsistent effects on event-related potentials. We conducted a picture-word interference electroencephalography experiment to examine word retrieval dynamics in these different semantic relationships. Importantly, we used traditional monopolar analysis as well as Laplacian transformation allowing us to examine spatially deblurred event-related components. Both analyses revealed greater negativity (150-250 ms) for unrelated than related taxonomic pairs, though more restricted in space for thematic pairs. Critically, Laplacian analyses revealed a larger negative-going component in the 300 to 500 ms time window in taxonomically related versus unrelated pairs which were restricted to a left frontal recording site. In parallel, an opposite effect was found in the same time window but localized to a left parietal site. Finding these opposite effects in the same time window was feasible thanks to the use of the Laplacian transformation and suggests that frontal control processes are concurrently engaged with cascading effects of the spread of activation through semantically related representations.
Collapse
Affiliation(s)
- Elizabeth J. Anderson
- Joint Doctoral Program in Language and Communicative DisordersSan Diego State UniversitySan DiegoCaliforniaUSA,Joint Doctoral Program in Language and Communicative DisordersUniversity of California San DiegoLa JollaCaliforniaUSA
| | | | - Phillip J. Holcomb
- Department of PsychologySan Diego State UniversitySan DiegoCaliforniaUSA
| | - Stephanie K. Riès
- School of Speech, Language, and Hearing SciencesSan Diego State UniversitySan DiegoCaliforniaUSA
| |
Collapse
|
7
|
Li C, Midgley KJ, Holcomb PJ. ERPs Reveal How Semantic and Syntactic Processing Unfold across Parafoveal and Foveal Vision during Sentence Comprehension. Lang Cogn Neurosci 2022; 38:88-104. [PMID: 36776698 PMCID: PMC9916175 DOI: 10.1080/23273798.2022.2091150] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/27/2021] [Accepted: 06/09/2022] [Indexed: 06/18/2023]
Abstract
We examined how readers process content and function words in sentence comprehension with ERPs. Participants read simple declarative sentences using a rapid serial visual presentation (RSVP) with flankers paradigm. Sentences contained either an unexpected semantically anomalous content word, an unexpected syntactically anomalous function word or were well formed with no anomalies. ERPs were examined when target words were in the parafoveal or foveal vision. Unexpected content words elicited a typically distributed N400 when displayed in the parafovea, followed by a longer-lasting, widely distributed positivity starting around 300 ms once foveated. Unexpected function words elicited a left lateralized LAN-like component when presented in the parafovea, followed by a left lateralized, posteriorly distributed P600 when foveated. These results suggested that both semantic and syntactic processing involve two stages-the initial, fast process that can be completed in parafovea, followed by a more in depth attentionally mediated assessment that occurs with direct attention.
Collapse
|
8
|
Emmorey K, Midgley KJ, Holcomb PJ. Tracking the time course of sign recognition using ERP repetition priming. Psychophysiology 2022; 59:e13975. [PMID: 34791683 PMCID: PMC9583460 DOI: 10.1111/psyp.13975] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Revised: 09/28/2021] [Accepted: 11/03/2021] [Indexed: 11/26/2022]
Abstract
Repetition priming and event-related potentials (ERPs) were used to investigate the time course of sign recognition in deaf users of American Sign Language. Signers performed a go/no-go semantic categorization task to rare probe signs referring to people; critical target items were repeated and unrelated signs. In Experiment 1, ERPs were time-locked either to the onset of the video or to sign onset within the video; in Experiment 2, the same full videos were clipped so that video and sign onset were aligned (removing transitional movements), and ERPs were time-locked to video/sign onset. All analyses revealed an N400 repetition priming effect (less negativity for repeated than unrelated signs) but differed in the timing and/or duration of the N400 effect. Results from Experiment 1 revealed that repetition priming effects began before sign onset within a video, suggesting that signers are sensitive to linguistic information within the transitional movement to sign onset. The timing and duration of the N400 for clipped videos were more parallel to that observed previously for auditorily presented words and was 200 ms shorter than either time-locking analysis from Experiment 1. We conclude that time-locking to full video onset is optimal when early ERP components or sensitivity to transitional movements are of interest and that time-locking to the onset of clipped videos is optimal for priming studies with fluent signers.
Collapse
Affiliation(s)
- Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, California, USA
| | - Katherine J. Midgley
- Department of Psychology, San Diego State University, San Diego, California, USA
| | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, California, USA
| |
Collapse
|
9
|
Declerck M, Meade G, Midgley KJ, Holcomb PJ, Roelofs A, Emmorey K. On the Connection Between Language Control and Executive Control-An ERP Study. Neurobiol Lang (Camb) 2021; 2:628-646. [PMID: 37214623 PMCID: PMC10158610 DOI: 10.1162/nol_a_00032] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Accepted: 01/25/2021] [Indexed: 05/24/2023]
Abstract
Models vary in the extent to which language control processes are domain general. Those that posit that language control is at least partially domain general insist on an overlap between language control and executive control at the goal level. To further probe whether or not language control is domain general, we conducted the first event-related potential (ERP) study that directly compares language-switch costs, as an index of language control, and task-switch costs, as an index of executive control. The language switching and task switching methodologies were identical, except that the former required switching between languages (English or Spanish) whereas the latter required switching between tasks (color naming or category naming). This design allowed us to directly compare control processes at the goal level (cue-locked ERPs) and at the task performance level (picture-locked ERPs). We found no significant differences in the switch-related cue-locked and picture-locked ERP patterns across the language and task switching paradigms. These results support models of domain-general language control.
Collapse
Affiliation(s)
- Mathieu Declerck
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, USA
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
- Department of Linguistics and Literary Studies, Vrije Universiteit Brussel, Brussels, Belgium
| | - Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, USA
| | | | | | - Ardi Roelofs
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, USA
| |
Collapse
|
10
|
Meade G, Lee B, Massa N, Holcomb PJ, Midgley KJ, Emmorey K. Are form priming effects phonological or perceptual? Electrophysiological evidence from American Sign Language. Cognition 2021; 220:104979. [PMID: 34906848 PMCID: PMC9578293 DOI: 10.1016/j.cognition.2021.104979] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2021] [Revised: 10/09/2021] [Accepted: 11/29/2021] [Indexed: 11/13/2022]
Abstract
Form priming has been used to identify and demarcate the processes that underlie word and sign recognition. The facilitation that results from the prime and target being related in form is typically interpreted in terms of pre-activation of linguistic representations, with little to no consideration for the potential contributions of increased perceptual overlap between related pairs. Indeed, isolating the contribution of perceptual similarity is impossible in spoken languages; there are no listeners who can perceive speech but have not acquired a sound-based phonological system. Here, we compared the electrophysiological indices of form priming effects in American Sign Language between hearing non-signers (i.e., who had no visual-manual phonological system) and deaf signers. We reasoned that similarities in priming effects between groups would most likely be perceptual in nature, whereas priming effects that are specific to the signer group would reflect pre-activation of phonological representations. Behavior in the go/no-go repetition detection task was remarkably similar between groups. Priming in a pre-N400 window was also largely similar across groups, consistent with an early effect of perceptual similarity. However, priming effects diverged between groups during the subsequent N400 and post-N400 windows. Signers had more typical form priming effects and were especially attuned to handshape overlap, whereas non-signers did not exhibit an N400 component and were more sensitive to location overlap. We attribute this pattern to an interplay between perceptual similarity and phonological knowledge. Perceptual similarity contributes to early phonological priming effects, while phonological knowledge tunes sensitivity to linguistically relevant dimensions of perceptual similarity.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, United States of America.
| | - Brittany Lee
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, United States of America
| | - Natasja Massa
- Department of Psychology, San Diego State University, United States of America
| | - Phillip J Holcomb
- Department of Psychology, San Diego State University, United States of America
| | - Katherine J Midgley
- Department of Psychology, San Diego State University, United States of America
| | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States of America
| |
Collapse
|
11
|
McGarry ME, Massa N, Mott M, Midgley KJ, Holcomb PJ, Emmorey K. Matching pictures and signs: An ERP study of the effects of iconic structural alignment in American sign language. Neuropsychologia 2021; 162:108051. [PMID: 34624260 DOI: 10.1016/j.neuropsychologia.2021.108051] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2020] [Revised: 07/28/2021] [Accepted: 10/02/2021] [Indexed: 10/20/2022]
Abstract
Event-related potentials (ERPs) were used to explore the effects of iconicity and structural visual alignment between a picture-prime and a sign-target in a picture-sign matching task in American Sign Language (ASL). Half the targets were iconic signs and were presented after a) a matching visually-aligned picture (e.g., the shape and location of the hands in the sign COW align with the depiction of a cow with visible horns), b) a matching visually-nonaligned picture (e.g., the cow's horns were not clearly shown), and c) a non-matching picture (e.g., a picture of a swing instead of a cow). The other half of the targets were filler signs. Trials in the matching condition were responded to faster than those in the non-matching condition and were associated with smaller N400 amplitudes in deaf ASL signers. These effects were also observed for hearing non-signers performing the same task with spoken-English targets. Trials where the picture-prime was aligned with the sign target were responded to faster than non-aligned trials and were associated with a reduced P3 amplitude rather than a reduced N400, suggesting that picture-sign alignment facilitated the decision process, rather than lexical access. These ERP and behavioral effects of alignment were found only for the ASL signers. The results indicate that iconicity effects on sign comprehension may reflect a task-dependent strategic use of iconicity, rather than facilitation of lexical access.
Collapse
Affiliation(s)
- Meghan E McGarry
- Joint Doctoral Program in Language and Communication Disorders, San Diego State University and University of California, San Diego, San Diego, CA, USA
| | - Natasja Massa
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | - Megan Mott
- Department of Psychology, San Diego State University, San Diego, CA, USA
| | | | - Phillip J Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| | - Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, CA, USA.
| |
Collapse
|
12
|
Declerck M, Meade G, Midgley KJ, Holcomb PJ, Roelofs A, Emmorey K. Language control in bimodal bilinguals: Evidence from ERPs. Neuropsychologia 2021; 161:108019. [PMID: 34487737 DOI: 10.1016/j.neuropsychologia.2021.108019] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Revised: 09/02/2021] [Accepted: 09/02/2021] [Indexed: 11/29/2022]
Abstract
It is currently unclear to what degree language control, which minimizes non-target language interference and increases the probability of selecting target-language words, is similar for sign-speech (bimodal) bilinguals and spoken language (unimodal) bilinguals. To further investigate the nature of language control processes in bimodal bilinguals, we conducted the first event-related potential (ERP) language switching study with hearing American Sign Language (ASL)-English bilinguals. The results showed a pattern that has not been observed in any unimodal language switching study: a switch-related positivity over anterior sites and a switch-related negativity over posterior sites during ASL production in both early and late time windows. No such pattern was found during English production. We interpret these results as evidence that bimodal bilinguals uniquely engage language control at the level of output modalities.
Collapse
Affiliation(s)
- Mathieu Declerck
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, USA; Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands; Linguistics and Literary Studies, Vrije Universiteit Brussel, Brussels, Belgium
| | - Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, USA
| | | | - Phillip J Holcomb
- Department of Psychology, San Diego State University, San Diego, USA
| | - Ardi Roelofs
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, Netherlands
| | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, USA.
| |
Collapse
|
13
|
Meade G, Lee B, Massa N, Holcomb PJ, Midgley KJ, Emmorey K. The organization of the American Sign Language lexicon: Comparing one- and two-parameter ERP phonological priming effects across tasks. Brain Lang 2021; 218:104960. [PMID: 33940343 PMCID: PMC8543839 DOI: 10.1016/j.bandl.2021.104960] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/08/2020] [Revised: 01/29/2021] [Accepted: 04/13/2021] [Indexed: 06/12/2023]
Abstract
We used phonological priming and ERPs to investigate the organization of the lexicon in American Sign Language. Across go/no-go repetition detection and semantic categorization tasks, targets in related pairs that shared handshape and location elicited smaller N400s than targets in unrelated pairs, indicative of facilitated processing. Handshape-related targets also elicited smaller N400s than unrelated targets, but only in the repetition task. The location priming effect reversed direction across tasks, with slightlylargeramplitude N400s for targets in related versus unrelated pairs in the semantic task, indicative of interference. These patterns imply that handshape and location play different roles during sign recognition and that there is a hierarchical organization for the sign lexicon. Similar to interactive-activation models of word recognition, we argue for differentiation between sublexical facilitation and lexical competition. Lexical competition is primarily driven by the location parameter and is more engaged when identification of single lexico-semantic entries is required.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University, University of California, San Diego, USA.
| | - Brittany Lee
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University, University of California, San Diego, USA
| | - Natasja Massa
- Department of Psychology, San Diego State University, USA
| | | | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, USA
| |
Collapse
|
14
|
Emmorey K, Holcomb PJ, Midgley KJ. Masked ERP repetition priming in deaf and hearing readers. Brain Lang 2021; 214:104903. [PMID: 33486233 PMCID: PMC8299519 DOI: 10.1016/j.bandl.2020.104903] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Revised: 11/11/2020] [Accepted: 12/21/2020] [Indexed: 06/12/2023]
Abstract
Deaf readers provide unique insights into how the reading circuit is modified by altered linguistic and sensory input. We investigated whether reading-matched deaf and hearing readers (n = 62) exhibit different ERP effects associated with orthographic to phonological mapping (N250) or lexico-semantic processes (N400). In a visual masked priming paradigm, participants performed a go/no-go categorization task; target words were preceded by repeated or unrelated primes. Prime duration and word frequency were manipulated. Hearing readers exhibited typical N250 and N400 priming effects with 50 ms primes (greater negativity for unrelated primes) and smaller effects with 100 ms primes. Deaf readers showed a surprising reversed priming effect with 50 ms primes (greater negativity for related primes), and more typical N250 and N400 effects with 100 ms primes. Correlation results suggested deaf readers with poorer phonological skills drove this effect. We suggest that weak phonological activation may create orthographic "repetition enhancement" or form/lexical competition in deaf readers.
Collapse
Affiliation(s)
- Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, CA, USA.
| | | | | |
Collapse
|
15
|
Mott M, Midgley KJ, Holcomb PJ, Emmorey K. Cross-modal translation priming and iconicity effects in deaf signers and hearing learners of American Sign Language. Biling (Camb Engl) 2020; 23:1032-1044. [PMID: 33897272 PMCID: PMC8061897 DOI: 10.1017/s1366728919000889] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
This study used ERPs to a) assess the neural correlates of cross-linguistic, cross-modal translation priming in hearing beginning learners of American Sign Language (ASL) and deaf highly proficient signers and b) examine whether sign iconicity modulates these priming effects. Hearing learners exhibited translation priming for ASL signs preceded by English words (greater negativity for unrelated than translation primes) later in the ERP waveform than deaf signers and exhibited earlier and greater priming for iconic than non-iconic signs. Iconicity did not modulate translation priming effects either behaviorally or in the ERPs for deaf signers (except in a 800-1000 ms time window). Because deaf signers showed early translation priming effects (beginning at 400ms-600ms), we suggest that iconicity did not facilitate lexical access, but deaf signers may have recognized sign iconicity later in processing. Overall, the results indicate that iconicity speeds lexical access for L2 sign language learners, but not for proficient signers.
Collapse
Affiliation(s)
- Megan Mott
- Department of Psychology, San Diego State University
| | | | | | - Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University
| |
Collapse
|
16
|
Emmorey K, Mott M, Meade G, Holcomb PJ, Midgley KJ. Lexical selection in bimodal bilinguals: ERP evidence from picture-word interference. Lang Cogn Neurosci 2020; 36:840-853. [PMID: 34485589 PMCID: PMC8411899 DOI: 10.1080/23273798.2020.1821905] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Accepted: 09/04/2020] [Indexed: 06/13/2023]
Abstract
The picture word interference (PWI) paradigm and ERPs were used to investigate whether lexical selection in deaf and hearing ASL-English bilinguals occurs via lexical competition or whether the response exclusion hypothesis (REH) for PWI effects is supported. The REH predicts that semantic interference should not occur for bimodal bilinguals because sign and word responses do not compete within an output buffer. Bimodal bilinguals named pictures in ASL, preceded by either a translation equivalent, semantically-related, or unrelated English written word. In both the translation and semantically-related conditions bimodal bilinguals showed facilitation effects: reduced RTs and N400 amplitudes for related compared to unrelated prime conditions. We also observed an unexpected focal left anterior positivity that was stronger in the translation condition, which we speculate may be due to articulatory priming. Overall, the results support the REH and models of bilingual language production that assume lexical selection occurs without competition between languages.
Collapse
Affiliation(s)
- Karen Emmorey
- Corresponding author: Laboratory for Language and Cognitive Neuroscience, 6495 Alvarado Road, Suite 200, San Diego, CA 92120,
| | - Megan Mott
- Psychology Department, San Diego State University
| | - Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University, University of California, San Diego
| | | | | |
Collapse
|
17
|
Meade G, Grainger J, Midgley KJ, Holcomb PJ, Emmorey K. An ERP investigation of orthographic precision in deaf and hearing readers. Neuropsychologia 2020; 146:107542. [PMID: 32590018 PMCID: PMC7502516 DOI: 10.1016/j.neuropsychologia.2020.107542] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2020] [Revised: 05/27/2020] [Accepted: 06/21/2020] [Indexed: 10/24/2022]
Abstract
Phonology is often assumed to play a role in the tuning of orthographic representations, but it is unknown whether deaf readers' reduced access to spoken phonology reduces orthographic precision. To index how precisely deaf and hearing readers encode orthographic information, we used a masked transposed-letter (TL) priming paradigm. Word targets were preceded by TL primes formed by reversing two letters in the word and substitution primes in which the same two letters were replaced. The two letters that were manipulated were either in adjacent or non-adjacent positions, yielding four prime conditions: adjacent TL (e.g., chikcen-CHICKEN), adjacent substitution (e.g., chidven- CHICKEN), non-adjacent TL (e.g., ckichen-CHICKEN), and non-adjacent substitution (e.g., cticfen-CHICKEN). Replicating the standard TL priming effects, targets preceded by TL primes elicited smaller amplitude negativities and faster responses than those preceded by substitution primes overall. This indicates some degree of flexibility in the associations between letters and their positions within words. More flexible (i.e., less precise) representations are thought to be more susceptible to activation by TL primes, resulting in larger TL priming effects. However, the size of the TL priming effects was virtually identical between groups. Moreover, the ERP effects were shifted in time such that the adjacent TL priming effect arose earlier than the non-adjacent TL priming effect in both groups. These results suggest that phonological tuning is not required to represent orthographic information in a precise manner.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, USA.
| | - Jonathan Grainger
- Laboratoire de Psychologie Cognitive, CNRS & Aix-Marseille Université, USA
| | | | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, USA
| |
Collapse
|
18
|
McGarry ME, Mott M, Midgley KJ, Holcomb PJ, Emmorey K. Picture-naming in American Sign Language: an electrophysiological study of the effects of iconicity and structured alignment. Lang Cogn Neurosci 2020; 36:199-210. [PMID: 33732747 PMCID: PMC7959108 DOI: 10.1080/23273798.2020.1804601] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/22/2020] [Accepted: 07/25/2020] [Indexed: 06/12/2023]
Abstract
A picture-naming task and ERPs were used to investigate effects of iconicity and visual alignment between signs and pictures in American Sign Language (ASL). For iconic signs, half the pictures visually overlapped with phonological features of the sign (e.g., the fingers of CAT align with a picture of a cat with prominent whiskers), while half did not (whiskers are not shown). Iconic signs were produced numerically faster than non-iconic signs and were associated with larger N400 amplitudes, akin to concreteness effects. Pictures aligned with iconic signs were named faster than non-aligned pictures, and there was a reduction in N400 amplitude. No behavioral effects were observed for the control group (English speakers). We conclude that sensory-motoric semantic features are represented more robustly for iconic than non-iconic signs (eliciting a concreteness-like N400 effect) and visual overlap between pictures and the phonological form of iconic signs facilitates lexical retrieval (eliciting a reduced N400).
Collapse
Affiliation(s)
- Meghan E. McGarry
- Joint Doctoral Program in Language and Communication Disorders, San Diego State University and University of California, San Diego, San Diego, CA USA
| | - Megan Mott
- Department of Psychology, San Diego State University, San Diego, CA USA
| | | | | | - Karen Emmorey
- School of Speech, Language and Hearing Sciences, San Diego State University, San Diego, CA USA
| |
Collapse
|
19
|
Abstract
A domain-general monitoring mechanism is proposed to be involved in overt speech monitoring. This mechanism is reflected in a medial frontal component, the error negativity (Ne), present in both errors and correct trials (Ne-like wave) but larger in errors than correct trials. In overt speech production, this negativity starts to rise before speech onset and is therefore associated with inner speech monitoring. Here, we investigate whether the same monitoring mechanism is involved in sign language production. Twenty deaf signers (American Sign Language [ASL] dominant) and 16 hearing signers (English dominant) participated in a picture-word interference paradigm in ASL. As in previous studies, ASL naming latencies were measured using the keyboard release time. EEG results revealed a medial frontal negativity peaking within 15 msec after keyboard release in the deaf signers. This negativity was larger in errors than correct trials, as previously observed in spoken language production. No clear negativity was present in the hearing signers. In addition, the slope of the Ne was correlated with ASL proficiency (measured by the ASL Sentence Repetition Task) across signers. Our results indicate that a similar medial frontal mechanism is engaged in preoutput language monitoring in sign and spoken language production. These results suggest that the monitoring mechanism reflected by the Ne/Ne-like wave is independent of output modality (i.e., spoken or signed) and likely monitors prearticulatory representations of language. Differences between groups may be linked to several factors including differences in language proficiency or more variable lexical access to motor programming latencies for hearing than deaf signers.
Collapse
Affiliation(s)
| | | | | | | | | | | | - Karen Emmorey
- San Diego State University
- University of California, San Diego
| |
Collapse
|
20
|
Sehyr ZS, Midgley KJ, Holcomb PJ, Emmorey K, Plaut DC, Behrmann M. Unique N170 signatures to words and faces in deaf ASL signers reflect experience-specific adaptations during early visual processing. Neuropsychologia 2020; 141:107414. [PMID: 32142729 DOI: 10.1016/j.neuropsychologia.2020.107414] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2019] [Revised: 02/24/2020] [Accepted: 02/26/2020] [Indexed: 11/24/2022]
Abstract
Previous studies with deaf adults reported reduced N170 waveform asymmetry to visual words, a finding attributed to reduced phonological mapping in left-hemisphere temporal regions compared to hearing adults. An open question remains whether this pattern indeed results from reduced phonological processing or from general neurobiological adaptations in visual processing of deaf individuals. Deaf ASL signers and hearing nonsigners performed a same-different discrimination task with visually presented words, faces, or cars, while scalp EEG time-locked to the onset of the first item in each pair was recorded. For word recognition, the typical left-lateralized N170 in hearing participants and reduced left-sided asymmetry in deaf participants were replicated. The groups did not differ on word discrimination but better orthographic skill was associated with larger N170 in the right hemisphere only for deaf participants. Face recognition was characterized by unique N170 signatures for both groups, and deaf individuals exhibited superior face discrimination performance. Laterality or discrimination performance effects did not generalize to the N170 responses to cars, confirming that deaf signers are not inherently less lateralized in their electrophysiological responses to words and critically, giving support to the phonological mapping hypothesis. P1 was attenuated for deaf participants compared to the hearing, but in both groups, P1 selectively discriminated between highly learned familiar objects - words and faces versus less familiar objects - cars. The distinct electrophysiological signatures to words and faces reflected experience-driven adaptations to words and faces that do not generalize to object recognition.
Collapse
|
21
|
Midgley KJ, Medina YE, Lee B. Studying bilingual learners and users of spoken and signed languages: A neuro-cognitive approach. Psychology of Learning and Motivation 2020. [DOI: 10.1016/bs.plm.2020.03.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
22
|
Emmorey K, Winsler K, Midgley KJ, Grainger J, Holcomb PJ. Neurophysiological Correlates of Frequency, Concreteness, and Iconicity in American Sign Language. Neurobiol Lang (Camb) 2020; 1:249-267. [PMID: 33043298 PMCID: PMC7544239 DOI: 10.1162/nol_a_00012] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2019] [Accepted: 04/16/2020] [Indexed: 05/21/2023]
Abstract
To investigate possible universal and modality-specific factors that influence the neurophysiological response during lexical processing, we recorded event-related potentials while a large group of deaf adults (n = 40) viewed 404 signs in American Sign Language (ASL) that varied in ASL frequency, concreteness, and iconicity. Participants performed a go/no-go semantic categorization task (does the sign refer to people?) to videoclips of ASL signs (clips began with the signer's hands at rest). Linear mixed-effects regression models were fit with per-participant, per-trial, and per-electrode data, allowing us to identify unique effects of each lexical variable. We observed an early effect of frequency (greater negativity for less frequent signs) beginning at 400 ms postvideo onset at anterior sites, which we interpreted as reflecting form-based lexical processing. This effect was followed by a more widely distributed posterior response that we interpreted as reflecting lexical-semantic processing. Paralleling spoken language, more concrete signs elicited greater negativities, beginning 600 ms postvideo onset with a wide scalp distribution. Finally, there were no effects of iconicity (except for a weak effect in the latest epochs; 1,000-1,200 ms), suggesting that iconicity does not modulate the neural response during sign recognition. Despite the perceptual and sensorimotoric differences between signed and spoken languages, the overall results indicate very similar neurophysiological processes underlie lexical access for both signs and words.
Collapse
Affiliation(s)
| | - Kurt Winsler
- Department of Psychology, University of California, Davis
| | | | - Jonathan Grainger
- Laboratoire de Psychologie Cognitive, Aix-Marseille University, Centre National de la Recherche Scientifique
| | | |
Collapse
|
23
|
Sheppard SM, Love T, Midgley KJ, Shapiro LP, Holcomb PJ. Using prosody during sentence processing in aphasia: Evidence from temporal neural dynamics. Neuropsychologia 2019; 134:107197. [PMID: 31542361 PMCID: PMC6911311 DOI: 10.1016/j.neuropsychologia.2019.107197] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2018] [Revised: 04/17/2019] [Accepted: 09/18/2019] [Indexed: 11/19/2022]
Affiliation(s)
- Shannon M Sheppard
- San Diego State University, USA; University of California, San Diego, USA.
| | - Tracy Love
- San Diego State University, USA; University of California, San Diego, USA
| | | | - Lewis P Shapiro
- San Diego State University, USA; University of California, San Diego, USA
| | | |
Collapse
|
24
|
Lee B, Meade G, Midgley KJ, Holcomb PJ, Emmorey K. ERP Evidence for Co-Activation of English Words during Recognition of American Sign Language Signs. Brain Sci 2019; 9:E148. [PMID: 31234356 PMCID: PMC6627215 DOI: 10.3390/brainsci9060148] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2019] [Revised: 06/18/2019] [Accepted: 06/20/2019] [Indexed: 11/17/2022] Open
Abstract
Event-related potentials (ERPs) were used to investigate co-activation of English words during recognition of American Sign Language (ASL) signs. Deaf and hearing signers viewed pairs of ASL signs and judged their semantic relatedness. Half of the semantically unrelated signs had English translations that shared an orthographic and phonological rime (e.g., BAR-STAR) and half did not (e.g., NURSE-STAR). Classic N400 and behavioral semantic priming effects were observed in both groups. For hearing signers, targets in sign pairs with English rime translations elicited a smaller N400 compared to targets in pairs with unrelated English translations. In contrast, a reversed N400 effect was observed for deaf signers: target signs in English rime translation pairs elicited a larger N400 compared to targets in pairs with unrelated English translations. This reversed effect was overtaken by a later, more typical ERP priming effect for deaf signers who were aware of the manipulation. These findings provide evidence that implicit language co-activation in bimodal bilinguals is bidirectional. However, the distinct pattern of effects in deaf and hearing signers suggests that it may be modulated by differences in language proficiency and dominance as well as by asymmetric reliance on orthographic versus phonological representations.
Collapse
Affiliation(s)
- Brittany Lee
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University and University of California, San Diego, CA 92182, USA.
| | - Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University and University of California, San Diego, CA 92182, USA.
| | | | | | - Karen Emmorey
- Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA 92182, USA.
| |
Collapse
|
25
|
Pu H, Medina YE, Holcomb PJ, Midgley KJ. Testing for Nonselective Bilingual Lexical Access Using L1 Attrited Bilinguals. Brain Sci 2019; 9:brainsci9060126. [PMID: 31159405 PMCID: PMC6628369 DOI: 10.3390/brainsci9060126] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2019] [Revised: 05/12/2019] [Accepted: 05/27/2019] [Indexed: 11/26/2022] Open
Abstract
Research in the past few decades generally supported a nonselective view of bilingual lexical access, where a bilingual’s two languages are both active during monolingual processing. However, recent work by Costa et al. (2017) brought this into question by reinterpreting evidence for nonselectivity in a selective manner. We manipulated the factor of first language (L1) attrition in an event-related potential (ERP) experiment to disentangle Costa and colleagues’ selective processing proposal versus the traditional nonselective processing view of bilingual lexical access. Spanish–English bilinguals demonstrated an N400 effect of L1 attrition during implicit L1 processing in a second language (L2) semantic judgment task, indicating the contribution of variable L1 lexical access during L2 processing. These results are incompatible with Costa and colleagues’ selective model, adding to the literature supporting a nonselective view of bilingual lexical access.
Collapse
Affiliation(s)
- He Pu
- Department of Psychology, Tufts University, Medford, MA 02155, USA.
| | - Yazmin E Medina
- Department of Psychology, San Diego State University, San Diego, CA 92182, USA.
| | - Phillip J Holcomb
- Department of Psychology, San Diego State University, San Diego, CA 92182, USA.
| | - Katherine J Midgley
- Department of Psychology, San Diego State University, San Diego, CA 92182, USA.
| |
Collapse
|
26
|
Meade G, Grainger J, Midgley KJ, Holcomb PJ, Emmorey K. ERP effects of masked orthographic neighbour priming in deaf readers. Lang Cogn Neurosci 2019; 34:1016-1026. [PMID: 31595216 PMCID: PMC6781870 DOI: 10.1080/23273798.2019.1614201] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/23/2018] [Accepted: 04/26/2019] [Indexed: 05/12/2023]
Abstract
In masked priming studies with hearing readers, neighbouring words (e.g., wine, vine) compete through lateral inhibition. Here, we asked whether lateral inhibition also characterizes visual word recognition in deaf readers and whether the neural signature of this competition is the same as for hearing readers. Only real words have lexical representations that engage in lateral inhibition. Therefore, we compared processing of target words following neighbouring prime words (e.g., wine-VINE) and pseudowords (e.g., bine-VINE). Targets following words elicited larger amplitude N400s and slower lexical decision responses than those following pseudowords, indicating more effortful processing due to lateral inhibition. Although these effects went in the same direction for hearing and deaf readers, the distribution of the N400 effect differed. We associate the more anterior effect in hearing readers with stronger co-activation of, and competition among, phonological representations. Thus, deaf readers use lexical competition to recognize visual words, but it is primarily restricted to orthographic representations.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University and University of California, San Diego, San Diego, CA, USA
| | - Jonathan Grainger
- Laboratoire de Psychologie Cognitive, CNRS & Aix-Marseille Université, France
| | | | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| |
Collapse
|
27
|
Glezer LS, Weisberg J, O'Grady Farnady C, McCullough S, Midgley KJ, Holcomb PJ, Emmorey K. Orthographic and phonological selectivity across the reading system in deaf skilled readers. Neuropsychologia 2018; 117:500-512. [PMID: 30005927 DOI: 10.1016/j.neuropsychologia.2018.07.010] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2018] [Revised: 07/06/2018] [Accepted: 07/07/2018] [Indexed: 11/18/2022]
Abstract
People who are born deaf often have difficulty learning to read. Recently, several studies have examined the neural substrates involved in reading in deaf people and found a left lateralized reading system similar to hearing people involving temporo-parietal, inferior frontal, and ventral occipito-temporal cortices. Previous studies in typical hearing readers show that within this reading network there are separate regions that specialize in processing orthography and phonology. We used fMRI rapid adaptation in deaf adults who were skilled readers to examine neural selectivity in three functional ROIs in the left hemisphere: temporoparietal cortex (TPC), inferior frontal gyrus (IFG), and the visual word form area (VWFA). Results show that in deaf skilled readers, the left VWFA showed selectivity for orthography similar to what has been reported for hearing readers, the TPC showed less sensitivity to phonology than previously reported for hearing readers using the same paradigm, and the IFG showed selectivity to orthography, but not phonology (similar to what has been reported previously for hearing readers). These results provide evidence that while skilled deaf readers demonstrate coarsely tuned phonological representations in the TPC, they develop finely tuned representations for the orthography of written words in the VWFA and IFG. This result suggests that phonological tuning in the TPC may have little impact on the neural network associated with skilled reading for deaf adults.
Collapse
Affiliation(s)
- Laurie S Glezer
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States; Department of Psychology, San Diego State University, United States.
| | - Jill Weisberg
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States
| | - Cindy O'Grady Farnady
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States
| | - Stephen McCullough
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States
| | | | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States
| |
Collapse
|
28
|
Abstract
A longstanding debate centers around how beginning adult bilinguals process words in their second language (L2). Do they access the meaning of the L2 words directly or do they first activate the native language (L1) translation equivalents in order to access meaning? To address this question, we used ERPs to investigate how newly learned L2 words influence processing of their L1 translation equivalents. We taught participants the meanings of 80 novel L2 (pseudo)words by presenting them with pictures of familiar objects. After 3 days of learning, participants were tested in a backward translation priming paradigm with a short (140 ms) stimulus onset asynchrony. L1 targets preceded by their L2 translations elicited faster responses and smaller amplitude negativities than the same L1 targets preceded by unrelated L2 words. The bulk of the ERP translation priming effect occurred within the N400 window (350–550 ms), suggesting that the new L2 words were automatically activating their semantic representations. A weaker priming effect in the preceding window (200–350 ms) was found at anterior sites, providing some evidence that the forms of the L1 translation equivalents had also been activated. These results have implications for models of L2 processing at the earliest stages of learning.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University and University of California, San Diego, San Diego, CA, United States
- *Correspondence: Gabriela Meade,
| | - Katherine J. Midgley
- Department of Psychology, San Diego State University, San Diego, CA, United States
| | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, United States
| |
Collapse
|
29
|
Meade G, Grainger J, Midgley KJ, Emmorey K, Holcomb PJ. From sublexical facilitation to lexical competition: ERP effects of masked neighbor priming. Brain Res 2018; 1685:29-41. [PMID: 29407530 PMCID: PMC5840043 DOI: 10.1016/j.brainres.2018.01.029] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2017] [Revised: 12/12/2017] [Accepted: 01/22/2018] [Indexed: 11/18/2022]
Abstract
Interactive-activation models posit that visual word recognition involves co-activation of orthographic neighbors (e.g., note, node) and competition among them via lateral inhibitory connections. Behavioral evidence of this lexical competition comes from masked priming paradigms, in which target words elicit slower responses when preceded by a neighbor (e.g., note-NODE) than when preceded by an unrelated word (e.g., kiss-NODE). In the present study, we used ERPs to investigate how masked high frequency word primes influence processing of low frequency word and pseudoword targets. Word targets preceded by a neighbor prime elicited larger negativities within the N400 window than those preceded by an unrelated prime across bilateral anterior sites, which we call a reversed N400 priming effect. Consistent with the behavioral literature, the size of the reversed N400 priming effect was larger for targets from high-density orthographic neighborhoods and for participants who scored higher on a behavioral measure of spelling recognition. Indeed, the opposite effect (i.e., smaller negativities within the N400 window for word targets preceded by a neighbor) was observed for words from low-density orthographic neighborhoods and for less-skilled spellers. Traditional priming was also observed within the N250 window for word targets and within both the N250 or N400 windows for pseudoword targets. The specificity of the reversed N400 priming effect to situations in which both words have precise lexical representations suggests that it, like the behavioral interference effect, indexes lexical competition during visual word recognition.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, United States.
| | - Jonathan Grainger
- Laboratoire de Psychologie Cognitive, CNRS & Aix-Marseille University, France
| | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States
| | | |
Collapse
|
30
|
Winsler K, Midgley KJ, Grainger J, Holcomb PJ. An electrophysiological megastudy of spoken word recognition. Lang Cogn Neurosci 2018; 33:1063-1082. [PMID: 33912620 PMCID: PMC8078007 DOI: 10.1080/23273798.2018.1455985] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/04/2017] [Accepted: 03/12/2018] [Indexed: 05/29/2023]
Abstract
This study used electrophysiological recordings to a large sample of spoken words to track the time-course of word frequency, phonological neighbourhood density, concreteness and stimulus duration effects in two experiments. Fifty subjects were presented more than a thousand spoken words during either a go/no go lexical decision task (Experiment 1) or a go/no go semantic categorisation task (Experiment 2) while EEG was collected. Linear mixed effects modelling was used to analyze the data. Effects of word frequency were found on the N400 and also as early as 100 ms in Experiment 1 but not Experiment 2. Phonological neighbourhood density produced an early effect around 250 ms and the typical N400 effect. Concreteness elicited effects in later epochs on the N400. Stimulus duration affected all epochs and its influence reflected changes in the timing of the ERP components. Overall the results support cascaded interactive models of spoken word recognition.
Collapse
Affiliation(s)
- Kurt Winsler
- Department of Psychology, San Diego State University, San Diego, CA, USA
| | | | - Jonathan Grainger
- Laboratoire de Psychologie Cognitive, CNRS and Aix-Marseille Université, Marseille, France
| | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| |
Collapse
|
31
|
Meade G, Lee B, Midgley KJ, Holcomb PJ, Emmorey K. Phonological and semantic priming in American Sign Language: N300 and N400 effects. Lang Cogn Neurosci 2018; 33:1092-1106. [PMID: 30662923 PMCID: PMC6335044 DOI: 10.1080/23273798.2018.1446543] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/09/2017] [Accepted: 02/20/2018] [Indexed: 05/29/2023]
Abstract
This study investigated the electrophysiological signatures of phonological and semantic priming in American Sign Language (ASL). Deaf signers made semantic relatedness judgments to pairs of ASL signs separated by a 1300 ms prime-target SOA. Phonologically related sign pairs shared two of three phonological parameters (handshape, location, and movement). Target signs preceded by phonologically related and semantically related prime signs elicited smaller negativities within the N300 and N400 windows than those preceded by unrelated primes. N300 effects, typically reported in studies of picture processing, are interpreted to reflect the mapping from the visual features of the signs to more abstract linguistic representations. N400 effects, consistent with rhyme priming effects in the spoken language literature, are taken to index lexico-semantic processes that appear to be largely modality independent. Together, these results highlight both the unique visual-manual nature of sign languages and the linguistic processing characteristics they share with spoken languages.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University and University of California, San Diego, San Diego, CA, USA
| | - Brittany Lee
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University and University of California, San Diego, San Diego, CA, USA
| | | | - Phillip J. Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| |
Collapse
|
32
|
Abstract
This study examined how acquisition of novel words from an unknown language (L2) is influenced by their orthographic similarity with existing native language (L1) words in beginning adult learners. Participants were tested in a two-alternative forced-choice recognition task and a typing production task as they learned to associate 80 L2 (pseudo)words with pictures depicting their meanings. There was no effect of L1 orthographic neighborhood density on accuracy in the two-alternative forced-choice task, but typing accuracy was higher for L2 words with many L1 neighbors in the earliest stages of learning. ERPs recorded during a language decision task before and after learning also showed differences as a function of L1 neighborhood density. Across sessions, L2 words with many L1 neighbors elicited slower responses and larger N400s than words with fewer L1 neighbors, suggesting that L1 neighbors continued to influence processing of the L2 words after learning (though to a lesser extent). Finally, ERPs recorded during a typing task after learning also revealed an effect of L1 neighborhood that began about 700 msec after picture onset, suggesting that the cross-language neighborhood effects cannot solely be attributed to bottom-up activation of L1 neighbors. Together, these results demonstrate that strategic associations between novel L2 words and existing L1 neighbors scaffold learning and result in interactions among cross-language neighbors, suggestive of an integrated L1-L2 lexicon.
Collapse
Affiliation(s)
- Gabriela Meade
- Radboud University Nijmegen
- San Diego State University
- University of California, San Diego
| | | | | | | |
Collapse
|
33
|
Emmorey K, Midgley KJ, Kohen CB, Sehyr ZS, Holcomb PJ. The N170 ERP component differs in laterality, distribution, and association with continuous reading measures for deaf and hearing readers. Neuropsychologia 2017; 106:298-309. [PMID: 28986268 PMCID: PMC5694363 DOI: 10.1016/j.neuropsychologia.2017.10.001] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2016] [Revised: 09/18/2017] [Accepted: 10/02/2017] [Indexed: 11/20/2022]
Abstract
The temporo-occipitally distributed N170 ERP component is hypothesized to reflect print-tuning in skilled readers. This study investigated whether skilled deaf and hearing readers (matched on reading ability, but not phonological awareness) exhibit similar N170 patterns, given their distinct experiences learning to read. Thirty-two deaf and 32 hearing adults viewed words and symbol strings in a familiarity judgment task. In the N170 epoch (120-240ms) hearing readers produced greater negativity for words than symbols at left hemisphere (LH) temporo-parietal and occipital sites, while deaf readers only showed this asymmetry at occipital sites. Linear mixed effects regression was used to examine the influence of continuous measures of reading, spelling, and phonological skills on the N170 (120-240ms). For deaf readers, better reading ability was associated with a larger N170 over the right hemisphere (RH), but for hearing readers better reading ability was associated with a smaller RH N170. Better spelling ability was related to larger occipital N170s in deaf readers, but this relationship was weak in hearing readers. Better phonological awareness was associated with smaller N170s in the LH for hearing readers, but this association was weaker and in the RH for deaf readers. The results support the phonological mapping hypothesis for a left-lateralized temporo-parietal N170 in hearing readers and indicate that skilled reading is characterized by distinct patterns of neural tuning to print in deaf and hearing adults.
Collapse
Affiliation(s)
- Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States.
| | | | - Casey B Kohen
- Department of Psychology, San Diego State University, United States
| | - Zed Sevcikova Sehyr
- School of Speech, Language, and Hearing Sciences, San Diego State University, United States
| | | |
Collapse
|
34
|
Sheppard SM, Midgley KJ, Love T, Shapiro LP, Holcomb PJ. Electrophysiological evidence for the interaction of prosody and thematic fit during sentence comprehension. Lang Cogn Neurosci 2017; 33:547-562. [PMID: 29904641 PMCID: PMC5997268 DOI: 10.1080/23273798.2017.1390143] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/21/2016] [Accepted: 09/26/2017] [Indexed: 06/08/2023]
Abstract
This study investigated the interaction of prosody and thematic fit/plausibility information during the processing of sentences containing temporary early closure (correct) or late closure (incorrect) syntactic ambiguities using event-related potentials (ERPs). Early closure sentences with congruent and incongruent prosody were presented where the temporarily ambiguous NP was either a plausible or an implausible continuation for the subordinate verb (e.g. "While the band played the song/beer pleased all the customers."). N400 and P600 components were examined at critical points in each condition. The CPS was examined in sentences with congruent prosody. Prosodic and thematic fit cues interacted immediately (N400-P600) at the implausible NP (beer), when it was paired with incongruent prosody. Incongruent prosody paired with a plausible NP (song) resulted in garden-path effects (N400-P600) at the critical verb (pleased). These findings provide strong evidence that prosodic and thematic fit/plausibility cues interact to aid the parser in syntactic structure building.
Collapse
Affiliation(s)
- Shannon M Sheppard
- Department of Neurology, Johns Hopkins University School of Medicine, Baltimore, MD, USA
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
| | | | - Tracy Love
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
- Center for Research in Language, University of California, San Diego, La Jolla, CA, USA
| | - Lewis P Shapiro
- School of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
- Center for Research in Language, University of California, San Diego, La Jolla, CA, USA
| | - Phillip J Holcomb
- Department of Psychology, San Diego State University, San Diego, CA, USA
| |
Collapse
|
35
|
Sheppard SM, Love T, Midgley KJ, Holcomb PJ, Shapiro LP. Electrophysiology of prosodic and lexical-semantic processing during sentence comprehension in aphasia. Neuropsychologia 2017; 107:9-24. [PMID: 29061490 DOI: 10.1016/j.neuropsychologia.2017.10.023] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2016] [Revised: 08/11/2017] [Accepted: 10/18/2017] [Indexed: 11/26/2022]
Abstract
Event-related potentials (ERPs) were used to examine how individuals with aphasia and a group of age-matched controls use prosody and themattic fit information in sentences containing temporary syntactic ambiguities. Two groups of individuals with aphasia were investigated; those demonstrating relatively good sentence comprehension whose primary language difficulty is anomia (Individuals with Anomic Aphasia (IWAA)), and those who demonstrate impaired sentence comprehension whose primary diagnosis is Broca's aphasia (Individuals with Broca's Aphasia (IWBA)). The stimuli had early closure syntactic structure and contained a temporary early closure (correct)/late closure (incorrect) syntactic ambiguity. The prosody was manipulated to either be congruent or incongruent, and the temporarily ambiguous NP was also manipulated to either be a plausible or an implausible continuation for the subordinate verb (e.g., "While the band played the song/the beer pleased all the customers."). It was hypothesized that an implausible NP in sentences with incongruent prosody may provide the parser with a plausibility cue that could be used to predict syntactic structure. The results revealed that incongruent prosody paired with a plausibility cue resulted in an N400-P600 complex at the implausible NP (the beer) in both the controls and the IWAAs, yet incongruent prosody without a plausibility cue resulted in an N400-P600 at the critical verb (pleased) only in healthy controls. IWBAs did not show evidence of N400 or P600 effects at the ambiguous NP or critical verb, although they did show evidence of a delayed N400 effect at the sentence-final word in sentences with incongruent prosody. These results suggest that IWAAs have difficulty integrating prosodic cues with underlying syntactic structure when lexical-semantic information is not available to aid their parse. IWBAs have difficulty integrating both prosodic and lexical-semantic cues with syntactic structure, likely due to a processing delay.
Collapse
Affiliation(s)
- Shannon M Sheppard
- Johns Hopkins University School of Medicine, United States; San Diego State University, United States.
| | - Tracy Love
- San Diego State University, United States; University of California, San Diego, United States
| | | | | | | |
Collapse
|
36
|
Winsler K, Holcomb PJ, Midgley KJ, Grainger J. Evidence for Separate Contributions of High and Low Spatial Frequencies during Visual Word Recognition. Front Hum Neurosci 2017; 11:324. [PMID: 28690505 PMCID: PMC5480267 DOI: 10.3389/fnhum.2017.00324] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2017] [Accepted: 06/06/2017] [Indexed: 11/13/2022] Open
Abstract
Previous studies have shown that different spatial frequency information processing streams interact during the recognition of visual stimuli. However, it is a matter of debate as to the contributions of high and low spatial frequency (HSF and LSF) information for visual word recognition. This study examined the role of different spatial frequencies in visual word recognition using event-related potential (ERP) masked priming. EEG was recorded from 32 scalp sites in 30 English-speaking adults in a go/no-go semantic categorization task. Stimuli were white characters on a neutral gray background. Targets were uppercase five letter words preceded by a forward-mask (#######) and a 50 ms lowercase prime. Primes were either the same word (repeated) or a different word (un-repeated) than the subsequent target and either contained only high, only low, or full spatial frequency information. Additionally within each condition, half of the prime-target pairs were high lexical frequency, and half were low. In the full spatial frequency condition, typical ERP masked priming effects were found with an attenuated N250 (sub-lexical) and N400 (lexical-semantic) for repeated compared to un-repeated primes. For HSF primes there was a weaker N250 effect which interacted with lexical frequency, a significant reversal of the effect around 300 ms, and an N400-like effect for only high lexical frequency word pairs. LSF primes did not produce any of the classic ERP repetition priming effects, however they did elicit a distinct early effect around 200 ms in the opposite direction of typical repetition effects. HSF information accounted for many of the masked repetition priming ERP effects and therefore suggests that HSFs are more crucial for word recognition. However, LSFs did produce their own pattern of priming effects indicating that larger scale information may still play a role in word recognition.
Collapse
Affiliation(s)
- Kurt Winsler
- NeuroCognition Laboratory, Department of Psychology, San Diego State UniversitySan Diego, CA, United States
| | - Phillip J Holcomb
- NeuroCognition Laboratory, Department of Psychology, San Diego State UniversitySan Diego, CA, United States
| | - Katherine J Midgley
- NeuroCognition Laboratory, Department of Psychology, San Diego State UniversitySan Diego, CA, United States
| | - Jonathan Grainger
- Laboratoire de Psychologie Cognitive, CNRS and Aix-Marseille UniversitéMarseille, France
| |
Collapse
|
37
|
Meade G, Midgley KJ, Sevcikova Sehyr Z, Holcomb PJ, Emmorey K. Implicit co-activation of American Sign Language in deaf readers: An ERP study. Brain Lang 2017; 170:50-61. [PMID: 28407510 PMCID: PMC5538318 DOI: 10.1016/j.bandl.2017.03.004] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/14/2016] [Revised: 01/21/2017] [Accepted: 03/16/2017] [Indexed: 05/12/2023]
Abstract
In an implicit phonological priming paradigm, deaf bimodal bilinguals made semantic relatedness decisions for pairs of English words. Half of the semantically unrelated pairs had phonologically related translations in American Sign Language (ASL). As in previous studies with unimodal bilinguals, targets in pairs with phonologically related translations elicited smaller negativities than targets in pairs with phonologically unrelated translations within the N400 window. This suggests that the same lexicosemantic mechanism underlies implicit co-activation of a non-target language, irrespective of language modality. In contrast to unimodal bilingual studies that find no behavioral effects, we observed phonological interference, indicating that bimodal bilinguals may not suppress the non-target language as robustly. Further, there was a subset of bilinguals who were aware of the ASL manipulation (determined by debrief), and they exhibited an effect of ASL phonology in a later time window (700-900ms). Overall, these results indicate modality-independent language co-activation that persists longer for bimodal bilinguals.
Collapse
Affiliation(s)
- Gabriela Meade
- Joint Doctoral Program in Language and Communicative Disorders, San Diego State University & University of California, San Diego, USA.
| | | | - Zed Sevcikova Sehyr
- School of Speech, Language, and Hearing Sciences, San Diego State University, USA
| | | | - Karen Emmorey
- School of Speech, Language, and Hearing Sciences, San Diego State University, USA
| |
Collapse
|
38
|
Carrasco-Ortiz H, Midgley KJ, Grainger J, Holcomb PJ. Interactions in the neighborhood: Effects of orthographic and phonological neighbors on N400 amplitude. J Neurolinguistics 2017; 41:1-10. [PMID: 33911344 PMCID: PMC8078004 DOI: 10.1016/j.jneuroling.2016.06.007] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
The present study investigated effects of phonological and orthographic neighborhood density on event-related potentials, with an aim to better specify the factors that determine N400 amplitude in single word reading paradigms. We orthogonally manipulated the number of orthographic and phonological neighbors of words using the Levenshtein Distance metric (OLD20 and PLD20, respectively). The results showed opposite effects of phonological neighborhood density (PND) as a function of orthographic neighborhood density (OND). Larger N400 amplitudes were elicited by words with high PND compared with low PND when OND was high, and smaller N400 amplitudes were observed with high PND compared with low PND words when OND was low. We interpret these findings using the notion of cross-code consistency, according to which the compatibility of orthographic and phonological representations activated by a given word influences the process of recognizing that word. Words with similar numbers of orthographic and phonological neighbors have more consistent spellings and pronunciations across the neighborhood, and generate larger N400 amplitudes.
Collapse
Affiliation(s)
| | | | - Jonathan Grainger
- Aix-Marseille University, France
- Centre National de la Recherche Scientifique, France
| | | |
Collapse
|
39
|
Abstract
Research has shown neural changes following second language (L2) acquisition after weeks or months of instruction. But are such changes detectable even earlier than previously shown? The present study examines the electrophysiological changes underlying the earliest stages of second language vocabulary acquisition by recording event-related potentials (ERPs) within the first week of learning. Adult native English speakers with no previous Spanish experience completed less than four hours of Spanish vocabulary training, with pre- and post-training ERPs recorded to a backward translation task. Results indicate that beginning L2 learners show rapid neural changes following learning, manifested in changes to the N400 - an ERP component sensitive to lexicosemantic processing and degree of L2 proficiency. Specifically, learners in early stages of L2 acquisition show growth in N400 amplitude to L2 words following learning as well as a backward translation N400 priming effect that was absent pre-training. These results were shown within days of minimal L2 training, suggesting that the neural changes captured during adult second language acquisition are more rapid than previously shown. Such findings are consistent with models of early stages of bilingualism in adult learners of L2 (e.g. Kroll and Stewart's RHM) and reinforce the use of ERP measures to assess L2 learning.
Collapse
Affiliation(s)
- He Pu
- Tufts University, Medford, MA 02155, USA
| | - Phillip J. Holcomb
- Tufts University, Medford, MA 02155, USA
- San Diego State University, San Diego, CA 92182, USA
| | | |
Collapse
|
40
|
Grainger J, Midgley KJ, Holcomb PJ. Trans-saccadic repetition priming: ERPs reveal on-line integration of information across words. Neuropsychologia 2016; 80:201-211. [PMID: 26656872 PMCID: PMC4698207 DOI: 10.1016/j.neuropsychologia.2015.11.025] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2015] [Revised: 10/28/2015] [Accepted: 11/30/2015] [Indexed: 11/24/2022]
Abstract
We used a trans-saccadic priming paradigm combined with ERP recordings to track the time-course of integration of information across a prime word briefly presented at fixation and a subsequent target word presented 4 degrees to the right of fixation. Trans-saccadic repetition priming effects (Experiments 1 and 2) were compared with priming effects obtained with centrally located targets (Experiment 3). In Experiment 2, target stimuli were preceded by a 100ms forward mask at the target location, hence allowing an attention shift to the target location prior to target onset. Compared with centrally located targets, repetition priming effects were found to onset later in Experiment 2 and even later in Experiment 1, and the growth of priming effects was slower in both Experiments 1 and 2 compared with Experiment 3. The results demonstrate integration of information across spatially distinct primes and targets, with the time-course of trans-saccadic priming being determined by the speed with which attention can be allocated to peripheral targets plus the quality of information available in peripheral vision prior to fixation of target stimuli.
Collapse
Affiliation(s)
- Jonathan Grainger
- Laboratoire de Psychologie Cognitive, Aix-Marseille University & CNRS, France; Brain and Language Research Institute, Aix-Marseille University, Marseille, France.
| | | | - Phillip J Holcomb
- San Diego State University, San Diego, CA, USA; Tufts University, Medford, MA, USA
| |
Collapse
|
41
|
Dufau S, Grainger J, Midgley KJ, Holcomb PJ. A Thousand Words Are Worth a Picture: Snapshots of Printed-Word Processing in an Event-Related Potential Megastudy. Psychol Sci 2015; 26:1887-97. [PMID: 26525074 DOI: 10.1177/0956797615603934] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2014] [Accepted: 08/12/2015] [Indexed: 11/16/2022] Open
Abstract
In the experiment reported here, approximately 1,000 words were presented to 75 participants in a go/no-go lexical decision task while event-related potentials (ERPs) were recorded. Partial correlations were computed for variables selected to reflect orthographic, lexical, and semantic processing, as well as for a novel measure of the visual complexity of written words. Correlations were based on the item-level ERPs at each electrode site and time slice while a false-discovery-rate correction was applied. Early effects of visual complexity were seen around 50 ms after word onset, followed by the earliest sustained orthographic effects around 100 to 150 ms, with the bulk of orthographic and lexical influences arising after 200 ms. Effects of a semantic variable (concreteness) emerged later, at around 300 ms. The overall time course of these ERP effects is in line with hierarchical, cascaded, interactive accounts of word recognition, in which fast feed-forward influences are consolidated by top-down feedback via recurrent processing loops.
Collapse
Affiliation(s)
- Stéphane Dufau
- Laboratoire de Psychologie Cognitive, Aix-Marseille University Centre National de la Recherche Scientifique (CNRS), Marseille, France Brain and Language Research Institute, Aix-Marseille University
| | - Jonathan Grainger
- Laboratoire de Psychologie Cognitive, Aix-Marseille University Centre National de la Recherche Scientifique (CNRS), Marseille, France Brain and Language Research Institute, Aix-Marseille University
| | | | - Phillip J Holcomb
- Department of Psychology, San Diego State University Department of Psychology, Tufts University
| |
Collapse
|
42
|
Yum YN, Midgley KJ, Holcomb PJ, Grainger J. An ERP study on initial second language vocabulary learning. Psychophysiology 2014; 51:364-73. [PMID: 24660886 DOI: 10.1111/psyp.12183] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2013] [Accepted: 11/19/2013] [Indexed: 11/27/2022]
Abstract
This study examined the very initial phases of orthographic and semantic acquisition in monolingual native English speakers learning Chinese words under controlled laboratory conditions. Participants engaged in 10 sessions of vocabulary learning, four of which were used to obtain ERPs. Performance in behavioral tests improved over sessions, and these data were used to define fast and slow learners. Most important is that ERPs in the two groups of learners revealed qualitatively distinct learning patterns. Only fast learners showed a left-lateralized increase in N170 amplitude with training. Furthermore, only fast learners showed an increased N400 amplitude with training, with a distinct anterior distribution. Slow learners, on the other hand, showed a posterior positive effect, with increasingly positive-going waveforms in occipital sites as training progressed. Possible mechanisms underlying these qualitative differences are discussed.
Collapse
Affiliation(s)
- Yen Na Yum
- Psychology Department, Tufts University, Medford, Massachusetts, USA
| | | | | | | |
Collapse
|
43
|
Aparicio X, Midgley KJ, Holcomb PJ, Pu H, Lavaur JM, Grainger J. Language Effects in Trilinguals: An ERP Study. Front Psychol 2012; 3:402. [PMID: 23133428 PMCID: PMC3490278 DOI: 10.3389/fpsyg.2012.00402] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2012] [Accepted: 09/26/2012] [Indexed: 11/13/2022] Open
Abstract
Event-related potentials were recorded during the visual presentation of words in the three languages of French-English-Spanish trilinguals. Participants monitored a mixed list of unrelated non-cognate words in the three languages while performing a semantic categorization task. Words in L1 generated earlier N400 peak amplitudes than both L2 and L3 words, which peaked together. On the other hand, L2 and L3 words did differ significantly in terms of N400 amplitude, with L3 words generating greater mean amplitudes compared with L2 words. We interpret the effects of peak N400 latency as reflecting the special status of the L1 relative to later acquired languages, rather than proficiency in that language per se. On the other hand, the mean amplitude difference between L2 and L3 is thought to reflect different levels of fluency in these two languages.
Collapse
|
44
|
Massol S, Grainger J, Midgley KJ, Holcomb PJ. Masked repetition priming of letter-in-string identification: an ERP investigation. Brain Res 2012; 1472:74-88. [PMID: 22824333 DOI: 10.1016/j.brainres.2012.07.018] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2011] [Revised: 07/09/2012] [Accepted: 07/10/2012] [Indexed: 10/28/2022]
Abstract
In a post-cued letter identification task, participants were presented with 7-letter nonword target stimuli that were formed of a random string of consonants (DCMFPLR) or a pronounceable sequence of consonants and vowels (DAMOPUR). Targets were preceded by briefly presented pattern-masked primes that could be the same sequence of letters as the target, composed of seven different letters, or sharing either the first or last five letters of the target. There was some evidence for repetition priming effects that were independent of target type in an early component, the N/P150, thought to reflect the mapping of visual features onto letter representations, and that is insensitive to orthographic structure. Following this, pronounceable nonwords showed significantly greater repetition priming effects than consonant strings, in line with the behavioral results. Initial versus final overlap only started to influence target processing at around 200-250ms post-target onset, at about the same time as the effects of target type emerged. The results are in line with a model where the initial parallel mapping of visual features onto a location-specific orthographic code is followed by the subsequent activation of location-invariant orthographic and phonological codes.
Collapse
|
45
|
Carrasco-Ortiz H, Midgley KJ, Frenck-Mestre C. Are phonological representations in bilinguals language specific? An ERP study on interlingual homophones. Psychophysiology 2012; 49:531-43. [PMID: 22220969 DOI: 10.1111/j.1469-8986.2011.01333.x] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2010] [Accepted: 10/02/2011] [Indexed: 11/30/2022]
Abstract
Event-related potentials (ERPs) served to investigate whether phonological representations from both the first (L1) and second (L2) language of bilinguals are activated during silent reading of L2 words. French-English late bilinguals and control monolingual English speakers read interlingual homophones (e.g., "knee" in English, which has substantial phonological overlap with the French word "nid," meaning "nest") and matched control words. Results showed a reduction in N400 amplitude in response to interlingual homophones in comparison to control words for bilinguals, but not for English monolinguals. The reduced N400 response to homophones in bilinguals suggests facilitation of word recognition. These results suggest parallel activation of both L1 and L2 phonological representations when reading silently in the L2. These findings point to a language nonspecific model for bilinguals at the phonological level of representation.
Collapse
Affiliation(s)
- Haydee Carrasco-Ortiz
- Aix-Marseille University, Laboratoire de Parole et Langage, 5 avenue Pasteur, 13604 Aix-en-Provence, France.
| | | | | |
Collapse
|
46
|
Geyer A, Holcomb PJ, Midgley KJ, Grainger J. Processing words in two languages: An event-related brain potential study of proficient bilinguals. J Neurolinguistics 2011; 24:338-351. [PMID: 21461123 PMCID: PMC3066444 DOI: 10.1016/j.jneuroling.2010.10.005] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
In a previous study of native-English speaking university learners of a second language (Spanish) we observed an asymmetric pattern of ERP translation priming effects in L1 and L2 (Alvarez et al., 2003, Brain & Language, 87, 290-304) with larger and earlier priming on the N400 component in the L2 to L1, compared with the L1 to L2 direction. In the current study 20 native-Russian speakers who were also highly proficient in English participated in a mixed language lexical decision task in which critical words were presented in Russian (L1) and English (L2) and repetitions of these words (within and between languages) were presented on subsequent trials. ERPs were recorded to all items allowing for comparisons of repetition effects within and between (translation) languages. The results revealed a symmetrical pattern of within-language repetition and between-language translation ERP priming effects, which in conjunction with Alvarez et al (2003), supports the hypothesis that L2 proficiency level rather than age or order of language acquisition is responsible for the observed patterns of translation priming. The ramifications of these results for models of bilingual word processing are discussed.
Collapse
|
47
|
Hoshino N, Midgley KJ, Holcomb PJ, Grainger J. An ERP investigation of masked cross-script translation priming. Brain Res 2010; 1344:159-72. [PMID: 20478274 PMCID: PMC2901627 DOI: 10.1016/j.brainres.2010.05.005] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2010] [Revised: 04/22/2010] [Accepted: 05/03/2010] [Indexed: 11/20/2022]
Abstract
The time course of cross-script translation priming and repetition priming was examined in two different scripts using a combination of the masked priming paradigm with the recording of event-related potentials (ERPs). Japanese-English bilinguals performed a semantic categorization task in their second language (L2) English and in their first language (L1) Japanese. Targets were preceded by a visually presented related (translation equivalent/repeated) or unrelated prime. The results showed that the amplitudes of the N250 and N400 ERP components were significantly modulated for L2-L2 repetition priming, L1-L2 translation priming, and L1-L1 repetition priming, but not for L2-L1 translation priming. There was also evidence for priming effects in an earlier 100-200 ms time window for L1-L1 repetition priming and L1-L2 translation priming. We argue that a change in script across primes and targets provides optimal conditions for prime word processing, hence generating very fast-acting translation priming effects when primes are in L1.
Collapse
Affiliation(s)
- Noriko Hoshino
- ESRC Centre for Research on Bilingualism in Theory and PracticeBangor University, Bangor, Gwynedd LL57 2DG, UK.
| | | | | | | |
Collapse
|
48
|
Abstract
Using event-related potentials, we investigated how the brain extracts information from another's face and translates it into relevant action in real time. In Study 1, participants made between-hand sex categorizations of sex-typical and sex-atypical faces. Sex-atypical faces evoked negativity between 250 and 550 ms (N300/N400 effects), reflecting the integration of accumulating sex-category knowledge into a coherent sex-category interpretation. Additionally, the lateralized readiness potential revealed that the motor cortex began preparing for a correct hand response while social category knowledge was still gradually evolving in parallel. In Study 2, participants made between-hand eye-color categorizations as part of go/no-go trials that were contingent on a target's sex. On no-go trials, although the hand did not actually move, information about eye color partially prepared the motor cortex to move the hand before perception of sex had finalized. Together, these findings demonstrate the dynamic continuity between person perception and action, such that ongoing results from face processing are immediately and continuously cascaded into the motor system over time. The preparation of action begins based on tentative perceptions of another's face before perceivers have finished interpreting what they just saw.
Collapse
|
49
|
Midgley KJ, Holcomb PJ, Grainger J. Effects of cognate status on word comprehension in second language learners: an ERP investigation. J Cogn Neurosci 2010; 23:1634-47. [PMID: 20476892 DOI: 10.1162/jocn.2010.21463] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
ERPs were used to explore the different patterns of processing of cognate and noncognate words in the first (L1) and second (L2) language of a population of second language learners. L1 English students of French were presented with blocked lists of L1 and L2 words, and ERPs to cognates and noncognates were compared within each language block. For both languages, cognates had smaller amplitudes in the N400 component when compared with noncognates. L1 items that were cognates showed early differences in amplitude in the N400 epoch when compared with noncognates. L2 items showed later differences between cognates and noncognates than L1 items. The results are discussed in terms of how cognate status affects word recognition in second language learners.
Collapse
|
50
|
Midgley KJ, Holcomb PJ, Grainger J. Masked repetition and translation priming in second language learners: a window on the time-course of form and meaning activation using erps. Psychophysiology 2009; 46:551-65. [PMID: 19298629 PMCID: PMC2692905 DOI: 10.1111/j.1469-8986.2009.00784.x] [Citation(s) in RCA: 69] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
Event-related potentials (ERPs) and masked translation priming served to examine the time-course of form and meaning activation during word recognition in second language learners. Targets were repetitions of, translations of, or were unrelated to the immediately preceding prime. In Experiment 1 all targets were in the participants' L2. In Experiment 2 all targets were in the participants' L1. In Exp 1 both within-language repetition and L1-L2 translation priming produced effects on the N250 component and the N400 component. In Experiment 2 only within-language repetition produced N250 effects, while both types of priming produced N400 effects. These results suggest rapid involvement of semantic representations during on-going form-level processing of printed words, and an absence of facilitatory connections between the form representations of non-cognate translation equivalents in L2 learners. The implications for bilingual theories of word processing are discussed.
Collapse
|