1
|
Sweet SJ, Van Hedger SC, Batterink LJ. Of words and whistles: Statistical learning operates similarly for identical sounds perceived as speech and non-speech. Cognition 2024; 242:105649. [PMID: 37871411 DOI: 10.1016/j.cognition.2023.105649] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2023] [Revised: 10/11/2023] [Accepted: 10/13/2023] [Indexed: 10/25/2023]
Abstract
Statistical learning is an ability that allows individuals to effortlessly extract patterns from the environment, such as sound patterns in speech. Some prior evidence suggests that statistical learning operates more robustly for speech compared to non-speech stimuli, supporting the idea that humans are predisposed to learn language. However, any apparent statistical learning advantage for speech could be driven by signal acoustics, rather than the subjective perception per se of sounds as speech. To resolve this issue, the current study assessed whether there is a statistical learning advantage for ambiguous sounds that are subjectively perceived as speech-like compared to the same sounds perceived as non-speech, thereby controlling for acoustic features. We first induced participants to perceive sine-wave speech (SWS)-a degraded form of speech not immediately perceptible as speech-as either speech or non-speech. After this induction phase, participants were exposed to a continuous stream of repeating trisyllabic nonsense words, composed of SWS syllables, and then completed an explicit familiarity rating task and an implicit target detection task to assess learning. Critically, participants showed robust and equivalent performance on both measures, regardless of their subjective speech perception. In contrast, participants who perceived the SWS syllables as more speech-like showed better detection of individual syllables embedded in speech streams. These results suggest that speech perception facilitates processing of individual sounds, but not the ability to extract patterns across sounds. Our findings suggest that statistical learning is not influenced by the perceived linguistic relevance of sounds, and that it may be conceptualized largely as an automatic, stimulus-driven mechanism.
Collapse
Affiliation(s)
- Sierra J Sweet
- Department of Psychology, Western University, London, ON, Canada.
| | - Stephen C Van Hedger
- Department of Psychology, Western University, London, ON, Canada; Western Institute for Neuroscience, Western University, London, ON, Canada; Department of Psychology, Huron University College, London, ON, Canada.
| | - Laura J Batterink
- Department of Psychology, Western University, London, ON, Canada; Western Institute for Neuroscience, Western University, London, ON, Canada.
| |
Collapse
|
2
|
Linguistic labels cue biological motion perception and misperception. Sci Rep 2021; 11:17239. [PMID: 34446746 PMCID: PMC8390742 DOI: 10.1038/s41598-021-96649-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Accepted: 08/05/2021] [Indexed: 11/24/2022] Open
Abstract
Linguistic labels exert a particularly strong top-down influence on perception. The potency of this influence has been ascribed to their ability to evoke category-diagnostic features of concepts. In doing this, they facilitate the formation of a perceptual template concordant with those features, effectively biasing perceptual activation towards the labelled category. In this study, we employ a cueing paradigm with moving, point-light stimuli across three experiments, in order to examine how the number of biological motion features (form and kinematics) encoded in lexical cues modulates the efficacy of lexical top-down influence on perception. We find that the magnitude of lexical influence on biological motion perception rises as a function of the number of biological motion-relevant features carried by both cue and target. When lexical cues encode multiple biological motion features, this influence is robust enough to mislead participants into reporting erroneous percepts, even when a masking level yielding high performance is used.
Collapse
|
3
|
Hebert KP, Goldinger SD, Walenchok SC. Eye movements and the label feedback effect: Speaking modulates visual search via template integrity. Cognition 2021; 210:104587. [PMID: 33508577 DOI: 10.1016/j.cognition.2021.104587] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2018] [Revised: 01/05/2021] [Accepted: 01/06/2021] [Indexed: 11/24/2022]
Abstract
The label-feedback hypothesis (Lupyan, 2012) proposes that language modulates low- and high-level visual processing, such as priming visual object perception. Lupyan and Swingley (2012) found that repeating target names facilitates visual search, resulting in shorter response times (RTs) and higher accuracy. In the present investigation, we conceptually replicated and extended their study, using additional control conditions and recording eye movements during search. Our goal was to evaluate whether self-directed speech influences target locating (i.e. attentional guidance) or object perception (i.e., distractor rejection and target appreciation). In three experiments, during object search, people spoke target names, nonwords, irrelevant (absent) object names, or irrelevant (present) object names (all within-participants). Experiments 1 and 2 examined search RTs and accuracy: Speaking target names improved performance, without differences among the remaining conditions. Experiment 3 incorporated eye-tracking: Gaze fixation patterns suggested that language does not affect attentional guidance, but instead affects both distractor rejection and target appreciation. When search trials were conditionalized according to distractor fixations, language effects became more orderly: Search was fastest while people spoke target names, followed in linear order by the nonword, distractor-absent, and distractor-present conditions. We suggest that language affects template maintenance during search, allowing fluent differentiation of targets and distractors. Materials, data, and analyses can be retrieved here: https://osf.io/z9ex2/.
Collapse
|
4
|
Ding J, Wang Y, Wang C, d'Oleire Uquillas F, He Q, Cheng L, Zou Z. Negative Impact of Sadness on Response Inhibition in Females: An Explicit Emotional Stop Signal Task fMRI Study. Front Behav Neurosci 2020; 14:119. [PMID: 32903296 PMCID: PMC7396530 DOI: 10.3389/fnbeh.2020.00119] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2020] [Accepted: 06/15/2020] [Indexed: 01/02/2023] Open
Abstract
Response inhibition is a critical cognitive ability underlying executive control over reactions to external cues, or inner requirements. Previous studies suggest that high arousal negative emotions (e.g., anger or fear) could impair response inhibition in implicit emotional stop signal tasks (eSSTs). However, studies exploring how low arousal negative emotions (e.g., sadness) influence response inhibition remain sparse. In the current study, 20 female college students performed an explicit eSST to explore the influence of sadness on response inhibition and its neural mechanism. Participants are instructed to press a button to sad or neutral facial stimuli while inhibiting their response during the presentation of a stop signal. Results showed that compared with neutral stimuli, sad stimuli were related to increased stop signal reaction time (SSRT) (i.e., worse response inhibition). Compared with neutral condition, higher activation during sad condition was found within the right superior frontal gyrus (SFG), right insula, right middle cingulate cortex (MCC), bilateral superior temporal gyrus (STG), left lingual gyrus, and right motor cortex. These findings indicated that sadness, like other negative emotions, may impair response inhibition in an explicit way and highlight the explicit eSST as a new paradigm to investigate the subtle interaction between negative emotion processing and cognitive control.
Collapse
Affiliation(s)
- Jianrui Ding
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Yongming Wang
- Sino-Danish College, University of Chinese Academy of Sciences, Beijing, China.,Sino-Danish Center for Education and Research, Beijing, China
| | - Chuan Wang
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Federico d'Oleire Uquillas
- Department of Neurology, Massachusetts General Hospital, Harvard Medical School, Boston, MA, United States.,Princeton Neuroscience Institute, Princeton University, Princeton, NJ, United States
| | - Qinghua He
- Faculty of Psychology, Southwest University, Chongqing, China
| | - Li Cheng
- Faculty of Education, Beijing Normal University, Beijing, China
| | - Zhiling Zou
- Faculty of Psychology, Southwest University, Chongqing, China
| |
Collapse
|
5
|
Effects of meaningfulness on perception: Alpha-band oscillations carry perceptual expectations and influence early visual responses. Sci Rep 2018; 8:6606. [PMID: 29700428 PMCID: PMC5920106 DOI: 10.1038/s41598-018-25093-5] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2017] [Accepted: 04/09/2018] [Indexed: 12/03/2022] Open
Abstract
Perceptual experience results from a complex interplay of bottom-up input and prior knowledge about the world, yet the extent to which knowledge affects perception, the neural mechanisms underlying these effects, and the stages of processing at which these two sources of information converge, are still unclear. In several experiments we show that language, in the form of verbal labels, both aids recognition of ambiguous “Mooney” images and improves objective visual discrimination performance in a match/non-match task. We then used electroencephalography (EEG) to better understand the mechanisms of this effect. The improved discrimination of images previously labeled was accompanied by a larger occipital-parietal P1 evoked response to the meaningful versus meaningless target stimuli. Time-frequency analysis of the interval between the cue and the target stimulus revealed increases in the power of posterior alpha-band (8–14 Hz) oscillations when the meaning of the stimuli to be compared was trained. The magnitude of the pre-target alpha difference and the P1 amplitude difference were positively correlated across individuals. These results suggest that prior knowledge prepares the brain for upcoming perception via the modulation of alpha-band oscillations, and that this preparatory state influences early (~120 ms) stages of visual processing.
Collapse
|
6
|
Yirdaw E, Monge A. Reconsidering what enclosure and exclosure mean in restoration ecology. Restor Ecol 2017. [DOI: 10.1111/rec.12569] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
- Eshetu Yirdaw
- Viikki Tropical Resources Institute (VITRI), Department of Forest Sciences; University of Helsinki; PO Box 27, Helsinki FI 00014 Finland
| | - Adrian Monge
- Viikki Tropical Resources Institute (VITRI), Department of Forest Sciences; University of Helsinki; PO Box 27, Helsinki FI 00014 Finland
| |
Collapse
|
7
|
Abstract
AbstractI applaud Firestone & Scholl (F&S) in calling for more rigor. But, although F&S are correct that some published work on top-down effects suffers from confounds, their sweeping claim that there are no top-down effects on perception is premised on incorrect assumptions. F&S's thesis is wrong. Perception is richly and interestingly influenced by cognition.
Collapse
|
8
|
Thierry G. Neurolinguistic Relativity: How Language Flexes Human Perception and Cognition. LANGUAGE LEARNING 2016; 66:690-713. [PMID: 27642191 PMCID: PMC5006882 DOI: 10.1111/lang.12186] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 03/06/2016] [Indexed: 05/10/2023]
Abstract
The time has come, perhaps, to go beyond merely acknowledging that language is a core manifestation of the workings of the human mind and that it relates interactively to all aspects of thinking. The issue, thus, is not to decide whether language and human thought may be ineluctably linked (they just are), but rather to determine what the characteristics of this relationship may be and to understand how language influences-and may be influenced by-nonverbal information processing. In an attempt to demystify linguistic relativity, I review neurolinguistic studies from our research group showing a link between linguistic distinctions and perceptual or conceptual processing. On the basis of empirical evidence showing effects of terminology on perception, language-idiosyncratic relationships in semantic memory, grammatical skewing of event conceptualization, and unconscious modulation of executive functioning by verbal input, I advocate a neurofunctional approach through which we can systematically explore how languages shape human thought.
Collapse
Affiliation(s)
- Guillaume Thierry
- School of Psychology and Centre for Research on Bilingualism Bangor University
| |
Collapse
|
9
|
Wiley RW, Wilson C, Rapp B. The effects of alphabet and expertise on letter perception. J Exp Psychol Hum Percept Perform 2016; 42:1186-203. [PMID: 26913778 DOI: 10.1037/xhp0000213] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Long-standing questions in human perception concern the nature of the visual features that underlie letter recognition and the extent to which the visual processing of letters is affected by differences in alphabets and levels of viewer expertise. We examined these issues in a novel approach using a same-different judgment task on pairs of letters from the Arabic alphabet with 2 participant groups: 1 with no prior exposure to Arabic and 1 with reading proficiency. Hierarchical clustering and linear mixed-effects modeling of reaction times and accuracy provide evidence that both the specific characteristics of the alphabet and observers' previous experience with it affect how letters are perceived and visually processed. The findings of this research further our understanding of the multiple factors that affect letter perception and support the view of a visual system that dynamically adjusts its weighting of visual features as expert readers come to more efficiently and effectively discriminate the letters of the specific alphabet they are viewing. (PsycINFO Database Record
Collapse
Affiliation(s)
- Robert W Wiley
- Department of Cognitive Science, Johns Hopkins University
| | - Colin Wilson
- Department of Cognitive Science, Johns Hopkins University
| | - Brenda Rapp
- Department of Cognitive Science, Johns Hopkins University
| |
Collapse
|
10
|
Abstract
The spectacularly varied responses to our target article raised big-picture questions about the nature of seeing and thinking, nitty-gritty experimental design details, and everything in between. We grapple with these issues, including the ready falsifiability of our view, neuroscientific theories that allow everything but demand nothing, cases where seeing and thinking conflict, mental imagery, the free press, an El Greco fallacy fallacy, hallucinogenic drugs, blue bananas, subatomic particles, Boeing 787s, and the racial identities of geometric shapes.
Collapse
|
11
|
Exploring the automaticity of language-perception interactions: Effects of attention and awareness. Sci Rep 2015; 5:17725. [PMID: 26640162 PMCID: PMC4671057 DOI: 10.1038/srep17725] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2015] [Accepted: 11/03/2015] [Indexed: 11/22/2022] Open
Abstract
Previous studies have shown that language can modulate visual perception, by biasing and/or enhancing perceptual performance. However, it is still debated where in the brain visual and linguistic information are integrated, and whether the effects of language on perception are automatic and persist even in the absence of awareness of the linguistic material. Here, we aimed to explore the automaticity of language-perception interactions and the neural loci of these interactions in an fMRI study. Participants engaged in a visual motion discrimination task (upward or downward moving dots). Before each trial, a word prime was briefly presented that implied upward or downward motion (e.g., “rise”, “fall”). These word primes strongly influenced behavior: congruent motion words sped up reaction times and improved performance relative to incongruent motion words. Neural congruency effects were only observed in the left middle temporal gyrus, showing higher activity for congruent compared to incongruent conditions. This suggests that higher-level conceptual areas rather than sensory areas are the locus of language-perception interactions. When motion words were rendered unaware by means of masking, they still affected visual motion perception, suggesting that language-perception interactions may rely on automatic feed-forward integration of perceptual and semantic material in language areas of the brain.
Collapse
|
12
|
Francken JC, Kok P, Hagoort P, de Lange FP. The behavioral and neural effects of language on motion perception. J Cogn Neurosci 2015; 27:175-84. [PMID: 25000524 DOI: 10.1162/jocn_a_00682] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Perception does not function as an isolated module but is tightly linked with other cognitive functions. Several studies have demonstrated an influence of language on motion perception, but it remains debated at which level of processing this modulation takes place. Some studies argue for an interaction in perceptual areas, but it is also possible that the interaction is mediated by "language areas" that integrate linguistic and visual information. Here, we investigated whether language-perception interactions were specific to the language-dominant left hemisphere by comparing the effects of language on visual material presented in the right (RVF) and left visual fields (LVF). Furthermore, we determined the neural locus of the interaction using fMRI. Participants performed a visual motion detection task. On each trial, the visual motion stimulus was presented in either the LVF or in the RVF, preceded by a centrally presented word (e.g., "rise"). The word could be congruent, incongruent, or neutral with regard to the direction of the visual motion stimulus that was presented subsequently. Participants were faster and more accurate when the direction implied by the motion word was congruent with the direction of the visual motion stimulus. Interestingly, the speed benefit was present only for motion stimuli that were presented in the RVF. We observed a neural counterpart of the behavioral facilitation effects in the left middle temporal gyrus, an area involved in semantic processing of verbal material. Together, our results suggest that semantic information about motion retrieved in language regions may automatically modulate perceptual decisions about motion.
Collapse
|
13
|
Abstract
AbstractWhat determines what we see? In contrast to the traditional “modular” understanding of perception, according to which visual processing is encapsulated from higher-level cognition, a tidal wave of recent research alleges that states such as beliefs, desires, emotions, motivations, intentions, and linguistic representations exert direct, top-down influences on what we see. There is a growing consensus that such effects are ubiquitous, and that the distinction between perception and cognition may itself be unsustainable. We argue otherwise: None of these hundreds of studies – either individually or collectively – provides compelling evidence for true top-down effects on perception, or “cognitive penetrability.” In particular, and despite their variety, we suggest that these studies all fall prey to only a handful of pitfalls. And whereas abstract theoretical challenges have failed to resolve this debate in the past, our presentation of these pitfalls is empirically anchored: In each case, we show not only how certain studies could be susceptible to the pitfall (in principle), but also how several alleged top-down effects actually are explained by the pitfall (in practice). Moreover, these pitfalls are perfectly general, with each applying to dozens of other top-down effects. We conclude by extracting the lessons provided by these pitfalls into a checklist that future work could use to convincingly demonstrate top-down effects on visual perception. The discovery of substantive top-down effects of cognition on perception would revolutionize our understanding of how the mind is organized; but without addressing these pitfalls, no such empirical report will license such exciting conclusions.
Collapse
|
14
|
Boutonnet B, Lupyan G. Words Jump-Start Vision: A Label Advantage in Object Recognition. J Neurosci 2015; 35:9329-35. [PMID: 26109657 PMCID: PMC6605198 DOI: 10.1523/jneurosci.5111-14.2015] [Citation(s) in RCA: 63] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2014] [Revised: 04/08/2015] [Accepted: 05/10/2015] [Indexed: 02/01/2023] Open
Abstract
People use language to shape each other's behavior in highly flexible ways. Effects of language are often assumed to be "high-level" in that, whereas language clearly influences reasoning, decision making, and memory, it does not influence low-level visual processes. Here, we test the prediction that words are able to provide top-down guidance at the very earliest stages of visual processing by acting as powerful categorical cues. We investigated whether visual processing of images of familiar animals and artifacts was enhanced after hearing their name (e.g., "dog") compared with hearing an equally familiar and unambiguous nonverbal sound (e.g., a dog bark) in 14 English monolingual speakers. Because the relationship between words and their referents is categorical, we expected words to deploy more effective categorical templates, allowing for more rapid visual recognition. By recording EEGs, we were able to determine whether this label advantage stemmed from changes to early visual processing or later semantic decision processes. The results showed that hearing a word affected early visual processes and that this modulation was specific to the named category. An analysis of ERPs showed that the P1 was larger when people were cued by labels compared with equally informative nonverbal cues-an enhancement occurring within 100 ms of image onset, which also predicted behavioral responses occurring almost 500 ms later. Hearing labels modulated the P1 such that it distinguished between target and nontarget images, showing that words rapidly guide early visual processing.
Collapse
Affiliation(s)
- Bastien Boutonnet
- Leiden Institute for Brain and Cognition, University of Leiden, NL-2300 RA Leiden, The Netherlands, and
| | - Gary Lupyan
- Department of Psychology, University of Wisconsin-Madison, Madison, Wisconsin 53706
| |
Collapse
|
15
|
Francken JC, Meijs EL, Ridderinkhof OM, Hagoort P, de Lange FP, van Gaal S. Manipulating word awareness dissociates feed-forward from feedback models of language-perception interactions. Neurosci Conscious 2015; 2015:niv003. [PMID: 30135740 PMCID: PMC6089086 DOI: 10.1093/nc/niv003] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2015] [Revised: 04/20/2015] [Accepted: 05/19/2015] [Indexed: 11/14/2022] Open
Abstract
Previous studies suggest that linguistic material can modulate visual perception, but it is unclear at which level of processing these interactions occur. Here we aim to dissociate between two competing models of language-perception interactions: a feed-forward and a feedback model. We capitalized on the fact that the models make different predictions on the role of feedback. We presented unmasked (aware) or masked (unaware) words implying motion (e.g. "rise," "fall"), directly preceding an upward or downward visual motion stimulus. Crucially, masking leaves intact feed-forward information processing from low- to high-level regions, whereas it abolishes subsequent feedback. Under this condition, participants remained faster and more accurate when the direction implied by the motion word was congruent with the direction of the visual motion stimulus. This suggests that language-perception interactions are driven by the feed-forward convergence of linguistic and perceptual information at higher-level conceptual and decision stages.
Collapse
Affiliation(s)
- Jolien C. Francken
- Donders Institute for Brain, Cognition and Behavior, Radboud University Nijmegen, PO Box 9101, 6500 HB Nijmegen, Netherlands
| | - Erik L. Meijs
- Donders Institute for Brain, Cognition and Behavior, Radboud University Nijmegen, PO Box 9101, 6500 HB Nijmegen, Netherlands
| | - Odile M. Ridderinkhof
- Department of Psychology, University of Amsterdam, Weesperplein 4, 1018 XA Amsterdam, Netherlands
| | - Peter Hagoort
- Donders Institute for Brain, Cognition and Behavior, Radboud University Nijmegen, PO Box 9101, 6500 HB Nijmegen, Netherlands
- Max Planck Institute for Psycholinguistics, Wundtlaan 1, 6525 XD Nijmegen, Netherlands
| | - Floris P. de Lange
- Donders Institute for Brain, Cognition and Behavior, Radboud University Nijmegen, PO Box 9101, 6500 HB Nijmegen, Netherlands
| | - Simon van Gaal
- Donders Institute for Brain, Cognition and Behavior, Radboud University Nijmegen, PO Box 9101, 6500 HB Nijmegen, Netherlands
- Department of Psychology, University of Amsterdam, Weesperplein 4, 1018 XA Amsterdam, Netherlands
| |
Collapse
|
16
|
Webster J, Kay P, Webster MA. Perceiving the average hue of color arrays. JOURNAL OF THE OPTICAL SOCIETY OF AMERICA. A, OPTICS, IMAGE SCIENCE, AND VISION 2014; 31:A283-92. [PMID: 24695184 PMCID: PMC3979548 DOI: 10.1364/josaa.31.00a283] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/16/2023]
Abstract
The average of a color distribution has special significance for color coding (e.g., to estimate the illuminant) but how it depends on the visual representation (e.g., perceptual versus cone-opponent) or nonlinearities (e.g., categorical coding) is unknown. We measured the perceived average of two colors shown alternated in spatial arrays. Observers adjusted the components until the average equaled a specified reference hue. Matches for red, blue-red, or yellow-green were consistent with the arithmetic mean chromaticity, while blue-green settings deviated toward blue. The settings show little evidence for categorical coding, and cannot be predicted from the scaled appearances of the individual components.
Collapse
Affiliation(s)
- Jacquelyn Webster
- Department of Psychology, University of Nevada, Reno, Reno, Nevada 89557, USA
- Corresponding author:
| | - Paul Kay
- Department of Linguistics, University of California, Berkeley, Berkeley, California 94720, USA
- International Computer Science Institute, Berkeley, California, Berkeley, California 94720-1776, USA
| | - Michael A. Webster
- Department of Psychology, University of Nevada, Reno, Reno, Nevada 89557, USA
| |
Collapse
|
17
|
Boutonnet B, Dering B, Viñas-Guasch N, Thierry G. Seeing Objects through the Language Glass. J Cogn Neurosci 2013; 25:1702-10. [DOI: 10.1162/jocn_a_00415] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Recent streams of research support the Whorfian hypothesis according to which language affects one's perception of the world. However, studies of object categorization in different languages have heavily relied on behavioral measures that are fuzzy and inconsistent. Here, we provide the first electrophysiological evidence for unconscious effects of language terminology on object perception. Whereas English has two words for cup and mug, Spanish labels those two objects with the word “taza.” We tested native speakers of Spanish and English in an object detection task using a visual oddball paradigm, while measuring event-related brain potentials. The early deviant-related negativity elicited by deviant stimuli was greater in English than in Spanish participants. This effect, which relates to the existence of two labels in English versus one in Spanish, substantiates the neurophysiological evidence that language-specific terminology affects object categorization.
Collapse
|
18
|
Myachykov A, Scheepers C, Shtyrov YY. Interfaces between language and cognition. Front Psychol 2013; 4:258. [PMID: 23653620 PMCID: PMC3644674 DOI: 10.3389/fpsyg.2013.00258] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2013] [Accepted: 04/18/2013] [Indexed: 11/13/2022] Open
Affiliation(s)
- Andriy Myachykov
- Department of Psychology, Northumbria University Newcastle upon Tyne, UK
| | | | | |
Collapse
|