1
|
Sato Y, Nishimaru H, Matsumoto J, Setogawa T, Nishijo H. Electroencephalographic Effective Connectivity Analysis of the Neural Networks during Gesture and Speech Production Planning in Young Adults. Brain Sci 2023; 13:brainsci13010100. [PMID: 36672081 PMCID: PMC9856316 DOI: 10.3390/brainsci13010100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2022] [Revised: 12/19/2022] [Accepted: 12/29/2022] [Indexed: 01/06/2023] Open
Abstract
Gestures and speech, as linked communicative expressions, form an integrated system. Previous functional magnetic resonance imaging studies have suggested that neural networks for gesture and spoken word production share similar brain regions consisting of fronto-temporo-parietal brain regions. However, information flow within the neural network may dynamically change during the planning of two communicative expressions and also differ between them. To investigate dynamic information flow in the neural network during the planning of gesture and spoken word generation in this study, participants were presented with spatial images and were required to plan the generation of gestures or spoken words to represent the same spatial situations. The evoked potentials in response to spatial images were recorded to analyze the effective connectivity within the neural network. An independent component analysis of the evoked potentials indicated 12 clusters of independent components, the dipoles of which were located in the bilateral fronto-temporo-parietal brain regions and on the medial wall of the frontal and parietal lobes. Comparison of effective connectivity indicated that information flow from the right middle cingulate gyrus (MCG) to the left supplementary motor area (SMA) and from the left SMA to the left precentral area increased during gesture planning compared with that of word planning. Furthermore, information flow from the right MCG to the left superior frontal gyrus also increased during gesture planning compared with that of word planning. These results suggest that information flow to the brain regions for hand praxis is more strongly activated during gesture planning than during word planning.
Collapse
Affiliation(s)
- Yohei Sato
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama 930-0194, Japan
| | - Hiroshi Nishimaru
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama 930-0194, Japan
- Research Center for Idling Brain Science (RCIBS), University of Toyama, Toyama 930-0194, Japan
| | - Jumpei Matsumoto
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama 930-0194, Japan
- Research Center for Idling Brain Science (RCIBS), University of Toyama, Toyama 930-0194, Japan
| | - Tsuyoshi Setogawa
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama 930-0194, Japan
- Research Center for Idling Brain Science (RCIBS), University of Toyama, Toyama 930-0194, Japan
| | - Hisao Nishijo
- Department of System Emotional Science, Faculty of Medicine, University of Toyama, Toyama 930-0194, Japan
- Research Center for Idling Brain Science (RCIBS), University of Toyama, Toyama 930-0194, Japan
- Correspondence:
| |
Collapse
|
2
|
He Y, Luell S, Muralikrishnan R, Straube B, Nagels A. Gesture's body orientation modulates the N400 for visual sentences primed by gestures. Hum Brain Mapp 2020; 41:4901-4911. [PMID: 32808721 PMCID: PMC7643362 DOI: 10.1002/hbm.25166] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2020] [Revised: 07/16/2020] [Accepted: 07/23/2020] [Indexed: 01/08/2023] Open
Abstract
Body orientation of gesture entails social-communicative intention, and may thus influence how gestures are perceived and comprehended together with auditory speech during face-to-face communication. To date, despite the emergence of neuroscientific literature on the role of body orientation on hand action perception, limited studies have directly investigated the role of body orientation in the interaction between gesture and language. To address this research question, we carried out an electroencephalography (EEG) experiment presenting to participants (n = 21) videos of frontal and lateral communicative hand gestures of 5 s (e.g., raising a hand), followed by visually presented sentences that are either congruent or incongruent with the gesture (e.g., "the mountain is high/low…"). Participants underwent a semantic probe task, judging whether a target word is related or unrelated to the gesture-sentence event. EEG results suggest that, during the perception phase of handgestures, while both frontal and lateral gestures elicited a power decrease in both the alpha (8-12 Hz) and the beta (16-24 Hz) bands, lateral versus frontal gestures elicited reduced power decrease in the beta band, source-located to the medial prefrontal cortex. For sentence comprehension, at the critical word whose meaning is congruent/incongruent with the gesture prime, frontal gestures elicited an N400 effect for gesture-sentence incongruency. More importantly, this incongruency effect was significantly reduced for lateral gestures. These findings suggest that body orientation plays an important role in gesture perception, and that its inferred social-communicative intention may influence gesture-language interaction at semantic level.
Collapse
Affiliation(s)
- Yifei He
- Department of Psychiatry and PsychotherapyPhilipps‐University MarburgMarburgGermany
| | - Svenja Luell
- Department of General LinguisticsJohannes‐Gutenberg University MainzMainzGermany
| | - R. Muralikrishnan
- Department of NeuroscienceMax Planck Institute for Empirical AestheticsFrankfurtGermany
| | - Benjamin Straube
- Department of Psychiatry and PsychotherapyPhilipps‐University MarburgMarburgGermany
| | - Arne Nagels
- Department of General LinguisticsJohannes‐Gutenberg University MainzMainzGermany
| |
Collapse
|
3
|
Momsen J, Gordon J, Wu YC, Coulson S. Verbal working memory and co-speech gesture processing. Brain Cogn 2020; 146:105640. [PMID: 33171343 DOI: 10.1016/j.bandc.2020.105640] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2020] [Revised: 09/21/2020] [Accepted: 10/19/2020] [Indexed: 12/15/2022]
Abstract
Multimodal discourse requires an assembly of cognitive processes that are uniquely recruited for language comprehension in social contexts. In this study, we investigated the role of verbal working memory for the online integration of speech and iconic gestures. Participants memorized and rehearsed a series of auditorily presented digits in low (one digit) or high (four digits) memory load conditions. To observe how verbal working memory load impacts online discourse comprehension, ERPs were recorded while participants watched discourse videos containing either congruent or incongruent speech-gesture combinations during the maintenance portion of the memory task. While expected speech-gesture congruity effects were found in the low memory load condition, high memory load trials elicited enhanced frontal positivities that indicated a unique interaction between online speech-gesture integration and the availability of verbal working memory resources. This work contributes to an understanding of discourse comprehension by demonstrating that language processing in a multimodal context is subject to the relationship between cognitive resource availability and the degree of controlled processing required for task performance. We suggest that verbal working memory is less important for speech-gesture integration than it is for mediating speech processing under high task demands.
Collapse
Affiliation(s)
- Jacob Momsen
- Joint Doctoral Program Language and Communicative Disorders, San Diego State University and UC San Diego, United States
| | - Jared Gordon
- Cognitive Science Department, UC San Diego, United States
| | - Ying Choon Wu
- Swartz Center for Computational Neuroscience, UC San Diego, United States
| | - Seana Coulson
- Joint Doctoral Program Language and Communicative Disorders, San Diego State University and UC San Diego, United States; Cognitive Science Department, UC San Diego, United States.
| |
Collapse
|
4
|
De Stefani E, De Marco D. Language, Gesture, and Emotional Communication: An Embodied View of Social Interaction. Front Psychol 2019; 10:2063. [PMID: 31607974 PMCID: PMC6769117 DOI: 10.3389/fpsyg.2019.02063] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2019] [Accepted: 08/26/2019] [Indexed: 11/13/2022] Open
Abstract
Spoken language is an innate ability of the human being and represents the most widespread mode of social communication. The ability to share concepts, intentions and feelings, and also to respond to what others are feeling/saying is crucial during social interactions. A growing body of evidence suggests that language evolved from manual gestures, gradually incorporating motor acts with vocal elements. In this evolutionary context, the human mirror mechanism (MM) would permit the passage from “doing something” to “communicating it to someone else.” In this perspective, the MM would mediate semantic processes being involved in both the execution and in the understanding of messages expressed by words or gestures. Thus, the recognition of action related words would activate somatosensory regions, reflecting the semantic grounding of these symbols in action information. Here, the role of the sensorimotor cortex and in general of the human MM on both language perception and understanding is addressed, focusing on recent studies on the integration between symbolic gestures and speech. We conclude documenting some evidence about MM in coding also the emotional aspects conveyed by manual, facial and body signals during communication, and how they act in concert with language to modulate other’s message comprehension and behavior, in line with an “embodied” and integrated view of social interaction.
Collapse
Affiliation(s)
| | - Doriana De Marco
- Consiglio Nazionale delle Ricerche, Istituto di Neuroscienze, Parma, Italy
| |
Collapse
|
5
|
Children with facial paralysis due to Moebius syndrome exhibit reduced autonomic modulation during emotion processing. J Neurodev Disord 2019; 11:12. [PMID: 31291910 PMCID: PMC6617955 DOI: 10.1186/s11689-019-9272-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Accepted: 06/21/2019] [Indexed: 12/31/2022] Open
Abstract
BACKGROUND Facial mimicry is crucial in the recognition of others' emotional state. Thus, the observation of others' facial expressions activates the same neural representation of that affective state in the observer, along with related autonomic and somatic responses. What happens, therefore, when someone cannot mimic others' facial expressions? METHODS We investigated whether psychophysiological emotional responses to others' facial expressions were impaired in 13 children (9 years) with Moebius syndrome (MBS), an extremely rare neurological disorder (1/250,000 live births) characterized by congenital facial paralysis. We inspected autonomic responses and vagal regulation through facial cutaneous thermal variations and by the computation of respiratory sinus arrhythmia (RSA). These parameters provide measures of emotional arousal and show the autonomic adaptation to others' social cues. Physiological responses in children with MBS were recorded during dynamic facial expression observation and were compared to those of a control group (16 non-affected children, 9 years). RESULTS There were significant group effects on thermal patterns and RSA, with lower values in children with MBS. We also observed a mild deficit in emotion recognition in these patients. CONCLUSION Results support "embodied" theory, whereby the congenital inability to produce facial expressions induces alterations in the processing of facial expression of emotions. Such alterations may constitute a risk for emotion dysregulation.
Collapse
|
6
|
De Stefani E, Nicolini Y, Belluardo M, Ferrari PF. Congenital facial palsy and emotion processing: The case of Moebius syndrome. GENES BRAIN AND BEHAVIOR 2019; 18:e12548. [PMID: 30604920 DOI: 10.1111/gbb.12548] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2018] [Revised: 11/16/2018] [Accepted: 12/15/2018] [Indexed: 12/13/2022]
Abstract
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.
Collapse
Affiliation(s)
- Elisa De Stefani
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Ylenia Nicolini
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Mauro Belluardo
- Department of Medicine and Surgery, University of Parma, Parma, Italy
| | - Pier Francesco Ferrari
- Department of Medicine and Surgery, University of Parma, Parma, Italy.,Institut des Sciences Cognitives Marc Jeannerod, CNRS, Université de Lyon, Lyon, France
| |
Collapse
|
7
|
Calbi M, Siri F, Heimann K, Barratt D, Gallese V, Kolesnikov A, Umiltà MA. How context influences the interpretation of facial expressions: a source localization high-density EEG study on the "Kuleshov effect". Sci Rep 2019; 9:2107. [PMID: 30765713 PMCID: PMC6376122 DOI: 10.1038/s41598-018-37786-y] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2018] [Accepted: 12/12/2018] [Indexed: 11/24/2022] Open
Abstract
Few studies have explored the specificities of contextual modulations of the processing of facial expressions at a neuronal level. This study fills this gap by employing an original paradigm, based on a version of the filmic “Kuleshov effect”. High-density EEG was recorded while participants watched film sequences consisting of three shots: the close-up of a target person’s neutral face (Face_1), the scene that the target person was looking at (happy, fearful, or neutral), and another close-up of the same target person’s neutral face (Face_2). The participants’ task was to rate both valence and arousal, and subsequently to categorize the target person’s emotional state. The results indicate that despite a significant behavioural ‘context’ effect, the electrophysiological indexes still indicate that the face is evaluated as neutral. Specifically, Face_2 elicited a high amplitude N170 when preceded by neutral contexts, and a high amplitude Late Positive Potential (LPP) when preceded by emotional contexts, thus showing sensitivity to the evaluative congruence (N170) and incongruence (LPP) between context and Face_2. The LPP activity was mainly underpinned by brain regions involved in facial expressions and emotion recognition processing. Our results shed new light on temporal and neural correlates of context-sensitivity in the interpretation of facial expressions.
Collapse
Affiliation(s)
- Marta Calbi
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy.
| | - Francesca Siri
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
| | - Katrin Heimann
- Interacting Minds Center, University of Aarhus, Aarhus, Denmark
| | - Daniel Barratt
- Department of Management, Society and Communication, Copenhagen Business School, Copenhagen, Denmark
| | - Vittorio Gallese
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy. .,Institute of Philosophy, School of Advanced Study, University of London, London, UK.
| | - Anna Kolesnikov
- Department of Humanities, Social Sciences and Cultural Industries, University of Parma, Parma, Italy
| | | |
Collapse
|
8
|
Dalla Volta R, Avanzini P, De Marco D, Gentilucci M, Fabbri-Destro M. From meaning to categorization: The hierarchical recruitment of brain circuits selective for action verbs. Cortex 2017; 100:95-110. [PMID: 29079343 DOI: 10.1016/j.cortex.2017.09.012] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2017] [Revised: 05/20/2017] [Accepted: 09/20/2017] [Indexed: 11/24/2022]
Abstract
Sensorimotor and affective brain systems are known to be involved in language processing. However, to date it is still debated whether this involvement is a crucial step of semantic processing or, on the contrary, it is dependent on the specific context or strategy adopted to solve a task at hand. The present electroencephalographic (EEG) study is aimed at investigating which brain circuits are engaged when processing written verbs. By aligning event-related potentials (ERPs) both to the verb onset and to the motor response indexing the accomplishment of a semantic task of categorization, we were able to dissociate the relative stimulus-related and response-related cognitive components at play, respectively. EEG signal source reconstruction showed that while the recruitment of sensorimotor fronto-parietal circuits was time-locked with action verb onset, a left temporal-parietal circuit was time-locked to the task accomplishment. Crucially, by comparing the time course of both these bottom-up and top-down cognitive components, it appears that the frontal motor involvement precedes the task-related temporal-parietal activity. The present findings suggest that the recruitment of fronto-parietal sensorimotor circuits is independent of the specific strategy adopted to solve a semantic task and, given its temporal hierarchy, it may provide crucial information to brain circuits involved in the categorization task. Eventually, a discussion on how the present results may contribute to the clinical literature on patients affected by disorders specifically impairing the motor system is provided.
Collapse
Affiliation(s)
- Riccardo Dalla Volta
- Dipartimento di Scienze Mediche e Chirurgiche, Università Magna Graecia, Catanzaro, Italy.
| | - Pietro Avanzini
- Consiglio Nazionale delle Ricerche (CNR), Istituto di Neuroscienze, Parma, Italy
| | - Doriana De Marco
- Consiglio Nazionale delle Ricerche (CNR), Istituto di Neuroscienze, Parma, Italy
| | | | | |
Collapse
|
9
|
"Embodied Body Language": an electrical neuroimaging study with emotional faces and bodies. Sci Rep 2017; 7:6875. [PMID: 28761076 PMCID: PMC5537350 DOI: 10.1038/s41598-017-07262-0] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2016] [Accepted: 06/28/2017] [Indexed: 11/09/2022] Open
Abstract
To date, most investigations in the field of affective neuroscience mainly focused on the processing of facial expressions, overlooking the exploration of emotional body language (EBL), its capability to express our emotions notwithstanding. Few electrophysiological studies investigated the time course and the neural correlates of EBL and the integration of face and body emotion-related information. The aim of the present study was to investigate both the time course and the neural correlates underlying the integration of affective information conveyed by faces and bodies. We analysed EEG activities evoked during an expression matching task, requiring the judgment of emotional congruence between sequentially presented pairs of stimuli belonging to the same category (face-face or body-body), and between stimuli belonging to different categories (face-body or body-face). We focused on N400 time window and results showed that incongruent stimuli elicited a modulation of the N400 in all comparisons except for body-face condition. This modulation was mainly detected in the Middle Temporal Gyrus and within regions related to the mirror mechanism. More specifically, while the perception of incongruent facial expressions activates somatosensory-related representations, incongruent emotional body postures also require the activation of motor and premotor representations, suggesting a strict link between emotion and action.
Collapse
|
10
|
Hayes JC, Kraemer DJM. Grounded understanding of abstract concepts: The case of STEM learning. Cogn Res Princ Implic 2017; 2:7. [PMID: 28203635 PMCID: PMC5281667 DOI: 10.1186/s41235-016-0046-z] [Citation(s) in RCA: 33] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2016] [Accepted: 12/23/2016] [Indexed: 11/10/2022] Open
Abstract
Characterizing the neural implementation of abstract conceptual representations has long been a contentious topic in cognitive science. At the heart of the debate is whether the "sensorimotor" machinery of the brain plays a central role in representing concepts, or whether the involvement of these perceptual and motor regions is merely peripheral or epiphenomenal. The domain of science, technology, engineering, and mathematics (STEM) learning provides an important proving ground for sensorimotor (or grounded) theories of cognition, as concepts in science and engineering courses are often taught through laboratory-based and other hands-on methodologies. In this review of the literature, we examine evidence suggesting that sensorimotor processes strengthen learning associated with the abstract concepts central to STEM pedagogy. After considering how contemporary theories have defined abstraction in the context of semantic knowledge, we propose our own explanation for how body-centered information, as computed in sensorimotor brain regions and visuomotor association cortex, can form a useful foundation upon which to build an understanding of abstract scientific concepts, such as mechanical force. Drawing from theories in cognitive neuroscience, we then explore models elucidating the neural mechanisms involved in grounding intangible concepts, including Hebbian learning, predictive coding, and neuronal recycling. Empirical data on STEM learning through hands-on instruction are considered in light of these neural models. We conclude the review by proposing three distinct ways in which the field of cognitive neuroscience can contribute to STEM learning by bolstering our understanding of how the brain instantiates abstract concepts in an embodied fashion.
Collapse
Affiliation(s)
- Justin C. Hayes
- Department of Psychological and Brain Sciences, Dartmouth College, Moore Hall 3 Maynard St., Hanover, NH 03755 USA
- Department of Education, Dartmouth College, 5 Maynard St., Hanover, NH 03755 USA
| | - David J. M. Kraemer
- Department of Education, Dartmouth College, 5 Maynard St., Hanover, NH 03755 USA
| |
Collapse
|