1
|
Quadrelli E, Roberti E, Polver S, Bulf H, Turati C. Sensorimotor Activity and Network Connectivity to Dynamic and Static Emotional Faces in 7-Month-Old Infants. Brain Sci 2021; 11:brainsci11111396. [PMID: 34827394 PMCID: PMC8615901 DOI: 10.3390/brainsci11111396] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Revised: 10/20/2021] [Accepted: 10/22/2021] [Indexed: 11/16/2022] Open
Abstract
The present study investigated whether, as in adults, 7-month-old infants’ sensorimotor brain areas are recruited in response to the observation of emotional facial expressions. Activity of the sensorimotor cortex, as indexed by µ rhythm suppression, was recorded using electroencephalography (EEG) while infants observed neutral, angry, and happy facial expressions either in a static (N = 19) or dynamic (N = 19) condition. Graph theory analysis was used to investigate to which extent neural activity was functionally localized in specific cortical areas. Happy facial expressions elicited greater sensorimotor activation compared to angry faces in the dynamic experimental condition, while no difference was found between the three expressions in the static condition. Results also revealed that happy but not angry nor neutral expressions elicited a significant right-lateralized activation in the dynamic condition. Furthermore, dynamic emotional faces generated more efficient processing as they elicited higher global efficiency and lower networks’ diameter compared to static faces. Overall, current results suggest that, contrarily to neutral and angry faces, happy expressions elicit sensorimotor activity at 7 months and dynamic emotional faces are more efficiently processed by functional brain networks. Finally, current data provide evidence of the existence of a right-lateralized activity for the processing of happy facial expressions.
Collapse
Affiliation(s)
- Ermanno Quadrelli
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
- Correspondence: ; Tel.: +39-026-448-3775
| | - Elisa Roberti
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
| | - Silvia Polver
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
| | - Hermann Bulf
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
| | - Chiara Turati
- Department of Psychology, University of Milano-Bicocca, Edificio U6, Piazza dell’Ateneo Nuovo 1, 20126 Milano, Italy; (E.R.); (S.P.); (H.B.); (C.T.)
- NeuroMI, Milan Center for Neuroscience, 20126 Milano, Italy
| |
Collapse
|
2
|
Wang Y, Chen J, Ku Y. Subliminal affective priming effect: Dissociated processes for intense versus normal facial expressions. Brain Cogn 2020; 148:105674. [PMID: 33388551 DOI: 10.1016/j.bandc.2020.105674] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2020] [Revised: 11/28/2020] [Accepted: 12/20/2020] [Indexed: 11/15/2022]
Abstract
Positive vs. negative intense-facial expressions are difficult to explicitly distinguish; yet, whether they dissociate when subliminally presented remains unclear. Through three experiments using affective priming paradigms, we assessed how intense facial expressions, when presented briefly (17 ms) and masked, influenced following neutral ambiguous words (Experiment 1) or visible facial expressions (Experiments 2&3). We also compared these results with those of using normal facial expressions as primes in each experiment. All experiments indicated masked affective priming effects (biasing valence judgement of neutral words or facilitating reaction time to faces with the same valence as the prime) in normal facial expression, but not those intense ones. Experiment 3 using event related potentials (ERPs) further revealed that two ERP components N250 and LPP were consistent with behavioral changes in the normal condition (larger when valences of primes and targets were different), but inconsistent in the intense condition. Taken together, our results provided behavioral and neural evidence for distinctive processing between normal and intense facial expressions under masked condition.
Collapse
Affiliation(s)
- Yanmei Wang
- School of Psychology and Cognitive Science, East China Normal University, Shanghai 200062, China
| | - Jie Chen
- School of Psychology and Cognitive Science, East China Normal University, Shanghai 200062, China
| | - Yixuan Ku
- Guangdong Provincial Key Laboratory of Social Cognitive Neuroscience and Mental Health, Department of Psychology, Sun Yat-Sen University, Guangzhou 510006, China; Peng Cheng Laboratory, Shenzhen 518055, China.
| |
Collapse
|
3
|
Ross P, Atkinson AP. Expanding Simulation Models of Emotional Understanding: The Case for Different Modalities, Body-State Simulation Prominence, and Developmental Trajectories. Front Psychol 2020; 11:309. [PMID: 32194476 PMCID: PMC7063097 DOI: 10.3389/fpsyg.2020.00309] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 02/10/2020] [Indexed: 12/14/2022] Open
Abstract
Recent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions. First, that fuller consideration be given to simulation of bodily and vocal expressions, given that the body and voice are also important expressive channels for providing cues to another's emotional state. Second, that simulation of other aspects of the perceived emotional state, such as changes in the autonomic nervous system and viscera, might have a more prominent role in underpinning emotion recognition than is typically proposed. Sensorimotor simulation models tend to relegate such body-state simulation to a subsidiary role, despite the plausibility of body-state simulation being able to underpin emotion recognition in the absence of typical sensorimotor simulation. Third, that simulation models of emotion recognition be extended to address how embodied processes and emotion recognition abilities develop through the lifespan. It is not currently clear how this system of sensorimotor and body-state simulation develops and in particular how this affects the development of emotion recognition ability. We review recent findings from the emotional body recognition literature and integrate recent evidence regarding the development of mimicry and interoception to significantly expand simulation models of emotion recognition.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, Durham, United Kingdom
| | | |
Collapse
|
4
|
Barabanschikov V, Korolkova O. Perception of “Live” Facial Expressions. EXPERIMENTAL PSYCHOLOGY (RUSSIA) 2020. [DOI: 10.17759/exppsy.2020130305] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The article provides a review of experimental studies of interpersonal perception on the material of static and dynamic facial expressions as a unique source of information about the person’s inner world. The focus is on the patterns of perception of a moving face, included in the processes of communication and joint activities (an alternative to the most commonly studied perception of static images of a person outside of a behavioral context). The review includes four interrelated topics: face statics and dynamics in the recognition of emotional expressions; specificity of perception of moving face expressions; multimodal integration of emotional cues; generation and perception of facial expressions in communication processes. The analysis identifies the most promising areas of research of face in motion. We show that the static and dynamic modes of facial perception complement each other, and describe the role of qualitative features of the facial expression dynamics in assessing the emotional state of a person. Facial expression is considered as part of a holistic multimodal manifestation of emotions. The importance of facial movements as an instrument of social interaction is emphasized.
Collapse
|