1
|
Tomberg C, Petagna M, de Selliers de Moranville LA. Horses (Equus caballus) facial micro-expressions: insight into discreet social information. Sci Rep 2023; 13:8625. [PMID: 37244937 DOI: 10.1038/s41598-023-35807-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 05/24/2023] [Indexed: 05/29/2023] Open
Abstract
Facial micro-expressions are facial expressions expressed briefly (less than 500 ms) and involuntarily. Described only in humans, we investigated whether micro-expressions could also be expressed by non-human animal species. Using the Equine Facial action coding system (EquiFACS), an objective tool based on facial muscles actions, we demonstrated that a non-human species, Equus caballus, is expressing facial micro-expressions in a social context. The AU17, AD38 and AD1 were selectively modulated as micro-expression-but not as standard facial expression (all durations included)-in presence of a human experimenter. As standard facial expressions, they have been associated with pain or stress but our results didn't support this association for micro-expressions which may convey other information. Like in humans, neural mechanisms underlying the exhibit of micro-expressions may differ from those of standard facial expressions. We found that some micro-expressions could be related to attention and involved in the multisensory processing of the 'fixed attention' observed in horses' high attentional state. The micro-expressions could be used by horses as social information in an interspecies relationship. We hypothesize that facial micro-expressions could be a window on transient internal states of the animal and may provide subtle and discreet social signals.
Collapse
Affiliation(s)
- Claude Tomberg
- Faculty of Medicine, Université Libre de Bruxelles, 808, Route de Lennik, CP 630, 1070, Brussels, Belgium.
| | - Maxime Petagna
- Faculty of Medicine, Université Libre de Bruxelles, 808, Route de Lennik, CP 630, 1070, Brussels, Belgium
| | | |
Collapse
|
2
|
Wu Q, Peng K, Xie Y, Lai Y, Liu X, Zhao Z. An ingroup disadvantage in recognizing micro-expressions. Front Psychol 2022; 13:1050068. [PMID: 36507018 PMCID: PMC9732534 DOI: 10.3389/fpsyg.2022.1050068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 11/08/2022] [Indexed: 11/27/2022] Open
Abstract
Micro-expression is a fleeting facial expression of emotion that usually occurs in high-stake situations and reveals the true emotion that a person tries to conceal. Due to its unique nature, recognizing micro-expression has great applications for fields like law enforcement, medical treatment, and national security. However, the psychological mechanism of micro-expression recognition is still poorly understood. In the present research, we sought to expand upon previous research to investigate whether the group membership of the expresser influences the recognition process of micro-expressions. By conducting two behavioral studies, we found that contrary to the widespread ingroup advantage found in macro-expression recognition, there was a robust ingroup disadvantage in micro-expression recognition instead. Specifically, in Study 1A and 1B, we found that participants were more accurate at recognizing the intense and subtle micro-expressions of their racial outgroups than those micro-expressions of their racial ingroups, and neither the training experience nor the duration of micro-expressions moderated this ingroup disadvantage. In Study 2A and 2B, we further found that mere social categorization alone was sufficient to elicit the ingroup disadvantage for the recognition of intense and subtle micro-expressions, and such an effect was also unaffected by the duration of micro-expressions. These results suggest that individuals spontaneously employ the social category information of others to recognize micro-expressions, and the ingroup disadvantage in micro-expression stems partly from motivated differential processing of ingroup micro-expressions.
Collapse
Affiliation(s)
- Qi Wu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China,*Correspondence: Qi Wu,
| | - Kunling Peng
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yanni Xie
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yeying Lai
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Xuanchen Liu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Ziwei Zhao
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| |
Collapse
|
3
|
Lin Q, Dong Z, Zheng Q, Wang SJ. The effect of facial attractiveness on micro-expression recognition. Front Psychol 2022; 13:959124. [PMID: 36186390 PMCID: PMC9524498 DOI: 10.3389/fpsyg.2022.959124] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Accepted: 08/11/2022] [Indexed: 11/23/2022] Open
Abstract
Micro-expression (ME) is an extremely quick and uncontrollable facial movement that lasts for 40–200 ms and reveals thoughts and feelings that an individual attempts to cover up. Though much more difficult to detect and recognize, ME recognition is similar to macro-expression recognition in that it is influenced by facial features. Previous studies suggested that facial attractiveness could influence facial expression recognition processing. However, it remains unclear whether facial attractiveness could also influence ME recognition. Addressing this issue, this study tested 38 participants with two ME recognition tasks in a static condition or dynamically. Three different MEs (positive, neutral, and negative) at two attractiveness levels (attractive, unattractive). The results showed that participants recognized MEs on attractive faces much quicker than on unattractive ones, and there was a significant interaction between ME and facial attractiveness. Furthermore, attractive happy faces were recognized faster in both the static and the dynamic conditions, highlighting the happiness superiority effect. Therefore, our results provided the first evidence that facial attractiveness could influence ME recognition in a static condition or dynamically.
Collapse
Affiliation(s)
- Qiongsi Lin
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
| | - Zizhao Dong
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
| | - Qiuqiang Zheng
- Teacher Education Curriculum Center, School of Educational Science, Huizhou University, Huizhou, China
| | - Su-Jing Wang
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
- *Correspondence: Su-Jing Wang
| |
Collapse
|
4
|
Ge Y, Su R, Liang Z, Luo J, Tian S, Shen X, Wu H, Liu C. Transcranial Direct Current Stimulation Over the Right Temporal Parietal Junction Facilitates Spontaneous Micro-Expression Recognition. Front Hum Neurosci 2022; 16:933831. [PMID: 35874155 PMCID: PMC9305610 DOI: 10.3389/fnhum.2022.933831] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2022] [Accepted: 06/21/2022] [Indexed: 11/19/2022] Open
Abstract
Micro-expressions are fleeting and subtle emotional expressions. As they are spontaneous and uncontrollable by one's mind, micro-expressions are considered an indicator of genuine emotions. Their accurate recognition and interpretation promote interpersonal interaction and social communication. Therefore, enhancing the ability to recognize micro-expressions has captured much attention. In the current study, we investigated the effects of training on micro-expression recognition with a Chinese version of the Micro-Expression Training Tool (METT). Our goal was to confirm whether the recognition accuracy of spontaneous micro-expressions could be improved through training and brain stimulation. Since the right temporal parietal junction (rTPJ) has been shown to be involved in the explicit process of facial emotion recognition, we hypothesized that the rTPJ would play a role in facilitating the recognition of micro-expressions. The results showed that anodal transcranial direct-current stimulation (tDCS) of the rTPJ indeed improved the recognition of spontaneous micro-expressions, especially for those associated with fear. The improved accuracy of recognizing fear spontaneous micro-expressions was positively correlated with personal distress in the anodal group but not in the sham group. Our study supports that the combined use of tDCS and METT can be a viable way to train and enhance micro-expression recognition.
Collapse
Affiliation(s)
- Yue Ge
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
- Beijing Institute of Biomedicine, Beijing, China
| | - Rui Su
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Zilu Liang
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Jing Luo
- Beijing Institute of Biomedicine, Beijing, China
| | - Suizi Tian
- School of Psychology, Beijing Normal University, Beijing, China
| | - Xunbing Shen
- College of Humanities, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Haiyan Wu
- Centre for Cognitive and Brain Sciences and Department of Psychology, University of Macau, Taipa, China
| | - Chao Liu
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| |
Collapse
|
5
|
Wu Q, Xie Y, Liu X, Liu Y. Oxytocin Impairs the Recognition of Micro-Expressions of Surprise and Disgust. Front Psychol 2022; 13:947418. [PMID: 35846599 PMCID: PMC9277341 DOI: 10.3389/fpsyg.2022.947418] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Accepted: 06/13/2022] [Indexed: 11/13/2022] Open
Abstract
As fleeting facial expressions which reveal the emotion that a person tries to conceal, micro-expressions have great application potentials for fields like security, national defense and medical treatment. However, the physiological basis for the recognition of these facial expressions is poorly understood. In the present research, we utilized a double-blind, placebo-controlled, mixed-model experimental design to investigate the effects of oxytocin on the recognition of micro-expressions in three behavioral studies. Specifically, in Studies 1 and 2, participants were asked to perform a laboratory-based standardized micro-expression recognition task after self-administration of a single dose of intranasal oxytocin (40 IU) or placebo (containing all ingredients except for the neuropeptide). In Study 3, we further examined the effects of oxytocin on the recognition of natural micro-expressions. The results showed that intranasal oxytocin decreased the recognition speed for standardized intense micro-expressions of surprise (Study 1) and decreased the recognition accuracy for standardized subtle micro-expressions of disgust (Study 2). The results of Study 3 further revealed that intranasal oxytocin administration significantly reduced the recognition accuracy for natural micro-expressions of surprise and disgust. The present research is the first to investigate the effects of oxytocin on micro-expression recognition. It suggests that the oxytocin mainly plays an inhibiting role in the recognition of micro-expressions and there are fundamental differences in the neurophysiological basis for the recognition of micro-expressions and macro-expressions.
Collapse
Affiliation(s)
- Qi Wu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China
- Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
- *Correspondence: Qi Wu,
| | - Yanni Xie
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China
- Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Xuanchen Liu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China
- Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yulong Liu
- School of Finance and Management, Changsha Social Work College, Changsha, China
| |
Collapse
|
6
|
Bello H, Zhou B, Lukowicz P. Facial Muscle Activity Recognition with Reconfigurable Differential Stethoscope-Microphones. SENSORS 2020; 20:s20174904. [PMID: 32872633 PMCID: PMC7506891 DOI: 10.3390/s20174904] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 08/21/2020] [Accepted: 08/27/2020] [Indexed: 12/02/2022]
Abstract
Many human activities and states are related to the facial muscles’ actions: from the expression of emotions, stress, and non-verbal communication through health-related actions, such as coughing and sneezing to nutrition and drinking. In this work, we describe, in detail, the design and evaluation of a wearable system for facial muscle activity monitoring based on a re-configurable differential array of stethoscope-microphones. In our system, six stethoscopes are placed at locations that could easily be integrated into the frame of smart glasses. The paper describes the detailed hardware design and selection and adaptation of appropriate signal processing and machine learning methods. For the evaluation, we asked eight participants to imitate a set of facial actions, such as expressions of happiness, anger, surprise, sadness, upset, and disgust, and gestures, like kissing, winkling, sticking the tongue out, and taking a pill. An evaluation of a complete data set of 2640 events with 66% training and a 33% testing rate has been performed. Although we encountered high variability of the volunteers’ expressions, our approach shows a recall = 55%, precision = 56%, and f1-score of 54% for the user-independent scenario(9% chance-level). On a user-dependent basis, our worst result has an f1-score = 60% and best result with f1-score = 89%. Having a recall ≥60% for expressions like happiness, anger, kissing, sticking the tongue out, and neutral(Null-class).
Collapse
Affiliation(s)
- Hymalai Bello
- German Research Center for Artificial Intelligence(DFKI), 67663 Kaiserslautern, Germany; (B.Z.); (P.L.)
- Correspondence:
| | - Bo Zhou
- German Research Center for Artificial Intelligence(DFKI), 67663 Kaiserslautern, Germany; (B.Z.); (P.L.)
| | - Paul Lukowicz
- German Research Center for Artificial Intelligence(DFKI), 67663 Kaiserslautern, Germany; (B.Z.); (P.L.)
- Department of Computer Science, University of Kaiserslautern, 67663 Kaiserslautern, Germany
| |
Collapse
|
7
|
Matsumoto D, Hwang HC. Commentary: Electrophysiological Evidence Reveals Differences between the Recognition of Microexpressions and Macroexpressions. Front Psychol 2019; 10:1293. [PMID: 31263437 PMCID: PMC6584814 DOI: 10.3389/fpsyg.2019.01293] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Accepted: 05/16/2019] [Indexed: 11/25/2022] Open
Affiliation(s)
- David Matsumoto
- Department of Psychology, San Francisco State University, San Francisco, CA, United States
- Humintell, El Cerrito, CA, United States
- *Correspondence: David Matsumoto
| | | |
Collapse
|
8
|
Shen X, Chen W, Zhao G, Hu P. Editorial: Recognizing Microexpression: An Interdisciplinary Perspective. Front Psychol 2019; 10:1318. [PMID: 31214101 PMCID: PMC6558206 DOI: 10.3389/fpsyg.2019.01318] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2019] [Accepted: 05/20/2019] [Indexed: 11/15/2022] Open
Affiliation(s)
- Xunbing Shen
- Department of Psychology, Jiangxi University of Traditional Chinese Medicine, Nanchang, China
| | - Wenfeng Chen
- Department of Psychology, Renmin University of China, Beijing, China
| | - Guoying Zhao
- Center for Machine Vision and Signal Analysis, University of Oulu, Oulu, Finland
| | - Ping Hu
- Department of Psychology, Renmin University of China, Beijing, China
| |
Collapse
|
9
|
Zhu C, Yin M, Chen X, Zhang J, Liu D. Ecological micro-expression recognition characteristics of young adults with subthreshold depression. PLoS One 2019; 14:e0216334. [PMID: 31042784 PMCID: PMC6493753 DOI: 10.1371/journal.pone.0216334] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2018] [Accepted: 04/18/2019] [Indexed: 11/19/2022] Open
Abstract
The micro-expression (ME) processing characteristics of patients with depression has been studied but has not been investigated in people with subthreshold depression. Based on this, by adopting the ecological MEs recognition paradigm, this study aimed to explore ME recognition in people with subthreshold depression. A 4 (background expression: happy, neutral, sad and fearful) × 4 (ME: happy, neutral, sad, and fearful) study was designed; two groups of participants (experimental group with subthreshold depression vs. healthy control group, 32 participants in each group) were asked to complete the ecological ME recognition task, and the corresponding accuracy (ACC) and reaction time (RT) were analyzed. Results: (1) Under different background conditions, recognizing happy MEs had the highest ACC and shortest RT. (2) There was no significant difference in the ACC and RT between experimental and control groups. (3)In different contexts, individuals with subthreshold depression tended to misjudge neutral, sad, and fearful MEs as happy, while neutral MEs were misjudged as sad and fearful. (4) The performance of individuals with subthreshold depression in the ecological ME recognition task were influenced by the type of ME; they showed highest ACC and shortest RT when recognizing happy MEs (vs. the other MEs). Conclusions: (1) The performance of individuals’ ecological ME recognition were influenced by the background expression, and this embodied the need for ecological ME recognition. (2) Individuals with subthreshold depression showed normal ecological ME recognition ability. (3) In terms of misjudgment, individuals with subthreshold depression showed both positive and negative bias, when completing the ecological ME recognition task. (4) Compared with the other MEs, happy MEs showed an advantage recognition effect for individuals with subthreshold depression who completed the ecological ME recognition task.
Collapse
Affiliation(s)
- Chuanlin Zhu
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Ming Yin
- Department of Criminal Investigation, Jiangsu Police Institute, Nanjing, Jiangsu, China
| | - Xinyun Chen
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Jianxin Zhang
- School of Humanities, Jiangnan University, Wuxi, Jiangsu, China
| | - Dianzhi Liu
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
- * E-mail:
| |
Collapse
|
10
|
Zeng X, Wu Q, Zhang S, Liu Z, Zhou Q, Zhang M. A False Trail to Follow: Differential Effects of the Facial Feedback Signals From the Upper and Lower Face on the Recognition of Micro-Expressions. Front Psychol 2018; 9:2015. [PMID: 30405497 PMCID: PMC6208096 DOI: 10.3389/fpsyg.2018.02015] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2018] [Accepted: 10/01/2018] [Indexed: 01/24/2023] Open
Abstract
Micro-expressions, as fleeting facial expressions, are very important for judging people’s true emotions, thus can provide an essential behavioral clue for lie and dangerous demeanor detection. From embodied accounts of cognition, we derived a novel hypothesis that facial feedback from upper and lower facial regions has differential effects on micro-expression recognition. This hypothesis was tested and supported across three studies. Specifically, the results of Study 1 showed that people became better judges of intense micro-expressions with a duration of 450 ms when the facial feedback from upper face was enhanced via a restricting gel. Additional results of Study 2 showed that the recognition accuracy of subtle micro-expressions was significantly impaired under all duration conditions (50, 150, 333, and 450 ms) when facial feedback from lower face was enhanced. In addition, the results of Study 3 also revealed that blocking the facial feedback of lower face, significantly boosted the recognition accuracy of subtle and intense micro-expressions under all duration conditions (150 and 450 ms). Together, these results highlight the role of facial feedback in judging the subtle movements of micro-expressions.
Collapse
Affiliation(s)
- Xuemei Zeng
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qi Wu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Siwei Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Zheying Liu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qing Zhou
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Meishan Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| |
Collapse
|
11
|
Marshall CR, Hardy CJD, Russell LL, Clark CN, Bond RL, Dick KM, Brotherhood EV, Mummery CJ, Schott JM, Rohrer JD, Kilner JM, Warren JD. Motor signatures of emotional reactivity in frontotemporal dementia. Sci Rep 2018; 8:1030. [PMID: 29348485 PMCID: PMC5773553 DOI: 10.1038/s41598-018-19528-2] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2017] [Accepted: 01/04/2018] [Indexed: 11/18/2022] Open
Abstract
Automatic motor mimicry is essential to the normal processing of perceived emotion, and disrupted automatic imitation might underpin socio-emotional deficits in neurodegenerative diseases, particularly the frontotemporal dementias. However, the pathophysiology of emotional reactivity in these diseases has not been elucidated. We studied facial electromyographic responses during emotion identification on viewing videos of dynamic facial expressions in 37 patients representing canonical frontotemporal dementia syndromes versus 21 healthy older individuals. Neuroanatomical associations of emotional expression identification accuracy and facial muscle reactivity were assessed using voxel-based morphometry. Controls showed characteristic profiles of automatic imitation, and this response predicted correct emotion identification. Automatic imitation was reduced in the behavioural and right temporal variant groups, while the normal coupling between imitation and correct identification was lost in the right temporal and semantic variant groups. Grey matter correlates of emotion identification and imitation were delineated within a distributed network including primary visual and motor, prefrontal, insular, anterior temporal and temporo-occipital junctional areas, with common involvement of supplementary motor cortex across syndromes. Impaired emotional mimesis may be a core mechanism of disordered emotional signal understanding and reactivity in frontotemporal dementia, with implications for the development of novel physiological biomarkers of socio-emotional dysfunction in these diseases.
Collapse
Affiliation(s)
- Charles R Marshall
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK.
- Sobell Department of Motor Neuroscience and Movement Disorders, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK.
| | - Chris J D Hardy
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Lucy L Russell
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Camilla N Clark
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Rebecca L Bond
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Katrina M Dick
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Emilie V Brotherhood
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Cath J Mummery
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Jonathan M Schott
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Jonathan D Rohrer
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - James M Kilner
- Sobell Department of Motor Neuroscience and Movement Disorders, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Jason D Warren
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| |
Collapse
|
12
|
Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review. APPLIED SCIENCES-BASEL 2017. [DOI: 10.3390/app7121239] [Citation(s) in RCA: 70] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|