1
|
Jin H, He N, Li Z, Yang P. Micro-expression recognition based on multi-scale 3D residual convolutional neural network. MATHEMATICAL BIOSCIENCES AND ENGINEERING : MBE 2024; 21:5007-5031. [PMID: 38872524 DOI: 10.3934/mbe.2024221] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2024]
Abstract
In demanding application scenarios such as clinical psychotherapy and criminal interrogation, the accurate recognition of micro-expressions is of utmost importance but poses significant challenges. One of the main difficulties lies in effectively capturing weak and fleeting facial features and improving recognition performance. To address this fundamental issue, this paper proposed a novel architecture based on a multi-scale 3D residual convolutional neural network. The algorithm leveraged a deep 3D-ResNet50 as the skeleton model and utilized the micro-expression optical flow feature map as the input for the network model. Drawing upon the complex spatial and temporal features inherent in micro-expressions, the network incorporated multi-scale convolutional modules of varying sizes to integrate both global and local information. Furthermore, an attention mechanism feature fusion module was introduced to enhance the model's contextual awareness. Finally, to optimize the model's prediction of the optimal solution, a discriminative network structure with multiple output channels was constructed. The algorithm's performance was evaluated using the public datasets SMIC, SAMM, and CASME Ⅱ. The experimental results demonstrated that the proposed algorithm achieves recognition accuracies of 74.6, 84.77 and 91.35% on these datasets, respectively. This substantial improvement in efficiency compared to existing mainstream methods for extracting micro-expression subtle features effectively enhanced micro-expression recognition performance and increased the accuracy of high-precision micro-expression recognition. Consequently, this paper served as an important reference for researchers working on high-precision micro-expression recognition.
Collapse
Affiliation(s)
- Hongmei Jin
- College of Computer Science and Technology, Xi'an University of Science and Technology, Xi'an 710054, China
| | - Ning He
- College of Computer Science and Technology, Xi'an University of Science and Technology, Xi'an 710054, China
| | - Zhanli Li
- College of Computer Science and Technology, Xi'an University of Science and Technology, Xi'an 710054, China
| | - Pengcheng Yang
- College of Computer Science and Technology, Xi'an University of Science and Technology, Xi'an 710054, China
| |
Collapse
|
2
|
Sharma D, Singh J, Shah B, Ali F, AlZubi AA, AlZubi MA. Public mental health through social media in the post COVID-19 era. Front Public Health 2023; 11:1323922. [PMID: 38146469 PMCID: PMC10749364 DOI: 10.3389/fpubh.2023.1323922] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2023] [Accepted: 11/22/2023] [Indexed: 12/27/2023] Open
Abstract
Social media is a powerful communication tool and a reflection of our digital environment. Social media acted as an augmenter and influencer during and after COVID-19. Many of the people sharing social media posts were not actually aware of their mental health status. This situation warrants to automate the detection of mental disorders. This paper presents a methodology for the detection of mental disorders using micro facial expressions. Micro-expressions are momentary, involuntary facial expressions that can be indicative of deeper feelings and mental states. Nevertheless, manually detecting and interpreting micro-expressions can be rather challenging. A deep learning HybridMicroNet model, based on convolution neural networks, is proposed for emotion recognition from micro-expressions. Further, a case study for the detection of mental health has been undertaken. The findings demonstrated that the proposed model achieved a high accuracy when attempting to diagnose mental health disorders based on micro-expressions. The attained accuracy on the CASME dataset was 99.08%, whereas the accuracy that was achieved on SAMM dataset was 97.62%. Based on these findings, deep learning may prove to be an effective method for diagnosing mental health conditions by analyzing micro-expressions.
Collapse
Affiliation(s)
- Deepika Sharma
- Chitkara University Institute of Engineering and Technology, Chitkara University, Punjab, India
| | - Jaiteg Singh
- Chitkara University Institute of Engineering and Technology, Chitkara University, Punjab, India
| | - Babar Shah
- College of Technological Innovation, Zayed University, Dubai, United Arab Emirates
| | - Farman Ali
- Department of Computer Science and Engineering, School of Convergence, College of Computing and Informatics, Sungkyunkwan University, Seoul, Republic of Korea
| | - Ahmad Ali AlZubi
- Department of Computer Science, Community College, King Saud University, Riyadh, Saudi Arabia
| | - Mallak Ahmad AlZubi
- Faculty of Medicine, Jordan University of Science and Technology, Irbid, Jordan
| |
Collapse
|
3
|
Fu C, Yang W, Chen D, Wei F. AM3F-FlowNet: Attention-Based Multi-Scale Multi-Branch Flow Network. ENTROPY (BASEL, SWITZERLAND) 2023; 25:1064. [PMID: 37510012 PMCID: PMC10378207 DOI: 10.3390/e25071064] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/27/2023] [Revised: 07/02/2023] [Accepted: 07/12/2023] [Indexed: 07/30/2023]
Abstract
Micro-expressions are the small, brief facial expression changes that humans momentarily show during emotional experiences, and their data annotation is complicated, which leads to the scarcity of micro-expression data. To extract salient and distinguishing features from a limited dataset, we propose an attention-based multi-scale, multi-modal, multi-branch flow network to thoroughly learn the motion information of micro-expressions by exploiting the attention mechanism and the complementary properties between different optical flow information. First, we extract optical flow information (horizontal optical flow, vertical optical flow, and optical strain) based on the onset and apex frames of micro-expression videos, and each branch learns one kind of optical flow information separately. Second, we propose a multi-scale fusion module to extract more prosperous and more stable feature expressions using spatial attention to focus on locally important information at each scale. Then, we design a multi-optical flow feature reweighting module to adaptively select features for each optical flow separately by channel attention. Finally, to better integrate the information of the three branches and to alleviate the problem of uneven distribution of micro-expression samples, we introduce a logarithmically adjusted prior knowledge weighting loss. This loss function weights the prediction scores of samples from different categories to mitigate the negative impact of category imbalance during the classification process. The effectiveness of the proposed model is demonstrated through extensive experiments and feature visualization on three benchmark datasets (CASMEII, SAMM, and SMIC), and its performance is comparable to that of state-of-the-art methods.
Collapse
Affiliation(s)
- Chenghao Fu
- School of Information Science and Engineering, Xinjiang University, Urumqi 830017, China
| | - Wenzhong Yang
- School of Information Science and Engineering, Xinjiang University, Urumqi 830017, China
- Xinjiang Key Laboratory of Multilingual Information Technology, Xinjiang University, Urumqi 830017, China
| | - Danny Chen
- School of Information Science and Engineering, Xinjiang University, Urumqi 830017, China
| | - Fuyuan Wei
- School of Information Science and Engineering, Xinjiang University, Urumqi 830017, China
| |
Collapse
|
4
|
Tomberg C, Petagna M, de Selliers de Moranville LA. Horses (Equus caballus) facial micro-expressions: insight into discreet social information. Sci Rep 2023; 13:8625. [PMID: 37244937 DOI: 10.1038/s41598-023-35807-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2023] [Accepted: 05/24/2023] [Indexed: 05/29/2023] Open
Abstract
Facial micro-expressions are facial expressions expressed briefly (less than 500 ms) and involuntarily. Described only in humans, we investigated whether micro-expressions could also be expressed by non-human animal species. Using the Equine Facial action coding system (EquiFACS), an objective tool based on facial muscles actions, we demonstrated that a non-human species, Equus caballus, is expressing facial micro-expressions in a social context. The AU17, AD38 and AD1 were selectively modulated as micro-expression-but not as standard facial expression (all durations included)-in presence of a human experimenter. As standard facial expressions, they have been associated with pain or stress but our results didn't support this association for micro-expressions which may convey other information. Like in humans, neural mechanisms underlying the exhibit of micro-expressions may differ from those of standard facial expressions. We found that some micro-expressions could be related to attention and involved in the multisensory processing of the 'fixed attention' observed in horses' high attentional state. The micro-expressions could be used by horses as social information in an interspecies relationship. We hypothesize that facial micro-expressions could be a window on transient internal states of the animal and may provide subtle and discreet social signals.
Collapse
Affiliation(s)
- Claude Tomberg
- Faculty of Medicine, Université Libre de Bruxelles, 808, Route de Lennik, CP 630, 1070, Brussels, Belgium.
| | - Maxime Petagna
- Faculty of Medicine, Université Libre de Bruxelles, 808, Route de Lennik, CP 630, 1070, Brussels, Belgium
| | | |
Collapse
|
5
|
Chamberland JA, Collin CA. Effects of forward mask duration variability on the temporal dynamics
of brief facial expression categorization. Iperception 2023; 14:20416695231162580. [PMID: 36968319 PMCID: PMC10031613 DOI: 10.1177/20416695231162580] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 02/22/2023] [Indexed: 03/24/2023] Open
Abstract
The Japanese and Caucasian Brief Affect Recognition Task (JACBART) has been
proposed as a standardized method for measuring people's ability to accurately
categorize briefly presented images of facial expressions. However, the factors
that impact performance in this task are not entirely understood. The current
study sought to explore the role of the forward mask's duration (i.e., fixed vs.
variable) in brief affect categorization across expressions of the six basic
emotions (i.e., anger, disgust, fear, happiness, sadness, and surprise) and
three presentation times (i.e., 17, 67, and 500 ms). Current findings do not
demonstrate evidence that a variable duration forward mask negatively impacts
brief affect categorization. However, efficiency and necessity thresholds were
observed to vary across the expressions of emotion. Further exploration of the
temporal dynamics of facial affect categorization will therefore require a
consideration of these differences.
Collapse
Affiliation(s)
- Justin A. Chamberland
- Justin A. Chamberland, School of
Psychology/École de psychologie, University of Ottawa/Université d’Ottawa,
Ottawa, Ontario, K1N 6N5, Canada.
| | | |
Collapse
|
6
|
Yang C, You X, Xie X, Duan Y, Wang B, Zhou Y, Feng H, Wang W, Fan L, Huang G, Shen X. Development of a Chinese werewolf deception database. Front Psychol 2023; 13:1047427. [PMID: 36698609 PMCID: PMC9869050 DOI: 10.3389/fpsyg.2022.1047427] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2022] [Accepted: 12/15/2022] [Indexed: 01/11/2023] Open
Abstract
Although it is important to accurately detect deception, limited research in this area has been undertaken involving Asian people. We aim to address this gap by undertaking research regarding the identification of deception in Asians in realistic environments. In this study, we develop a Chinese Werewolf Deception Database (C2W2D), which consists of 168 video clips (84 deception videos and 84 honest videos). A total of 1,738,760 frames of facial data are recorded. Fifty-eight healthy undergraduates (24 men and 34 women) and 26 drug addicts (26 men) participated in a werewolf game. The development of C2W2D is accomplished based on a "werewolf" deception game paradigm in which the participants spontaneously tell the truth or a lie. Two synced high-speed cameras are used to capture the game process. To explore the differences between lying and truth-telling in the database, descriptive statistics (e.g., duration and quantity) and hypothesis tests are conducted using action units (AUs) of facial expressions (e.g., t-test). The C2W2D contributes to a relatively sizable number of deceptive and honest samples with high ecological validity. These samples can be used to study the individual differences and the underlying mechanisms of lies and truth-telling between drug addicts and healthy people.
Collapse
Affiliation(s)
- Chaocao Yang
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China,School of Psychology, Shaanxi Normal University, Xi’an, China,Shaanxi Provincial Key Laboratory of Behavior and Cognitive Neuroscience, Shaanxi Normal University, Xi’an, China
| | - Xuqun You
- School of Psychology, Shaanxi Normal University, Xi’an, China,Shaanxi Provincial Key Laboratory of Behavior and Cognitive Neuroscience, Shaanxi Normal University, Xi’an, China
| | - Xudong Xie
- School of Psychology, Shaanxi Normal University, Xi’an, China,Shaanxi Provincial Key Laboratory of Behavior and Cognitive Neuroscience, Shaanxi Normal University, Xi’an, China
| | - Yuanyuan Duan
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Buxue Wang
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Yuxi Zhou
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Hong Feng
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Wenjing Wang
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Ling Fan
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Genying Huang
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Xunbing Shen
- Key Laboratory of Psychology of TCM and Brain Science, Jiangxi Administration of Traditional Chinese Medicine, Jiangxi University of Chinese Medicine, Nanchang, China,*Correspondence: Xunbing Shen,
| |
Collapse
|
7
|
A Survey of Micro-expression Recognition Methods Based on LBP, Optical Flow and Deep Learning. Neural Process Lett 2023. [DOI: 10.1007/s11063-022-11123-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
|
8
|
Wu Q, Peng K, Xie Y, Lai Y, Liu X, Zhao Z. An ingroup disadvantage in recognizing micro-expressions. Front Psychol 2022; 13:1050068. [PMID: 36507018 PMCID: PMC9732534 DOI: 10.3389/fpsyg.2022.1050068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 11/08/2022] [Indexed: 11/27/2022] Open
Abstract
Micro-expression is a fleeting facial expression of emotion that usually occurs in high-stake situations and reveals the true emotion that a person tries to conceal. Due to its unique nature, recognizing micro-expression has great applications for fields like law enforcement, medical treatment, and national security. However, the psychological mechanism of micro-expression recognition is still poorly understood. In the present research, we sought to expand upon previous research to investigate whether the group membership of the expresser influences the recognition process of micro-expressions. By conducting two behavioral studies, we found that contrary to the widespread ingroup advantage found in macro-expression recognition, there was a robust ingroup disadvantage in micro-expression recognition instead. Specifically, in Study 1A and 1B, we found that participants were more accurate at recognizing the intense and subtle micro-expressions of their racial outgroups than those micro-expressions of their racial ingroups, and neither the training experience nor the duration of micro-expressions moderated this ingroup disadvantage. In Study 2A and 2B, we further found that mere social categorization alone was sufficient to elicit the ingroup disadvantage for the recognition of intense and subtle micro-expressions, and such an effect was also unaffected by the duration of micro-expressions. These results suggest that individuals spontaneously employ the social category information of others to recognize micro-expressions, and the ingroup disadvantage in micro-expression stems partly from motivated differential processing of ingroup micro-expressions.
Collapse
Affiliation(s)
- Qi Wu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China,*Correspondence: Qi Wu,
| | - Kunling Peng
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yanni Xie
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yeying Lai
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Xuanchen Liu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Ziwei Zhao
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| |
Collapse
|
9
|
Lin Q, Dong Z, Zheng Q, Wang SJ. The effect of facial attractiveness on micro-expression recognition. Front Psychol 2022; 13:959124. [PMID: 36186390 PMCID: PMC9524498 DOI: 10.3389/fpsyg.2022.959124] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Accepted: 08/11/2022] [Indexed: 11/23/2022] Open
Abstract
Micro-expression (ME) is an extremely quick and uncontrollable facial movement that lasts for 40–200 ms and reveals thoughts and feelings that an individual attempts to cover up. Though much more difficult to detect and recognize, ME recognition is similar to macro-expression recognition in that it is influenced by facial features. Previous studies suggested that facial attractiveness could influence facial expression recognition processing. However, it remains unclear whether facial attractiveness could also influence ME recognition. Addressing this issue, this study tested 38 participants with two ME recognition tasks in a static condition or dynamically. Three different MEs (positive, neutral, and negative) at two attractiveness levels (attractive, unattractive). The results showed that participants recognized MEs on attractive faces much quicker than on unattractive ones, and there was a significant interaction between ME and facial attractiveness. Furthermore, attractive happy faces were recognized faster in both the static and the dynamic conditions, highlighting the happiness superiority effect. Therefore, our results provided the first evidence that facial attractiveness could influence ME recognition in a static condition or dynamically.
Collapse
Affiliation(s)
- Qiongsi Lin
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
| | - Zizhao Dong
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
| | - Qiuqiang Zheng
- Teacher Education Curriculum Center, School of Educational Science, Huizhou University, Huizhou, China
| | - Su-Jing Wang
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
- *Correspondence: Su-Jing Wang
| |
Collapse
|
10
|
Wu Q, Xie Y, Liu X, Liu Y. Oxytocin Impairs the Recognition of Micro-Expressions of Surprise and Disgust. Front Psychol 2022; 13:947418. [PMID: 35846599 PMCID: PMC9277341 DOI: 10.3389/fpsyg.2022.947418] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Accepted: 06/13/2022] [Indexed: 11/13/2022] Open
Abstract
As fleeting facial expressions which reveal the emotion that a person tries to conceal, micro-expressions have great application potentials for fields like security, national defense and medical treatment. However, the physiological basis for the recognition of these facial expressions is poorly understood. In the present research, we utilized a double-blind, placebo-controlled, mixed-model experimental design to investigate the effects of oxytocin on the recognition of micro-expressions in three behavioral studies. Specifically, in Studies 1 and 2, participants were asked to perform a laboratory-based standardized micro-expression recognition task after self-administration of a single dose of intranasal oxytocin (40 IU) or placebo (containing all ingredients except for the neuropeptide). In Study 3, we further examined the effects of oxytocin on the recognition of natural micro-expressions. The results showed that intranasal oxytocin decreased the recognition speed for standardized intense micro-expressions of surprise (Study 1) and decreased the recognition accuracy for standardized subtle micro-expressions of disgust (Study 2). The results of Study 3 further revealed that intranasal oxytocin administration significantly reduced the recognition accuracy for natural micro-expressions of surprise and disgust. The present research is the first to investigate the effects of oxytocin on micro-expression recognition. It suggests that the oxytocin mainly plays an inhibiting role in the recognition of micro-expressions and there are fundamental differences in the neurophysiological basis for the recognition of micro-expressions and macro-expressions.
Collapse
Affiliation(s)
- Qi Wu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China
- Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
- *Correspondence: Qi Wu,
| | - Yanni Xie
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China
- Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Xuanchen Liu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China
- Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yulong Liu
- School of Finance and Management, Changsha Social Work College, Changsha, China
| |
Collapse
|
11
|
Micro-Expression Recognition Based on Optical Flow and PCANet+. SENSORS 2022; 22:s22114296. [PMID: 35684917 PMCID: PMC9185295 DOI: 10.3390/s22114296] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Revised: 05/10/2022] [Accepted: 05/31/2022] [Indexed: 11/27/2022]
Abstract
Micro-expressions are rapid and subtle facial movements. Different from ordinary facial expressions in our daily life, micro-expressions are very difficult to detect and recognize. In recent years, due to a wide range of potential applications in many domains, micro-expression recognition has aroused extensive attention from computer vision. Because available micro-expression datasets are very small, deep neural network models with a huge number of parameters are prone to over-fitting. In this article, we propose an OF-PCANet+ method for micro-expression recognition, in which we design a spatiotemporal feature learning strategy based on shallow PCANet+ model, and we incorporate optical flow sequence stacking with the PCANet+ network to learn discriminative spatiotemporal features. We conduct comprehensive experiments on publicly available SMIC and CASME2 datasets. The results show that our lightweight model obviously outperforms popular hand-crafted methods and also achieves comparable performances with deep learning based methods, such as 3D-FCNN and ELRCN.
Collapse
|
12
|
Steinmair D, Löffler-Stastka H. Personalized treatment - which interaction ingredients should be focused to capture the unconscious. World J Clin Cases 2022; 10:2053-2062. [PMID: 35321177 PMCID: PMC8895185 DOI: 10.12998/wjcc.v10.i7.2053] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Revised: 12/14/2021] [Accepted: 02/13/2022] [Indexed: 02/06/2023] Open
Abstract
A recent meta-analysis revealed that mental health and baseline psychological impairment affect the quality of life and outcomes in different chronic conditions. Implementing mental health care in physical care services is still insufficient. Thus, interdisciplinary communication across treatment providers is essential. The standardized language provided by the diagnostic statistical manual favors a clear conceptualization. However, this approach might not focus on the individual, as thinking in categories might impede recognizing the continuum from healthy to diseased. Psychoanalytic theory is concerned with an individual’s unconscious conflictual wishes and motivations, manifested through enactments like psychic symptoms or (maladaptive) behavior with long-term consequences if not considered. Such modifiable internal and external factors often are inadequately treated. However, together with the physical chronic condition constraints, these factors determine degrees of freedom for a self-determined existence. The effect of therapeutic interventions, and especially therapy adherence, relies on a solid therapeutic relationship. Outcome and process research still investigates the mechanism of change in psychotherapeutic treatments with psychanalysis’s focus on attachment problems. This article examines existing knowledge about the mechanism of change in psychoanalysis under the consideration of current trends emerging from psychotherapy research. A clinical example is discussed. Additionally, further directions for research are given. The theoretical frame in psychoanalytic therapies is the affect-cognitive interface. Subliminal affect-perception is enabled via awareness of subjective meanings in oneself and the other; shaping this awareness is the main intervention point. The interactional ingredients, the patient’s inherent bioenvironmental history meeting the clinician, are relevant variables. Several intrinsic, subliminal parameters relevant for changing behavior are observed. Therapeutic interventions aim at supporting the internalization of the superego’s functions and at making this ability available in moments of self-reflection. By supporting mentalization abilities, a better understanding of oneself and higher self-regulation (including emotional regulation) can lead to better judgments (application of formal logic and abstract thinking). Thus, this facilitates enduring behavior change with presumably positive effects on mental and physical health.
Collapse
Affiliation(s)
- Dagmar Steinmair
- Department of Psychoanalysis and Psychotherapy, Medical University Vienna, Wien 1090, Österreich, Austria
| | - Henriette Löffler-Stastka
- Department of Psychoanalysis and Psychotherapy, Medical University Vienna, Wien 1090, Österreich, Austria
| |
Collapse
|
13
|
Shen X, Fan G, Niu C, Chen Z. Catching a Liar Through Facial Expression of Fear. Front Psychol 2021; 12:675097. [PMID: 34168597 PMCID: PMC8217652 DOI: 10.3389/fpsyg.2021.675097] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Accepted: 05/17/2021] [Indexed: 11/13/2022] Open
Abstract
High stakes can be stressful whether one is telling the truth or lying. However, liars can feel extra fear from worrying to be discovered than truth-tellers, and according to the "leakage theory," the fear is almost impossible to be repressed. Therefore, we assumed that analyzing the facial expression of fear could reveal deceits. Detecting and analyzing the subtle leaked fear facial expressions is a challenging task for laypeople. It is, however, a relatively easy job for computer vision and machine learning. To test the hypothesis, we analyzed video clips from a game show "The moment of truth" by using OpenFace (for outputting the Action Units (AUs) of fear and face landmarks) and WEKA (for classifying the video clips in which the players were lying or telling the truth). The results showed that some algorithms achieved an accuracy of >80% merely using AUs of fear. Besides, the total duration of AU20 of fear was found to be shorter under the lying condition than that from the truth-telling condition. Further analysis found that the reason for a shorter duration in the lying condition was that the time window from peak to offset of AU20 under the lying condition was less than that under the truth-telling condition. The results also showed that facial movements around the eyes were more asymmetrical when people are telling lies. All the results suggested that facial clues can be used to detect deception, and fear could be a cue for distinguishing liars from truth-tellers.
Collapse
Affiliation(s)
- Xunbing Shen
- Department of Psychology, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Gaojie Fan
- Beck Visual Cognition Laboratory, Louisiana State University, Baton Rouge, LA, United States
| | - Caoyuan Niu
- Department of Psychology, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Zhencai Chen
- Department of Psychology, Jiangxi University of Chinese Medicine, Nanchang, China
| |
Collapse
|
14
|
Avani VS, Shaila SG, Vadivel A. Geometrical features of lips using the properties of parabola for recognizing facial expression. Cogn Neurodyn 2020; 15:481-499. [PMID: 34040673 DOI: 10.1007/s11571-020-09638-x] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2020] [Revised: 09/02/2020] [Accepted: 09/30/2020] [Indexed: 10/23/2022] Open
Abstract
Various real-time applications such as Human-Computer Interactions, Psychometric analysis, etc. use facial expressions as one of the important parameters. The researchers have used Action Units (AU) of the face as feature points and its deformation is compared with the reference points on the face to estimate the facial expressions. Among many parts of the face, features from the mouth contribute largely to all the well-known emotions. In this paper, the parabola theory is used to identify and mark various points on the lips. These points are considered as feature points to construct feature vectors. The Latus Rectum, Focal Point, Directrix, Vertex, etc. are also considered to identify the feature points of the lower lips and upper lips. The proposed approach is evaluated on benchmark datasets such as JAFFEE and Cohn-Kanade dataset and it is found that the performance is encouraging in understanding the facial expressions. The results are compared with contemporary methods and found that the proposed approach has given good classification accuracy in recognizing facial expressions.
Collapse
Affiliation(s)
- V Suma Avani
- Department of CSE, Dayananda Sagar University, Bangalore, India
| | - S G Shaila
- Department of CSE, Dayananda Sagar University, Bangalore, India
| | - A Vadivel
- Department of CSE, SRM University AP, Amaravati, Andhra Pradesh India
| |
Collapse
|
15
|
Zhang M, Zhao K, Qu F, Li K, Fu X. Brain Activation in Contrasts of Microexpression Following Emotional Contexts. Front Neurosci 2020; 14:329. [PMID: 32410934 PMCID: PMC7202324 DOI: 10.3389/fnins.2020.00329] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2019] [Accepted: 03/20/2020] [Indexed: 11/18/2022] Open
Abstract
The recognition of microexpressions may be influenced by emotional contexts. The microexpression is recognized poorly when it follows a negative context in contrast to a neutral context. Based on the behavioral evidence, we predicted that the effect of emotional contexts might be dependent on neural activities. Using the synthesized microexpressions task modified from the Micro-Expression Training Tool (METT), we performed an functional MRI (fMRI) study to compare brain response in contrasts of the same targets following different contexts. Behaviorally, we observed that the accuracies of target microexpressions following neutral contexts were significantly higher than those following negative or positive contexts. At the neural level, we found increased brain activations in contrasts of the same targets following different contexts, which reflected the discrepancy in the processing of emotional contexts. The increased activations implied that different emotional contexts might differently influence the processing of subsequent target microexpressions and further suggested interactions between the processing of emotional contexts and of microexpressions.
Collapse
Affiliation(s)
- Ming Zhang
- Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Ke Zhao
- Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Fangbing Qu
- Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,College of Preschool Education, Capital Normal University, Beijing, China
| | - Kaiyun Li
- Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,School of Education and Psychology, University of Jinan, Jinan, China
| | - Xiaolan Fu
- Institute of Psychology, Chinese Academy of Sciences, Beijing, China.,Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
16
|
Zhu C, Yin M, Chen X, Zhang J, Liu D. Ecological micro-expression recognition characteristics of young adults with subthreshold depression. PLoS One 2019; 14:e0216334. [PMID: 31042784 PMCID: PMC6493753 DOI: 10.1371/journal.pone.0216334] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2018] [Accepted: 04/18/2019] [Indexed: 11/19/2022] Open
Abstract
The micro-expression (ME) processing characteristics of patients with depression has been studied but has not been investigated in people with subthreshold depression. Based on this, by adopting the ecological MEs recognition paradigm, this study aimed to explore ME recognition in people with subthreshold depression. A 4 (background expression: happy, neutral, sad and fearful) × 4 (ME: happy, neutral, sad, and fearful) study was designed; two groups of participants (experimental group with subthreshold depression vs. healthy control group, 32 participants in each group) were asked to complete the ecological ME recognition task, and the corresponding accuracy (ACC) and reaction time (RT) were analyzed. Results: (1) Under different background conditions, recognizing happy MEs had the highest ACC and shortest RT. (2) There was no significant difference in the ACC and RT between experimental and control groups. (3)In different contexts, individuals with subthreshold depression tended to misjudge neutral, sad, and fearful MEs as happy, while neutral MEs were misjudged as sad and fearful. (4) The performance of individuals with subthreshold depression in the ecological ME recognition task were influenced by the type of ME; they showed highest ACC and shortest RT when recognizing happy MEs (vs. the other MEs). Conclusions: (1) The performance of individuals’ ecological ME recognition were influenced by the background expression, and this embodied the need for ecological ME recognition. (2) Individuals with subthreshold depression showed normal ecological ME recognition ability. (3) In terms of misjudgment, individuals with subthreshold depression showed both positive and negative bias, when completing the ecological ME recognition task. (4) Compared with the other MEs, happy MEs showed an advantage recognition effect for individuals with subthreshold depression who completed the ecological ME recognition task.
Collapse
Affiliation(s)
- Chuanlin Zhu
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Ming Yin
- Department of Criminal Investigation, Jiangsu Police Institute, Nanjing, Jiangsu, China
| | - Xinyun Chen
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Jianxin Zhang
- School of Humanities, Jiangnan University, Wuxi, Jiangsu, China
| | - Dianzhi Liu
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
- * E-mail:
| |
Collapse
|
17
|
|
18
|
Zeng X, Wu Q, Zhang S, Liu Z, Zhou Q, Zhang M. A False Trail to Follow: Differential Effects of the Facial Feedback Signals From the Upper and Lower Face on the Recognition of Micro-Expressions. Front Psychol 2018; 9:2015. [PMID: 30405497 PMCID: PMC6208096 DOI: 10.3389/fpsyg.2018.02015] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2018] [Accepted: 10/01/2018] [Indexed: 01/24/2023] Open
Abstract
Micro-expressions, as fleeting facial expressions, are very important for judging people’s true emotions, thus can provide an essential behavioral clue for lie and dangerous demeanor detection. From embodied accounts of cognition, we derived a novel hypothesis that facial feedback from upper and lower facial regions has differential effects on micro-expression recognition. This hypothesis was tested and supported across three studies. Specifically, the results of Study 1 showed that people became better judges of intense micro-expressions with a duration of 450 ms when the facial feedback from upper face was enhanced via a restricting gel. Additional results of Study 2 showed that the recognition accuracy of subtle micro-expressions was significantly impaired under all duration conditions (50, 150, 333, and 450 ms) when facial feedback from lower face was enhanced. In addition, the results of Study 3 also revealed that blocking the facial feedback of lower face, significantly boosted the recognition accuracy of subtle and intense micro-expressions under all duration conditions (150 and 450 ms). Together, these results highlight the role of facial feedback in judging the subtle movements of micro-expressions.
Collapse
Affiliation(s)
- Xuemei Zeng
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qi Wu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Siwei Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Zheying Liu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qing Zhou
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Meishan Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| |
Collapse
|
19
|
Abstract
Micro-expressions are brief spontaneous facial expressions that appear on a face when a person conceals an emotion, making them different to normal facial expressions in subtlety and duration. Currently, emotion classes within the CASME II dataset (Chinese Academy of Sciences Micro-expression II) are based on Action Units and self-reports, creating conflicts during machine learning training. We will show that classifying expressions using Action Units, instead of predicted emotion, removes the potential bias of human reporting. The proposed classes are tested using LBP-TOP (Local Binary Patterns from Three Orthogonal Planes), HOOF (Histograms of Oriented Optical Flow) and HOG 3D (3D Histogram of Oriented Gradient) feature descriptors. The experiments are evaluated on two benchmark FACS (Facial Action Coding System) coded datasets: CASME II and SAMM (A Spontaneous Micro-Facial Movement). The best result achieves 86.35% accuracy when classifying the proposed 5 classes on CASME II using HOG 3D, outperforming the result of the state-of-the-art 5-class emotional-based classification in CASME II. Results indicate that classification based on Action Units provides an objective method to improve micro-expression recognition.
Collapse
|
20
|
Felisberti FM. Long-lasting effects of family-related factors on adults' ability to recognise brief facial expressions of emotion. Q J Exp Psychol (Hove) 2018; 71:1512-1525. [PMID: 29926784 DOI: 10.1177/1747021817742080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
This study investigated whether adults' ability to attribute emotions to brief facial expressions (microexpressions) is associated with family-related environmental factors (FrFs) such as one's number of siblings (Experiment 1), attachment style (Experiment 2), or perceived parental authority style (Experiment 3). Participants' accuracy and reaction time (RT) to the recognition of anger, contempt, disgust, fear, happiness, and sadness to facial microexpressions (exposure: 100 ms) were measured with a six-alternative forced choice computerised method (6AFC). The attachment style and the authority style of the participants' parents were accessed using questionnaires. The findings revealed that up to 13% of the variance in participants' responses could be explained by FrFs, with modest to moderate effect sizes. Microexpressions linked to signs of hostility or threat (i.e., contempt and fear) were decoded faster and/or more accurately by adults with few or no siblings or with a fearful attachment. Conversely, participants who recalled their fathers as authoritarian were worse at recognising contempt and fear than participants who perceived them as permissive or authoritative. The findings suggest that early FrFs may still be involved in the fine-tuning of responses to signs of contextual danger when the time for cognitive processing of facial expressions is severely restricted.
Collapse
|
21
|
Zhang M, Fu Q, Chen YH, Fu X. Emotional context modulates micro-expression processing as reflected in event-related potentials. Psych J 2018; 7:13-24. [DOI: 10.1002/pchj.196] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2017] [Revised: 09/20/2017] [Accepted: 09/23/2017] [Indexed: 11/10/2022]
Affiliation(s)
- Ming Zhang
- State Key Laboratory of Brain and Cognitive Science; Institute of Psychology, Chinese Academy of Sciences; Beijing China
- Department of Psychology; Dalian Medical University; Dalian China
| | - Qiufang Fu
- State Key Laboratory of Brain and Cognitive Science; Institute of Psychology, Chinese Academy of Sciences; Beijing China
- Department of Psychology; University of Chinese Academy of Sciences; Beijing China
| | - Yu-Hsin Chen
- State Key Laboratory of Brain and Cognitive Science; Institute of Psychology, Chinese Academy of Sciences; Beijing China
- Institute of Psychology and Behavior Sciences, Wenzhou University; Wenzhou China
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science; Institute of Psychology, Chinese Academy of Sciences; Beijing China
- Department of Psychology; University of Chinese Academy of Sciences; Beijing China
| |
Collapse
|
22
|
High-Speed Video System for Micro-Expression Detection and Recognition. SENSORS 2017; 17:s17122913. [PMID: 29240700 PMCID: PMC5751645 DOI: 10.3390/s17122913] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/25/2017] [Revised: 12/11/2017] [Accepted: 12/11/2017] [Indexed: 11/17/2022]
Abstract
Micro-expressions play an essential part in understanding non-verbal communication and deceit detection. They are involuntary, brief facial movements that are shown when a person is trying to conceal something. Automatic analysis of micro-expression is challenging due to their low amplitude and to their short duration (they occur as fast as 1/15 to 1/25 of a second). We propose a fully micro-expression analysis system consisting of a high-speed image acquisition setup and a software framework which can detect the frames when the micro-expressions occurred as well as determine the type of the emerged expression. The detection and classification methods use fast and simple motion descriptors based on absolute image differences. The recognition module it only involves the computation of several 2D Gaussian probabilities. The software framework was tested on two publicly available high speed micro-expression databases and the whole system was used to acquire new data. The experiments we performed show that our solution outperforms state of the art works which use more complex and computationally intensive descriptors.
Collapse
|
23
|
Høyland AL, Nærland T, Engstrøm M, Lydersen S, Andreassen OA. The relation between face-emotion recognition and social function in adolescents with autism spectrum disorders: A case control study. PLoS One 2017; 12:e0186124. [PMID: 29020059 PMCID: PMC5636137 DOI: 10.1371/journal.pone.0186124] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2016] [Accepted: 09/26/2017] [Indexed: 12/27/2022] Open
Abstract
An altered processing of emotions may contribute to a reduced ability for social interaction and communication in autism spectrum disorder, ASD. We investigated how face-emotion recognition in ASD is different from typically developing across adolescent age groups. Fifty adolescents diagnosed with ASD and 49 typically developing (age 12–21 years) were included. The ASD diagnosis was underpinned by parent-rated Social Communication Questionnaire. We used a cued GO/ NOGO task with pictures of facial expressions and recorded reaction time, intra-individual variability of reaction time and omissions/commissions. The Social Responsiveness Scale was used as a measure of social function. Analyses were conducted for the whole group and for young (< 16 years) and old (≥ 16 years) age groups. We found no significant differences in any task measures between the whole group of typically developing and ASD and no significant correlations with the Social Responsiveness Scale. However, there was a non-significant tendency for longer reaction time in the young group with ASD (p = 0.099). The Social Responsiveness Scale correlated positively with reaction time (r = 0.30, p = 0.032) and intra-individual variability in reaction time (r = 0.29, p = 0.037) in the young group and in contrast, negatively in the old group (r = -0.23, p = 0.13; r = -0.38, p = 0.011, respectively) giving significant age group interactions for both reaction time (p = 0.008) and intra-individual variability in reaction time (p = 0.001). Our findings suggest an age-dependent association between emotion recognition and severity of social problems indicating a delayed development of emotional understanding in ASD. It also points towards alterations in top-down attention control in the ASD group. This suggests novel disease-related features that should be investigated in more details in experimental settings.
Collapse
Affiliation(s)
- Anne Lise Høyland
- Regional Centre for Child and Youth Mental Health and Child Welfare, Norwegian University of Science and Technology, Trondheim, Norway
- Department of Pediatrics, St. Olavs Hospital, Trondheim University Hospital, Norway
- * E-mail:
| | - Terje Nærland
- NevSom, Department of Rare Disorders and Disabilities, Oslo University Hospital, Norway
- NORMENT, KG Jebsen Centre for Psychosis Research, University of Oslo, Oslo, Norway
| | - Morten Engstrøm
- Department of Neurology and Clinical Neurophysiology, St. Olavs Hospital, Trondheim University Hospital, Norway
- Department of Neuroscience, Norwegian University of Science and Technology, Trondheim, Norway
| | - Stian Lydersen
- Regional Centre for Child and Youth Mental Health and Child Welfare, Norwegian University of Science and Technology, Trondheim, Norway
| | - Ole Andreas Andreassen
- NORMENT, KG Jebsen Centre for Psychosis Research, University of Oslo, Oslo, Norway
- Division of Mental Health and Addiction, Oslo University Hospital, Oslo, Norway
| |
Collapse
|
24
|
Qu F, Yan WJ, Chen YH, Li K, Zhang H, Fu X. "You Should Have Seen the Look on Your Face…": Self-awareness of Facial Expressions. Front Psychol 2017; 8:832. [PMID: 28611703 PMCID: PMC5447732 DOI: 10.3389/fpsyg.2017.00832] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2017] [Accepted: 05/08/2017] [Indexed: 11/29/2022] Open
Abstract
The awareness of facial expressions allows one to better understand, predict, and regulate his/her states to adapt to different social situations. The present research investigated individuals' awareness of their own facial expressions and the influence of the duration and intensity of expressions in two self-reference modalities, a real-time condition and a video-review condition. The participants were instructed to respond as soon as they became aware of any facial movements. The results revealed that awareness rates were 57.79% in the real-time condition and 75.92% in the video-review condition. The awareness rate was influenced by the intensity and (or) the duration. The intensity thresholds for individuals to become aware of their own facial expressions were calculated using logistic regression models. The results of Generalized Estimating Equations (GEE) revealed that video-review awareness was a significant predictor of real-time awareness. These findings extend understandings of human facial expression self-awareness in two modalities.
Collapse
Affiliation(s)
- Fangbing Qu
- College of Preschool Education, Capital Normal UniversityBeijing, China
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
- Department of Psychology, University of Chinese Academy of SciencesBeijing, China
| | - Wen-Jing Yan
- Institute of Psychology and Behavioral Sciences, Wenzhou UniversityWenzhou, China
| | - Yu-Hsin Chen
- Institute of Psychology and Behavioral Sciences, Wenzhou UniversityWenzhou, China
| | - Kaiyun Li
- School of Education and Psychology, University of JinanJinan, China
| | - Hui Zhang
- Department of Biostatistics, St. Jude Children’s Research Hospital, MemphisTN, United States
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
- Department of Psychology, University of Chinese Academy of SciencesBeijing, China
| |
Collapse
|
25
|
Yu EH, Choi EJ, Lee SY, Im SJ, Yune SJ, Baek SY. Effects of micro- and subtle-expression reading skill training in medical students: A randomized trial. PATIENT EDUCATION AND COUNSELING 2016; 99:1670-1675. [PMID: 27134051 DOI: 10.1016/j.pec.2016.04.013] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/26/2015] [Revised: 04/19/2016] [Accepted: 04/24/2016] [Indexed: 06/05/2023]
Abstract
OBJECTIVE to investigate the effectiveness of the Micro Expression Training Tool (METT) and the Subtle Expression Training Tool (SETT) to help improve the non-verbal communication skills of medical students. METHODS In a randomized controlled trial, all participants were randomly allocated to either a training (n=41) or control group (n=41) and were pre-tested before education with METT and SETT at baseline. Then, training students took second tests after a 1-h class about interpreting micro and subtle expressions and control students took the second tests without the class. RESULTS METT pre-test scores were positively related with female gender, agreeableness, whereas SETT pre-test scores were negatively related with age and positively related with female gender. Mean METT score increases of 29.3% and mean SETT score increases of 36.2% were observed after training, whereas the control group achieved only a mean METT score increase of 11.0% at second testing. Increases in both test scores in the training group were significantly higher than in the control group. CONCLUSION METT and SETT are effective, simple tools for improving the micro- and subtle-expression reading skills of medical students. PRACTICE IMPLICATIONS METT and SETT can be effective for improving the non-verbal communication skills of medical students.
Collapse
Affiliation(s)
- Eun Ho Yu
- Department of Medical Education, Pusan National University School of Medicine, Yangsan, South Korea
| | - Eun Jung Choi
- Department of Family Medicine, Pusan National University School of Medicine, Yangsan, South Korea
| | - Sang Yeoup Lee
- Department of Medical Education, Pusan National University School of Medicine, Yangsan, South Korea; Family Medicine Clinic and Research Institute of Convergence of Biomedical Science and Technology, Pusan National University Yangsan Hospital, Yangsan, South Korea.
| | - Sun Ju Im
- Department of Medical Education, Pusan National University School of Medicine, Yangsan, South Korea
| | - So Jung Yune
- Department of Medical Education, Pusan National University School of Medicine, Yangsan, South Korea
| | - Sun Yong Baek
- Department of Medical Education, Pusan National University School of Medicine, Yangsan, South Korea
| |
Collapse
|
26
|
Shen X, Wu Q, Zhao K, Fu X. Electrophysiological Evidence Reveals Differences between the Recognition of Microexpressions and Macroexpressions. Front Psychol 2016; 7:1346. [PMID: 27630610 PMCID: PMC5005928 DOI: 10.3389/fpsyg.2016.01346] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2016] [Accepted: 08/23/2016] [Indexed: 11/25/2022] Open
Abstract
Microexpressions are fleeting facial expressions that are important for judging people's true emotions. Little is known about the neural mechanisms underlying the recognition of microexpressions (with duration of less than 200 ms) and macroexpressions (with duration of greater than 200 ms). We used an affective priming paradigm in which a picture of a facial expression is the prime and an emotional word is the target, and electroencephalogram (EEG) and event-related potentials (ERPs) to examine neural activities associated with recognizing microexpressions and macroexpressions. The results showed that there were significant main effects of duration and valence for N170/vertex positive potential. The main effect of congruence for N400 is also significant. Further, sLORETA showed that the brain regions responsible for these significant differences included the inferior temporal gyrus and widespread regions of the frontal lobe. Furthermore, the results suggested that the left hemisphere was more involved than the right hemisphere in processing a microexpression. The main effect of duration for the event-related spectral perturbation (ERSP) was significant, and the theta oscillations (4 to 8 Hz) increased in recognizing expressions with a duration of 40 ms compared with 300 ms. Thus, there are different EEG/ERPs neural mechanisms for recognizing microexpressions compared to recognizing macroexpressions.
Collapse
Affiliation(s)
- Xunbing Shen
- Department of Psychology, Jiangxi University of Traditional Chinese MedicineNanchang, China
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
| | - Qi Wu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
- Department of Psychology, Hunan Normal UniversityChangsha, China
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of SciencesBeijing, China
| |
Collapse
|
27
|
Zhao MF, Zimmer HD, Shen X, Chen W, Fu X. Exploring the Cognitive Processes Causing the Age-Related Categorization Deficit in the Recognition of Facial Expressions. Exp Aging Res 2016; 42:348-64. [DOI: 10.1080/0361073x.2016.1191854] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
28
|
Ben X, Zhang P, Yan R, Yang M, Ge G. Gait recognition and micro-expression recognition based on maximum margin projection with tensor representation. Neural Comput Appl 2015. [DOI: 10.1007/s00521-015-2031-8] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
29
|
Felisberti F, Terry P. The effects of alcohol on the recognition of facial expressions and microexpressions of emotion: enhanced recognition of disgust and contempt. Hum Psychopharmacol 2015; 30:384-92. [PMID: 26073552 DOI: 10.1002/hup.2488] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/26/2014] [Revised: 02/24/2015] [Accepted: 04/22/2015] [Indexed: 11/12/2022]
Abstract
OBJECTIVE The study compared alcohol's effects on the recognition of briefly displayed facial expressions of emotion (so-called microexpressions) with expressions presented for a longer period. METHOD Using a repeated-measures design, we tested 18 participants three times (counterbalanced), after (i) a placebo drink, (ii) a low-to-moderate dose of alcohol (0.17 g/kg women; 0.20 g/kg men) and (iii) a moderate-to-high dose of alcohol (0.52 g/kg women; 0.60 g/kg men). On each session, participants were presented with stimuli representing six emotions (happiness, sadness, anger, fear, disgust and contempt) overlaid on a generic avatar in a six-alternative forced-choice paradigm. A neutral expression (1 s) preceded and followed a target expression presented for 200 ms (microexpressions) or 400 ms. Participants mouse clicked the correct answer. RESULTS The recognition of disgust was significantly better after the high dose of alcohol than after the low dose or placebo drinks at both durations of stimulus presentation. A similar profile of effects was found for the recognition of contempt. There were no effects on response latencies. CONCLUSION Alcohol can increase sensitivity to expressions of disgust and contempt. Such effects are not dependent on stimulus duration up to 400 ms and may reflect contextual modulation of alcohol's effects on emotion recognition.
Collapse
Affiliation(s)
- Fatima Felisberti
- Department of Psychology, Kingston University, Kingston upon Thames, UK
| | - Philip Terry
- Department of Psychology, Kingston University, Kingston upon Thames, UK
| |
Collapse
|
30
|
Lodder GMA, Scholte RHJ, Goossens L, Engels RCME, Verhagen M. Loneliness and the social monitoring system: Emotion recognition and eye gaze in a real-life conversation. Br J Psychol 2015; 107:135-53. [PMID: 25854912 DOI: 10.1111/bjop.12131] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2014] [Revised: 10/21/2014] [Indexed: 12/01/2022]
Abstract
Based on the belongingness regulation theory (Gardner et al., 2005, Pers. Soc. Psychol. Bull., 31, 1549), this study focuses on the relationship between loneliness and social monitoring. Specifically, we examined whether loneliness relates to performance on three emotion recognition tasks and whether lonely individuals show increased gazing towards their conversation partner's faces in a real-life conversation. Study 1 examined 170 college students (Mage = 19.26; SD = 1.21) who completed an emotion recognition task with dynamic stimuli (morph task) and a micro(-emotion) expression recognition task. Study 2 examined 130 college students (Mage = 19.33; SD = 2.00) who completed the Reading the Mind in the Eyes Test and who had a conversation with an unfamiliar peer while their gaze direction was videotaped. In both studies, loneliness was measured using the UCLA Loneliness Scale version 3 (Russell, 1996, J. Pers. Assess., 66, 20). The results showed that loneliness was unrelated to emotion recognition on all emotion recognition tasks, but that it was related to increased gaze towards their conversation partner's faces. Implications for the belongingness regulation system of lonely individuals are discussed.
Collapse
Affiliation(s)
- Gerine M A Lodder
- Behavioural Science Institute, Radboud University Nijmegen, The Netherlands
| | - Ron H J Scholte
- Behavioural Science Institute, Radboud University Nijmegen, The Netherlands
| | - Luc Goossens
- Research Group School Psychology and Child and Adolescent Development, KU Leuven - University of Leuven, Belgium
| | | | - Maaike Verhagen
- Behavioural Science Institute, Radboud University Nijmegen, The Netherlands
| |
Collapse
|
31
|
Prochnow D, Brunheim S, Kossack H, Eickhoff SB, Markowitsch HJ, Seitz RJ. Anterior and posterior subareas of the dorsolateral frontal cortex in socially relevant decisions based on masked affect expressions. F1000Res 2014; 3:212. [PMID: 26236464 PMCID: PMC4516020 DOI: 10.12688/f1000research.4734.1] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 07/01/2015] [Indexed: 08/22/2023] Open
Abstract
Socially-relevant decisions are based on clearly recognizable but also not consciously accessible affective stimuli. We studied the role of the dorsolateral frontal cortex (DLFC) in decision-making on masked affect expressions using functional magnetic resonance imaging. Our paradigm permitted us to capture brain activity during a pre-decision phase when the subjects viewed emotional expressions below the threshold of subjective awareness, and during the decision phase, which was based on verbal descriptions as the choice criterion. Using meta-analytic connectivity modeling, we found that the preparatory phase of the decision was associated with activity in a right-posterior portion of the DLFC featuring co-activations in the left-inferior frontal cortex. During the subsequent decision a right-anterior and more dorsal portion of the DLFC became activated, exhibiting a different co-activation pattern. These results provide evidence for partially independent sub-regions within the DLFC, supporting the notion of dual associative processes in intuitive judgments.
Collapse
Affiliation(s)
- Denise Prochnow
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| | - Sascha Brunheim
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| | - Hannes Kossack
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| | - Simon B. Eickhoff
- Institute for Clinical Neuroscience and Medical Psychology, University of Düsseldorf, Düsseldorf, D-40225, Germany
| | | | - Rüdiger J. Seitz
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| |
Collapse
|
32
|
Prochnow D, Brunheim S, Kossack H, Eickhoff SB, Markowitsch HJ, Seitz RJ. Anterior and posterior subareas of the dorsolateral frontal cortex in socially relevant decisions based on masked affect expressions. F1000Res 2014; 3:212. [PMID: 26236464 PMCID: PMC4516020 DOI: 10.12688/f1000research.4734.3] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 07/01/2015] [Indexed: 12/24/2022] Open
Abstract
Socially-relevant decisions are based on clearly recognizable but also not consciously accessible affective stimuli. We studied the role of the dorsolateral frontal cortex (DLFC) in decision-making on masked affect expressions using functional magnetic resonance imaging. Our paradigm permitted us to capture brain activity during a pre-decision phase when the subjects viewed emotional expressions below the threshold of subjective awareness, and during the decision phase, which was based on verbal descriptions as the choice criterion. Using meta-analytic connectivity modeling, we found that the preparatory phase of the decision was associated with activity in a right-posterior portion of the DLFC featuring co-activations in the left-inferior frontal cortex. During the subsequent decision a right-anterior and more dorsal portion of the DLFC became activated, exhibiting a different co-activation pattern. These results provide evidence for partially independent sub-regions within the DLFC, supporting the notion of dual associative processes in intuitive judgments.
Collapse
Affiliation(s)
- Denise Prochnow
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| | - Sascha Brunheim
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| | - Hannes Kossack
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| | - Simon B Eickhoff
- Institute for Clinical Neuroscience and Medical Psychology, University of Düsseldorf, Düsseldorf, D-40225, Germany
| | - Hans J Markowitsch
- Department of Psychology, Bielefeld University, Bielefeld, D-33615, Germany
| | - Rüdiger J Seitz
- Department of Neurology, Heinrich-Heine University Düsseldorf, Düsseldorf, D-40225, Germany
| |
Collapse
|
33
|
Zhang M, Fu Q, Chen YH, Fu X. Emotional context influences micro-expression recognition. PLoS One 2014; 9:e95018. [PMID: 24736491 PMCID: PMC3988169 DOI: 10.1371/journal.pone.0095018] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2013] [Accepted: 03/21/2014] [Indexed: 11/19/2022] Open
Abstract
Micro-expressions are often embedded in a flow of expressions including both neutral and other facial expressions. However, it remains unclear whether the types of facial expressions appearing before and after the micro-expression, i.e., the emotional context, influence micro-expression recognition. To address this question, the present study used a modified METT (Micro-Expression Training Tool) paradigm that required participants to recognize the target micro-expressions presented briefly between two identical emotional faces. The results of Experiments 1 and 2 showed that negative context impaired the recognition of micro-expressions regardless of the duration of the target micro-expression. Stimulus-difference between the context and target micro-expression was accounted for in Experiment 3. Results showed that a context effect on micro-expression recognition persists even when the stimulus similarity between the context and target micro-expressions was controlled. Therefore, our results not only provided evidence for the context effect on micro-expression recognition but also suggested that the context effect might result from both the stimulus and valence differences.
Collapse
Affiliation(s)
- Ming Zhang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Qiufang Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Yu-Hsin Chen
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- University of Chinese Academy of Sciences, Beijing, China
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- * E-mail:
| |
Collapse
|
34
|
Kovács-Bálint Z, Stefanics G, Trunk A, Hernádi I. Automatic detection of trustworthiness of the face: a visual mismatch negativity study. ACTA BIOLOGICA HUNGARICA 2014; 65:1-12. [PMID: 24561890 DOI: 10.1556/abiol.65.2014.1.1] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Recognizing intentions of strangers from facial cues is crucial in everyday social interactions. Recent studies demonstrated enhanced event-related potential (ERP) responses to untrustworthy compared to trustworthy faces. The aim of the present study was to investigate the electrophysiological correlates of automatic processing of trustworthiness cues in a visual oddball paradigm in two consecutive experimental blocks. In one block, frequent trustworthy (p = 0.9) and rare untrustworthy face stimuli (p = 0.1) were briefly presented on a computer screen with each stimulus consisting of four peripherally positioned faces. In the other block stimuli were presented with reversed probabilities enabling the comparison of ERPs evoked by physically identical deviant and standard stimuli. To avoid attentional effects participants engaged in a central detection task. Analyses of deviant minus standard difference waveforms revealed that deviant untrustworthy but not trustworthy faces elicited the visual mismatch negativity (vMMN) component. The present results indicate that adaptation occurred to repeated unattended trustworthy (but not untrustworthy) faces, i.e., an automatic expectation was elicited towards trustworthiness signals, which was violated by deviant untrustworthy faces. As an evolutionary adaptive mechanism, the observed fast detection of trustworthiness-related social facial cues may serve as the basis of conscious recognition of reliable partners.
Collapse
Affiliation(s)
- Z Kovács-Bálint
- University of Pécs Department of Experimental Zoology and Neurobiology, Institute of Biology Pécs Hungary
| | - G Stefanics
- University of Zurich & ETH Zurich Translational Neuromodeling Unit, Institute for Biomedical Engineering Zürich Switzerland University of Zurich Laboratory for Social and Neural Systems Research, Institute for Empirical Research in Economics Zürich Switzerland
| | - A Trunk
- University of Pécs Department of Experimental Zoology and Neurobiology, Institute of Biology Pécs Hungary
| | - I Hernádi
- University of Pécs Department of Experimental Zoology and Neurobiology, Institute of Biology Pécs Hungary
| |
Collapse
|
35
|
How Fast are the Leaked Facial Expressions: The Duration of Micro-Expressions. JOURNAL OF NONVERBAL BEHAVIOR 2013. [DOI: 10.1007/s10919-013-0159-8] [Citation(s) in RCA: 207] [Impact Index Per Article: 18.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|