1
|
Wezowski K, Penton-Voak IS. Relationship between low mood and micro-expression processing: evidence of negative bias in interpreting fleeting facial expressions. ROYAL SOCIETY OPEN SCIENCE 2024; 11:231944. [PMID: 39086818 PMCID: PMC11288663 DOI: 10.1098/rsos.231944] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Revised: 06/02/2024] [Accepted: 07/11/2024] [Indexed: 08/02/2024]
Abstract
Depression affects the recognition of emotion in facial expressions by reducing the detection accuracy and adding a bias towards negativity. However, no study has examined associations between depression and the recognition of microfacial expressions (fleeting facial cues of emotions in people's faces). Thus, we investigated associations between low mood and micro-expression processing using video stimuli of micro-expressions. We examined whether (i) individuals with low mood had trouble recognizing emotions, (ii) were more likely to perceive happy facial expressions as neutral and neutral facial expressions as sad, and (iii) recognized sad emotional expressions better than control subjects (n = 349). We found that participants with low mood showed poorer performance when judging emotions in faces (p = 0.03). Furthermore, there was a specific deficit among them in recognizing happiness. Lastly, participants with low moods were more likely to perceive neutral faces as sad (p = 0.042). However, no evidence was found that individuals with low moods confused happy faces as neutral or were better than the control group at recognizing sad faces. Our results show that mood affects the perception of emotions in facial expressions, which has the potential to negatively affect interpersonal interactions and ultimately quality of life.
Collapse
Affiliation(s)
- Kasia Wezowski
- School of Psychological Science, University of Bristol, 12a Priory Road, BristolBS8 1TU, UK
| | - Ian S. Penton-Voak
- School of Psychological Science, University of Bristol, 12a Priory Road, BristolBS8 1TU, UK
- National Institute for Health Research Bristol Biomedical Research Centre, University Hospitals Bristol NHS Foundation Trust and University of Bristol, Bristol, UK
| |
Collapse
|
2
|
Chamberland JA, Collin CA. Effects of forward mask duration variability on the temporal dynamics
of brief facial expression categorization. Iperception 2023; 14:20416695231162580. [PMID: 36968319 PMCID: PMC10031613 DOI: 10.1177/20416695231162580] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 02/22/2023] [Indexed: 03/24/2023] Open
Abstract
The Japanese and Caucasian Brief Affect Recognition Task (JACBART) has been
proposed as a standardized method for measuring people's ability to accurately
categorize briefly presented images of facial expressions. However, the factors
that impact performance in this task are not entirely understood. The current
study sought to explore the role of the forward mask's duration (i.e., fixed vs.
variable) in brief affect categorization across expressions of the six basic
emotions (i.e., anger, disgust, fear, happiness, sadness, and surprise) and
three presentation times (i.e., 17, 67, and 500 ms). Current findings do not
demonstrate evidence that a variable duration forward mask negatively impacts
brief affect categorization. However, efficiency and necessity thresholds were
observed to vary across the expressions of emotion. Further exploration of the
temporal dynamics of facial affect categorization will therefore require a
consideration of these differences.
Collapse
Affiliation(s)
- Justin A. Chamberland
- Justin A. Chamberland, School of
Psychology/École de psychologie, University of Ottawa/Université d’Ottawa,
Ottawa, Ontario, K1N 6N5, Canada.
| | | |
Collapse
|
3
|
Wu Q, Peng K, Xie Y, Lai Y, Liu X, Zhao Z. An ingroup disadvantage in recognizing micro-expressions. Front Psychol 2022; 13:1050068. [PMID: 36507018 PMCID: PMC9732534 DOI: 10.3389/fpsyg.2022.1050068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 11/08/2022] [Indexed: 11/27/2022] Open
Abstract
Micro-expression is a fleeting facial expression of emotion that usually occurs in high-stake situations and reveals the true emotion that a person tries to conceal. Due to its unique nature, recognizing micro-expression has great applications for fields like law enforcement, medical treatment, and national security. However, the psychological mechanism of micro-expression recognition is still poorly understood. In the present research, we sought to expand upon previous research to investigate whether the group membership of the expresser influences the recognition process of micro-expressions. By conducting two behavioral studies, we found that contrary to the widespread ingroup advantage found in macro-expression recognition, there was a robust ingroup disadvantage in micro-expression recognition instead. Specifically, in Study 1A and 1B, we found that participants were more accurate at recognizing the intense and subtle micro-expressions of their racial outgroups than those micro-expressions of their racial ingroups, and neither the training experience nor the duration of micro-expressions moderated this ingroup disadvantage. In Study 2A and 2B, we further found that mere social categorization alone was sufficient to elicit the ingroup disadvantage for the recognition of intense and subtle micro-expressions, and such an effect was also unaffected by the duration of micro-expressions. These results suggest that individuals spontaneously employ the social category information of others to recognize micro-expressions, and the ingroup disadvantage in micro-expression stems partly from motivated differential processing of ingroup micro-expressions.
Collapse
Affiliation(s)
- Qi Wu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China,*Correspondence: Qi Wu,
| | - Kunling Peng
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yanni Xie
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Yeying Lai
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Xuanchen Liu
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| | - Ziwei Zhao
- Department of Psychology, School of Educational Science, Hunan Normal University, Changsha, China,Cognition and Human Behavior Key Laboratory of Hunan Province, Hunan Normal University, Changsha, China
| |
Collapse
|
4
|
Lin Q, Dong Z, Zheng Q, Wang SJ. The effect of facial attractiveness on micro-expression recognition. Front Psychol 2022; 13:959124. [PMID: 36186390 PMCID: PMC9524498 DOI: 10.3389/fpsyg.2022.959124] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Accepted: 08/11/2022] [Indexed: 11/23/2022] Open
Abstract
Micro-expression (ME) is an extremely quick and uncontrollable facial movement that lasts for 40–200 ms and reveals thoughts and feelings that an individual attempts to cover up. Though much more difficult to detect and recognize, ME recognition is similar to macro-expression recognition in that it is influenced by facial features. Previous studies suggested that facial attractiveness could influence facial expression recognition processing. However, it remains unclear whether facial attractiveness could also influence ME recognition. Addressing this issue, this study tested 38 participants with two ME recognition tasks in a static condition or dynamically. Three different MEs (positive, neutral, and negative) at two attractiveness levels (attractive, unattractive). The results showed that participants recognized MEs on attractive faces much quicker than on unattractive ones, and there was a significant interaction between ME and facial attractiveness. Furthermore, attractive happy faces were recognized faster in both the static and the dynamic conditions, highlighting the happiness superiority effect. Therefore, our results provided the first evidence that facial attractiveness could influence ME recognition in a static condition or dynamically.
Collapse
Affiliation(s)
- Qiongsi Lin
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
| | - Zizhao Dong
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
| | - Qiuqiang Zheng
- Teacher Education Curriculum Center, School of Educational Science, Huizhou University, Huizhou, China
| | - Su-Jing Wang
- Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, China
- *Correspondence: Su-Jing Wang
| |
Collapse
|
5
|
Ge Y, Su R, Liang Z, Luo J, Tian S, Shen X, Wu H, Liu C. Transcranial Direct Current Stimulation Over the Right Temporal Parietal Junction Facilitates Spontaneous Micro-Expression Recognition. Front Hum Neurosci 2022; 16:933831. [PMID: 35874155 PMCID: PMC9305610 DOI: 10.3389/fnhum.2022.933831] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2022] [Accepted: 06/21/2022] [Indexed: 11/19/2022] Open
Abstract
Micro-expressions are fleeting and subtle emotional expressions. As they are spontaneous and uncontrollable by one's mind, micro-expressions are considered an indicator of genuine emotions. Their accurate recognition and interpretation promote interpersonal interaction and social communication. Therefore, enhancing the ability to recognize micro-expressions has captured much attention. In the current study, we investigated the effects of training on micro-expression recognition with a Chinese version of the Micro-Expression Training Tool (METT). Our goal was to confirm whether the recognition accuracy of spontaneous micro-expressions could be improved through training and brain stimulation. Since the right temporal parietal junction (rTPJ) has been shown to be involved in the explicit process of facial emotion recognition, we hypothesized that the rTPJ would play a role in facilitating the recognition of micro-expressions. The results showed that anodal transcranial direct-current stimulation (tDCS) of the rTPJ indeed improved the recognition of spontaneous micro-expressions, especially for those associated with fear. The improved accuracy of recognizing fear spontaneous micro-expressions was positively correlated with personal distress in the anodal group but not in the sham group. Our study supports that the combined use of tDCS and METT can be a viable way to train and enhance micro-expression recognition.
Collapse
Affiliation(s)
- Yue Ge
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
- Beijing Institute of Biomedicine, Beijing, China
| | - Rui Su
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Zilu Liang
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| | - Jing Luo
- Beijing Institute of Biomedicine, Beijing, China
| | - Suizi Tian
- School of Psychology, Beijing Normal University, Beijing, China
| | - Xunbing Shen
- College of Humanities, Jiangxi University of Chinese Medicine, Nanchang, China
| | - Haiyan Wu
- Centre for Cognitive and Brain Sciences and Department of Psychology, University of Macau, Taipa, China
| | - Chao Liu
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing, China
- Center for Collaboration and Innovation in Brain and Learning Sciences, Beijing Normal University, Beijing, China
- Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, China
| |
Collapse
|
6
|
Döllinger L, Laukka P, Högman LB, Bänziger T, Makower I, Fischer H, Hau S. Training Emotion Recognition Accuracy: Results for Multimodal Expressions and Facial Micro Expressions. Front Psychol 2021; 12:708867. [PMID: 34475841 PMCID: PMC8406528 DOI: 10.3389/fpsyg.2021.708867] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Accepted: 07/09/2021] [Indexed: 12/22/2022] Open
Abstract
Nonverbal emotion recognition accuracy (ERA) is a central feature of successful communication and interaction, and is of importance for many professions. We developed and evaluated two ERA training programs-one focusing on dynamic multimodal expressions (audio, video, audio-video) and one focusing on facial micro expressions. Sixty-seven subjects were randomized to one of two experimental groups (multimodal, micro expression) or an active control group (emotional working memory task). Participants trained once weekly with a brief computerized training program for three consecutive weeks. Pre-post outcome measures consisted of a multimodal ERA task, a micro expression recognition task, and a task about patients' emotional cues. Post measurement took place approximately a week after the last training session. Non-parametric mixed analyses of variance using the Aligned Rank Transform were used to evaluate the effectiveness of the training programs. Results showed that multimodal training was significantly more effective in improving multimodal ERA compared to micro expression training or the control training; and the micro expression training was significantly more effective in improving micro expression ERA compared to the other two training conditions. Both pre-post effects can be interpreted as large. No group differences were found for the outcome measure about recognizing patients' emotion cues. There were no transfer effects of the training programs, meaning that participants only improved significantly for the specific facet of ERA that they had trained on. Further, low baseline ERA was associated with larger ERA improvements. Results are discussed with regard to methodological and conceptual aspects, and practical implications and future directions are explored.
Collapse
Affiliation(s)
- Lillian Döllinger
- Department of Psychology, Faculty of Social Sciences, Stockholm University, Stockholm, Sweden
| | - Petri Laukka
- Department of Psychology, Faculty of Social Sciences, Stockholm University, Stockholm, Sweden
| | - Lennart Björn Högman
- Department of Psychology, Faculty of Social Sciences, Stockholm University, Stockholm, Sweden
| | - Tanja Bänziger
- Department of Psychology and Social Work, Mid Sweden University, Sundsvall, Sweden
| | | | - Håkan Fischer
- Department of Psychology, Faculty of Social Sciences, Stockholm University, Stockholm, Sweden
| | - Stephan Hau
- Department of Psychology, Faculty of Social Sciences, Stockholm University, Stockholm, Sweden
| |
Collapse
|
7
|
Schlegel K. The Effects of Emotion Recognition Training on Interpersonal Effectiveness. BASIC AND APPLIED SOCIAL PSYCHOLOGY 2021. [DOI: 10.1080/01973533.2021.1883021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
8
|
Zhang J, Yin M, Shu D, Liu D. The Establishment of Pseudorandom Ecological Microexpression Recognition Test (PREMERT) and Its Relevant Resting-State Brain Activity. Front Hum Neurosci 2020; 14:281. [PMID: 32848665 PMCID: PMC7406786 DOI: 10.3389/fnhum.2020.00281] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2020] [Accepted: 06/22/2020] [Indexed: 11/13/2022] Open
Abstract
The EMERT (ecological microexpression recognition test) by Zhang et al. (2017) used between-subjects Latin square block design for backgrounds; therefore, participants could not get comparable scores. The current study used within-subject pseudorandom design for backgrounds to improve EMERT to PREMERT (pseudorandom EMERT) and used eyes-closed and eyes-open resting-state functional magnetic resonance imaging to detect relevant brain activity of PREMERT for the first time. The results showed (1) two new recapitulative indexes of PREMERT were adopted, such as microexpression M and microexpression SD. Using pseudorandom design, the participants could effectively identify almost all the microexpressions, and each microexpression type had significant background effect. The PREMERT had good split-half reliability, parallel-forms reliability, criterion validity, and ecological validity. Therefore, it could stably and efficiently detect the participants' microexpression recognition abilities. Because of its pseudorandom design, all participants did the same test; their scores could be compared with each other. (2) amplitude of low-frequency fluctuations (ALFF; 0.01-0.1 Hz) in both eyes-closed and eyes-open resting states and ALFF difference could predict microexpression M, and the ALFF difference was less predictive. The relevant resting-state brain areas of microexpression M were some frontal lobes, insula, cingulate cortex, hippocampus, amygdala, fusiform gyrus, parietal lobe, caudate nucleus, precuneus, thalamus, putamen, temporal lobe, and cerebellum. (3) ALFFs in both eyes-closed and eyes-open resting states and ALFF difference could predict microexpression SD, and the ALFF difference was more predictive. The relevant resting-state brain areas of microexpression SD were some frontal lobes, central anterior gyrus, supplementary motor area, insula, hippocampus, amygdala, cuneus, occipital lobe, fusiform gyrus, parietal lobe, caudate nucleus, pallidum, putamen, thalamus, temporal lobe, and cerebellum. (4) There were many similar relevant resting-state brain areas, such as brain areas of expression recognition, microexpressions consciousness and attention, and the change from expression backgrounds to microexpression, and some different relevant resting-state brain areas, such as precuneus, insula, and pallidum, between microexpression M and SD. The ALFF difference was more sensitive to PREMERT fluctuations.
Collapse
Affiliation(s)
- Jianxin Zhang
- School of Humanities, Jiangnan University, Wuxi, China
| | - Ming Yin
- Jiangsu Police Institute, Nanjing, China
| | - Deming Shu
- School of Education, Soochow University, Suzhou, China
| | - Dianzhi Liu
- School of Education, Soochow University, Suzhou, China
| |
Collapse
|
9
|
Stanley JT, Webster BA. A comparison of the effectiveness of two types of deceit detection training methods in older adults. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2019; 4:26. [PMID: 31332602 PMCID: PMC6646507 DOI: 10.1186/s41235-019-0178-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/04/2018] [Accepted: 06/05/2019] [Indexed: 12/02/2022]
Abstract
Background In general, people are poor at detecting deception. Older adults are even worse than young adults at detecting deceit, which might make them uniquely vulnerable to certain types of financial fraud. One reason for poor deceit detection abilities is that lay theories of cues to deception are not valid. This study compared the effectiveness of two training methods to improve deceit detection among older adults: valid facial cues versus valid verbal cues to deception. Approximately 150 older adults were randomly assigned to facial training, verbal training, or a control condition. Participants completed a pre-test deceit detection task, their assigned training, and a post-test deceit detection task. Results Both training groups significantly improved at recognizing their respectively trained cues after training. However, the facial cue training group were less accurate at detecting deception post-test compared to pre-test and the control group exhibited improved accuracy of deceit detection from pre-test to post-test. Conclusions These results are consistent with the body of literature on deception suggesting people hover around chance accuracy, even after training. Older adults’ facial and verbal cue recognition can be improved with training, but these improvements did not translate into more accurate deceit detection, and actually hampered performance in the facial condition. Older adults showed the most benefit from sheer practice at detecting deception (in the control condition), perhaps because this condition encouraged implicit rather than explicit judgments of deception.
Collapse
Affiliation(s)
| | - Britney A Webster
- Department of Psychology, University of Akron, Akron, OH, 44325-4301, USA
| |
Collapse
|
10
|
Matsumoto D, Hwang HC. Commentary: Electrophysiological Evidence Reveals Differences between the Recognition of Microexpressions and Macroexpressions. Front Psychol 2019; 10:1293. [PMID: 31263437 PMCID: PMC6584814 DOI: 10.3389/fpsyg.2019.01293] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Accepted: 05/16/2019] [Indexed: 11/25/2022] Open
Affiliation(s)
- David Matsumoto
- Department of Psychology, San Francisco State University, San Francisco, CA, United States
- Humintell, El Cerrito, CA, United States
- *Correspondence: David Matsumoto
| | | |
Collapse
|
11
|
Zeng X, Wu Q, Zhang S, Liu Z, Zhou Q, Zhang M. A False Trail to Follow: Differential Effects of the Facial Feedback Signals From the Upper and Lower Face on the Recognition of Micro-Expressions. Front Psychol 2018; 9:2015. [PMID: 30405497 PMCID: PMC6208096 DOI: 10.3389/fpsyg.2018.02015] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2018] [Accepted: 10/01/2018] [Indexed: 01/24/2023] Open
Abstract
Micro-expressions, as fleeting facial expressions, are very important for judging people’s true emotions, thus can provide an essential behavioral clue for lie and dangerous demeanor detection. From embodied accounts of cognition, we derived a novel hypothesis that facial feedback from upper and lower facial regions has differential effects on micro-expression recognition. This hypothesis was tested and supported across three studies. Specifically, the results of Study 1 showed that people became better judges of intense micro-expressions with a duration of 450 ms when the facial feedback from upper face was enhanced via a restricting gel. Additional results of Study 2 showed that the recognition accuracy of subtle micro-expressions was significantly impaired under all duration conditions (50, 150, 333, and 450 ms) when facial feedback from lower face was enhanced. In addition, the results of Study 3 also revealed that blocking the facial feedback of lower face, significantly boosted the recognition accuracy of subtle and intense micro-expressions under all duration conditions (150 and 450 ms). Together, these results highlight the role of facial feedback in judging the subtle movements of micro-expressions.
Collapse
Affiliation(s)
- Xuemei Zeng
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qi Wu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Siwei Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Zheying Liu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qing Zhou
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Meishan Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| |
Collapse
|
12
|
Hutchison A, Gerstein L, Kasai M. A Cross-Cultural Comparison of U.S. and Japanese Trainees’ Emotion-Recognition Ability. JAPANESE PSYCHOLOGICAL RESEARCH 2017. [DOI: 10.1111/jpr.12182] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
13
|
Piccolo LD, Finset A, Mellblom AV, Figueiredo-Braga M, Korsvold L, Zhou Y, Zimmermann C, Humphris G. Verona Coding Definitions of Emotional Sequences (VR-CoDES): Conceptual framework and future directions. PATIENT EDUCATION AND COUNSELING 2017; 100:2303-2311. [PMID: 28673489 DOI: 10.1016/j.pec.2017.06.026] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/27/2017] [Revised: 06/01/2017] [Accepted: 06/20/2017] [Indexed: 06/07/2023]
Abstract
OBJECTIVE To discuss the theoretical and empirical framework of VR-CoDES and potential future direction in research based on the coding system. METHODS The paper is based on selective review of papers relevant to the construction and application of VR-CoDES. RESULTS VR-CoDES system is rooted in patient-centered and biopsychosocial model of healthcare consultations and on a functional approach to emotion theory. According to the VR-CoDES, emotional interaction is studied in terms of sequences consisting of an eliciting event, an emotional expression by the patient and the immediate response by the clinician. The rationale for the emphasis on sequences, on detailed classification of cues and concerns, and on the choices of explicit vs. non-explicit responses and providing vs. reducing room for further disclosure, as basic categories of the clinician responses, is described. CONCLUSIONS Results from research on VR-CoDES may help raise awareness of emotional sequences. Future directions in applying VR-CoDES in research may include studies on predicting patient and clinician behavior within the consultation, qualitative analyses of longer sequences including several VR-CoDES triads, and studies of effects of emotional communication on health outcomes. PRACTICE IMPLICATIONS VR-CoDES may be applied to develop interventions to promote good handling of patients' emotions in healthcare encounters.
Collapse
Affiliation(s)
- Lidia Del Piccolo
- Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Verona, Italy.
| | - Arnstein Finset
- Department of Behavioral Sciences in Medicine, Institute of Basic Medical Science, Faculty of Medicine, University of Oslo, Oslo, Norway
| | - Anneli V Mellblom
- Department of Behavioral Sciences in Medicine, Institute of Basic Medical Science, Faculty of Medicine, University of Oslo, Oslo, Norway; Department of Pediatric Medicine, Women and Children's Unit, Oslo University Hospital, Rikshospitalet, Oslo, Norway
| | - Margarida Figueiredo-Braga
- Faculty of Medicine, University of Porto, Portugal; I3S Instituto de Investigação e Inovação em Saúde, Porto, Portugal
| | - Live Korsvold
- Department of Behavioral Sciences in Medicine, Institute of Basic Medical Science, Faculty of Medicine, University of Oslo, Oslo, Norway
| | - Yuefang Zhou
- University of St Andrews, Medical School, North Haugh, St Andrews, UK
| | - Christa Zimmermann
- Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Verona, Italy
| | - Gerald Humphris
- University of St Andrews, Medical School, North Haugh, St Andrews, UK
| |
Collapse
|
14
|
McDonald K, Newby-Clark IR, Walker J, Henselwood K. It is written all over your face: Socially rejected people display microexpressions that are detectable after training in the Micro Expression Training Tool (METT). EUROPEAN JOURNAL OF SOCIAL PSYCHOLOGY 2017. [DOI: 10.1002/ejsp.2301] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
15
|
Effectiveness of a short audiovisual emotion recognition training program in adults. MOTIVATION AND EMOTION 2017. [DOI: 10.1007/s11031-017-9631-9] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
16
|
Svetieva E, Frank MG. Empathy, emotion dysregulation, and enhanced microexpression recognition ability. MOTIVATION AND EMOTION 2015. [DOI: 10.1007/s11031-015-9528-4] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
17
|
Sweeney CD, Ceci SJ. Deception detection, transmission, and modality in age and sex. Front Psychol 2014; 5:590. [PMID: 24982645 PMCID: PMC4056559 DOI: 10.3389/fpsyg.2014.00590] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2014] [Accepted: 05/26/2014] [Indexed: 11/30/2022] Open
Abstract
This study is the first to create and use spontaneous (i.e., unrehearsed) pro-social lies in an ecological setting. Creation of the stimuli involved 51 older adult and 44 college student “senders” who lied “authentically” in that their lies were spontaneous in the service of protecting a research assistant. In the main study, 77 older adult and 84 college raters attempted to detect lies in the older adult and college senders in three modalities: audio, visual, and audiovisual. Raters of both age groups were best at detecting lies in the audiovisual and worst in the visual modalities. Overall, college students were better detectors than older adults. There was an age-matching effect for college students but not for older adults. Older adult males were the hardest to detect. The older the adult was the worse the ability to detect deception.
Collapse
Affiliation(s)
- Charlotte D Sweeney
- Union Theological Seminary, New York NY, USA ; Department of Human Development, Cornell University, Ithaca NY, USA
| | - Stephen J Ceci
- Department of Human Development, Cornell University, Ithaca NY, USA
| |
Collapse
|
18
|
Hurley CM, Anker AE, Frank MG, Matsumoto D, Hwang HC. Background factors predicting accuracy and improvement in micro expression recognition. MOTIVATION AND EMOTION 2014. [DOI: 10.1007/s11031-014-9410-9] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|