1
|
Fino E, Menegatti M, Avenanti A, Rubini M. Reading of ingroup politicians' smiles triggers smiling in the corner of one's eyes. PLoS One 2024; 19:e0290590. [PMID: 38635525 PMCID: PMC11025833 DOI: 10.1371/journal.pone.0290590] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2023] [Accepted: 01/19/2024] [Indexed: 04/20/2024] Open
Abstract
Spontaneous smiles in response to politicians can serve as an implicit barometer for gauging electorate preferences. However, it is unclear whether a subtle Duchenne smile-an authentic expression involving the coactivation of the zygomaticus major (ZM) and orbicularis oculi (OO) muscles-would be elicited while reading about a favored politician smiling, indicating a more positive disposition and political endorsement. From an embodied simulation perspective, we investigated whether written descriptions of a politician's smile would trigger morphologically different smiles in readers depending on shared or opposing political orientation. In a controlled reading task in the laboratory, participants were presented with subject-verb phrases describing left and right-wing politicians smiling or frowning. Concurrently, their facial muscular reactions were measured via electromyography (EMG) recording at three facial muscles: the ZM and OO, coactive during Duchenne smiles, and the corrugator supercilii (CS) involved in frowning. We found that participants responded with a Duchenne smile detected at the ZM and OO facial muscles when exposed to portrayals of smiling politicians of same political orientation and reported more positive emotions towards these latter. In contrast, when reading about outgroup politicians smiling, there was a weaker activation of the ZM muscle and no activation of the OO muscle, suggesting a weak non-Duchenne smile, while emotions reported towards outgroup politicians were significantly more negative. Also, a more enhanced frown response in the CS was found for ingroup compared to outgroup politicians' frown expressions. Present findings suggest that a politician's smile may go a long way to influence electorates through both non-verbal and verbal pathways. They add another layer to our understanding of how language and social information shape embodied effects in a highly nuanced manner. Implications for verbal communication in the political context are discussed.
Collapse
Affiliation(s)
- Edita Fino
- Department of Psychology “Renzo Canestrari”, Alma Mater Studiorum Università di Bologna, Bologna, Italy
| | - Michela Menegatti
- Department of Psychology “Renzo Canestrari”, Alma Mater Studiorum Università di Bologna, Bologna, Italy
| | - Alessio Avenanti
- Department of Psychology “Renzo Canestrari”, Alma Mater Studiorum Università di Bologna, Bologna, Italy
- Centro Studi e Ricerche in Neuroscienze Cognitive, Department of Psychology “Renzo Canestrari”, Alma Mater Studiorum Università di Bologna, Campus di Cesena, Cesena, Italy
- Centro de Investigación en Neuropsicología y Neurociencias Cognitivas, Universidad Católica del Maule, Talca, Chile
| | - Monica Rubini
- Department of Psychology “Renzo Canestrari”, Alma Mater Studiorum Università di Bologna, Bologna, Italy
| |
Collapse
|
2
|
Efthimiou TN, Hernandez MP, Elsenaar A, Mehu M, Korb S. Application of facial neuromuscular electrical stimulation (fNMES) in psychophysiological research: Practical recommendations based on a systematic review of the literature. Behav Res Methods 2024; 56:2941-2976. [PMID: 37864116 PMCID: PMC11133044 DOI: 10.3758/s13428-023-02262-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/29/2023] [Indexed: 10/22/2023]
Abstract
Facial neuromuscular electrical stimulation (fNMES), which allows for the non-invasive and physiologically sound activation of facial muscles, has great potential for investigating fundamental questions in psychology and neuroscience, such as the role of proprioceptive facial feedback in emotion induction and emotion recognition, and may serve for clinical applications, such as alleviating symptoms of depression. However, despite illustrious origins in the 19th-century work of Duchenne de Boulogne, the practical application of fNMES remains largely unknown to today's researchers in psychology. In addition, published studies vary dramatically in the stimulation parameters used, such as stimulation frequency, amplitude, duration, and electrode size, and in the way they reported them. Because fNMES parameters impact the comfort and safety of volunteers, as well as its physiological (and psychological) effects, it is of paramount importance to establish recommendations of good practice and to ensure studies can be better compared and integrated. Here, we provide an introduction to fNMES, systematically review the existing literature focusing on the stimulation parameters used, and offer recommendations on how to safely and reliably deliver fNMES and on how to report the fNMES parameters to allow better cross-study comparison. In addition, we provide a free webpage, to easily visualise fNMES parameters and verify their safety based on current density. As an example of a potential application, we focus on the use of fNMES for the investigation of the facial feedback hypothesis.
Collapse
Affiliation(s)
| | | | - Arthur Elsenaar
- ArtScience Interfaculty, Royal Academy of Art, Royal Conservatory, The Hague, Netherlands
| | - Marc Mehu
- Department of Psychology, Webster Vienna Private University, Vienna, Austria
| | - Sebastian Korb
- Department of Psychology, University of Essex, Colchester, UK.
- Department of Cognition, Emotion, and Methods in Psychology, University of Vienna, Vienna, Austria.
| |
Collapse
|
3
|
Efthimiou TN, Baker J, Clarke A, Elsenaar A, Mehu M, Korb S. Zygomaticus activation through facial neuromuscular electrical stimulation (fNMES) induces happiness perception in ambiguous facial expressions and affects neural correlates of face processing. Soc Cogn Affect Neurosci 2024; 19:nsae013. [PMID: 38334739 PMCID: PMC10873823 DOI: 10.1093/scan/nsae013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2023] [Revised: 12/14/2023] [Accepted: 02/05/2024] [Indexed: 02/10/2024] Open
Abstract
The role of facial feedback in facial emotion recognition remains controversial, partly due to limitations of the existing methods to manipulate the activation of facial muscles, such as voluntary posing of facial expressions or holding a pen in the mouth. These procedures are indeed limited in their control over which muscles are (de)activated when and to what degree. To overcome these limitations and investigate in a more controlled way if facial emotion recognition is modulated by one's facial muscle activity, we used computer-controlled facial neuromuscular electrical stimulation (fNMES). In a pre-registered EEG experiment, ambiguous facial expressions were categorised as happy or sad by 47 participants. In half of the trials, weak smiling was induced through fNMES delivered to the bilateral Zygomaticus Major muscle for 500 ms. The likelihood of categorising ambiguous facial expressions as happy was significantly increased with fNMES, as shown with frequentist and Bayesian linear mixed models. Further, fNMES resulted in a reduction of P1, N170 and LPP amplitudes. These findings suggest that fNMES-induced facial feedback can bias facial emotion recognition and modulate the neural correlates of face processing. We conclude that fNMES has potential as a tool for studying the effects of facial feedback.
Collapse
Affiliation(s)
| | - Joshua Baker
- Department of Psychology, University of Essex, Colchester CO4 3SQ, United Kingdom
| | - Alasdair Clarke
- Department of Psychology, University of Essex, Colchester CO4 3SQ, United Kingdom
| | - Arthur Elsenaar
- ArtScience Interfaculty, Royal Academy of Art, Royal Conservatory, The Hague 2514 AN, Netherlands
| | - Marc Mehu
- Department of Psychology, Webster Vienna Private University, Vienna 1020, Austria
| | - Sebastian Korb
- Department of Psychology, University of Essex, Colchester CO4 3SQ, United Kingdom
- Department of Cognition, Emotion, and Methods in Psychology, University of Vienna, Vienna 1010, Austria
| |
Collapse
|
4
|
Fu G, Yu Y, Ye J, Zheng Y, Li W, Cui N, Wang Q. A method for diagnosing depression: Facial expression mimicry is evaluated by facial expression recognition. J Affect Disord 2023; 323:809-818. [PMID: 36535548 DOI: 10.1016/j.jad.2022.12.029] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/17/2021] [Revised: 11/20/2022] [Accepted: 12/10/2022] [Indexed: 12/23/2022]
Abstract
BACKGROUND Considerable evidence has shown that facial expression mimicry is impaired in patients with depression. We aimed to evaluate voluntary expression mimicry by facial expression recognition for diagnosing depression. METHODS A total of 168 participants performed voluntary expression mimicry task, posing anger, disgust, fear, happiness, neutrality, sadness, and surprise. 9 healthy raters performed facial expression recognition task through the observer scoring method, and evaluated seven expressions imitated by participants. Emotional scores were calculated to measure any differences between two groups of participants and provided a basis for clinical diagnosis of depression. RESULTS Compared with the control group, the depression group had lower accuracy in imitating happiness. Compared with the control group, the depression group imitated a higher neutrality bias for sadness, surprise, happiness and disgust, while sadness and surprise had a lower happiness bias; for imitating happiness, the depression group showed higher anger, disgust, fear, neutrality, and surprise bias; for imitating neutrality, the depression group showed higher sadness bias, and lower happiness bias. Compared with the control group, the raters had a higher reaction time to recognize the happiness imitated by depression group, and it was positively correlated with severity of depression. The severity of depression was also negatively correlated with accuracy in imitating happiness, and positively correlated with neutrality bias of imitating surprise. LIMITATIONS The ecological effectiveness of static stimulus materials is lower than that of dynamic stimuli. Without synchronized functional imaging, there is no way to link brain activation patterns. CONCLUSION The ability of patients with depression to voluntarily imitate facial expressions declines, which is mainly reflected in accuracy, bias and recognizability. Our experiment has discovered deficits in these aspects of patients with depression, which will be used as a method for diagnosising depression.
Collapse
Affiliation(s)
- Gang Fu
- School of Computer Science and Technology, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China
| | - Yanhong Yu
- College of Traditional Chinese Medicine, Shandong University of Traditional Chinese Medicine, Jinan 250355, China
| | - Jiayu Ye
- School of Computer Science and Technology, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China
| | - Yunshao Zheng
- Shandong Provincial Mental Health Center, Jinan 250014, China
| | - Wentao Li
- School of Computer Science and Technology, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China
| | - Ning Cui
- College of Health, Shandong University of Traditional Chinese Medicine, Jinan 250355, China
| | - Qingxiang Wang
- School of Computer Science and Technology, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China.
| |
Collapse
|
5
|
Lee M, Lori A, Langford NA, Rilling JK. The neural basis of smile authenticity judgments and the potential modulatory role of the oxytocin receptor gene (OXTR). Behav Brain Res 2023; 437:114144. [PMID: 36216140 DOI: 10.1016/j.bbr.2022.114144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Revised: 09/03/2022] [Accepted: 09/30/2022] [Indexed: 11/13/2022]
Abstract
Accurate perception of genuine vs. posed smiles is crucial for successful social navigation in humans. While people vary in their ability to assess the authenticity of smiles, little is known about the specific biological mechanisms underlying this variation. We investigated the neural substrates of smile authenticity judgments using functional magnetic resonance imaging (fMRI). We also tested a preliminary hypothesis that a common polymorphism in the oxytocin receptor gene (OXTR) rs53576 would modulate the behavioral and neural indices of accurate smile authenticity judgments. A total of 185 healthy adult participants (Neuroimaging arm: N = 44, Behavioral arm: N = 141) determined the authenticity of dynamic facial expressions of genuine and posed smiles either with or without fMRI scanning. Correctly identified genuine vs. posed smiles activated brain areas involved with reward processing, facial mimicry, and mentalizing. Activation within the inferior frontal gyrus and dorsomedial prefrontal cortex correlated with individual differences in sensitivity (d') and response criterion (C), respectively. Our exploratory genetic analysis revealed that rs53576 G homozygotes in the neuroimaging arm had a stronger tendency to judge posed smiles as genuine than did A allele carriers and showed decreased activation in the medial prefrontal cortex when viewing genuine vs. posed smiles. Yet, OXTR rs53576 did not modulate task performance in the behavioral arm, which calls for further studies to evaluate the legitimacy of this result. Our findings extend previous literature on the biological foundations of smile authenticity judgments, particularly emphasizing the involvement of brain regions implicated in reward, facial mimicry, and mentalizing.
Collapse
Affiliation(s)
| | - Adriana Lori
- Department of Psychiatry and Behavioral Science, USA
| | - Nicole A Langford
- Department of Psychiatry and Behavioral Science, USA; Nell Hodgson Woodruff School of Nursing, USA
| | - James K Rilling
- Department of Anthropology, USA; Department of Psychiatry and Behavioral Science, USA; Center for Behavioral Neuroscience, USA; Emory National Primate Research Center, USA; Center for Translational Social Neuroscience, USA.
| |
Collapse
|
6
|
Massaccesi C, Korb S, Willeit M, Quednow BB, Silani G. Effects of the mu-opioid receptor agonist morphine on facial mimicry and emotion recognition. Psychoneuroendocrinology 2022; 142:105801. [PMID: 35609510 DOI: 10.1016/j.psyneuen.2022.105801] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/09/2022] [Revised: 05/02/2022] [Accepted: 05/11/2022] [Indexed: 11/30/2022]
Abstract
Facial mimicry and emotion recognition are two socio-cognitive abilities involved in adaptive socio-emotional behavior, promoting affiliation and the establishment of social bonds. The mu-opioid receptor (MOR) system plays a key role in affiliation and social bonding. However, it remains unclear whether MORs are involved in the categorization and spontaneous mimicry of emotional facial expressions. Using a randomized, placebo-controlled, double-blind, between-subjects design, we investigated in 82 healthy female volunteers the effects of the specific MOR agonist morphine on the recognition accuracy of emotional faces (happiness, anger, fear), and on their facial mimicry (measured with electromyography). Frequentist statistics did not reveal any significant effects of drug administration on facial mimicry or emotion recognition abilities. However, post hoc Bayesian analyses provided support for an effect of morphine on facial mimicry of fearful facial expressions. Specifically, compared to placebo, morphine reduced mimicry of fear, as shown by lower activity of the frontalis muscle. Bayesian analyses also provided support for the absence of a drug effect on mimicry of happy and angry facial expressions, which were assessed with the zygomaticus major and corrugator supercilii muscles, as well as on emotion recognition accuracy. These findings suggest that MOR activity is involved in automatic facial responses to fearful stimuli, but not in their identification. Overall, the current results, together with the previously reported small effects of opioid compounds, suggest a relatively marginal role of the MOR system in emotion simulation and perception.
Collapse
Affiliation(s)
- Claudia Massaccesi
- Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Austria.
| | - Sebastian Korb
- Department of Psychology, University of Essex, Colchester, United Kingdom; Department of Cognition, Emotion, and Methods in Psychology, Faculty of Psychology, University of Vienna, Austria
| | - Matthaeus Willeit
- Department of Psychiatry and Psychotherapy, Medical University of Vienna, Austria
| | - Boris B Quednow
- Experimental and Clinical Pharmacopsychology, Department of Psychiatry, Psychotherapy and Psychosomatics, Psychiatric Hospital of the University of Zurich, Zurich, Switzerland; Neuroscience Center Zurich, University of Zurich and Swiss Federal Institute of Technology, Zurich, Switzerland
| | - Giorgia Silani
- Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Austria
| |
Collapse
|
7
|
Namba S, Sato W, Nakamura K, Watanabe K. Computational Process of Sharing Emotion: An Authentic Information Perspective. Front Psychol 2022; 13:849499. [PMID: 35645906 PMCID: PMC9134197 DOI: 10.3389/fpsyg.2022.849499] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2022] [Accepted: 04/26/2022] [Indexed: 11/28/2022] Open
Abstract
Although results of many psychology studies have shown that sharing emotion achieves dyadic interaction, no report has explained a study of the transmission of authentic information from emotional expressions that can strengthen perceivers. For this study, we used computational modeling, which is a multinomial processing tree, for formal quantification of the process of sharing emotion that emphasizes the perception of authentic information for expressers’ feeling states from facial expressions. Results indicated that the ability to perceive authentic information of feeling states from a happy expression has a higher probability than the probability of judging authentic information from anger expressions. Next, happy facial expressions can activate both emotional elicitation and sharing emotion in perceivers, where emotional elicitation alone is working rather than sharing emotion for angry facial expressions. Third, parameters to detect anger experiences were found to be correlated positively with those of happiness. No robust correlation was found between the parameters extracted from this experiment task and questionnaire-measured emotional contagion, empathy, and social anxiety. Results of this study revealed the possibility that a new computational approach contributes to description of emotion sharing processes.
Collapse
Affiliation(s)
- Shushi Namba
- Psychological Process Research Team, Guardian Robot Project, RIKEN, Kyoto, Japan
- *Correspondence: Shushi Namba,
| | - Wataru Sato
- Psychological Process Research Team, Guardian Robot Project, RIKEN, Kyoto, Japan
| | - Koyo Nakamura
- Department of Cognition, Emotion, and Methods in Psychology, Faculty of Psychology, University of Vienna, Vienna, Austria
- Japan Society for the Promotion of Science, Tokyo, Japan
- Faculty of Science and Engineering, Waseda University, Tokyo, Japan
| | - Katsumi Watanabe
- Faculty of Science and Engineering, Waseda University, Tokyo, Japan
- Faculty of Arts, Design and Architecture, University of New South Wales, Sydney, NSW, Australia
| |
Collapse
|
8
|
Duchenne Smiles of White American College Students in Same-Race and Interracial Interactions. JOURNAL OF NONVERBAL BEHAVIOR 2022. [DOI: 10.1007/s10919-021-00393-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
9
|
ZHANG HONG, SUN YAORU. MODULATION EFFECT OF MOTOR ACTIVITY ON LIMBIC AREAS: AN FMRI STUDY. J MECH MED BIOL 2021. [DOI: 10.1142/s0219519421400637] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Neural activation of the motor cortex has been consistently reported to be evoked in the emotion processing of facial expressions, but it is poorly understood whether and how the motor system influences the activity of limbic areas during participants’ perceived emotional expressions. In this study, we proposed that motor activations evoked by emotional processing influence the activations in limbic areas such as amygdala during the perception of facial expressions. To examine this issue, a masked priming paradigm was adopted in our fMRI experiment, which could modulate the activation within the motor cortex when healthy participants perceived sad or happy facial expressions. We found that the first presented stimulus (masked prime) in each trial reduced the activations in the premotor cortex and inferior frontal gyrus when the movement of facial muscles implied by the arrows on the prime stimulus was consistent with that implied by the target face expressions (compatible condition), but increased the activations in these two areas when the movements implied by the arrows and the target face expressions were inconsistent (incompatible condition). The superior temporal gyrus, middle cingulate gyrus and amygdala also showed similar response tendency to that in motor cortex. Moreover, psychophysiological interaction (PPI) analysis showed that both right middle cingulate gyrus and bilateral superior temporal gyrus were closely linked to the premotor cortex with inferior frontal gyrus during the incompatible trials compared with the compatible trials. Together with this result and the significant activation correlations between the motor cortex and the limbic areas, this work revealed the modulation effect of motor cortex on brain regions related to emotion perception, suggesting that motor representation of facial movements can affect emotion experience. Our results provide new evidence for the functional role of motor system in the perception of facial emotions, and could contribute to the understanding of the deficit in social interaction for patients with autism or schizophrenia.
Collapse
Affiliation(s)
- HONG ZHANG
- Department of Computer Science and Technology, Taiyuan Normal University, Taiyuan, Shanxi 030619, P. R. China
| | - YAORU SUN
- Department of Computer Science and Technology, Tongji University, Shanghai 201804, P. R. China
| |
Collapse
|
10
|
La risa contagiosa como vocalización provocadora de expresiones faciales y electromiográficas relacionadas con emociones positivas en los oyentes. ACTA COLOMBIANA DE PSICOLOGIA 2021. [DOI: 10.14718/acp.2021.24.2.5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022] Open
Abstract
A pesar de su relevancia para la comprensión de la expresión emocional vocal, el estudio de la risa contagiosa se encuentra en sus primeras etapas de investigación y aún no se ha establecido su naturaleza ni la de las respuestas que esta provoca. Teniendo esto en cuenta, el propósito de este estudio fue determinar si los estímulos acústicos de risa contagiosa, además de generar conductas de risa o sonrisa, provocan en los oyentes las expresiones faciales, electromiográficas y cardíacas de una emoción positiva. Para esto, se contó con la participación de 60 universitarios de ambos sexos con edades entre los 18 y los 30 años en un diseño experimental intrasujeto con mediciones en la condición de línea de base y en exposiciones a diferentes estímulos de risa contagiosa, donde se verificaron tres hipótesis en las que se comparó expresiones faciales de alegría (medidas con el software FaceReader), amplitud electromiográfica (emg) del músculo cigomático mayor (medida con el módulo emg-100 del Biopac) e intervalos R-R como indicadores de frecuencia cardíaca (medidos con el módulo ecg-100 del Biopac) entre las diferentes condiciones. Como resultado, se encontraron diferencias significativas en los porcentajes de las expresiones faciales de alegría y amplitud emg del cigomático al comparar las condiciones de línea de base y estímulos de risa más contagiosa, y de risas más y menos contagiosas; no obstante, no se encontraron diferencias significativas en los intervalos R-R en ninguna de las condiciones comparadas. Como conclusión, se comprobó la naturaleza emocional positiva de la risa/sonrisa provocada por estímulos de risa contagiosa y la proporcionalidad entre la intensidad de las expresiones faciales y las respuestas emg elicitadas por esta risa y el grado de contagio percibido de la misma.
Collapse
|
11
|
Kuehne M, Zaehle T, Lobmaier JS. Effects of posed smiling on memory for happy and sad facial expressions. Sci Rep 2021; 11:10477. [PMID: 34006957 PMCID: PMC8131584 DOI: 10.1038/s41598-021-89828-7] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2020] [Accepted: 04/30/2021] [Indexed: 11/14/2022] Open
Abstract
The perception and storage of facial emotional expressions constitutes an important human skill that is essential for our daily social interactions. While previous research revealed that facial feedback can influence the perception of facial emotional expressions, it is unclear whether facial feedback also plays a role in memory processes of facial emotional expressions. In the present study we investigated the impact of facial feedback on the performance in emotional visual working memory (WM). For this purpose, 37 participants underwent a classical facial feedback manipulation (FFM) (holding a pen with the teeth—inducing a smiling expression vs. holding a pen with the non-dominant hand—as a control condition) while they performed a WM task on varying intensities of happy or sad facial expressions. Results show that the smiling manipulation improved memory performance selectively for happy faces, especially for highly ambiguous facial expressions. Furthermore, we found that in addition to an overall negative bias specifically for happy faces (i.e. happy faces are remembered as more negative than they initially were), FFM induced a positivity bias when memorizing emotional facial information (i.e. faces were remembered as being more positive than they actually were). Finally, our data demonstrate that men were affected more by FFM: during induced smiling men showed a larger positive bias than women did. These data demonstrate that facial feedback not only influences our perception but also systematically alters our memory of facial emotional expressions.
Collapse
Affiliation(s)
- Maria Kuehne
- Department of Social Neuroscience and Social Psychology, Institute of Psychology, University of Bern, Bern, Switzerland. .,Department of Neurology, Otto-Von-Guericke-University Magdeburg, Leipziger Straße 44, 39120, Magdeburg, Germany.
| | - Tino Zaehle
- Department of Neurology, Otto-Von-Guericke-University Magdeburg, Leipziger Straße 44, 39120, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany
| | - Janek S Lobmaier
- Department of Social Neuroscience and Social Psychology, Institute of Psychology, University of Bern, Bern, Switzerland
| |
Collapse
|
12
|
Lima CF, Arriaga P, Anikin A, Pires AR, Frade S, Neves L, Scott SK. Authentic and posed emotional vocalizations trigger distinct facial responses. Cortex 2021; 141:280-292. [PMID: 34102411 DOI: 10.1016/j.cortex.2021.04.015] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2021] [Revised: 04/21/2021] [Accepted: 04/27/2021] [Indexed: 11/28/2022]
Abstract
The ability to recognize the emotions of others is a crucial skill. In the visual modality, sensorimotor mechanisms provide an important route for emotion recognition. Perceiving facial expressions often evokes activity in facial muscles and in motor and somatosensory systems, and this activity relates to performance in emotion tasks. It remains unclear whether and how similar mechanisms extend to audition. Here we examined facial electromyographic and electrodermal responses to nonverbal vocalizations that varied in emotional authenticity. Participants (N = 100) passively listened to laughs and cries that could reflect an authentic or a posed emotion. Bayesian mixed models indicated that listening to laughter evoked stronger facial responses than listening to crying. These responses were sensitive to emotional authenticity. Authentic laughs evoked more activity than posed laughs in the zygomaticus and orbicularis, muscles typically associated with positive affect. We also found that activity in the orbicularis and corrugator related to subjective evaluations in a subsequent authenticity perception task. Stronger responses in the orbicularis predicted higher perceived laughter authenticity. Stronger responses in the corrugator, a muscle associated with negative affect, predicted lower perceived laughter authenticity. Moreover, authentic laughs elicited stronger skin conductance responses than posed laughs. This arousal effect did not predict task performance, however. For crying, physiological responses were not associated with authenticity judgments. Altogether, these findings indicate that emotional authenticity affects peripheral nervous system responses to vocalizations. They also point to a role of sensorimotor mechanisms in the evaluation of authenticity in the auditory modality.
Collapse
Affiliation(s)
- César F Lima
- Instituto Universitário de Lisboa (ISCTE-IUL), Lisboa, Portugal; Institute of Cognitive Neuroscience, University College London, London, UK.
| | | | - Andrey Anikin
- Equipe de Neuro-Ethologie Sensorielle (ENES)/Centre de Recherche en Neurosciences de Lyon (CRNL), University of Lyon/Saint-Etienne, CNRS UMR5292, INSERM UMR_S 1028, Saint-Etienne, France; Division of Cognitive Science, Lund University, Lund, Sweden
| | - Ana Rita Pires
- Instituto Universitário de Lisboa (ISCTE-IUL), Lisboa, Portugal
| | - Sofia Frade
- Instituto Universitário de Lisboa (ISCTE-IUL), Lisboa, Portugal
| | - Leonor Neves
- Instituto Universitário de Lisboa (ISCTE-IUL), Lisboa, Portugal
| | - Sophie K Scott
- Institute of Cognitive Neuroscience, University College London, London, UK
| |
Collapse
|
13
|
Nasir J, Bruno B, Chetouani M, Dillenbourg P. What if Social Robots Look for Productive Engagement? Int J Soc Robot 2021. [DOI: 10.1007/s12369-021-00766-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
AbstractIn educational HRI, it is generally believed that a robots behavior has a direct effect on the engagement of a user with the robot, the task at hand and also their partner in case of a collaborative activity. Increasing this engagement is then held responsible for increased learning and productivity. The state of the art usually investigates the relationship between the behaviors of the robot and the engagement state of the user while assuming a linear relationship between engagement and the end goal: learning. However, is it correct to assume that to maximise learning, one needs to maximise engagement? Furthermore, conventional supervised models of engagement require human annotators to get labels. This is not only laborious but also introduces further subjectivity in an already subjective construct of engagement. Can we have machine-learning models for engagement detection where annotations do not rely on human annotators? Looking deeper at the behavioral patterns and the learning outcomes and a performance metric in a multi-modal data set collected in an educational human–human–robot setup with 68 students, we observe a hidden link that we term as Productive Engagement. We theorize a robot incorporating this knowledge will (1) distinguish teams based on engagement that is conducive of learning; and (2) adopt behaviors that eventually lead the users to increased learning by means of being productively engaged. Furthermore, this seminal link paves way for machine-learning models in educational HRI with automatic labelling based on the data.
Collapse
|
14
|
Soundirarajan M, Pakniyat N, Sim S, Nathan V, Namazi H. Information-based analysis of the relationship between brain and facial muscle activities in response to static visual stimuli. Technol Health Care 2021; 29:99-109. [DOI: 10.3233/thc-192085] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
BACKGROUND: Human facial muscles react differently to different visual stimuli. It is known that the human brain controls and regulates the activity of the muscles. OBJECTIVE: In this research, for the first time, we investigate how facial muscle reaction is related to the reaction of the human brain. METHODS: Since both electromyography (EMG) and electroencephalography (EEG) signals, as the features of muscle and brain activities, contain information, we benefited from the information theory and computed the Shannon entropy of EMG and EEG signals when subjects were exposed to different static visual stimuli with different Shannon entropies (information content). RESULTS: Based on the obtained results, the variations of the information content of the EMG signal are related to the variations of the information content of the EEG signal and the visual stimuli. Statistical analysis also supported the results indicating that the visual stimuli with greater information content have a greater effect on the variation of the information content of both EEG and EMG signals. CONCLUSION: This investigation can be further continued to analyze the relationship between facial muscle and brain reactions in case of other types of stimuli.
Collapse
Affiliation(s)
| | | | - Sue Sim
- School of Engineering, Monash University, Selangor, Malaysia
| | - Visvamba Nathan
- School of Engineering, Monash University, Selangor, Malaysia
| | | |
Collapse
|
15
|
Bowdring MA, Sayette MA, Girard JM, Woods WC. In the Eye of the Beholder: A Comprehensive Analysis of Stimulus Type, Perceiver, and Target in Physical Attractiveness Perceptions. JOURNAL OF NONVERBAL BEHAVIOR 2021. [DOI: 10.1007/s10919-020-00350-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
16
|
Holland AC, O’Connell G, Dziobek I. Facial mimicry, empathy, and emotion recognition: a meta-analysis of correlations. Cogn Emot 2020; 35:150-168. [DOI: 10.1080/02699931.2020.1815655] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Affiliation(s)
- Alison C. Holland
- Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
| | - Garret O’Connell
- Berlin School of Mind and Brain, Institute of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Isabel Dziobek
- Berlin School of Mind and Brain, Institute of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|
17
|
Arnold AJ, Winkielman P. Smile (but only deliberately) though your heart is aching: Loneliness is associated with impaired spontaneous smile mimicry. Soc Neurosci 2020; 16:26-38. [PMID: 32835612 DOI: 10.1080/17470919.2020.1809516] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
As social beings, humans harbor an evolved capacity for loneliness - perceived social isolation. Loneliness is associated with atypical affective and social processing, as well as physiological dysregulation. We investigated how loneliness influences spontaneous facial mimicry (SFM), an interpersonal response involved in social connection and emotional contagion. We presented participants with emotional stimuli, such as video clips of actors expressing anger, fear, sadness, or joy, and emotional IAPS images. We measured participants' zygomaticus major ("smiling") muscle and their corrugator supercilii ("frowning") muscle with facial electromyography (fEMG). We also measured self-reported loneliness, depression, and extraversion levels. For socially connected individuals we found intact SFM, as reflected in greater fEMG activity of the zygomaticus and corrugator to positive and negative expressions, respectively. However, individuals reporting higher levels of loneliness lacked SFM for expressions of joy. Loneliness did not impair deliberate mimicry activity to the same expressions, or spontaneous reactions to positive, negative, or neutral IAPS images. Depression and extraversion did not predict any differences in fEMG responses. We suggest that impairments in spontaneous "smiling back" at another - a decreased interpersonal resonance - could contribute to negative social and emotional consequences of loneliness and may facilitate loneliness contagion.
Collapse
Affiliation(s)
- Andrew J Arnold
- Department of Psychology, University of California, San Diego , La Jolla, CA, USA
| | - Piotr Winkielman
- Department of Psychology, University of California, San Diego , La Jolla, CA, USA.,Department of Psychology, SWPS University of Social Sciences and Humanities , Warsaw, Poland
| |
Collapse
|
18
|
FaReT: A free and open-source toolkit of three-dimensional models and software to study face perception. Behav Res Methods 2020; 52:2604-2622. [PMID: 32519291 DOI: 10.3758/s13428-020-01421-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
A problem in the study of face perception is that results can be confounded by poor stimulus control. Ideally, experiments should precisely manipulate facial features under study and tightly control irrelevant features. Software for 3D face modeling provides such control, but there is a lack of free and open source alternatives specifically created for face perception research. Here, we provide such tools by expanding the open-source software MakeHuman. We present a database of 27 identity models and six expression pose models (sadness, anger, happiness, disgust, fear, and surprise), together with software to manipulate the models in ways that are common in the face perception literature, allowing researchers to: (1) create a sequence of renders from interpolations between two or more 3D models (differing in identity, expression, and/or pose), resulting in a "morphing" sequence; (2) create renders by extrapolation in a direction of face space, obtaining 3D "anti-faces" and caricatures; (3) obtain videos of dynamic faces from rendered images; (4) obtain average face models; (5) standardize a set of models so that they differ only in selected facial shape features, and (6) communicate with experiment software (e.g., PsychoPy) to render faces dynamically online. These tools vastly improve both the speed at which face stimuli can be produced and the level of control that researchers have over face stimuli. We validate the face model database and software tools through a small study on human perceptual judgments of stimuli produced with the toolkit.
Collapse
|
19
|
Gino F, Sezer O, Huang L. To be or not to be your authentic self? Catering to others’ preferences hinders performance. ORGANIZATIONAL BEHAVIOR AND HUMAN DECISION PROCESSES 2020. [DOI: 10.1016/j.obhdp.2020.01.003] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
20
|
Palagi E, Celeghin A, Tamietto M, Winkielman P, Norscia I. The neuroethology of spontaneous mimicry and emotional contagion in human and non-human animals. Neurosci Biobehav Rev 2020; 111:149-165. [DOI: 10.1016/j.neubiorev.2020.01.020] [Citation(s) in RCA: 65] [Impact Index Per Article: 16.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2019] [Revised: 11/27/2019] [Accepted: 01/18/2020] [Indexed: 01/30/2023]
|
21
|
Minio-Paluello I, Porciello G, Gandolfo M, Boukarras S, Aglioti SM. The enfacement illusion boosts facial mimicry. Cortex 2020; 123:113-123. [DOI: 10.1016/j.cortex.2019.10.001] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2019] [Revised: 08/19/2019] [Accepted: 10/01/2019] [Indexed: 12/19/2022]
|
22
|
Facial responses of adult humans during the anticipation and consumption of touch and food rewards. Cognition 2020; 194:104044. [DOI: 10.1016/j.cognition.2019.104044] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2018] [Revised: 08/04/2019] [Accepted: 08/08/2019] [Indexed: 01/04/2023]
|
23
|
Lin XX, Sun YB, Wang YZ, Fan L, Wang X, Wang N, Luo F, Wang JY. Ambiguity Processing Bias Induced by Depressed Mood Is Associated with Diminished Pleasantness. Sci Rep 2019; 9:18726. [PMID: 31822749 PMCID: PMC6904491 DOI: 10.1038/s41598-019-55277-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2018] [Accepted: 11/21/2019] [Indexed: 11/30/2022] Open
Abstract
Depressed individuals are biased to perceive, interpret, and judge ambiguous cues in a negative/pessimistic manner. Depressed mood can induce and exacerbate these biases, but the underlying mechanisms are not fully understood. We theorize that depressed mood can bias ambiguity processing by altering one's subjective emotional feelings (e.g. pleasantness/unpleasantness) of the cues. This is because when there is limited objective information, individuals often rely on subjective feelings as a source of information for cognitive processing. To test this theory, three groups (induced depression vs. spontaneous depression vs. neutral) were tested in the Judgement Bias Task (JBT), a behavioral assay of ambiguity processing bias. Subjective pleasantness/unpleasantness of cues was measured by facial electromyography (EMG) from the zygomaticus major (ZM, "smiling") and from the corrugator supercilii (CS, "frowning") muscles. As predicted, induced sad mood (vs. neutral mood) yielded a negative bias with a magnitude comparable to that in a spontaneous depressed mood. The facial EMG data indicates that the negative judgement bias induced by depressed mood was associated with a decrease in ZM reactivity (i.e., diminished perceived pleasantness of cues). Our results suggest that depressed mood may bias ambiguity processing by affecting the reward system.
Collapse
Affiliation(s)
- Xiao-Xiao Lin
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Ya-Bin Sun
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Yu-Zheng Wang
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Lu Fan
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Xin Wang
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
- Sino-Danish Center for Education and Research, Beijing, China
| | - Ning Wang
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Fei Luo
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Jin-Yan Wang
- CAS Key Laboratory of Mental Health, Institute of Psychology, Beijing, China.
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China.
| |
Collapse
|
24
|
The Mimicry Among Us: Intra- and Inter-Personal Mechanisms of Spontaneous Mimicry. JOURNAL OF NONVERBAL BEHAVIOR 2019. [DOI: 10.1007/s10919-019-00324-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Abstract
This review explores spontaneous mimicry in the context of three questions. The first question concerns the role of spontaneous mimicry in processing conceptual information. The second question concerns the debate whether spontaneous mimicry is driven by simple associative processes or reflects higher-order processes such as goals, intentions, and social context. The third question addresses the implications of these debates for understanding atypical individuals and states. We review relevant literature and argue for a dynamic, context-sensitive role of spontaneous mimicry in social cognition and behavior. We highlight how the modulation of mimicry is often adaptive but also point out some cases of maladaptive modulations that impair an individuals’ engagement in social life.
Collapse
|
25
|
Kowallik AE, Schweinberger SR. Sensor-Based Technology for Social Information Processing in Autism: A Review. SENSORS (BASEL, SWITZERLAND) 2019; 19:E4787. [PMID: 31689906 PMCID: PMC6864871 DOI: 10.3390/s19214787] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 10/11/2019] [Revised: 10/29/2019] [Accepted: 10/30/2019] [Indexed: 11/16/2022]
Abstract
The prevalence of autism spectrum disorders (ASD) has increased strongly over the past decades, and so has the demand for adequate behavioral assessment and support for persons affected by ASD. Here we provide a review on original research that used sensor technology for an objective assessment of social behavior, either with the aim to assist the assessment of autism or with the aim to use this technology for intervention and support of people with autism. Considering rapid technological progress, we focus (1) on studies published within the last 10 years (2009-2019), (2) on contact- and irritation-free sensor technology that does not constrain natural movement and interaction, and (3) on sensory input from the face, the voice, or body movements. We conclude that sensor technology has already demonstrated its great potential for improving both behavioral assessment and interventions in autism spectrum disorders. We also discuss selected examples for recent theoretical questions related to the understanding of psychological changes and potentials in autism. In addition to its applied potential, we argue that sensor technology-when implemented by appropriate interdisciplinary teams-may even contribute to such theoretical issues in understanding autism.
Collapse
Affiliation(s)
- Andrea E Kowallik
- Early Support and Counselling Center Jena, Herbert Feuchte Stiftungsverbund, 07743 Jena, Germany.
- Social Potential in Autism Research Unit, Friedrich Schiller University, 07743 Jena, Germany.
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Am Steiger 3/Haus 1, 07743 Jena, Germany.
| | - Stefan R Schweinberger
- Early Support and Counselling Center Jena, Herbert Feuchte Stiftungsverbund, 07743 Jena, Germany.
- Social Potential in Autism Research Unit, Friedrich Schiller University, 07743 Jena, Germany.
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Am Steiger 3/Haus 1, 07743 Jena, Germany.
- Michael Stifel Center Jena for Data-Driven and Simulation Science, Friedrich Schiller University, 07743 Jena, Germany.
- Swiss Center for Affective Science, University of Geneva, 1202 Geneva, Switzerland.
| |
Collapse
|
26
|
Improvement of Radial Cheek Lines With Hyaluronic Acid-Based Dermal Filler VYC-17.5L: Results of the BEAM Study. Dermatol Surg 2019; 46:376-385. [PMID: 31449079 DOI: 10.1097/dss.0000000000002057] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND Radial cheek lines (RCL) may convey an older, potentially less attractive appearance. OBJECTIVE To evaluate the effectiveness of hyaluronic acid-based dermal filler VYC-17.5L for correcting RCL. MATERIALS AND METHODS Fifty-three women (40-65 years) received injections of VYC-17.5L in both cheeks on Day 1 (optional Day 14 touch-up). Effectiveness was evaluated on Day 45 by subject-rated dynamic RCL improvement (Global Aesthetic Improvement Scale [GAIS]; primary end point) and independent, noninjecting investigator-rated GAIS; subject Self-Perception of Age (SPA); subject-assessed satisfaction with and natural look of treatment; and instrument-assessed changes in static and dynamic RCL roughness, amplitude, and texture (secondary end points). Safety assessments included injection site responses (ISRs). RESULTS On Day 45, 98% of subjects rated RCL as improved or much improved (investigator rated: 95%). Subjects with same or older SPA before treatment (n = 38) perceived themselves as 2.0 and 5.5 average years younger after treatment, respectively. Day 45 mean satisfaction with and natural look of treated areas was 7.9/10 and 7.2/10, respectively. Treatment significantly improved RCL roughness, amplitude, and texture (all p < .001). Most common ISRs were hematoma (35.9%), bruising (30.2%), and irregularities/bumps (22.6%); most ISRs were mild. CONCLUSION VYC-17.5L effectively corrected dynamic RCL, improved instrument-assessed indicators of skin quality, and resulted in younger age perception.
Collapse
|
27
|
Haines N, Bell Z, Crowell S, Hahn H, Kamara D, McDonough-Caplan H, Shader T, Beauchaine TP. Using automated computer vision and machine learning to code facial expressions of affect and arousal: Implications for emotion dysregulation research. Dev Psychopathol 2019; 31:871-886. [PMID: 30919792 PMCID: PMC7319037 DOI: 10.1017/s0954579419000312] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
Abstract
As early as infancy, caregivers' facial expressions shape children's behaviors, help them regulate their emotions, and encourage or dissuade their interpersonal agency. In childhood and adolescence, proficiencies in producing and decoding facial expressions promote social competence, whereas deficiencies characterize several forms of psychopathology. To date, however, studying facial expressions has been hampered by the labor-intensive, time-consuming nature of human coding. We describe a partial solution: automated facial expression coding (AFEC), which combines computer vision and machine learning to code facial expressions in real time. Although AFEC cannot capture the full complexity of human emotion, it codes positive affect, negative affect, and arousal-core Research Domain Criteria constructs-as accurately as humans, and it characterizes emotion dysregulation with greater specificity than other objective measures such as autonomic responding. We provide an example in which we use AFEC to evaluate emotion dynamics in mother-daughter dyads engaged in conflict. Among other findings, AFEC (a) shows convergent validity with a validated human coding scheme, (b) distinguishes among risk groups, and (c) detects developmental increases in positive dyadic affect correspondence as teen daughters age. Although more research is needed to realize the full potential of AFEC, findings demonstrate its current utility in research on emotion dysregulation.
Collapse
Affiliation(s)
- Nathaniel Haines
- Department of Psychology, Ohio State University, Columbus, OH, USA
| | - Ziv Bell
- Department of Psychology, Ohio State University, Columbus, OH, USA
| | - Sheila Crowell
- Department of Psychology, University of Utah, Salt Lake City, UT, USA
- Department of Psychiatry, University of Utah, Salt Lake City, UT, USA
| | - Hunter Hahn
- Department of Psychology, Ohio State University, Columbus, OH, USA
| | - Dana Kamara
- Department of Psychology, Ohio State University, Columbus, OH, USA
| | | | - Tiffany Shader
- Department of Psychology, Ohio State University, Columbus, OH, USA
| | | |
Collapse
|
28
|
Korb S, Goldman R, Davidson RJ, Niedenthal PM. Increased Medial Prefrontal Cortex and Decreased Zygomaticus Activation in Response to Disliked Smiles Suggest Top-Down Inhibition of Facial Mimicry. Front Psychol 2019; 10:1715. [PMID: 31402888 PMCID: PMC6677088 DOI: 10.3389/fpsyg.2019.01715] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2019] [Accepted: 07/09/2019] [Indexed: 12/03/2022] Open
Abstract
Spontaneous facial mimicry is modulated by many factors, and often needs to be suppressed to comply with social norms. The neural basis for the inhibition of facial mimicry was investigated in a combined functional magnetic resonance imaging and electromyography study in 39 healthy participants. In an operant conditioning paradigm, face identities were associated with reward or punishment and were later shown expressing dynamic smiles and anger expressions. Face identities previously associated with punishment, compared to reward, were disliked by participants overall, and their smiles generated less mimicry. Consistent with previous research on the inhibition of finger/hand movements, the medial prefrontal cortex (mPFC) was activated when previous conditioning was incongruent with the valence of the expression. On such trials there was also greater functional connectivity of the mPFC with insula and premotor cortex as tested with psychophysiological interaction, suggesting inhibition of areas associated with the production of facial mimicry and the processing of facial feedback. The findings suggest that the mPFC supports the inhibition of facial mimicry, and support the claim of theories of embodied cognition that facial mimicry constitutes a spontaneous low-level motor imitation.
Collapse
Affiliation(s)
- Sebastian Korb
- Department of Applied Psychology: Health, Development, Enhancement and Intervention, Faculty of Psychology, University of Vienna, Vienna, Austria.,Department of Psychology, University of Wisconsin-Madison, Madison, WI, United States
| | - Robin Goldman
- Center for Healthy Minds, University of Wisconsin-Madison, Madison, WI, United States
| | - Richard J Davidson
- Department of Psychology, University of Wisconsin-Madison, Madison, WI, United States.,Center for Healthy Minds, University of Wisconsin-Madison, Madison, WI, United States
| | - Paula M Niedenthal
- Department of Psychology, University of Wisconsin-Madison, Madison, WI, United States
| |
Collapse
|
29
|
Haines N, Southward MW, Cheavens JS, Beauchaine T, Ahn WY. Using computer-vision and machine learning to automate facial coding of positive and negative affect intensity. PLoS One 2019; 14:e0211735. [PMID: 30721270 PMCID: PMC6363175 DOI: 10.1371/journal.pone.0211735] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2018] [Accepted: 01/18/2019] [Indexed: 11/26/2022] Open
Abstract
Facial expressions are fundamental to interpersonal communication, including social interaction, and allow people of different ages, cultures, and languages to quickly and reliably convey emotional information. Historically, facial expression research has followed from discrete emotion theories, which posit a limited number of distinct affective states that are represented with specific patterns of facial action. Much less work has focused on dimensional features of emotion, particularly positive and negative affect intensity. This is likely, in part, because achieving inter-rater reliability for facial action and affect intensity ratings is painstaking and labor-intensive. We use computer-vision and machine learning (CVML) to identify patterns of facial actions in 4,648 video recordings of 125 human participants, which show strong correspondences to positive and negative affect intensity ratings obtained from highly trained coders. Our results show that CVML can both (1) determine the importance of different facial actions that human coders use to derive positive and negative affective ratings when combined with interpretable machine learning methods, and (2) efficiently automate positive and negative affect intensity coding on large facial expression databases. Further, we show that CVML can be applied to individual human judges to infer which facial actions they use to generate perceptual emotion ratings from facial expressions.
Collapse
Affiliation(s)
- Nathaniel Haines
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Matthew W. Southward
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Jennifer S. Cheavens
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Theodore Beauchaine
- Department of Psychology, The Ohio State University, Columbus, Ohio, United States of America
| | - Woo-Young Ahn
- Department of Psychology, Seoul National University, Seoul, Korea
| |
Collapse
|
30
|
Slepian ML, Carr EW. Facial expressions of authenticity: Emotion variability increases judgments of trustworthiness and leadership. Cognition 2018; 183:82-98. [PMID: 30445313 DOI: 10.1016/j.cognition.2018.10.009] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2017] [Revised: 10/10/2018] [Accepted: 10/11/2018] [Indexed: 01/10/2023]
Abstract
People automatically generate first impressions from others' faces, even with limited time and information. Most research on social face evaluation focuses on static morphological features that are embedded "in the face" (e.g., overall average of facial features, masculinity/femininity, cues related to positivity/negativity, etc.). Here, we offer the first investigation of how variability in facial emotion affects social evaluations. Participants evaluated targets that, over time, displayed either high-variability or low-variability distributions of positive (happy) and/or negative (angry/fearful/sad) facial expressions, despite the overall averages of those facial features always being the same across conditions. We found that high-variability led to consistently positive perceptions of authenticity, and thereby, judgments of perceived happiness, trustworthiness, leadership, and team-member desirability. We found these effects were based specifically in variability in emotional displays (not intensity of emotion), and specifically increased the positivity of social judgments (not their extremity). Overall, people do not merely average or summarize over facial expressions to arrive at a judgment, but instead also draw inferences from the variability of those expressions.
Collapse
|
31
|
Zeng X, Wu Q, Zhang S, Liu Z, Zhou Q, Zhang M. A False Trail to Follow: Differential Effects of the Facial Feedback Signals From the Upper and Lower Face on the Recognition of Micro-Expressions. Front Psychol 2018; 9:2015. [PMID: 30405497 PMCID: PMC6208096 DOI: 10.3389/fpsyg.2018.02015] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2018] [Accepted: 10/01/2018] [Indexed: 01/24/2023] Open
Abstract
Micro-expressions, as fleeting facial expressions, are very important for judging people’s true emotions, thus can provide an essential behavioral clue for lie and dangerous demeanor detection. From embodied accounts of cognition, we derived a novel hypothesis that facial feedback from upper and lower facial regions has differential effects on micro-expression recognition. This hypothesis was tested and supported across three studies. Specifically, the results of Study 1 showed that people became better judges of intense micro-expressions with a duration of 450 ms when the facial feedback from upper face was enhanced via a restricting gel. Additional results of Study 2 showed that the recognition accuracy of subtle micro-expressions was significantly impaired under all duration conditions (50, 150, 333, and 450 ms) when facial feedback from lower face was enhanced. In addition, the results of Study 3 also revealed that blocking the facial feedback of lower face, significantly boosted the recognition accuracy of subtle and intense micro-expressions under all duration conditions (150 and 450 ms). Together, these results highlight the role of facial feedback in judging the subtle movements of micro-expressions.
Collapse
Affiliation(s)
- Xuemei Zeng
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qi Wu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Siwei Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Zheying Liu
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Qing Zhou
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| | - Meishan Zhang
- Cognition and Human Behavior Key Laboratory of Hunan Province, Department of Psychology, Hunan Normal University, Changsha, China
| |
Collapse
|
32
|
Paracampo R, Pirruccio M, Costa M, Borgomaneri S, Avenanti A. Visual, sensorimotor and cognitive routes to understanding others' enjoyment: An individual differences rTMS approach to empathic accuracy. Neuropsychologia 2018; 116:86-98. [DOI: 10.1016/j.neuropsychologia.2018.01.043] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2017] [Revised: 01/15/2018] [Accepted: 01/31/2018] [Indexed: 01/26/2023]
|
33
|
Orlowska AB, Krumhuber EG, Rychlowska M, Szarota P. Dynamics Matter: Recognition of Reward, Affiliative, and Dominance Smiles From Dynamic vs. Static Displays. Front Psychol 2018; 9:938. [PMID: 29942274 PMCID: PMC6004382 DOI: 10.3389/fpsyg.2018.00938] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2018] [Accepted: 05/22/2018] [Indexed: 11/13/2022] Open
Abstract
Smiles are distinct and easily recognizable facial expressions, yet they markedly differ in their meanings. According to a recent theoretical account, smiles can be classified based on three fundamental social functions which they serve: expressing positive affect and rewarding self and others (reward smile), creating and maintaining social bonds (affiliative smile), and negotiating social status (dominance smiles) (Niedenthal et al., 2010; Martin et al., 2017). While there is evidence for distinct morphological features of these smiles, their categorization only starts to be investigated in human faces. Moreover, the factors influencing this process - such as facial mimicry or display mode - remain yet unknown. In the present study, we examine the recognition of reward, affiliative, and dominance smiles in static and dynamic portrayals, and explore how interfering with facial mimicry affects such classification. Participants (N = 190) were presented with either static or dynamic displays of the three smile types, whilst their ability to mimic was free or restricted via a pen-in-mouth procedure. For each stimulus they rated the extent to which the expression represents a reward, an affiliative, or a dominance smile. Higher than chance accuracy rates revealed that participants were generally able to differentiate between the three smile types. In line with our predictions, recognition performance was lower in the static than dynamic condition, but this difference was only significant for affiliative smiles. No significant effects of facial muscle restriction were observed, suggesting that the ability to mimic might not be necessary for the distinction between the three functional smiles. Together, our findings support previous evidence on reward, affiliative, and dominance smiles by documenting their perceptual distinctiveness. They also replicate extant observations on the dynamic advantage in expression perception and suggest that this effect may be especially pronounced in the case of ambiguous facial expressions, such as affiliative smiles.
Collapse
Affiliation(s)
- Anna B Orlowska
- Institute of Psychology, Polish Academy of Sciences, Warsaw, Poland
| | - Eva G Krumhuber
- Department of Experimental Psychology, University College London, London, United Kingdom
| | | | - Piotr Szarota
- Institute of Psychology, Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
34
|
Argaud S, Vérin M, Sauleau P, Grandjean D. Facial emotion recognition in Parkinson's disease: A review and new hypotheses. Mov Disord 2018; 33:554-567. [PMID: 29473661 PMCID: PMC5900878 DOI: 10.1002/mds.27305] [Citation(s) in RCA: 114] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Revised: 12/21/2017] [Accepted: 12/22/2017] [Indexed: 02/02/2023] Open
Abstract
Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional-processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia-based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society.
Collapse
Affiliation(s)
- Soizic Argaud
- Behavior and Basal Ganglia Research Unit (EA4712)University of Rennes 1RennesFrance
- Neuroscience of Emotion and Affective Dynamics laboratory, Department of Psychology and Educational SciencesUniversity of GenevaGenevaSwitzerland
| | - Marc Vérin
- Behavior and Basal Ganglia Research Unit (EA4712)University of Rennes 1RennesFrance
- Department of NeurologyRennes University HospitalRennesFrance
| | - Paul Sauleau
- Behavior and Basal Ganglia Research Unit (EA4712)University of Rennes 1RennesFrance
- Department of NeurophysiologyRennes University HospitalRennesFrance
| | - Didier Grandjean
- Neuroscience of Emotion and Affective Dynamics laboratory, Department of Psychology and Educational SciencesUniversity of GenevaGenevaSwitzerland
- Swiss Center for Affective SciencesCampus BiotechGenevaSwitzerland
| |
Collapse
|
35
|
Kawakami K, Friesen J, Vingilis-Jaremko L. Visual attention to members of own and other groups: Preferences, determinants, and consequences. SOCIAL AND PERSONALITY PSYCHOLOGY COMPASS 2018. [DOI: 10.1111/spc3.12380] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
|
36
|
Juslin PN, Harmat L, Laukka P. The wisdom of the body: Listeners' autonomic arousal distinguishes between spontaneous and posed vocal emotions. Scand J Psychol 2018; 59:105-112. [PMID: 29411386 DOI: 10.1111/sjop.12429] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2017] [Accepted: 12/19/2017] [Indexed: 11/26/2022]
Abstract
It has been the matter of much debate whether perceivers are able to distinguish spontaneous vocal expressions of emotion from posed vocal expressions (e.g., emotion portrayals). In this experiment, we show that such discrimination can be manifested in the autonomic arousal of listeners during implicit processing of vocal emotions. Participants (N = 21, age: 20-55 years) listened to two consecutive blocks of brief voice clips and judged the gender of the speaker in each clip, while we recorded three measures of sympathetic arousal of the autonomic nervous system (skin conductance level, mean arterial blood pressure, pulse rate). Unbeknownst to the listeners, the blocks consisted of two types of emotional speech: spontaneous and posed clips. As predicted, spontaneous clips yielded higher arousal levels than posed clips, suggesting that listeners implicitly distinguished between the two kinds of expression, even in the absence of any requirement to retrieve emotional information from the voice. We discuss the results with regard to theories of emotional contagion and the use of posed stimuli in studies of emotions.
Collapse
Affiliation(s)
| | | | - Petri Laukka
- Uppsala University, Uppsala, Sweden.,Stockholm University, Stockholm, Sweden
| |
Collapse
|
37
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions. Front Psychol 2018; 9:52. [PMID: 29467691 PMCID: PMC5807922 DOI: 10.3389/fpsyg.2018.00052] [Citation(s) in RCA: 33] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2017] [Accepted: 01/12/2018] [Indexed: 11/13/2022] Open
Abstract
Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|
38
|
Busin Y, Lukasova K, Asthana MK, Macedo EC. Hemiface Differences in Visual Exploration Patterns When Judging the Authenticity of Facial Expressions. Front Psychol 2018; 8:2332. [PMID: 29367851 PMCID: PMC5767895 DOI: 10.3389/fpsyg.2017.02332] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2017] [Accepted: 12/21/2017] [Indexed: 11/29/2022] Open
Abstract
Past studies have found asymmetry biases in human emotion recognition. The left side bias refers to preferential looking at the left-hemiface when actively exploring face images. However, these studies have been mainly conducted with static and frontally oriented stimuli, whereas real-life emotion recognition takes place on dynamic faces viewed from different angles. The aim of this study was to assess the judgment of genuine vs. masked expressions in dynamic movie clips of faces rotated to the right or left side. Forty-eight participants judged the expressions on faces displaying genuine or masked happy, sad, and fearful emotions. The head of the actor was either rotated to the left by a 45° angle, thus showing the left side of the face (standard orientation), or inverted, with the same face shown from the right side perspective. The eye movements were registered by the eye tracker and the data were analyzed for the inverse efficiency score (IES), the number of fixations, gaze time on the whole face and in the regions of interest. Results showed shorter IESs and gaze times for happy compared to sad and fearful emotions, but no difference was found for these variables between sad and fearful emotions. The left side preference was evident from comparisons of the number of fixations. Standard stimuli received a higher number of fixations than inverted ones. However, gaze time was long on inverted compared to standard faces. Number of fixations on exposed hemiface interacted with the emotions decreasing from happy to sad and fearful. An opposite pattern was found for the occluded hemiface. These results suggest a change in fixation patterns in the rotated faces that may be beneficial for the judgments of expressions. Furthermore, this study replicated the effects of the judgment of genuine and masked emotions using dynamic faces.
Collapse
Affiliation(s)
- Yuri Busin
- Social and Cognitive Neuroscience Laboratory and Developmental Disorders Program, Center for Health and Biological Sciences, Mackenzie Presbyterian University, São Paulo, Brazil
| | - Katerina Lukasova
- Social and Cognitive Neuroscience Laboratory and Developmental Disorders Program, Center for Health and Biological Sciences, Mackenzie Presbyterian University, São Paulo, Brazil.,Center of Mathematics, Computation and Cognition, Federal University of ABC (UFABC), São Bernardo, Brazil
| | - Manish K Asthana
- Department of Humanities and Social Sciences, Indian Institute of Technology Kanpur, Kanpur, India
| | - Elizeu C Macedo
- Social and Cognitive Neuroscience Laboratory and Developmental Disorders Program, Center for Health and Biological Sciences, Mackenzie Presbyterian University, São Paulo, Brazil
| |
Collapse
|
39
|
Marshall CR, Hardy CJD, Russell LL, Clark CN, Bond RL, Dick KM, Brotherhood EV, Mummery CJ, Schott JM, Rohrer JD, Kilner JM, Warren JD. Motor signatures of emotional reactivity in frontotemporal dementia. Sci Rep 2018; 8:1030. [PMID: 29348485 PMCID: PMC5773553 DOI: 10.1038/s41598-018-19528-2] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2017] [Accepted: 01/04/2018] [Indexed: 11/18/2022] Open
Abstract
Automatic motor mimicry is essential to the normal processing of perceived emotion, and disrupted automatic imitation might underpin socio-emotional deficits in neurodegenerative diseases, particularly the frontotemporal dementias. However, the pathophysiology of emotional reactivity in these diseases has not been elucidated. We studied facial electromyographic responses during emotion identification on viewing videos of dynamic facial expressions in 37 patients representing canonical frontotemporal dementia syndromes versus 21 healthy older individuals. Neuroanatomical associations of emotional expression identification accuracy and facial muscle reactivity were assessed using voxel-based morphometry. Controls showed characteristic profiles of automatic imitation, and this response predicted correct emotion identification. Automatic imitation was reduced in the behavioural and right temporal variant groups, while the normal coupling between imitation and correct identification was lost in the right temporal and semantic variant groups. Grey matter correlates of emotion identification and imitation were delineated within a distributed network including primary visual and motor, prefrontal, insular, anterior temporal and temporo-occipital junctional areas, with common involvement of supplementary motor cortex across syndromes. Impaired emotional mimesis may be a core mechanism of disordered emotional signal understanding and reactivity in frontotemporal dementia, with implications for the development of novel physiological biomarkers of socio-emotional dysfunction in these diseases.
Collapse
Affiliation(s)
- Charles R Marshall
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK.
- Sobell Department of Motor Neuroscience and Movement Disorders, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK.
| | - Chris J D Hardy
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Lucy L Russell
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Camilla N Clark
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Rebecca L Bond
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Katrina M Dick
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Emilie V Brotherhood
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Cath J Mummery
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Jonathan M Schott
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Jonathan D Rohrer
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - James M Kilner
- Sobell Department of Motor Neuroscience and Movement Disorders, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| | - Jason D Warren
- Dementia Research Centre, Department of Neurodegenerative Disease, Institute of Neurology, University College London, Queen Square, London, WC1N 3BG, UK
| |
Collapse
|
40
|
Spies M, Sevincer AT. Women outperform men in distinguishing between authentic and nonauthentic smiles. The Journal of Social Psychology 2018; 158:574-579. [DOI: 10.1080/00224545.2017.1409187] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
41
|
Neves L, Cordeiro C, Scott SK, Castro SL, Lima CF. High emotional contagion and empathy are associated with enhanced detection of emotional authenticity in laughter. Q J Exp Psychol (Hove) 2018; 71:2355-2363. [PMID: 30362411 DOI: 10.1177/1747021817741800] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Nonverbal vocalisations such as laughter pervade social interactions, and the ability to accurately interpret them is an important skill. Previous research has probed the general mechanisms supporting vocal emotional processing, but the factors that determine individual differences in this ability remain poorly understood. Here, we ask whether the propensity to resonate with others' emotions-as measured by trait levels of emotional contagion and empathy-relates to the ability to perceive different types of laughter. We focus on emotional authenticity detection in spontaneous and voluntary laughs: spontaneous laughs reflect a less controlled and genuinely felt emotion, and voluntary laughs reflect a more deliberate communicative act (e.g., polite agreement). In total, 119 participants evaluated the authenticity and contagiousness of spontaneous and voluntary laughs and completed two self-report measures of resonance with others' emotions: the Emotional Contagion Scale and the Empathic Concern scale of the Interpersonal Reactivity Index. We found that higher scores on these measures predict enhanced ability to detect laughter authenticity. We further observed that perceived contagion responses during listening to laughter significantly relate to authenticity detection. These findings suggest that resonating with others' emotions provides a mechanism for processing complex aspects of vocal emotional information.
Collapse
Affiliation(s)
- Leonor Neves
- 1 Faculty of Psychology and Education Sciences, University of Porto, Porto, Portugal
| | - Carolina Cordeiro
- 1 Faculty of Psychology and Education Sciences, University of Porto, Porto, Portugal
| | - Sophie K Scott
- 2 Institute of Cognitive Neuroscience, University College London, London, UK
| | - São Luís Castro
- 1 Faculty of Psychology and Education Sciences, University of Porto, Porto, Portugal
| | - César F Lima
- 1 Faculty of Psychology and Education Sciences, University of Porto, Porto, Portugal.,2 Institute of Cognitive Neuroscience, University College London, London, UK.,3 Instituto Universitário de Lisboa (ISCTE-IUL), Lisboa, Portugal
| |
Collapse
|
42
|
Blocker HS, McIntosh DN. Not All Outgroups Are Equal: Group Type May Influence Group Effect on Matching Behavior. JOURNAL OF NONVERBAL BEHAVIOR 2017. [DOI: 10.1007/s10919-017-0258-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
43
|
Namba S, Kabir RS, Miyatani M, Nakao T. Spontaneous Facial Actions Map onto Emotional Experiences in a Non-social Context: Toward a Component-Based Approach. Front Psychol 2017; 8:633. [PMID: 28522979 PMCID: PMC5415601 DOI: 10.3389/fpsyg.2017.00633] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2017] [Accepted: 04/09/2017] [Indexed: 11/20/2022] Open
Abstract
While numerous studies have examined the relationships between facial actions and emotions, they have yet to account for the ways that specific spontaneous facial expressions map onto emotional experiences induced without expressive intent. Moreover, previous studies emphasized that a fine-grained investigation of facial components could establish the coherence of facial actions with actual internal states. Therefore, this study aimed to accumulate evidence for the correspondence between spontaneous facial components and emotional experiences. We reinvestigated data from previous research which secretly recorded spontaneous facial expressions of Japanese participants as they watched film clips designed to evoke four different target emotions: surprise, amusement, disgust, and sadness. The participants rated their emotional experiences via a self-reported questionnaire of 16 emotions. These spontaneous facial expressions were coded using the Facial Action Coding System, the gold standard for classifying visible facial movements. We corroborated each facial action that was present in the emotional experiences by applying stepwise regression models. The results found that spontaneous facial components occurred in ways that cohere to their evolutionary functions based on the rating values of emotional experiences (e.g., the inner brow raiser might be involved in the evaluation of novelty). This study provided new empirical evidence for the correspondence between each spontaneous facial component and first-person internal states of emotion as reported by the expresser.
Collapse
Affiliation(s)
- Shushi Namba
- Graduate School of Education, Hiroshima UniversityHiroshima, Japan
| | - Russell S Kabir
- Graduate School of Education, Hiroshima UniversityHiroshima, Japan
| | - Makoto Miyatani
- Department of Psychology, Hiroshima UniversityHiroshima, Japan
| | - Takashi Nakao
- Department of Psychology, Hiroshima UniversityHiroshima, Japan
| |
Collapse
|
44
|
Korb S, Osimo SA, Suran T, Goldstein A, Rumiati RI. Face proprioception does not modulate access to visual awareness of emotional faces in a continuous flash suppression paradigm. Conscious Cogn 2017; 51:166-180. [PMID: 28388482 DOI: 10.1016/j.concog.2017.03.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2016] [Revised: 03/09/2017] [Accepted: 03/26/2017] [Indexed: 11/26/2022]
Abstract
An important question in neuroscience is which multisensory information, presented outside of awareness, can influence the nature and speed of conscious access to our percepts. Recently, proprioceptive feedback of the hand was reported to lead to faster awareness of congruent hand images in a breaking continuous flash suppression (b-CFS) paradigm. Moreover, a vast literature suggests that spontaneous facial mimicry can improve emotion recognition, even without awareness of the stimulus face. However, integration of visual and proprioceptive information about the face to date has not been tested with CFS. The modulation of visual awareness of emotional faces by facial proprioception was investigated across three separate experiments. Face proprioception was induced with voluntary facial expressions or with spontaneous facial mimicry. Frequentist statistical analyses were complemented with Bayesian statistics. No evidence of multisensory integration was found, suggesting that proprioception does not modulate access to visual awareness of emotional faces in a CFS paradigm.
Collapse
Affiliation(s)
- Sebastian Korb
- Neuroscience and Society Lab, SISSA, Via Bonomea 265, 34136 Trieste, Italy; Faculty of Psychology, Department of Applied Psychology: Health, Development, Enhancement and Intervention, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria.
| | - Sofia A Osimo
- Neuroscience and Society Lab, SISSA, Via Bonomea 265, 34136 Trieste, Italy.
| | - Tiziano Suran
- Neuroscience and Society Lab, SISSA, Via Bonomea 265, 34136 Trieste, Italy.
| | - Ariel Goldstein
- Cognitive Science Department, The Hebrew University of Jerusalem, Mount Scopus, Jerusalem 91905, Israel.
| | | |
Collapse
|
45
|
Gernot G, Pelowski M, Leder H. Empathy, Einfühlung, and aesthetic experience: the effect of emotion contagion on appreciation of representational and abstract art using fEMG and SCR. Cogn Process 2017; 19:147-165. [DOI: 10.1007/s10339-017-0800-2] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2016] [Accepted: 03/09/2017] [Indexed: 12/30/2022]
|
46
|
Sensorimotor simulation and emotion processing: Impairing facial action increases semantic retrieval demands. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2017; 17:652-664. [DOI: 10.3758/s13415-017-0503-2] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
47
|
Baumeister JC, Papa G, Foroni F. Deeper than skin deep – The effect of botulinum toxin-A on emotion processing. Toxicon 2016; 118:86-90. [DOI: 10.1016/j.toxicon.2016.04.044] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2016] [Revised: 04/23/2016] [Accepted: 04/25/2016] [Indexed: 10/21/2022]
|
48
|
Korb S, Malsert J, Strathearn L, Vuilleumier P, Niedenthal P. Sniff and mimic - Intranasal oxytocin increases facial mimicry in a sample of men. Horm Behav 2016; 84:64-74. [PMID: 27283377 DOI: 10.1016/j.yhbeh.2016.06.003] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/11/2015] [Revised: 05/29/2016] [Accepted: 06/04/2016] [Indexed: 01/18/2023]
Abstract
The neuropeptide oxytocin (OT) has many potential social benefits. For example, intranasal administration of OT appears to trigger caregiving behavior and to improve the recognition of emotional facial expressions. But the mechanism for these effects is not yet clear. Recent findings relating OT to action imitation and to the visual processing of the eye region of faces point to mimicry as a mechanism through which OT improves processing of emotional expression. To test the hypothesis that increased levels of OT in the brain enhance facial mimicry, 60 healthy male participants were administered, in a double-blind between-subjects design, 24 international units (IUs) of OT or placebo (PLA) through nasal spray. Facial mimicry and emotion judgments were recorded in response to movie clips depicting changing facial expressions. As expected, facial mimicry was increased in the OT group, but effects were strongest for angry infant faces. These findings provide further evidence for the importance of OT in social cognitive skills, and suggest that facial mimicry mediates the effects of OT on improved emotion recognition.
Collapse
Affiliation(s)
- Sebastian Korb
- Swiss Center for Affective Sciences, Campus Biotech, 9 Chemin des Mines, 1202 Geneva, Switzerland; Department of Psychology, University of Wisconsin, 1202 West Johnson street, Madison, WI 53706, USA.
| | - Jennifer Malsert
- Swiss Center for Affective Sciences, Campus Biotech, 9 Chemin des Mines, 1202 Geneva, Switzerland; Department of Psychology, University of Geneva, 40 bd du Pont d'Arve, 1205 Geneva, Switzerland.
| | - Lane Strathearn
- Stead Family Department of Pediatrics, University of Iowa, 213F CDD Center for Disabilities and Development, 100 Hawkins Dr, Iowa City, IA 52246, USA.
| | - Patrik Vuilleumier
- Department of Fundamental Neurosciences, University of Geneva, 1 rue Michel-Servet, 1205 Geneva, Switzerland.
| | - Paula Niedenthal
- Department of Psychology, University of Wisconsin, 1202 West Johnson street, Madison, WI 53706, USA.
| |
Collapse
|
49
|
Argaud S, Delplanque S, Houvenaghel JF, Auffret M, Duprez J, Vérin M, Grandjean D, Sauleau P. Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson's Disease. PLoS One 2016; 11:e0160329. [PMID: 27467393 PMCID: PMC4965153 DOI: 10.1371/journal.pone.0160329] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2016] [Accepted: 07/18/2016] [Indexed: 11/28/2022] Open
Abstract
According to embodied simulation theory, understanding other people’s emotions is fostered by facial mimicry. However, studies assessing the effect of facial mimicry on the recognition of emotion are still controversial. In Parkinson’s disease (PD), one of the most distinctive clinical features is facial amimia, a reduction in facial expressiveness, but patients also show emotional disturbances. The present study used the pathological model of PD to examine the role of facial mimicry on emotion recognition by investigating EMG responses in PD patients during a facial emotion recognition task (anger, joy, neutral). Our results evidenced a significant decrease in facial mimicry for joy in PD, essentially linked to the absence of reaction of the zygomaticus major and the orbicularis oculi muscles in response to happy avatars, whereas facial mimicry for expressions of anger was relatively preserved. We also confirmed that PD patients were less accurate in recognizing positive and neutral facial expressions and highlighted a beneficial effect of facial mimicry on the recognition of emotion. We thus provide additional arguments for embodied simulation theory suggesting that facial mimicry is a potential lever for therapeutic actions in PD even if it seems not to be necessarily required in recognizing emotion as such.
Collapse
Affiliation(s)
- Soizic Argaud
- Behavior and Basal Ganglia" research unit (EA4712), University of Rennes 1, Rennes, France
- Neuroscience of Emotion and Affective Dynamics laboratory, Department of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland
- * E-mail:
| | - Sylvain Delplanque
- Swiss Center for Affective Sciences, Campus Biotech, University of Geneva, Geneva, Switzerland
| | - Jean-François Houvenaghel
- Behavior and Basal Ganglia" research unit (EA4712), University of Rennes 1, Rennes, France
- Department of Neurology, Rennes University Hospital, Rennes, France
| | - Manon Auffret
- Behavior and Basal Ganglia" research unit (EA4712), University of Rennes 1, Rennes, France
| | - Joan Duprez
- Behavior and Basal Ganglia" research unit (EA4712), University of Rennes 1, Rennes, France
| | - Marc Vérin
- Behavior and Basal Ganglia" research unit (EA4712), University of Rennes 1, Rennes, France
- Department of Neurology, Rennes University Hospital, Rennes, France
| | - Didier Grandjean
- Neuroscience of Emotion and Affective Dynamics laboratory, Department of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland
- Swiss Center for Affective Sciences, Campus Biotech, University of Geneva, Geneva, Switzerland
| | - Paul Sauleau
- Behavior and Basal Ganglia" research unit (EA4712), University of Rennes 1, Rennes, France
- Department of Neurophysiology, Rennes University Hospital, Rennes, France
| |
Collapse
|
50
|
Korb S, Wood A, Banks CA, Agoulnik D, Hadlock TA, Niedenthal PM. Asymmetry of Facial Mimicry and Emotion Perception in Patients With Unilateral Facial Paralysis. JAMA FACIAL PLAST SU 2016; 18:222-7. [DOI: 10.1001/jamafacial.2015.2347] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Affiliation(s)
- Sebastian Korb
- Neuroscience Area, International School for Advanced Studies (SISSA), Trieste, Italy
| | - Adrienne Wood
- Department of Psychology, University of Wisconsin, Madison
| | - Caroline A. Banks
- Department of Otology and Laryngology, Massachusetts Eye and Ear Infirmary, Harvard Medical School, Boston, Massachusetts
| | - Dasha Agoulnik
- Department of Otology and Laryngology, Massachusetts Eye and Ear Infirmary, Harvard Medical School, Boston, Massachusetts
| | - Tessa A. Hadlock
- Division of Facial Plastic and Reconstructive Surgery, Facial Nerve Center, Boston, Massachusetts
| | | |
Collapse
|