1
|
Martineau S, Perrin L, Kerleau H, Rahal A, Marcotte K. Comparison of Objective Facial Metrics on Both Sides of the Face Among Patients with Severe Bell's Palsy Treated with Mirror Effect Plus Protocol Rehabilitation Versus Controls. Facial Plast Surg Aesthet Med 2024; 26:172-179. [PMID: 37819748 DOI: 10.1089/fpsam.2023.0087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/13/2023] Open
Abstract
Objective: The extent to which the healthy hemiface dynamically contributes to facial synchronization during facial rehabilitation has been largely unstudied. This study compares the synchronization of both hemifaces in severe Bell's palsy patients who either received facial rehabilitation called "Mirror Effect Plus Protocol" (MEPP) or basic counseling. Methods: Baseline and 1-year postonset data from 39 patients (19 = MEPP and 20 = basic counseling) were retrospectively analyzed using Emotrics+, a software that generates facial metrics with artificial intelligence (AI) algorithms. Paired t-tests were used for intrasubject comparisons of hemifaces, and mixed model analysis were used to compare between groups. Results: For voluntary movements, a significant difference in favor of the MEPP group was only found for smiling (p = 0.025*). However, at 1-year postonset, the control group showed significant variability between hemifaces for most synkinesis measurements [nasolabial fold (p = 0.029*); eye area (p = 0.043*); palpebral fissure (p = 0.011*)]. Conclusion: In this study, a better synchronization of both hemifaces was found in the MEPP group. Interestingly, motor adaptation in movement amplitude of the healthy hemiface seemed to contribute to this synchronization in MEPP patients. Further studies are needed to standardize the procedure of AI measurements and to adapt it for clinical use.
Collapse
Affiliation(s)
- Sarah Martineau
- Département de chirurgie et Direction des Services Multidisciplinaires, Hôpital Maisonneuve-Rosemont, Montréal, Québec, Canada
- Centre de recherche du Centre intégré universitaire de santé et services sociaux du Nord-de-l'île-de-Montréal, Hôpital du Sacré-Cœur de Montréal, Montréal, Québec, Canada
- École d'Orthophonie et d'Audiologie, Faculté de Médecine, Université de Montréal, Montréal, Québec, Canada
| | - Lucie Perrin
- Département universitaire d'enseignement et de formation en orthophonie, Faculté de Médecine, Université de Sorbonne, Paris, France
| | - Hélène Kerleau
- Département universitaire d'enseignement et de formation en orthophonie, Faculté de Médecine, Université de Sorbonne, Paris, France
| | - Akram Rahal
- Département de chirurgie et Direction des Services Multidisciplinaires, Hôpital Maisonneuve-Rosemont, Montréal, Québec, Canada
- École d'Orthophonie et d'Audiologie, Faculté de Médecine, Université de Montréal, Montréal, Québec, Canada
| | - Karine Marcotte
- Centre de recherche du Centre intégré universitaire de santé et services sociaux du Nord-de-l'île-de-Montréal, Hôpital du Sacré-Cœur de Montréal, Montréal, Québec, Canada
- École d'Orthophonie et d'Audiologie, Faculté de Médecine, Université de Montréal, Montréal, Québec, Canada
| |
Collapse
|
2
|
Alagapan S, Choi KS, Heisig S, Riva-Posse P, Crowell A, Tiruvadi V, Obatusin M, Veerakumar A, Waters AC, Gross RE, Quinn S, Denison L, O'Shaughnessy M, Connor M, Canal G, Cha J, Hershenberg R, Nauvel T, Isbaine F, Afzal MF, Figee M, Kopell BH, Butera R, Mayberg HS, Rozell CJ. Cingulate dynamics track depression recovery with deep brain stimulation. Nature 2023; 622:130-138. [PMID: 37730990 PMCID: PMC10550829 DOI: 10.1038/s41586-023-06541-3] [Citation(s) in RCA: 23] [Impact Index Per Article: 23.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2021] [Accepted: 08/09/2023] [Indexed: 09/22/2023]
Abstract
Deep brain stimulation (DBS) of the subcallosal cingulate (SCC) can provide long-term symptom relief for treatment-resistant depression (TRD)1. However, achieving stable recovery is unpredictable2, typically requiring trial-and-error stimulation adjustments due to individual recovery trajectories and subjective symptom reporting3. We currently lack objective brain-based biomarkers to guide clinical decisions by distinguishing natural transient mood fluctuations from situations requiring intervention. To address this gap, we used a new device enabling electrophysiology recording to deliver SCC DBS to ten TRD participants (ClinicalTrials.gov identifier NCT01984710). At the study endpoint of 24 weeks, 90% of participants demonstrated robust clinical response, and 70% achieved remission. Using SCC local field potentials available from six participants, we deployed an explainable artificial intelligence approach to identify SCC local field potential changes indicating the patient's current clinical state. This biomarker is distinct from transient stimulation effects, sensitive to therapeutic adjustments and accurate at capturing individual recovery states. Variable recovery trajectories are predicted by the degree of preoperative damage to the structural integrity and functional connectivity within the targeted white matter treatment network, and are matched by objective facial expression changes detected using data-driven video analysis. Our results demonstrate the utility of objective biomarkers in the management of personalized SCC DBS and provide new insight into the relationship between multifaceted (functional, anatomical and behavioural) features of TRD pathology, motivating further research into causes of variability in depression treatment.
Collapse
Affiliation(s)
- Sankaraleengam Alagapan
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, USA
| | - Ki Sueng Choi
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Radiology, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Neurosurgery, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Stephen Heisig
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Patricio Riva-Posse
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine, Atlanta, GA, USA
| | - Andrea Crowell
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine, Atlanta, GA, USA
| | - Vineet Tiruvadi
- Wallace H. Coulter Department of Biomedical Engineering at Georgia Institute of Technology and Emory University, Atlanta, GA, USA
- Emory University School of Medicine, Atlanta, GA, USA
| | - Mosadoluwa Obatusin
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Ashan Veerakumar
- Department of Psychiatry, Schulich School of Medicine and Dentistry at Western University, London, Ontario, Canada
| | - Allison C Waters
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Robert E Gross
- Wallace H. Coulter Department of Biomedical Engineering at Georgia Institute of Technology and Emory University, Atlanta, GA, USA
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, USA
- Department of Neurology, Emory University School of Medicine, Atlanta, GA, USA
| | - Sinead Quinn
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine, Atlanta, GA, USA
| | - Lydia Denison
- Emory University School of Medicine, Atlanta, GA, USA
| | - Matthew O'Shaughnessy
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, USA
| | - Marissa Connor
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, USA
| | - Gregory Canal
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, USA
| | - Jungho Cha
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Rachel Hershenberg
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine, Atlanta, GA, USA
| | - Tanya Nauvel
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Faical Isbaine
- Department of Neurosurgery, Emory University School of Medicine, Atlanta, GA, USA
| | - Muhammad Furqan Afzal
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Martijn Figee
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Brian H Kopell
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Neurosurgery, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA
- Department of Neurology, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Robert Butera
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, USA
- Wallace H. Coulter Department of Biomedical Engineering at Georgia Institute of Technology and Emory University, Atlanta, GA, USA
| | - Helen S Mayberg
- Nash Family Center for Advanced Circuit Therapeutics, Icahn School of Medicine at Mount Sinai, New York, NY, USA.
- Department of Neurosurgery, Icahn School of Medicine at Mount Sinai, New York, NY, USA.
- Department of Neuroscience, Icahn School of Medicine at Mount Sinai, New York, NY, USA.
- Department of Psychiatry, Icahn School of Medicine at Mount Sinai, New York, NY, USA.
- Department of Neurology, Icahn School of Medicine at Mount Sinai, New York, NY, USA.
| | - Christopher J Rozell
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, USA.
| |
Collapse
|
3
|
Straulino E, Scarpazza C, Spoto A, Betti S, Chozas Barrientos B, Sartori L. The Spatiotemporal Dynamics of Facial Movements Reveals the Left Side of a Posed Smile. BIOLOGY 2023; 12:1160. [PMID: 37759560 PMCID: PMC10525663 DOI: 10.3390/biology12091160] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Revised: 08/14/2023] [Accepted: 08/19/2023] [Indexed: 09/29/2023]
Abstract
Humans can recombine thousands of different facial expressions. This variability is due to the ability to voluntarily or involuntarily modulate emotional expressions, which, in turn, depends on the existence of two anatomically separate pathways. The Voluntary (VP) and Involuntary (IP) pathways mediate the production of posed and spontaneous facial expressions, respectively, and might also affect the left and right sides of the face differently. This is a neglected aspect in the literature on emotion, where posed expressions instead of genuine expressions are often used as stimuli. Two experiments with different induction methods were specifically designed to investigate the unfolding of spontaneous and posed facial expressions of happiness along the facial vertical axis (left, right) with a high-definition 3-D optoelectronic system. The results showed that spontaneous expressions were distinguished from posed facial movements as revealed by reliable spatial and speed key kinematic patterns in both experiments. Moreover, VP activation produced a lateralization effect: compared with the felt smile, the posed smile involved an initial acceleration of the left corner of the mouth, while an early deceleration of the right corner occurred in the second phase of the movement, after the velocity peak.
Collapse
Affiliation(s)
- Elisa Straulino
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
| | - Cristina Scarpazza
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
- Translational Neuroimaging and Cognitive Lab, IRCCS San Camillo Hospital, Via Alberoni 70, 30126 Venice, Italy
| | - Andrea Spoto
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
| | - Sonia Betti
- Department of Psychology, Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Viale Rasi e Spinelli 176, 47521 Cesena, Italy;
| | - Beatriz Chozas Barrientos
- Department of Chiropractic Medicine, University of Zurich, Balgrist University Hospital, Forchstrasse 340, 8008 Zürich, Switzerland;
| | - Luisa Sartori
- Department of General Psychology, University of Padova, Via Venezia 8, 35131 Padova, Italy; (C.S.); (A.S.)
- Padova Neuroscience Center, University of Padova, Via Giuseppe Orus 2, 35131 Padova, Italy
| |
Collapse
|
4
|
Straulino E, Scarpazza C, Sartori L. What is missing in the study of emotion expression? Front Psychol 2023; 14:1158136. [PMID: 37179857 PMCID: PMC10173880 DOI: 10.3389/fpsyg.2023.1158136] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/06/2023] [Indexed: 05/15/2023] Open
Abstract
While approaching celebrations for the 150 years of "The Expression of the Emotions in Man and Animals", scientists' conclusions on emotion expression are still debated. Emotion expression has been traditionally anchored to prototypical and mutually exclusive facial expressions (e.g., anger, disgust, fear, happiness, sadness, and surprise). However, people express emotions in nuanced patterns and - crucially - not everything is in the face. In recent decades considerable work has critiqued this classical view, calling for a more fluid and flexible approach that considers how humans dynamically perform genuine expressions with their bodies in context. A growing body of evidence suggests that each emotional display is a complex, multi-component, motoric event. The human face is never static, but continuously acts and reacts to internal and environmental stimuli, with the coordinated action of muscles throughout the body. Moreover, two anatomically and functionally different neural pathways sub-serve voluntary and involuntary expressions. An interesting implication is that we have distinct and independent pathways for genuine and posed facial expressions, and different combinations may occur across the vertical facial axis. Investigating the time course of these facial blends, which can be controlled consciously only in part, is recently providing a useful operational test for comparing the different predictions of various models on the lateralization of emotions. This concise review will identify shortcomings and new challenges regarding the study of emotion expressions at face, body, and contextual levels, eventually resulting in a theoretical and methodological shift in the study of emotions. We contend that the most feasible solution to address the complex world of emotion expression is defining a completely new and more complete approach to emotional investigation. This approach can potentially lead us to the roots of emotional display, and to the individual mechanisms underlying their expression (i.e., individual emotional signatures).
Collapse
Affiliation(s)
- Elisa Straulino
- Department of General Psychology, University of Padova, Padova, Italy
- *Correspondence: Elisa Straulino,
| | - Cristina Scarpazza
- Department of General Psychology, University of Padova, Padova, Italy
- IRCCS San Camillo Hospital, Venice, Italy
| | - Luisa Sartori
- Department of General Psychology, University of Padova, Padova, Italy
- Padova Neuroscience Center, University of Padova, Padova, Italy
- Luisa Sartori,
| |
Collapse
|
5
|
Das A, Mock J, Irani F, Huang Y, Najafirad P, Golob E. Multimodal explainable AI predicts upcoming speech behavior in adults who stutter. Front Neurosci 2022; 16:912798. [PMID: 35979337 PMCID: PMC9376608 DOI: 10.3389/fnins.2022.912798] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2022] [Accepted: 07/04/2022] [Indexed: 11/18/2022] Open
Abstract
A key goal of cognitive neuroscience is to better understand how dynamic brain activity relates to behavior. Such dynamics, in terms of spatial and temporal patterns of brain activity, are directly measured with neurophysiological methods such as EEG, but can also be indirectly expressed by the body. Autonomic nervous system activity is the best-known example, but, muscles in the eyes and face can also index brain activity. Mostly parallel lines of artificial intelligence research show that EEG and facial muscles both encode information about emotion, pain, attention, and social interactions, among other topics. In this study, we examined adults who stutter (AWS) to understand the relations between dynamic brain and facial muscle activity and predictions about future behavior (fluent or stuttered speech). AWS can provide insight into brain-behavior dynamics because they naturally fluctuate between episodes of fluent and stuttered speech behavior. We focused on the period when speech preparation occurs, and used EEG and facial muscle activity measured from video to predict whether the upcoming speech would be fluent or stuttered. An explainable self-supervised multimodal architecture learned the temporal dynamics of both EEG and facial muscle movements during speech preparation in AWS, and predicted fluent or stuttered speech at 80.8% accuracy (chance=50%). Specific EEG and facial muscle signals distinguished fluent and stuttered trials, and systematically varied from early to late speech preparation time periods. The self-supervised architecture successfully identified multimodal activity that predicted upcoming behavior on a trial-by-trial basis. This approach could be applied to understanding the neural mechanisms driving variable behavior and symptoms in a wide range of neurological and psychiatric disorders. The combination of direct measures of neural activity and simple video data may be applied to developing technologies that estimate brain state from subtle bodily signals.
Collapse
Affiliation(s)
- Arun Das
- Secure AI and Autonomy Laboratory, University of Texas at San Antonio, San Antonio, TX, United States
- UPMC Hillman Cancer Center, University of Pittsburgh Medical Center, Pittsburgh, PA, United States
| | - Jeffrey Mock
- Cognitive Neuroscience Laboratory, University of Texas at San Antonio, San Antonio, TX, United States
| | - Farzan Irani
- Department of Communication Disorders, Texas State University, San Marcos, TX, United States
| | - Yufei Huang
- UPMC Hillman Cancer Center, University of Pittsburgh Medical Center, Pittsburgh, PA, United States
| | - Peyman Najafirad
- Secure AI and Autonomy Laboratory, University of Texas at San Antonio, San Antonio, TX, United States
| | - Edward Golob
- Cognitive Neuroscience Laboratory, University of Texas at San Antonio, San Antonio, TX, United States
| |
Collapse
|
6
|
Delor B, D'Hondt F, Philippot P. The Influence of Facial Asymmetry on Genuineness Judgment. Front Psychol 2021; 12:727446. [PMID: 34899469 PMCID: PMC8655228 DOI: 10.3389/fpsyg.2021.727446] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2021] [Accepted: 10/22/2021] [Indexed: 11/13/2022] Open
Abstract
This study investigates how asymmetry, expressed emotion, and sex of the expresser impact the perception of emotional facial expressions (EFEs) in terms of perceived genuineness. Thirty-five undergraduate women completed a task using chimeric stimuli with artificial human faces. They were required to judge whether the expressed emotion was genuinely felt. The results revealed that (a) symmetrical faces are judged as more genuine than asymmetrical faces and (b) EFEs' decoding is modulated by complex interplays between emotion and sex of the expresser.
Collapse
Affiliation(s)
- Bérénice Delor
- Louvain Experimental Psychopathology, Psychological Sciences Research Institute, Catholic University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium
| | - Fabien D'Hondt
- Inserm, CHU Lille, U1172-LilNCog-Lille Neuroscience and Cognition, Université de Lille, Lille, France.,Clinique de Psychiatrie, Unité CURE, CHU Lille, Lille, France.,Centre national de ressources et de résilience Lille-Paris (CN2R), Lille, France
| | - Pierre Philippot
- Louvain Experimental Psychopathology, Psychological Sciences Research Institute, Catholic University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium
| |
Collapse
|
7
|
Human face and gaze perception is highly context specific and involves bottom-up and top-down neural processing. Neurosci Biobehav Rev 2021; 132:304-323. [PMID: 34861296 DOI: 10.1016/j.neubiorev.2021.11.042] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Revised: 11/24/2021] [Accepted: 11/24/2021] [Indexed: 11/21/2022]
Abstract
This review summarizes human perception and processing of face and gaze signals. Face and gaze signals are important means of non-verbal social communication. The review highlights that: (1) some evidence is available suggesting that the perception and processing of facial information starts in the prenatal period; (2) the perception and processing of face identity, expression and gaze direction is highly context specific, the effect of race and culture being a case in point. Culture affects by means of experiential shaping and social categorization the way in which information on face and gaze is collected and perceived; (3) face and gaze processing occurs in the so-called 'social brain'. Accumulating evidence suggests that the processing of facial identity, facial emotional expression and gaze involves two parallel and interacting pathways: a fast and crude subcortical route and a slower cortical pathway. The flow of information is bi-directional and includes bottom-up and top-down processing. The cortical networks particularly include the fusiform gyrus, superior temporal sulcus (STS), intraparietal sulcus, temporoparietal junction and medial prefrontal cortex.
Collapse
|
8
|
Ross ED. Differential Hemispheric Lateralization of Emotions and Related Display Behaviors: Emotion-Type Hypothesis. Brain Sci 2021; 11:1034. [PMID: 34439653 PMCID: PMC8393469 DOI: 10.3390/brainsci11081034] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2021] [Revised: 07/14/2021] [Accepted: 07/26/2021] [Indexed: 11/26/2022] Open
Abstract
There are two well-known hypotheses regarding hemispheric lateralization of emotions. The Right Hemisphere Hypothesis (RHH) postulates that emotions and associated display behaviors are a dominant and lateralized function of the right hemisphere. The Valence Hypothesis (VH) posits that negative emotions and related display behaviors are modulated by the right hemisphere and positive emotions and related display behaviors are modulated by the left hemisphere. Although both the RHH and VH are supported by extensive research data, they are mutually exclusive, suggesting that there may be a missing factor in play that may provide a more accurate description of how emotions are lateralization in the brain. Evidence will be presented that provides a much broader perspective of emotions by embracing the concept that emotions can be classified into primary and social types and that hemispheric lateralization is better explained by the Emotion-type Hypothesis (ETH). The ETH posits that primary emotions and related display behaviors are modulated by the right hemisphere and social emotions and related display behaviors are modulated by the left hemisphere.
Collapse
Affiliation(s)
- Elliott D. Ross
- Department of Neurology, University of Oklahoma Health Sciences Center, Oklahoma City, OK 73104, USA; or
- Department of Neurology, University of Colorado School of Medicine, Aurora, CO 80045, USA
| |
Collapse
|
9
|
Changes in Computer-Analyzed Facial Expressions with Age. SENSORS 2021; 21:s21144858. [PMID: 34300600 PMCID: PMC8309819 DOI: 10.3390/s21144858] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/17/2021] [Revised: 07/13/2021] [Accepted: 07/15/2021] [Indexed: 11/17/2022]
Abstract
Facial expressions are well known to change with age, but the quantitative properties of facial aging remain unclear. In the present study, we investigated the differences in the intensity of facial expressions between older (n = 56) and younger adults (n = 113). In laboratory experiments, the posed facial expressions of the participants were obtained based on six basic emotions and neutral facial expression stimuli, and the intensities of their faces were analyzed using a computer vision tool, OpenFace software. Our results showed that the older adults expressed strong expressions for some negative emotions and neutral faces. Furthermore, when making facial expressions, older adults used more face muscles than younger adults across the emotions. These results may help to understand the characteristics of facial expressions in aging and can provide empirical evidence for other fields regarding facial recognition.
Collapse
|
10
|
Facial expressions can be categorized along the upper-lower facial axis, from a perceptual perspective. Atten Percept Psychophys 2021; 83:2159-2173. [PMID: 33759116 DOI: 10.3758/s13414-021-02281-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/09/2021] [Indexed: 11/08/2022]
Abstract
A critical question, fundamental for building models of emotion, is how to categorize emotions. Previous studies have typically taken one of two approaches: (a) they focused on the pre-perceptual visual cues, how salient facial features or configurations were displayed; or (b) they focused on the post-perceptual affective experiences, how emotions affected behavior. In this study, we attempted to group emotions at a peri-perceptual processing level: it is well known that humans perceive different facial expressions differently, therefore, can we classify facial expressions into distinct categories in terms of their perceptual similarities? Here, using a novel non-lexical paradigm, we assessed the perceptual dissimilarities between 20 facial expressions using reaction times. Multidimensional-scaling analysis revealed that facial expressions were organized predominantly along the upper-lower face axis. Cluster analysis of behavioral data delineated three superordinate categories, and eye-tracking measurements validated these clustering results. Interestingly, these superordinate categories can be conceptualized according to how facial displays interact with acoustic communications: One group comprises expressions that have salient mouth features. They likely link to species-specific vocalization, for example, crying, laughing. The second group comprises visual displays with diagnosing features in both the mouth and the eye regions. They are not directly articulable but can be expressed prosodically, for example, sad, angry. Expressions in the third group are also whole-face expressions but are completely independent of vocalization, and likely being blends of two or more elementary expressions. We propose a theoretical framework to interpret the tripartite division in which distinct expression subsets are interpreted as successive phases in an evolutionary chain.
Collapse
|
11
|
Plouffe-Demers MP, Fiset D, Saumure C, Duncan J, Blais C. Strategy Shift Toward Lower Spatial Frequencies in the Recognition of Dynamic Facial Expressions of Basic Emotions: When It Moves It Is Different. Front Psychol 2019; 10:1563. [PMID: 31379648 PMCID: PMC6650765 DOI: 10.3389/fpsyg.2019.01563] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2018] [Accepted: 06/20/2019] [Indexed: 11/15/2022] Open
Abstract
Facial expressions of emotion play a key role in social interactions. While in everyday life, their dynamic and transient nature calls for a fast processing of the visual information they contain, a majority of studies investigating the visual processes underlying their recognition have focused on their static display. The present study aimed to gain a better understanding of these processes while using more ecological dynamic facial expressions. In two experiments, we directly compared the spatial frequency (SF) tuning during the recognition of static and dynamic facial expressions. Experiment 1 revealed a shift toward lower SFs for dynamic expressions in comparison to static ones. Experiment 2 was designed to verify if changes in SF tuning curves were specific to the presence of emotional information in motion by comparing the SF tuning profiles for static, dynamic, and shuffled dynamic expressions. Results showed a similar shift toward lower SFs for shuffled expressions, suggesting that the difference found between dynamic and static expressions might not be linked to informative motion per se but to the presence of motion regardless its nature.
Collapse
Affiliation(s)
- Marie-Pier Plouffe-Demers
- Département de Psychologie, Universtité du Québec en Outaouais, Gatineau, QC, Canada.,Département de Psychologie, Université du Québec à Montréal, Montreal, QC, Canada
| | - Daniel Fiset
- Département de Psychologie, Universtité du Québec en Outaouais, Gatineau, QC, Canada
| | - Camille Saumure
- Département de Psychologie, Universtité du Québec en Outaouais, Gatineau, QC, Canada
| | - Justin Duncan
- Département de Psychologie, Universtité du Québec en Outaouais, Gatineau, QC, Canada.,Département de Psychologie, Université du Québec à Montréal, Montreal, QC, Canada
| | - Caroline Blais
- Département de Psychologie, Universtité du Québec en Outaouais, Gatineau, QC, Canada
| |
Collapse
|
12
|
Saumure C, Plouffe-Demers MP, Estéphan A, Fiset D, Blais C. The use of visual information in the recognition of posed and spontaneous facial expressions. J Vis 2019; 18:21. [PMID: 30372755 DOI: 10.1167/18.9.21] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Recognizing facial expressions is crucial for the success of social interactions, and the visual processes underlying this ability have been the subject of many studies in the field of face perception. Nevertheless, the stimuli used in the majority of these studies consist of facial expressions that were produced on request rather than spontaneously induced. In the present study, we directly compared the visual strategies underlying the recognition of posed and spontaneous expressions of happiness, disgust, surprise, and sadness. We used the Bubbles method with pictures of the same individuals spontaneously expressing an emotion or posing with an expression on request. Two key findings were obtained: Visual strategies were less systematic with spontaneous than with posed expressions, suggesting a higher heterogeneity in the useful facial cues across identities; and with spontaneous expressions, the relative reliance on the mouth and eyes areas was more evenly distributed, contrasting with the higher reliance on the mouth compared to the eyes area observed with posed expressions.
Collapse
Affiliation(s)
- Camille Saumure
- Department of Psychoeducation and Psychology, Université du Québec en Outaouais, Gatineau, Québec, Canada
| | - Marie-Pier Plouffe-Demers
- Department of Psychoeducation and Psychology, Université du Québec en Outaouais, Gatineau, Québec, Canada
| | - Amanda Estéphan
- Department of Psychoeducation and Psychology, Université du Québec en Outaouais, Gatineau, Québec, Canada
| | - Daniel Fiset
- Department of Psychoeducation and Psychology, Université du Québec en Outaouais, Gatineau, Québec, Canada
| | - Caroline Blais
- Department of Psychoeducation and Psychology, Université du Québec en Outaouais, Gatineau, Québec, Canada
| |
Collapse
|
13
|
Kang J, Derva D, Kwon DY, Wallraven C. Voluntary and spontaneous facial mimicry toward other's emotional expression in patients with Parkinson's disease. PLoS One 2019; 14:e0214957. [PMID: 30973893 PMCID: PMC6459535 DOI: 10.1371/journal.pone.0214957] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2018] [Accepted: 03/23/2019] [Indexed: 01/31/2023] Open
Abstract
A "masked face", that is, decreased facial expression is considered as one of the cardinal symptoms among individuals with Parkinson's disease (PD). Both spontaneous and voluntary mimicry toward others' emotional expressions is essential for both social communication and emotional sharing with others. Despite many studies showing impairments in facial movements in PD in general, it is still unclear whether voluntary, spontaneous, or both types of mimicry are affected and how the impairments affect the patients' quality of life. We investigated to verify whether impairments in facial movements happen for spontaneous as well as for voluntary expressions by quantitatively comparing muscle activations using surface electromyography. Dynamic facial expressions of Neutral, Anger, Joy, and Sad were presented during recordings in corrugator and zygomatic areas. In the spontaneous condition, participants were instructed to simply watch clips, whereas in the voluntary condition they were instructed to actively mimic the stimuli. We found that PD patients showed decreased mimicry in both spontaneous and voluntary conditions compared to a matched control group, although movement patterns in each emotion were similar in the two groups. Moreover, whereas the decrease in mimicry correlated with the decrease not in a health-related quality of life index (PDQ), it did so in a more subjective measurement of general quality of life index (SWB). The correlation between facial mimicry and subjective well-being index suggests that the 'masked face' symptom deteriorates patients' quality of life in a complex way affecting social and psychological aspects, which in turn may be linked to the increased depression risk among individuals with PD.
Collapse
Affiliation(s)
- June Kang
- Korea University, Department of Brain and Cognitive Engineering, Seoul, South Korea
- Empathy Research Institute, Seoul, South Korea
| | - Dilara Derva
- Korea University, Department of Brain and Cognitive Engineering, Seoul, South Korea
| | - Do-Young Kwon
- Korea University Ansan hospital, Department of Neurology, Ansan City, South Korea
| | - Christian Wallraven
- Korea University, Department of Brain and Cognitive Engineering, Seoul, South Korea
| |
Collapse
|
14
|
Neurophysiology of spontaneous facial expressions: II. Motor control of the right and left face is partially independent in adults. Cortex 2018; 111:164-182. [PMID: 30502646 DOI: 10.1016/j.cortex.2018.10.027] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2018] [Revised: 10/18/2018] [Accepted: 10/31/2018] [Indexed: 12/22/2022]
Abstract
Facial expressions are described traditionally as monolithic or unitary entities. However, humans have the capacity to produce facial blends of emotion in which the upper and lower face simultaneously display different expressions. Recent neuroanatomical studies in monkeys have demonstrated that there are separate cortical motor areas for controlling the upper and lower face in each hemisphere that, presumably, also occur in humans. Using high-speed videography, we began measuring the movement dynamics of spontaneous facial expressions, including facial blends, to develop a more complete understanding of the neurophysiology underlying facial expressions. In our part 1 publication in Cortex (2016), we found that hemispheric motor control of the upper and lower face is overwhelmingly independent; 242 (99%) of the expressions were classified as demonstrating independent hemispheric motor control whereas only 3 (1%) were classified as demonstrating dependent hemispheric motor control. In this companion paper (part 2), 251 unitary facial expressions that occurred on either the upper or lower face were analyzed. 164 (65%) expressions demonstrated dependent hemispheric motor control whereas 87 (35%) expressions demonstrated independent or dual hemispheric motor control, indicating that some expressions represent facial blends of emotion that occur across the vertical facial axis. These findings also support the concepts that 1) spontaneous facial expressions are organized predominantly across the horizontal facial axis and secondarily across the vertical facial axis and 2) facial expressions are complex, multi-component, motoric events. Based on the Emotion-type hypothesis of cerebral lateralization, we propose that facial expressions modulated by a primary-emotional response to an environmental event are initiated by the right hemisphere on the left side of the face whereas facial expressions modulated by a social-emotional response to an environmental event are initiated by the left hemisphere on the right side of the face.
Collapse
|
15
|
Recio G, Sommer W. Copycat of dynamic facial expressions: Superior volitional motor control for expressions of disgust. Neuropsychologia 2018; 119:512-523. [PMID: 30176302 DOI: 10.1016/j.neuropsychologia.2018.08.027] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2018] [Revised: 07/31/2018] [Accepted: 08/29/2018] [Indexed: 10/28/2022]
Abstract
In social situations facial expressions are often strategically employed. Despite the many research on motor control of limb movements, little is known about the control over facial expressions. Using a response-priming task, we investigated motor control over three facial expressions, smiles, disgust and emotionally neutral jaw drops. Prime stimuli consisted of videos of a facial expression to be prepared or - as a neutral prime - an abstract symbol superimposed to a scrambled face. In valid trials an equal symbol (=) indicated to produce the primed expression. In invalid trials, an unequal symbol (‡) prompted participants to produce an alternative, unprimed expression. We examined the impact of emotion in preparing and revoking a prepared expression, and possible facilitation for dynamic facial expressions relative to symbolic primes. Participants' facial responses were scored using automated analyses of facial expressions with computer software. The underlying neurocognitive processes were tracked with event-related-potentials. Reprogramming costs, in the form of longer reaction times (RTs) in trials where participants had prepared an invalidly primed expression and had to quickly switch to the correct one, were more pronounced for smiles and jaw drops than for disgust, possibly indicating the need for being fast when showing disgust. Data from the P3 component related the behavioral effect to a more efficient updating of the correct response in brain systems responsible for motor control. Priming participants with dynamic facial expressions as examples for imitation, improved performance accuracy as compared to the symbolic abstract stimuli, but it not did affect RTs. Priming with dynamic videos also resulted in larger validity effects of the P3 component when disgust was the target response, indicating that the perceptual system might trigger automatic emotional responses, at least for negative affect.
Collapse
Affiliation(s)
- Guillermo Recio
- Differential Psychology and Psychological Assessment, Universität Hamburg, Von-Melle-Park 5, R4020b, D-20146 Hamburg, Germany.
| | - Werner Sommer
- Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, D-10099 Berlin, Germany.
| |
Collapse
|
16
|
Lee WJ, Choi SH, Jang JH, Moon JY, Kim YC, Noh E, Shin JE, Shin H, Kang DH. Different patterns in mental rotation of facial expressions in complex regional pain syndrome patients. Medicine (Baltimore) 2017; 96:e7990. [PMID: 28953620 PMCID: PMC5626263 DOI: 10.1097/md.0000000000007990] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/27/2022] Open
Abstract
Although facial pain expressions are considered to be the most visible pain behaviors, it is known that the association between pain intensity and facial pain expression is weak for chronic pain. The authors hypothesized that the facial pain expressiveness was altered in chronic pain and investigated it with a mental rotation task using various facial expression, which seems to be associated with actual facial movements. As a task stimulus, 4 types of facial expression stimuli consisted of upper (tightening of eye and furrowed brows) and lower (raising upper lip) pain-specific facial expressions, and upper (eyeball deviation) and lower (tongue protrusion) facial movements not using facial muscles were used. Participants were asked to judge whether a stimulus presented at various rotation angles was left- or right-sided. The authors tested 40 patients with complex regional pain syndrome (CRPS) (12 women, age range 21-60) and 35 healthy controls (15 women, age range 26-64). In an analysis of reaction time (RT) using a linear mixed model, patients were slower to react to all types of stimuli (P = .001) and a significant interaction between group (patient or control) and type of facial expression was observed (P = .01). In the post hoc analysis only patients showed longer RTs to raising upper lip than other types of facial expressions. This reflects a deficit in mental rotation tasks especially for lower facial region pain expressions in CRPS, which may be related to the psychosocial aspects of pain. However, comprehensive intra- and interpersonal influences should be further investigated.
Collapse
Affiliation(s)
- Won Joon Lee
- Department of Psychiatry, Seoul National University Hospital, Seoul
- Department of Psychiatry, Armed Forces Capital Hospital, Seongnam
| | - Soo-Hee Choi
- Department of Psychiatry, Seoul National University Hospital, Seoul
- Department of Psychiatry and Institute of Human Behavioral Sciences, Seoul National University College of Medicine
| | - Joon Hwan Jang
- Department of Medicine, Seoul National University College of Medicine
| | - Jee Youn Moon
- Department of Anesthesiology and Pain Medicine, Seoul National University Hospital
| | - Yong Chul Kim
- Department of Anesthesiology and Pain Medicine, Seoul National University Hospital
| | - EunChung Noh
- Interdisciplinary Program of Neuroscience, Seoul National University, Seoul
| | - Jung Eun Shin
- Department of Psychiatry, Seoul National University Hospital, Seoul
| | - HyunSoon Shin
- Electronics and Telecommunications Research Institute, Daejeon, Republic of Korea
| | - Do-Hyung Kang
- Department of Psychiatry, Seoul National University Hospital, Seoul
- Department of Psychiatry and Institute of Human Behavioral Sciences, Seoul National University College of Medicine
| |
Collapse
|
17
|
Boutsen FA, Dvorak JD, Pulusu VK, Ross ED. Altered saccadic targets when processing facial expressions under different attentional and stimulus conditions. Vision Res 2017; 133:150-160. [PMID: 28279711 DOI: 10.1016/j.visres.2016.07.012] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2015] [Revised: 05/16/2016] [Accepted: 07/09/2016] [Indexed: 10/20/2022]
Abstract
Depending on a subject's attentional bias, robust changes in emotional perception occur when facial blends (different emotions expressed on upper/lower face) are presented tachistoscopically. If no instructions are given, subjects overwhelmingly identify the lower facial expression when blends are presented to either visual field. If asked to attend to the upper face, subjects overwhelmingly identify the upper facial expression in the left visual field but remain slightly biased to the lower facial expression in the right visual field. The current investigation sought to determine whether differences in initial saccadic targets could help explain the perceptual biases described above. Ten subjects were presented with full and blend facial expressions under different attentional conditions. No saccadic differences were found for left versus right visual field presentations or for full facial versus blend stimuli. When asked to identify the presented emotion, saccades were directed to the lower face. When asked to attend to the upper face, saccades were directed to the upper face. When asked to attend to the upper face and try to identify the emotion, saccades were directed to the upper face but to a lesser degree. Thus, saccadic behavior supports the concept that there are cognitive-attentional pre-attunements when subjects visually process facial expressions. However, these pre-attunements do not fully explain the perceptual superiority of the left visual field for identifying the upper facial expression when facial blends are presented tachistoscopically. Hence other perceptual factors must be in play, such as the phenomenon of virtual scanning.
Collapse
Affiliation(s)
- Frank A Boutsen
- Department of Communication Sciences and Disorders, University of Oklahoma Health Sciences, 1200 North Stonewall Ave., Oklahoma City, OK 73117, USA
| | - Justin D Dvorak
- Department of Communication Sciences and Disorders, University of Oklahoma Health Sciences, 1200 North Stonewall Ave., Oklahoma City, OK 73117, USA
| | - Vinay K Pulusu
- Department of Neurology, University of Oklahoma Health Sciences Center, and the VA Medical Center (127), 921 NE 13th Street, Oklahoma City, OK 73104, USA
| | - Elliott D Ross
- Department of Neurology, University of Oklahoma Health Sciences Center, and the VA Medical Center (127), 921 NE 13th Street, Oklahoma City, OK 73104, USA; Department of Communication Sciences and Disorders, University of Oklahoma Health Sciences, 1200 North Stonewall Ave., Oklahoma City, OK 73117, USA.
| |
Collapse
|
18
|
Müri RM. Cortical control of facial expression. J Comp Neurol 2017; 524:1578-85. [PMID: 26418049 DOI: 10.1002/cne.23908] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2015] [Revised: 06/03/2015] [Accepted: 09/25/2015] [Indexed: 11/10/2022]
Abstract
The present Review deals with the motor control of facial expressions in humans. Facial expressions are a central part of human communication. Emotional face expressions have a crucial role in human nonverbal behavior, allowing a rapid transfer of information between individuals. Facial expressions can be either voluntarily or emotionally controlled. Recent studies in nonhuman primates and humans have revealed that the motor control of facial expressions has a distributed neural representation. At least five cortical regions on the medial and lateral aspects of each hemisphere are involved: the primary motor cortex, the ventral lateral premotor cortex, the supplementary motor area on the medial wall, and the rostral and caudal cingulate cortex. The results of studies in humans and nonhuman primates suggest that the innervation of the face is bilaterally controlled for the upper part and mainly contralaterally controlled for the lower part. Furthermore, the primary motor cortex, the ventral lateral premotor cortex, and the supplementary motor area are essential for the voluntary control of facial expressions. In contrast, the cingulate cortical areas are important for emotional expression, because they receive input from different structures of the limbic system.
Collapse
Affiliation(s)
- René M Müri
- Division of Cognitive and Restorative Neurology, Departments of Neurology and Clinical Research, University Hospital Inselspital, 3010, Bern, Switzerland.,Gerontechnology and Rehabilitation Group, University of Bern, 3012, Bern, Switzerland.,Center for Cognition, Learning, and Memory, University of Bern, 3012, Bern, Switzerland
| |
Collapse
|
19
|
Ross ED, Gupta SS, Adnan AM, Holden TL, Havlicek J, Radhakrishnan S. Neurophysiology of spontaneous facial expressions: I. Motor control of the upper and lower face is behaviorally independent in adults. Cortex 2016; 76:28-42. [DOI: 10.1016/j.cortex.2016.01.001] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2015] [Revised: 12/29/2015] [Accepted: 01/05/2016] [Indexed: 12/01/2022]
|
20
|
Krippl M, Karim AA, Brechmann A. Neuronal correlates of voluntary facial movements. Front Hum Neurosci 2015; 9:598. [PMID: 26578940 PMCID: PMC4623161 DOI: 10.3389/fnhum.2015.00598] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2014] [Accepted: 10/14/2015] [Indexed: 11/30/2022] Open
Abstract
Whereas the somatotopy of finger movements has been extensively studied with neuroimaging, the neural foundations of facial movements remain elusive. Therefore, we systematically studied the neuronal correlates of voluntary facial movements using the Facial Action Coding System (FACS, Ekman et al., 2002). The facial movements performed in the MRI scanner were defined as Action Units (AUs) and were controlled by a certified FACS coder. The main goal of the study was to investigate the detailed somatotopy of the facial primary motor area (facial M1). Eighteen participants were asked to produce the following four facial movements in the fMRI scanner: AU1+2 (brow raiser), AU4 (brow lowerer), AU12 (lip corner puller) and AU24 (lip presser), each in alternation with a resting phase. Our facial movement task induced generally high activation in brain motor areas (e.g., M1, premotor cortex, supplementary motor area, putamen), as well as in the thalamus, insula, and visual cortex. BOLD activations revealed overlapping representations for the four facial movements. However, within the activated facial M1 areas, we could find distinct peak activities in the left and right hemisphere supporting a rough somatotopic upper to lower face organization within the right facial M1 area, and a somatotopic organization within the right M1 upper face part. In both hemispheres, the order was an inverse somatotopy within the lower face representations. In contrast to the right hemisphere, in the left hemisphere the representation of AU4 was more lateral and anterior compared to the rest of the facial movements. Our findings support the notion of a partial somatotopic order within the M1 face area confirming the “like attracts like” principle (Donoghue et al., 1992). AUs which are often used together or are similar are located close to each other in the motor cortex.
Collapse
Affiliation(s)
- Martin Krippl
- Department of Methodology, Psychodiagnostics and Evaluation Research, Institute of Psychology, Otto-von-Guericke University Magdeburg Magdeburg, Germany
| | - Ahmed A Karim
- Department of Psychiatry and Psychotherapy, Universitätsklinikum Tübingen Tübingen, Germany ; Department of Prevention and Health Psychology, SRH Fernhochschule Riedlingen Riedlingen, Germany
| | - André Brechmann
- Special Lab Non-Invasive Brain Imaging, Leibniz Institute for Neurobiology Magdeburg, Germany
| |
Collapse
|
21
|
Carr EW, Korb S, Niedenthal PM, Winkielman P. The two sides of spontaneity: Movement onset asymmetries in facial expressions influence social judgments. JOURNAL OF EXPERIMENTAL SOCIAL PSYCHOLOGY 2014. [DOI: 10.1016/j.jesp.2014.05.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
22
|
von Piekartz H, Mohr G. Reduction of head and face pain by challenging lateralization and basic emotions: a proposal for future assessment and rehabilitation strategies. J Man Manip Ther 2014; 22:24-35. [PMID: 24976745 DOI: 10.1179/2042618613y.0000000063] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/31/2022] Open
Abstract
Chronic facial pain has many of the clinical characteristics found in other persistent musculoskeletal conditions, such as low back and cervical pain syndromes. Unique to this condition, however, is that painful facial movements may result in rigidity or altered ability to demonstrate mimicry, defined as the natural tendency to adopt the behavioral expressions of other persons involved in the interaction. Loss of ability to communicate through emotional expression can lead to impaired processing of emotions and ultimately social isolation. Diminished quality and quantity of facial expression is associated with chronic face pain, tempromandibular dysfunction, facial asymmetries, and neurological disorders. This report provides a framework for assessment of impaired emotional processing and associated somatosensory alterations. Principles for management for chronic facial pain should include graded motor imagery, in addition to standard treatments of manual therapy, exercise, and patient education. A case study is provided which illustrates these principles.
Collapse
Affiliation(s)
- Harry von Piekartz
- University of Applied Science, Osnabrueck, Germany ; Cranial Facial Therapy Academy (CRAFTA), Hamburg, Germany
| | - Gesche Mohr
- University of Applied Science, Osnabrueck, Germany
| |
Collapse
|
23
|
Cattaneo L, Pavesi G. The facial motor system. Neurosci Biobehav Rev 2013; 38:135-59. [PMID: 24239732 DOI: 10.1016/j.neubiorev.2013.11.002] [Citation(s) in RCA: 116] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2013] [Revised: 10/18/2013] [Accepted: 11/02/2013] [Indexed: 12/23/2022]
Abstract
Facial movements support a variety of functions in human behavior. They participate in automatic somatic and visceral motor programs, they are essential in producing communicative displays of affective states and they are also subject to voluntary control. The multiplicity of functions of facial muscles, compared to limb muscles, is reflected in the heterogeneity of their anatomical and histological characteristics that goes well beyond the conventional classification in single facial muscles. Such parcellation in different functional muscular units is maintained throughout the central representation of facial movements from the brainstem up to the neocortex. Facial movements peculiarly lack a conventional proprioceptive feedback system, which is only in part vicariated by cutaneous or auditory afferents. Facial motor activity is the main marker of endogenous affective states and of the affective valence of external stimuli. At the cortical level, a complex network of specialized motor areas supports voluntary facial movements and, differently from upper limb movements, in such network there does not seem to be a prime actor in the primary motor cortex.
Collapse
Affiliation(s)
- Luigi Cattaneo
- Center for Mind/Brain Sciences, University of Trento, Via delle Regole 101, Mattarello, Trento 38123, Italy.
| | - Giovanni Pavesi
- Department of Neuroscience, University of Parma, Via Gramsci 14, Parma 43100, Italy
| |
Collapse
|
24
|
Abstract
OBJECTIVE The objective of this study was to evaluate laterality and upper/lower face dominance of expressiveness during prescribed speech using a unique validated image subtraction system capable of sensitive and reliable measurement of facial surface deformation. RATIONALE Observations and experiments of central control of facial expressions during speech and social utterances in humans and animals suggest that the right mouth moves more than the left during nonemotional speech. However, proficient lip readers seem to attend to the whole face to interpret meaning from expressed facial cues, also implicating a horizontal (upper face-lower face) axis. STUDY DESIGN Prospective experimental design. Experimental maneuver: recited speech. OUTCOME MEASURE image-subtraction strength-duration curve amplitude. METHODS Thirty normal human adults were evaluated during memorized nonemotional recitation of 2 short sentences. Facial movements were assessed using a video-image subtractions system capable of simultaneously measuring upper and lower specific areas of each hemiface. RESULTS The results demonstrate both axes influence facial expressiveness in human communication; however, the horizontal axis (upper versus lower face) would appear dominant, especially during what would appear to be spontaneous breakthrough unplanned expressiveness. CONCLUSION These data are congruent with the concept that the left cerebral hemisphere has control over nonemotionally stimulated speech; however, the multisynaptic brainstem extrapyramidal pathways may override hemiface laterality and preferentially take control of the upper face. Additionally, these data demonstrate the importance of the often-ignored brow in facial expressiveness. LEVEL OF EVIDENCE Experimental study. EBM levels not applicable.
Collapse
|
25
|
Ross ED, Shayya L, Champlain A, Monnot M, Prodan CI. Decoding facial blends of emotion: visual field, attentional and hemispheric biases. Brain Cogn 2013; 83:252-61. [PMID: 24091036 DOI: 10.1016/j.bandc.2013.09.001] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2012] [Revised: 07/23/2013] [Accepted: 09/02/2013] [Indexed: 10/26/2022]
Abstract
Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced.
Collapse
Affiliation(s)
- Elliott D Ross
- Department of Neurology, University of Oklahoma Health Sciences Center and the VA Medical Center 127, 921 NE 13th Street, Oklahoma City, OK 73104, USA.
| | | | | | | | | |
Collapse
|
26
|
|
27
|
Ross ED, Pulusu VK. Posed versus spontaneous facial expressions are modulated by opposite cerebral hemispheres. Cortex 2012; 49:1280-91. [PMID: 22699022 DOI: 10.1016/j.cortex.2012.05.002] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2012] [Revised: 04/17/2012] [Accepted: 05/07/2012] [Indexed: 01/04/2023]
Abstract
Clinical research has indicated that the left face is more expressive than the right face, suggesting that modulation of facial expressions is lateralized to the right hemisphere. The findings, however, are controversial because the results explain, on average, approximately 4% of the data variance. Using high-speed videography, we sought to determine if movement-onset asymmetry was a more powerful research paradigm than terminal movement asymmetry. The results were very robust, explaining up to 70% of the data variance. Posed expressions began overwhelmingly on the right face whereas spontaneous expressions began overwhelmingly on the left face. This dichotomy was most robust for upper facial expressions. In addition, movement-onset asymmetries did not predict terminal movement asymmetries, which were not significantly lateralized. The results support recent neuroanatomic observations that upper versus lower facial movements have different forebrain motor representations and recent behavioral constructs that posed versus spontaneous facial expressions are modulated preferentially by opposite cerebral hemispheres and that spontaneous facial expressions are graded rather than non-graded movements.
Collapse
Affiliation(s)
- Elliott D Ross
- Department of Neurology, University of Oklahoma Health Sciences Center and The VA Medical Center, Oklahoma City, OK 73104, USA.
| | | |
Collapse
|
28
|
Exploring the association between pain intensity and facial display in term newborns. Pain Res Manag 2011; 16:10-2. [PMID: 21369535 DOI: 10.1155/2011/873103] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
BACKGROUND Facial expression is widely used to judge pain in neonates. However, little is known about the relationship between intensity of the painful stimulus and the nature of the expression in term neonates. OBJECTIVES To describe differences in the movement of key facial areas between two groups of term neonates experiencing painful stimuli of different intensities. METHODS Video recordings from two previous studies were used to select study subjects. Four term neonates undergoing circumcision without analgesia were compared with four similar male term neonates undergoing a routine heel stick. Facial movements were measured with a computer using a previously developed 'point-pair' system that focuses on movement in areas implicated in neonatal pain expression. Measurements were expressed in pixels, standardized to percentage of individual infant face width. RESULTS Point pairs measuring eyebrow and eye movement were similar, as was the sum of change across the face (41.15 in the circumcision group versus 40.33 in the heel stick group). Point pair 4 (horizontal change of the mouth) was higher for the heel stick group at 9.09 versus 3.93 for the circumcision group, while point pair 5 (vertical change of the mouth) was higher for the circumcision group (23.32) than for the heel stick group (15.53). CONCLUSION Little difference was noted in eye and eyebrow movement between pain intensities. The mouth opened wider (vertically) in neonates experiencing the higher pain stimulus. Qualitative differences in neonatal facial expression to pain intensity may exist, and the mouth may be an area in which to detect them. Further study of the generalizability of these findings is needed.
Collapse
|
29
|
Ross ED, Monnot M. Affective prosody: What do comprehension errors tell us about hemispheric lateralization of emotions, sex and aging effects, and the role of cognitive appraisal. Neuropsychologia 2011; 49:866-877. [DOI: 10.1016/j.neuropsychologia.2010.12.024] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2010] [Revised: 12/10/2010] [Accepted: 12/13/2010] [Indexed: 10/18/2022]
|
30
|
Meyer-Marcotty P, Stellzig-Eisenhauer A, Bareis U, Hartmann J, Kochel J. Three-dimensional perception of facial asymmetry. Eur J Orthod 2011; 33:647-53. [DOI: 10.1093/ejo/cjq146] [Citation(s) in RCA: 71] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
31
|
Wallez C, Vauclair J. Right hemisphere dominance for emotion processing in baboons. Brain Cogn 2010; 75:164-9. [PMID: 21131120 DOI: 10.1016/j.bandc.2010.11.004] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2010] [Revised: 10/30/2010] [Accepted: 11/10/2010] [Indexed: 11/17/2022]
Abstract
Asymmetries of emotional facial expressions in humans offer reliable indexes to infer brain lateralization and mostly revealed right hemisphere dominance. Studies concerned with oro-facial asymmetries in nonhuman primates largely showed a left-sided asymmetry in chimpanzees, marmosets and macaques. The presence of asymmetrical oro-facial productions was assessed in Olive baboons in order to determine the functional cerebral asymmetries. Two affiliative behaviors (lipsmack, copulation call) and two agonistic ones (screeching, eyebrow-raising) were recorded. For screeching, a strong and significant left hemimouth bias was found, but no significant bias was observed for the other behaviors. These results are discussed in the light of the available literature concerning asymmetrical oro-facial productions in nonhuman primates. In addition, these findings suggest that human hemispheric specialization for emotions has precursors in primate evolution.
Collapse
Affiliation(s)
- Catherine Wallez
- University of Provence, Center of Research in the Psychology of Cognition, Language & Emotion, Department of Psychology, 29 Ave. Robert Schuman, 13621 Aix-en-Provence Cedex 1, France
| | | |
Collapse
|
32
|
Cattaneo L, Saccani E, De Giampaulis P, Crisi G, Pavesi G. Central facial palsy revisited: A clinical-radiological study. Ann Neurol 2010; 68:404-8. [DOI: 10.1002/ana.22069] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
33
|
Liu L, Ioannides AA. Emotion separation is completed early and it depends on visual field presentation. PLoS One 2010; 5:e9790. [PMID: 20339549 PMCID: PMC2842434 DOI: 10.1371/journal.pone.0009790] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2009] [Accepted: 02/27/2010] [Indexed: 11/19/2022] Open
Abstract
It is now apparent that the visual system reacts to stimuli very fast, with many brain areas activated within 100 ms. It is, however, unclear how much detail is extracted about stimulus properties in the early stages of visual processing. Here, using magnetoencephalography we show that the visual system separates different facial expressions of emotion well within 100 ms after image onset, and that this separation is processed differently depending on where in the visual field the stimulus is presented. Seven right-handed males participated in a face affect recognition experiment in which they viewed happy, fearful and neutral faces. Blocks of images were shown either at the center or in one of the four quadrants of the visual field. For centrally presented faces, the emotions were separated fast, first in the right superior temporal sulcus (STS; 35-48 ms), followed by the right amygdala (57-64 ms) and medial pre-frontal cortex (83-96 ms). For faces presented in the periphery, the emotions were separated first in the ipsilateral amygdala and contralateral STS. We conclude that amygdala and STS likely play a different role in early visual processing, recruiting distinct neural networks for action: the amygdala alerts sub-cortical centers for appropriate autonomic system response for fight or flight decisions, while the STS facilitates more cognitive appraisal of situations and links appropriate cortical sites together. It is then likely that different problems may arise when either network fails to initiate or function properly.
Collapse
Affiliation(s)
- Lichan Liu
- Lab for Human Brain Dynamics, RIKEN Brain Science Institute, Wakoshi, Saitama, Japan.
| | | |
Collapse
|
34
|
Meyer-Marcotty P, Alpers GW, Gerdes ABM, Stellzig-Eisenhauer A. Impact of facial asymmetry in visual perception: a 3-dimensional data analysis. Am J Orthod Dentofacial Orthop 2010; 137:168.e1-8; discussion 168-9. [PMID: 20152669 DOI: 10.1016/j.ajodo.2008.11.023] [Citation(s) in RCA: 59] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2008] [Revised: 11/01/2008] [Accepted: 11/01/2008] [Indexed: 11/16/2022]
Abstract
INTRODUCTION The aim of this controlled study was to analyze the degree and localization of 3-dimensional (3D) facial asymmetry in adult patients with cleft lip and palate (CLP) compared with a control group and its impact on the visual perception of faces. METHODS The degree of 3D asymmetry was analyzed with a novel method without landmarks in 18 adults with complete unilateral CLP and 18 adults without congenital anomalies. Furthermore, the CLP and control faces were rated for appearance, symmetry, and facial expression by 30 participants. RESULTS The results showed that adults with CLP had significantly greater asymmetry in their facial soft tissues compared with the control group. Moreover, the lower face, and particularly the midface, had greater asymmetry in the CLP patients. The perceptual ratings showed that adults with CLP were judged much more negatively than those in the control group. CONCLUSIONS With sophisticated 3D analysis, the real morphology of a face can be calculated and asymmetric regions precisely identified. The greatest asymmetry in CLP patients is in the midface. These results underline the importance of symmetry in the perception of faces. In general, the greater the facial asymmetry near the midline of the face, the more negative the evaluation of the face in direct face-to-face interactions.
Collapse
|
35
|
Meyer-Marcotty P, Gerdes ABM, Reuther T, Stellzig-Eisenhauer A, Alpers GW. Persons with cleft lip and palate are looked at differently. J Dent Res 2010; 89:400-4. [PMID: 20164498 DOI: 10.1177/0022034509359488] [Citation(s) in RCA: 50] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
There is evidence that persons with cleft lip and palate (CLP) suffer psychosocial consequences as a result of their facial appearance. However, no data exist on how they are perceived by others. Our hypothesis was that CLP faces were looked at differently compared with faces lacking an anomaly. Eye movements of 30 healthy participants were recorded (via an eye-tracking camera) while they viewed photographs of faces with/without a CLP. Subsequently, the faces were rated for appearance, symmetry, and facial expression. When the CLP faces were viewed, there were significantly more initial fixations in the mouth and longer fixations in the mouth and nose regions, compared with reactions when control faces were viewed. Moreover, CLP faces were rated more negatively overall. When faces with CLP were viewed, attention was directed to the mouth and nose region. Together with the negative ratings, this may explain at least some of the social deprivations in persons with CLP, probably due to residual asymmetry.
Collapse
Affiliation(s)
- P Meyer-Marcotty
- Department of Orthodontics, University of Wuerzburg, Dental Clinic of the Medical Faculty, Pleicherwall 2, D-97070 Wuerzburg, Germany.
| | | | | | | | | |
Collapse
|
36
|
Leppänen JM, Niehaus DJH, Koen L, Schoeman R, Emsley R. Allocation of attention to the eye and mouth region of faces in schizophrenia patients. Cogn Neuropsychiatry 2008; 13:505-19. [PMID: 19048442 DOI: 10.1080/13546800802608452] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
INTRODUCTION The present study examined whether reduced attentiveness to facial features and biased weighting of attention to the eye and mouth region might explain deficits in face processing in schizophrenia. METHODS Healthy controls (n=21) and schizophrenia patients (n=28) from an African Xhosa population were asked to detect target stimuli (dots) superimposed on pictures of faces. General attentiveness to facial features was assessed by measuring overall reaction times to targets superimposed on feature areas of faces and attentiveness to the eye versus mouth region by comparing reaction times to targets on the upper and lower parts of faces. RESULTS Patients exhibited generally slower target detection speed than comparison subjects but the strength of the attentional bias towards the eyes did not differ between groups (i.e., the reaction time gain for targets in the eye region). A regression analysis indicated, however, that generally slower target detection speed and an attentional bias away from the mouth predicted a deficit in the recognition of open-mouth angry facial expressions in schizophrenia patients. CONCLUSIONS The results give partial support for hypothesis that reduced overall attentiveness to faces and a failure to utilise visual information in salient facial features may underlie affect processing deficits in schizophrenia.
Collapse
Affiliation(s)
- Jukka M Leppänen
- Human Information Processing Laboratory, Department of Psychology, University of Tampere, Finland.
| | | | | | | | | |
Collapse
|
37
|
Ross ED. Facial expressions, pain and nociception--are they related? NATURE CLINICAL PRACTICE. NEUROLOGY 2008; 4:304-305. [PMID: 18431381 DOI: 10.1038/ncpneuro0791] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2008] [Accepted: 03/07/2008] [Indexed: 05/26/2023]
Affiliation(s)
- Elliott D Ross
- University of Oklahoma Health Sciences Center, Oklahoma City, OK 73104, USA.
| |
Collapse
|
38
|
Abstract
Facial expression is a mode of close-proximity non-vocal communication used by primates and is produced by mimetic/facial musculature. Arguably, primates make the most-intricate facial displays and have some of the most-complex facial musculature of all mammals. Most of the earlier ideas of primate mimetic musculature, involving its function in facial displays and its evolution, were essentially linear "scala natural" models of increasing complexity. More-recent work has challenged these ideas, suggesting that ecological factors and social systems have played a much larger role in explaining the diversity of structures than previously believed. The present review synthesizes the evidence from gross muscular, microanatomical, behavioral and neurobiological studies in order to provide a preliminary analysis of the factors responsible for the evolution of primate facial musculature with comparisons to general mammals. In addition, the unique structure, function and evolution of human mimetic musculature are discussed, along with the potential influential roles of human speech and eye gaze.
Collapse
Affiliation(s)
- Anne M Burrows
- Department of Physical Therapy, Duquesne University, PA and Department of Anthropology, University of Pittsburgh, PA, USA.
| |
Collapse
|