1
|
Hur MS, Lee S, Jung HS, Schneider RA. Crossing fibers may underlie the dynamic pulling forces of muscles that attach to cartilage at the tip of the nose. Sci Rep 2023; 13:18948. [PMID: 37919340 PMCID: PMC10622497 DOI: 10.1038/s41598-023-45781-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2023] [Accepted: 10/24/2023] [Indexed: 11/04/2023] Open
Abstract
The present study used microdissection, histology, and microcomputed tomography (micro-CT) with the aims of determining the prevalence and patterns of the depressor septi nasi (DSN) and orbicularis oris (OOr) muscles attached to the footplate of the medial crus (fMC) of the major alar cartilage, focusing on their crossing fibers. The DSN and OOr attached to the fMC of the major alar cartilage were investigated in 76 samples from 38 embalmed Korean adult cadavers (20 males, 18 females; mean age 70 years). The DSN, OOr, or both were attached to the fMC. When the DSN ran unilaterally or was absent, some OOr fibers ascended to attach to the fMC instead of the DSN in 20.6% of the samples. Crossing fibers of the DSN or OOr attached to the fMC were found in 82.4% of the samples. Bilateral and unilateral crossing fibers were found in 32.4% and 50.0%, respectively, and no crossing fibers were found in 17.6%. The DSN and OOr that attached to the fMC could be categorized into six types according to presence of the DSN and the crossing patterns of the DSN and OOr. Anatomical findings of the DSN and OOr that attached to the fMC were confirmed in histology and micro-CT images. These findings offer insights on anatomical mechanisms that may underlie the dynamic pulling forces generated by muscles that attach to the fMCs and on evolutionary variation observed in human facial expressions. They can also provide useful information for guiding rhinoplasty of the nasal tip.
Collapse
Affiliation(s)
- Mi-Sun Hur
- Department of Anatomy, Daegu Catholic University School of Medicine, Daegu, Korea
| | - Seunggyu Lee
- Division of Applied Mathematical Sciences, Korea University, Sejong, Korea
- Biomedical Mathematics Group, Institute for Basic Science, Daejeon, Korea
| | - Han-Sung Jung
- Division in Anatomy and Developmental Biology, Department of Oral Biology, Taste Research Center, BK21 FOUR Project, Oral Science Research Center, Yonsei University College of Dentistry, Seoul, Korea.
| | - Richard A Schneider
- Department of Orthopaedic Surgery, University of California at San Francisco, 513 Parnassus Avenue, S-1161, San Francisco, CA, 94143-0514, USA.
| |
Collapse
|
2
|
Zijlstra TW, van Berlo E, Kret ME. Attention Towards Pupil Size in Humans and Bonobos ( Pan paniscus). AFFECTIVE SCIENCE 2022; 3:761-771. [PMID: 36519142 PMCID: PMC9743857 DOI: 10.1007/s42761-022-00146-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/15/2022] [Accepted: 08/07/2022] [Indexed: 05/29/2023]
Abstract
UNLABELLED Previous work has established that humans have an attentional bias towards emotional signals, and there is some evidence that this phenomenon is shared with bonobos, our closest relatives. Although many emotional signals are explicit and overt, implicit cues such as pupil size also contain emotional information for observers. Pupil size can impact social judgment and foster trust and social support, and is automatically mimicked, suggesting a communicative role. While an attentional bias towards more obvious emotional expressions has been shown, it is unclear whether this also extends to a more subtle implicit cue, like changes in pupil size. Therefore, the current study investigated whether attention is biased towards pupils of differing sizes in humans and bonobos. A total of 150 human participants (141 female), with a mean age of 19.13 (ranging from 18 to 32 years old), completed an online dot-probe task. Four female bonobos (6 to 17 years old) completed the dot-probe task presented via a touch screen. We used linear mixed multilevel models to examine the effect of pupil size on reaction times. In humans, our analysis showed a small but significant attentional bias towards dilated pupils compared to intermediate-sized pupils and intermediate-sized pupils when compared to small pupils. Our analysis did not show a significant effect in bonobos. These results suggest that the attentional bias towards emotions in humans can be extended to a subtle unconsciously produced signal, namely changes in pupil size. Due to methodological differences between the two experiments, more research is needed before drawing a conclusion regarding bonobos. SUPPLEMENTARY INFORMATION The online version contains supplementary material available at 10.1007/s42761-022-00146-1.
Collapse
Affiliation(s)
- T. W. Zijlstra
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, the Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden, the Netherlands
| | - E. van Berlo
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, the Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden, the Netherlands
- Institute for Biodiversity and Ecosystem Dynamics, University of Amsterdam, Amsterdam, the Netherlands
| | - M. E. Kret
- Cognitive Psychology Unit, Institute of Psychology, Leiden University, Leiden, the Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden, the Netherlands
| |
Collapse
|
3
|
Jeganathan J, Campbell M, Hyett M, Parker G, Breakspear M. Quantifying dynamic facial expressions under naturalistic conditions. eLife 2022; 11:79581. [PMID: 36043464 PMCID: PMC9439684 DOI: 10.7554/elife.79581] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2022] [Accepted: 08/24/2022] [Indexed: 11/15/2022] Open
Abstract
Facial affect is expressed dynamically – a giggle, grimace, or an agitated frown. However, the characterisation of human affect has relied almost exclusively on static images. This approach cannot capture the nuances of human communication or support the naturalistic assessment of affective disorders. Using the latest in machine vision and systems modelling, we studied dynamic facial expressions of people viewing emotionally salient film clips. We found that the apparent complexity of dynamic facial expressions can be captured by a small number of simple spatiotemporal states – composites of distinct facial actions, each expressed with a unique spectral fingerprint. Sequential expression of these states is common across individuals viewing the same film stimuli but varies in those with the melancholic subtype of major depressive disorder. This approach provides a platform for translational research, capturing dynamic facial expressions under naturalistic conditions and enabling new quantitative tools for the study of affective disorders and related mental illnesses.
Collapse
Affiliation(s)
- Jayson Jeganathan
- School of Psychology, University of Newcastle Australia, Newcastle, Australia
| | - Megan Campbell
- School of Psychology, University of Newcastle Australia, Newcastle, Australia
| | - Matthew Hyett
- School of Psychological Sciences, University of Western Australia, Perth, Australia
| | - Gordon Parker
- School of Psychiatry, University of New South Wales, Kensington, Australia
| | - Michael Breakspear
- School of Psychology, University of Newcastle Australia, Newcastle, Australia
| |
Collapse
|
4
|
Facial hair may slow detection of happy facial expressions in the face in the crowd paradigm. Sci Rep 2022; 12:5911. [PMID: 35396450 PMCID: PMC8993935 DOI: 10.1038/s41598-022-09397-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2021] [Accepted: 03/08/2022] [Indexed: 11/08/2022] Open
Abstract
Human visual systems have evolved to extract ecologically relevant information from complex scenery. In some cases, the face in the crowd visual search task demonstrates an anger superiority effect, where anger is allocated preferential attention. Across three studies (N = 419), we tested whether facial hair guides attention in visual search and influences the speed of detecting angry and happy facial expressions in large arrays of faces. In Study 1, participants were faster to search through clean-shaven crowds and detect bearded targets than to search through bearded crowds and detect clean-shaven targets. In Study 2, targets were angry and happy faces presented in neutral backgrounds. Facial hair of the target faces was also manipulated. An anger superiority effect emerged that was augmented by the presence of facial hair, which was due to the slower detection of happiness on bearded faces. In Study 3, targets were happy and angry faces presented in either bearded or clean-shaven backgrounds. Facial hair of the background faces was also systematically manipulated. A significant anger superiority effect was revealed, although this was not moderated by the target's facial hair. Rather, the anger superiority effect was larger in clean-shaven than bearded face backgrounds. Together, results suggest that facial hair does influence detection of emotional expressions in visual search, however, rather than facilitating an anger superiority effect as a potential threat detection system, facial hair may reduce detection of happy faces within the face in the crowd paradigm.
Collapse
|
5
|
Menting-Henry S, Hidalgo-Lopez E, Aichhorn M, Kronbichler M, Kerschbaum H, Pletzer B. Oral Contraceptives Modulate the Relationship Between Resting Brain Activity, Amygdala Connectivity and Emotion Recognition – A Resting State fMRI Study. Front Behav Neurosci 2022; 16:775796. [PMID: 35368304 PMCID: PMC8967165 DOI: 10.3389/fnbeh.2022.775796] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 01/24/2022] [Indexed: 12/26/2022] Open
Abstract
Recent research into the effects of hormonal contraceptives on emotion processing and brain function suggests that hormonal contraceptive users show (a) reduced accuracy in recognizing emotions compared to naturally cycling women, and (b) alterations in amygdala volume and connectivity at rest. To date, these observations have not been linked, although the amygdala has certainly been identified as core region activated during emotion recognition. To assess, whether volume, oscillatory activity and connectivity of emotion-related brain areas at rest are predictive of participant’s ability to recognize facial emotional expressions, 72 participants (20 men, 20 naturally cycling women, 16 users of androgenic contraceptives, 16 users of anti-androgenic contraceptives) completed a brain structural and resting state fMRI scan, as well as an emotion recognition task. Our results showed that resting brain characteristics did not mediate oral contraceptive effects on emotion recognition performance. However, sex and oral contraceptive use emerged as a moderator of brain-behavior associations. Sex differences did emerge in the prediction of emotion recognition performance by the left amygdala amplitude of low frequency oscillations (ALFF) for anger, as well as left and right amygdala connectivity for fear. Anti-androgenic oral contraceptive users (OC) users stood out in that they showed strong brain-behavior associations, usually in the opposite direction as naturally cycling women, while androgenic OC-users showed a pattern similar to, but weaker, than naturally cycling women. This result suggests that amygdala ALFF and connectivity have predictive values for facial emotion recognition. The importance of the different connections depends heavily on sex hormones and oral contraceptive use.
Collapse
Affiliation(s)
- Shanice Menting-Henry
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Esmeralda Hidalgo-Lopez
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Markus Aichhorn
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Martin Kronbichler
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
- Neuroscience Institute, Paracelsus Medical University, Salzburg, Austria
| | - Hubert Kerschbaum
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Biosciences, University of Salzburg, Salzburg, Austria
| | - Belinda Pletzer
- Center for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria
- Department of Psychology, University of Salzburg, Salzburg, Austria
- *Correspondence: Belinda Pletzer,
| |
Collapse
|
6
|
Franz M, Müller T, Hahn S, Lundqvist D, Rampoldt D, Westermann JF, Nordmann MA, Schäfer R. Creation and validation of the Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE). PLoS One 2021; 16:e0260871. [PMID: 34874965 PMCID: PMC8651117 DOI: 10.1371/journal.pone.0260871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2020] [Accepted: 11/18/2021] [Indexed: 11/23/2022] Open
Abstract
The immediate detection and correct processing of affective facial expressions are one of the most important competences in social interaction and thus a main subject in emotion and affect research. Generally, studies in these research domains, use pictures of adults who display affective facial expressions as experimental stimuli. However, for studies investigating developmental psychology and attachment behaviour it is necessary to use age-matched stimuli, where it is children that display affective expressions. PSYCAFE represents a newly developed picture-set of children’s faces. It includes reference portraits of girls and boys aged 4 to 6 years averaged digitally from different individual pictures, that were categorized to six basic affects (fear, disgust, happiness, sadness, anger and surprise) plus a neutral facial expression by cluster analysis. This procedure led to deindividualized and affect prototypical portraits. Individual affect expressive portraits of adults from an already validated picture-set (KDEF) were used in a similar way to create affect prototypical images also of adults. The stimulus set has been validated on human observers and entail emotion recognition accuracy rates and scores for intensity, authenticity and likeability ratings of the specific affect displayed. Moreover, the stimuli have also been characterized by the iMotions Facial Expression Analysis Module, providing additional data on probability values representing the likelihood that the stimuli depict the expected affect. Finally, the validation data from human observers and iMotions are compared to data on facial mimicry of healthy adults in response to these portraits, measured by facial EMG (m. zygomaticus major and m. corrugator supercilii).
Collapse
Affiliation(s)
- Matthias Franz
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy (15.16), University Hospital Düsseldorf, Düsseldorf, Germany
- * E-mail:
| | - Tobias Müller
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy (15.16), University Hospital Düsseldorf, Düsseldorf, Germany
| | - Sina Hahn
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy (15.16), University Hospital Düsseldorf, Düsseldorf, Germany
| | - Daniel Lundqvist
- Karolinska Institute, Department of Clinical Neuroscience, NatMEG, Solna, Sweden
| | - Dirk Rampoldt
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy (15.16), University Hospital Düsseldorf, Düsseldorf, Germany
| | - Jan-Frederik Westermann
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy (15.16), University Hospital Düsseldorf, Düsseldorf, Germany
| | - Marc A. Nordmann
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy (15.16), University Hospital Düsseldorf, Düsseldorf, Germany
| | - Ralf Schäfer
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy (15.16), University Hospital Düsseldorf, Düsseldorf, Germany
| |
Collapse
|
7
|
Liu L, du Toit M, Weidemann G. Infants are sensitive to cultural differences in emotions at 11 months. PLoS One 2021; 16:e0257655. [PMID: 34591863 PMCID: PMC8483341 DOI: 10.1371/journal.pone.0257655] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2019] [Accepted: 09/07/2021] [Indexed: 11/19/2022] Open
Abstract
A myriad of emotion perception studies has shown infants' ability to discriminate different emotional categories, yet there has been little investigation of infants' perception of cultural differences in emotions. Hence little is known about the extent to which culture-specific emotion information is recognised in the beginning of life. Caucasian Australian infants of 10-12 months participated in a visual-paired comparison task where their preferential looking patterns to three types of infant-directed emotions (anger, happiness, surprise) from two different cultures (Australian, Japanese) were examined. Differences in racial appearances were controlled. Infants exhibited preferential looking to Japanese over Caucasian Australian mothers' angry and surprised expressions, whereas no difference was observed in trials involving East-Asian Australian mothers. In addition, infants preferred Caucasian Australian mothers' happy expressions. These findings suggest that 11-month-olds are sensitive to cultural differences in spontaneous infant-directed emotional expressions when they are combined with a difference in racial appearance.
Collapse
Affiliation(s)
- Liquan Liu
- School of Psychology, Western Sydney University, Sydney, Australia
- MARCS Institute for Brain and Behaviour, Western Sydney University, Sydney, Australia
- Center for Multilingualism in Society Across the Lifespan, University of Oslo, Oslo, Norway
| | - Mieke du Toit
- School of Psychology, Western Sydney University, Sydney, Australia
| | - Gabrielle Weidemann
- School of Psychology, Western Sydney University, Sydney, Australia
- MARCS Institute for Brain and Behaviour, Western Sydney University, Sydney, Australia
| |
Collapse
|
8
|
Kim SA, Kim SH. Neurocognitive Effects of Preceding Facial Expressions on Perception of Subsequent Emotions. Front Behav Neurosci 2021; 15:683833. [PMID: 34393734 PMCID: PMC8363130 DOI: 10.3389/fnbeh.2021.683833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Accepted: 07/13/2021] [Indexed: 11/28/2022] Open
Abstract
In everyday life, individuals successively and simultaneously encounter multiple stimuli that are emotionally incongruent. Emotional incongruence elicited by preceding stimuli may alter emotional experience with ongoing stimuli. However, the underlying neural mechanisms of the modulatory influence of preceding emotional stimuli on subsequent emotional processing remain unclear. In this study, we examined self-reported and neural responses to negative and neutral pictures whose emotional valence was incongruent with that of preceding images of facial expressions. Twenty-five healthy participants performed an emotional intensity rating task inside a brain scanner. Pictures of negative and neutral scenes appeared, each of which was preceded by a pleasant, neutral, or unpleasant facial expression to elicit a degree of emotional incongruence. Behavioral results showed that emotional incongruence based on preceding facial expressions did not influence ratings of subsequent pictures' emotional intensity. On the other hand, neuroimaging results revealed greater activation of the right dorsomedial prefrontal cortex (dmPFC) in response to pictures that were more emotionally incongruent with preceding facial expressions. The dmPFC had stronger functional connectivity with the right ventrolateral prefrontal cortex (vlPFC) during the presentation of negative pictures that followed pleasant facial expressions compared to those that followed unpleasant facial expressions. Interestingly, increased functional connectivity of the dmPFC was associated with the reduced modulatory influence of emotional incongruence on the experienced intensity of negative emotions. These results indicate that functional connectivity of the dmPFC contributes to the resolution of emotional incongruence, reducing the emotion modulation effect of preceding information on subsequent emotional processes.
Collapse
Affiliation(s)
| | - Sang Hee Kim
- Department of Brain and Cognitive Engineering, Korea University, Seoul, South Korea
| |
Collapse
|
9
|
Taubert N, Stettler M, Siebert R, Spadacenta S, Sting L, Dicke P, Thier P, Giese MA. Shape-invariant encoding of dynamic primate facial expressions in human perception. eLife 2021; 10:61197. [PMID: 34115584 PMCID: PMC8195610 DOI: 10.7554/elife.61197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2020] [Accepted: 04/22/2021] [Indexed: 11/30/2022] Open
Abstract
Dynamic facial expressions are crucial for communication in primates. Due to the difficulty to control shape and dynamics of facial expressions across species, it is unknown how species-specific facial expressions are perceptually encoded and interact with the representation of facial shape. While popular neural network models predict a joint encoding of facial shape and dynamics, the neuromuscular control of faces evolved more slowly than facial shape, suggesting a separate encoding. To investigate these alternative hypotheses, we developed photo-realistic human and monkey heads that were animated with motion capture data from monkeys and humans. Exact control of expression dynamics was accomplished by a Bayesian machine-learning technique. Consistent with our hypothesis, we found that human observers learned cross-species expressions very quickly, where face dynamics was represented largely independently of facial shape. This result supports the co-evolution of the visual processing and motor control of facial expressions, while it challenges appearance-based neural network theories of dynamic expression recognition.
Collapse
Affiliation(s)
- Nick Taubert
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen, Germany
| | - Michael Stettler
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen, Germany.,International Max Planck Research School for Intelligent Systems (IMPRS-IS), Tübingen, Germany
| | - Ramona Siebert
- Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
| | - Silvia Spadacenta
- Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
| | - Louisa Sting
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen, Germany
| | - Peter Dicke
- Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
| | - Peter Thier
- Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen, Germany
| | - Martin A Giese
- Section for Computational Sensomotorics, Centre for Integrative Neuroscience & Hertie Institute for Clinical Brain Research, University Clinic Tübingen, Tübingen, Germany
| |
Collapse
|
10
|
Bjork JM, Keyser-Marcus L, Vassileva J, Ramey T, Houghton DC, Moeller FG. Social Information Processing in Substance Use Disorders: Insights From an Emotional Go-Nogo Task. Front Psychiatry 2021; 12:672488. [PMID: 34122188 PMCID: PMC8193089 DOI: 10.3389/fpsyt.2021.672488] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/25/2021] [Accepted: 04/28/2021] [Indexed: 11/27/2022] Open
Abstract
Positive social connections are crucial for recovery from Substance Use Disorder (SUD). Of interest is understanding potential social information processing (SIP) mediators of this effect. To explore whether persons with different SUD show idiosyncratic biases toward social signals, we administered an emotional go-nogo task (EGNG) to 31 individuals with Cocaine Use Disorder (CoUD), 31 with Cannabis Use Disorder (CaUD), 79 with Opioid Use Disorder (OUD), and 58 controls. Participants were instructed to respond to emotional faces (Fear/Happy) but withhold responses to expressionless faces in two task blocks, with the reverse instruction in the other two blocks. Emotional faces as non-targets elicited more "false alarm" (FA) commission errors as a main effect. Groups did not differ in overall rates of hits (correct responses to target faces), but participants with CaUD and CoUD showed reduced rates of hits (relative to controls) when expressionless faces were targets. OUD participants had worse hit rates [and slower reaction times (RT)] when fearful faces (but not happy faces) were targets. CaUD participants were most affected by instruction effects (respond/"go" vs withhold response/"no-go" to emotional face) on discriminability statistic A. Participants were faster to respond to happy face targets than to expressionless faces. However, this pattern was reversed in fearful face blocks in OUD and CoUD participants. This experiment replicated previous findings of the greater salience of expressive face images, and extends this finding to SUD, where persons with CaUD may show even greater bias toward emotional faces. Conversely, OUD participants showed idiosyncratic behavior in response to fearful faces suggestive of increased attentional disruption by fear. These data suggest a mechanism by which positive social signals may contribute to recovery.
Collapse
Affiliation(s)
- James M. Bjork
- Institute for Drug and Alcohol Studies, Virginia Commonwealth University, Richmond, VA, United States
| | - Lori Keyser-Marcus
- Institute for Drug and Alcohol Studies, Virginia Commonwealth University, Richmond, VA, United States
| | - Jasmin Vassileva
- Institute for Drug and Alcohol Studies, Virginia Commonwealth University, Richmond, VA, United States
| | - Tatiana Ramey
- Division of Therapeutics and Medical Consequences, National Institute on Drug Abuse, Bethesda, MD, United States
| | - David C. Houghton
- Department of Psychiatry and Behavioral Sciences, Center for Addiction Research, University of Texas Medical Branch, Galveston, TX, United States
| | - F. Gerard Moeller
- Institute for Drug and Alcohol Studies, Virginia Commonwealth University, Richmond, VA, United States
| |
Collapse
|
11
|
Carrard V. Non-verbal Adaptation to the Interlocutors' Inner Characteristics: Relevance, Challenges, and Future Directions. Front Psychol 2021; 12:612664. [PMID: 33959067 PMCID: PMC8093557 DOI: 10.3389/fpsyg.2021.612664] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2020] [Accepted: 01/19/2021] [Indexed: 11/13/2022] Open
Abstract
Human diversity cannot be denied. In our everyday social interactions, we constantly experience the fact that each individual is a unique combination of characteristics with specific cultural norms, roles, personality, and mood. Efficient social interaction thus requires an adaptation of communication behaviors to each specific interlocutor that one encounters. This is especially true for non-verbal communication that is more unconscious and automatic than verbal communication. Consequently, non-verbal communication needs to be understood as a dynamic and adaptive process in the theoretical modeling and study of social interactions. This perspective paper presents relevance, challenges, and future directions for the study of non-verbal adaptation in social interactions. It proposes that non-verbal adaptability is more pertinently studied as adaptation to interlocutor's inner characteristics (i.e., expectations or preferences) than to interlocutor's behaviors per se, because behaviors are communication messages that individuals interpret in order to understand their interlocutors. The affiliation and control dimensions of the Interpersonal Circumplex Model are proposed as a framework to measure both the interlocutors' inner characteristics (self-reported) and the individuals' non-verbal responses (external coders). These measures can then be compared across different interactions to assess an actual change in behavior tailored to different interlocutors. These recommendations are proposed in the hope of generating more research on the topic of non-verbal adaptability. Indeed, after having gathered the evidence on average effects of non-verbal behaviors, the field can go further than a “one size fits all” approach, by investigating the predictors, moderators, and outcomes of non-verbal adaptation to the interlocutors' inner characteristics.
Collapse
Affiliation(s)
- Valerie Carrard
- Swiss Paraplegic Research (SPF), Nottwil, Switzerland.,Department of Health Sciences and Medicine, University of Lucerne, Lucerne, Switzerland
| |
Collapse
|
12
|
Chen L, Jiang J, Li X, Ding J, Paterson KB, Rao LL. Beyond Smiles: Static Expressions in Maxillary Protrusion and Associated Positivity. Front Psychol 2021; 12:514016. [PMID: 33859586 PMCID: PMC8042222 DOI: 10.3389/fpsyg.2021.514016] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2020] [Accepted: 03/10/2021] [Indexed: 11/28/2022] Open
Abstract
Smiles play an important role in social perception. However, it is unclear whether a similar role is played by static facial features associated with smiles (e.g., stretched mouth and visible teeth). In dental science, maxillary dental protrusions increase the baring of the teeth and thus produce partial facial features of a smile even when the individual is not choosing to smile, whereas mandibular dental protrusions do not. We conducted three experiments to assess whether individuals ascribe positive evaluations to these facial features, which are not genuine emotional expressions. In Experiment 1, participants viewed facial photographs of maxillary and mandibular protrusions and indicated the smiling and emotional status of the faces. The results showed that, while no difference was observed in participants’ perception of the presence of a smile across both types of dental protrusion, participants felt more positive to faces with maxillary than mandibular protrusions. In Experiment 2, participants completed an Implicit Association Test (IAT) test measuring implicit attitudes toward faces with maxillary vs. mandibular protrusions. The results showed that participants had more positive attitude toward faces with maxillary than mandibular protrusions. In Experiment 3, individuals with either maxillary or mandibular protrusions completed the same IAT test to assess whether any preference would be affected by in-group/out-group preferences. The results showed both groups had more positive attitudes toward faces with maxillary protrusion, indicating that this preference is independent of the group effect. These findings suggest that facial features associated with smiles are viewed positively in social situations. We discuss this in terms of the social-function account.
Collapse
Affiliation(s)
- Lijing Chen
- School of Psychology, Fujian Normal University, Fuzhou, China.,CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Jiuhui Jiang
- School of Stomatology, Peking University, Beijing, China
| | - Xingshan Li
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Jinfeng Ding
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| | - Kevin B Paterson
- Department of Neuroscience, Psychology and Behaviour, University of Leicester, Leicester, United Kingdom
| | - Li-Lin Rao
- CAS Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
13
|
Drimalla H, Baskow I, Behnia B, Roepke S, Dziobek I. Imitation and recognition of facial emotions in autism: a computer vision approach. Mol Autism 2021; 12:27. [PMID: 33823922 PMCID: PMC8025560 DOI: 10.1186/s13229-021-00430-0] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2019] [Accepted: 03/01/2021] [Indexed: 01/21/2023] Open
Abstract
Background Imitation of facial expressions plays an important role in social functioning. However, little is known about the quality of facial imitation in individuals with autism and its relationship with defining difficulties in emotion recognition. Methods We investigated imitation and recognition of facial expressions in 37 individuals with autism spectrum conditions and 43 neurotypical controls. Using a novel computer-based face analysis, we measured instructed imitation of facial emotional expressions and related it to emotion recognition abilities. Results Individuals with autism imitated facial expressions if instructed to do so, but their imitation was both slower and less precise than that of neurotypical individuals. In both groups, a more precise imitation scaled positively with participants’ accuracy of emotion recognition. Limitations Given the study’s focus on adults with autism without intellectual impairment, it is unclear whether the results generalize to children with autism or individuals with intellectual disability. Further, the new automated facial analysis, despite being less intrusive than electromyography, might be less sensitive. Conclusions Group differences in emotion recognition, imitation and their interrelationships highlight potential for treatment of social interaction problems in individuals with autism.
Collapse
Affiliation(s)
- Hanna Drimalla
- Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany. .,Clinical Psychology of Social Interaction, Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany. .,Digital Health Center, Hasso Plattner Institute, University of Potsdam, Am Neuen Palais 10, 14469, Potsdam, Germany. .,Multimodal Behavior Processing, Faculty of Technology, Bielefeld University, Inspiration 1, 33619, Bielefeld, Germany.
| | - Irina Baskow
- Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany.,Departement of Psychiatry and Psychotherapy, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Campus Benjamin Franklin, Hindenburgdamm 30, 12203, Berlin, Deutschland
| | - Behnoush Behnia
- Departement of Psychiatry and Psychotherapy, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Campus Benjamin Franklin, Hindenburgdamm 30, 12203, Berlin, Deutschland
| | - Stefan Roepke
- Departement of Psychiatry and Psychotherapy, Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin and Humboldt-Universität zu Berlin, Campus Benjamin Franklin, Hindenburgdamm 30, 12203, Berlin, Deutschland
| | - Isabel Dziobek
- Department of Psychology, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany.,Clinical Psychology of Social Interaction, Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099, Berlin, Germany
| |
Collapse
|
14
|
Tereshenko V, Dotzauer DC, Maierhofer U, Festin C, Luft M, Laengle G, Politikou O, Klein HJ, Blumer R, Aszmann OC, Bergmeister KD. Selective Denervation of the Facial Dermato-Muscular Complex in the Rat: Experimental Model and Anatomical Basis. Front Neuroanat 2021; 15:650761. [PMID: 33828465 PMCID: PMC8019738 DOI: 10.3389/fnana.2021.650761] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2021] [Accepted: 03/01/2021] [Indexed: 11/13/2022] Open
Abstract
The facial dermato-muscular system consists of highly specialized muscles tightly adhering to the overlaying skin and thus form a complex morphological conglomerate. This is the anatomical and functional basis for versatile facial expressions, which are essential for human social interaction. The neural innervation of the facial skin and muscles occurs via branches of the trigeminal and facial nerves. These are also the most commonly pathologically affected cranial nerves, often requiring surgical treatment. Hence, experimental models for researching these nerves and their pathologies are highly relevant to study pathophysiology and nerve regeneration. Experimental models for the distinctive investigation of the complex afferent and efferent interplay within facial structures are scarce. In this study, we established a robust surgical model for distinctive exploration of facial structures after complete elimination of afferent or efferent innervation in the rat. Animals were allocated into two groups according to the surgical procedure. In the first group, the facial nerve and in the second all distal cutaneous branches of the trigeminal nerve were transected unilaterally. All animals survived and no higher burden was caused by the procedures. Whisker pad movements were documented with video recordings 4 weeks after surgery and showed successful denervation. Whole-mount immunofluorescent staining of facial muscles was performed to visualize the innervation pattern of the neuromuscular junctions. Comprehensive quantitative analysis revealed large differences in afferent axon counts in the cutaneous branches of the trigeminal nerve. Axon number was the highest in the infraorbital nerve (28,625 ± 2,519), followed by the supraorbital nerve (2,131 ± 413), the mental nerve (3,062 ± 341), and the cutaneous branch of the mylohyoid nerve (343 ± 78). Overall, this surgical model is robust and reliable for distinctive surgical deafferentation or deefferentation of the face. It may be used for investigating cortical plasticity, the neurobiological mechanisms behind various clinically relevant conditions like facial paralysis or trigeminal neuralgia as well as local anesthesia in the face and oral cavity.
Collapse
Affiliation(s)
- Vlad Tereshenko
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Center for Biomedical Research, Medical University of Vienna, Vienna, Austria
| | - Dominik C Dotzauer
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria
| | - Udo Maierhofer
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Center for Biomedical Research, Medical University of Vienna, Vienna, Austria
| | - Christopher Festin
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Center for Biomedical Research, Medical University of Vienna, Vienna, Austria
| | - Matthias Luft
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Center for Biomedical Research, Medical University of Vienna, Vienna, Austria
| | - Gregor Laengle
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Center for Biomedical Research, Medical University of Vienna, Vienna, Austria
| | - Olga Politikou
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Center for Biomedical Research, Medical University of Vienna, Vienna, Austria
| | - Holger J Klein
- Department of Plastic Surgery and Hand Surgery, University Hospital Zurich, Zurich, Switzerland
| | - Roland Blumer
- Center for Anatomy and Cell Biology, Medical University of Vienna, Vienna, Austria
| | - Oskar C Aszmann
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria
| | - Konstantin D Bergmeister
- Clinical Laboratory for Bionic Extremity Reconstruction, Department of Plastic, Reconstructive and Aesthetic Surgery, Medical University of Vienna, Vienna, Austria.,Department of Plastic, Aesthetic and Reconstructive Surgery, University Hospital St. Poelten, Karl Landsteiner University of Health Sciences, Krems, Austria.,Department of Plastic, Aesthetic and Reconstructive Surgery, University Hospital St. Poelten, Krems, Austria
| |
Collapse
|
15
|
Webb ALM. Reversing the Luminance Polarity of Control Faces: Why Are Some Negative Faces Harder to Recognize, but Easier to See? Front Psychol 2021; 11:609045. [PMID: 33551920 PMCID: PMC7858267 DOI: 10.3389/fpsyg.2020.609045] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2020] [Accepted: 12/15/2020] [Indexed: 11/13/2022] Open
Abstract
Control stimuli are key for understanding the extent to which face processing relies on holistic processing, and affective evaluation versus the encoding of low-level image properties. Luminance polarity (LP) reversal combined with face inversion is a popular tool for severely disrupting the recognition of face controls. However, recent findings demonstrate visibility-recognition trade-offs for LP-reversed faces, where these face controls sometimes appear more salient despite being harder to recognize. The present report brings together findings from image analysis, simple stimuli, and behavioral data for facial recognition and visibility, in an attempt to disentangle instances where LP-reversed control faces are associated with a performance bias in terms of their perceived salience. These findings have important implications for studies of subjective face appearance, and highlight that future research must be aware of behavioral artifacts due to the possibility of trade-off effects.
Collapse
Affiliation(s)
- Abigail L M Webb
- Department of Psychology, University of Essex, Colchester, United Kingdom
| |
Collapse
|
16
|
Kawulok M, Nalepa J, Kawulok J, Smolka B. Dynamics of facial actions for assessing smile genuineness. PLoS One 2021; 16:e0244647. [PMID: 33400708 PMCID: PMC7785114 DOI: 10.1371/journal.pone.0244647] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2020] [Accepted: 12/14/2020] [Indexed: 11/19/2022] Open
Abstract
Applying computer vision techniques to distinguish between spontaneous and posed smiles is an active research topic of affective computing. Although there have been many works published addressing this problem and a couple of excellent benchmark databases created, the existing state-of-the-art approaches do not exploit the action units defined within the Facial Action Coding System that has become a standard in facial expression analysis. In this work, we explore the possibilities of extracting discriminative features directly from the dynamics of facial action units to differentiate between genuine and posed smiles. We report the results of our experimental study which shows that the proposed features offer competitive performance to those based on facial landmark analysis and on textural descriptors extracted from spatial-temporal blocks. We make these features publicly available for the UvA-NEMO and BBC databases, which will allow other researchers to further improve the classification scores, while preserving the interpretation capabilities attributed to the use of facial action units. Moreover, we have developed a new technique for identifying the smile phases, which is robust against the noise and allows for continuous analysis of facial videos.
Collapse
Affiliation(s)
- Michal Kawulok
- Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, Gliwice, Poland
- * E-mail:
| | - Jakub Nalepa
- Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, Gliwice, Poland
| | - Jolanta Kawulok
- Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, Gliwice, Poland
| | - Bogdan Smolka
- Faculty of Automatic Control, Electronics and Computer Science, Silesian University of Technology, Gliwice, Poland
| |
Collapse
|
17
|
Donnier S, Kovács G, Oña LS, Bräuer J, Amici F. Experience has a limited effect on humans' ability to predict the outcome of social interactions in children, dogs and macaques. Sci Rep 2020; 10:21240. [PMID: 33277580 PMCID: PMC7718882 DOI: 10.1038/s41598-020-78275-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Accepted: 11/23/2020] [Indexed: 11/08/2022] Open
Abstract
The ability to predict others' behaviour represents a crucial mechanism which allows individuals to react faster and more appropriately. To date, several studies have investigated humans' ability to predict conspecifics' behaviour, but little is known on our ability to predict behaviour in other species. Here, we aimed to test humans' ability to predict social behaviour in dogs, macaques and humans, and assess the role played by experience and evolution on the emergence of this ability. For this purpose, we presented participants with short videoclips of real-life social interactions in dog, child and macaque dyads, and then asked them to predict the outcome of the observed interactions (i.e. aggressive, neutral or playful). Participants were selected according to their previous species-specific experience with dogs, children and non-human primates. Our results showed a limited effect of experience on the ability to predict the outcome of social interactions, which was mainly restricted to macaques. Moreover, we found no support to the co-domestication hypothesis, in that participants were not especially skilled at predicting dog behaviour. Finally, aggressive outcomes in dogs were predicted significantly worse than playful or neutral ones. Based on our findings, we suggest possible lines for future research, like the inclusion of other primate species and the assessment of cultural factors on the ability to predict behaviour across species.
Collapse
Affiliation(s)
- Sasha Donnier
- Fundació UdG: Innovació I Formació, Universitat de Girona, Carrer Pic de Peguera 11, 17003, Girona, Spain
| | - Gyula Kovács
- Institute of Psychology, Friedrich Schiller University Jena, Leutragraben 1, 07743, Jena, Germany
| | - Linda S Oña
- Max Planck Research Group 'Naturalistic Social Cognition', Max Planck Institute for Human Development, Berlin, Germany
| | - Juliane Bräuer
- Institute of Psychology, Friedrich Schiller University Jena, Leutragraben 1, 07743, Jena, Germany
- Max-Planck-Institute for the Science of Human History, Jena, Germany
| | - Federica Amici
- Department of Human Behavior, Ecology and Culture, Research Group "Primate Behavioural Ecology", Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany.
- Institute of Biology, Behavioral Ecology Research Group, University of Leipzig Faculty of Life Science, Leipzig, Germany.
| |
Collapse
|
18
|
Ruan QN, Liang J, Hong JY, Yan WJ. Focusing on Mouth Movement to Improve Genuine Smile Recognition. Front Psychol 2020; 11:1126. [PMID: 32848960 PMCID: PMC7399707 DOI: 10.3389/fpsyg.2020.01126] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2020] [Accepted: 05/04/2020] [Indexed: 11/13/2022] Open
Abstract
Smiles are the most commonly and frequently used facial expressions by human beings. Some scholars claimed that the low accuracy in recognizing genuine smiles is explained by the perceptual-attentional hypothesis, meaning that observers either did not pay attention to responsible cues or were unable to recognize these cues (usually the Duchenne marker or AU6 displaying as contraction of muscles in eye regions). We investigated whether training (instructing participants to pay attention either to the Duchenne mark or to mouth movement) might help improve the recognition of genuine smiles, including accuracy and confidence. Results indicated that attention to mouth movement improves these people's ability to distinguish between genuine and posed smiles, with nullification of the alternative explanations such as sample distribution and intensity of lip pulling (AU12). The generalization of the conclusion requires further investigations. This study further argues that the perceptual-attentional hypothesis can explain smile genuineness recognition.
Collapse
Affiliation(s)
| | - Jing Liang
- School of Educational Science, Ludong University, Yantai, China
| | - Jin-Yu Hong
- College of Education, Wenzhou University, Wenzhou, China
| | - Wen-Jing Yan
- College of Education, Wenzhou University, Wenzhou, China
| |
Collapse
|
19
|
Gupta T, Haase CM, Strauss GP, Cohen AS, Ricard JR, Mittal VA. Alterations in facial expressions of emotion: Determining the promise of ultrathin slicing approaches and comparing human and automated coding methods in psychosis risk. Emotion 2020; 22:714-724. [PMID: 32584067 DOI: 10.1037/emo0000819] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Alterations in facial expressions of emotion are a hallmark of psychopathology and may be present before the onset of mental illness. Technological advances have spurred interest in examining alterations based on "thin slices" of behavior using automated approaches. However, questions remain. First, can alterations be detected in ultrathin slices of behavior? Second, how do automated approaches converge with human coding techniques? The present study examined ultrathin (i.e., 1-min) slices of video-recorded clinical interviews of 42 individuals at clinical high risk (CHR) for psychosis and 42 matched controls. Facial expressions of emotion (e.g., joy, anger) were examined using two automated facial analysis programs and coded by trained human raters (using the Expressive Emotional Behavior Coding System). Results showed that ultrathin (i.e., 1-min) slices of behavior were sufficient to reveal alterations in facial expressions of emotion, specifically blunted joy expressions in individuals at CHR (with supplementary analyses probing links with attenuated positive symptoms and functioning). Furthermore, both automated analysis programs converged in the ability to detect blunted joy expressions and were consistent with human coding at the level of both second-by-second and aggregate data. Finally, there were areas of divergence across approaches for other emotional expressions beyond joy. These data suggest that ultrathin slices of behavior can yield clues about emotional dysfunction. Further, automated approaches (which do not require lengthy training and coder time but do lend well to mobile assessment and computational modeling) show promise, but careful evaluation of convergence with human coding is needed. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
|
20
|
Mühlenbeck C, Pritsch C, Wartenburger I, Telkemeyer S, Liebal K. Attentional Bias to Facial Expressions of Different Emotions - A Cross-Cultural Comparison of ≠Akhoe Hai||om and German Children and Adolescents. Front Psychol 2020; 11:795. [PMID: 32411056 PMCID: PMC7199105 DOI: 10.3389/fpsyg.2020.00795] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Accepted: 03/31/2020] [Indexed: 11/16/2022] Open
Abstract
The attentional bias to negative information enables humans to quickly identify and to respond appropriately to potentially threatening situations. Because of its adaptive function, the enhanced sensitivity to negative information is expected to represent a universal trait, shared by all humans regardless of their cultural background. However, existing research focuses almost exclusively on humans from Western industrialized societies, who are not representative for the human species. Therefore, we compare humans from two distinct cultural contexts: adolescents and children from Germany, a Western industrialized society, and from the ≠Akhoe Hai||om, semi-nomadic hunter-gatherers in Namibia. We predicted that both groups show an attentional bias toward negative facial expressions as compared to neutral or positive faces. We used eye-tracking to measure their fixation duration on facial expressions depicting different emotions, including negative (fear, anger), positive (happy), and neutral faces. Both Germans and the ≠Akhoe Hai||om gazed longer at fearful faces, but shorter on angry faces, challenging the notion of a general bias toward negative emotions. For happy faces, fixation durations varied between the two groups, suggesting more flexibility in the response to positive emotions. Our findings emphasize the need for placing research on emotion perception into an evolutionary, cross-cultural comparative framework that considers the adaptive significance of specific emotions, rather than differentiating between positive and negative information, and enables systematic comparisons across participants from diverse cultural backgrounds.
Collapse
Affiliation(s)
- Cordelia Mühlenbeck
- Department of Psychology, Brandenburg Medical School Theodor Fontane, Neuruppin, Germany.,Comparative Developmental Psychology, Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
| | - Carla Pritsch
- Graduate School "Languages of Emotion", Freie Universität Berlin, Berlin, Germany
| | - Isabell Wartenburger
- Department of Linguistics, Cognitive Sciences, University of Potsdam, Potsdam, Germany
| | - Silke Telkemeyer
- Department of Linguistics, Cognitive Sciences, University of Potsdam, Potsdam, Germany
| | - Katja Liebal
- Comparative Developmental Psychology, Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
| |
Collapse
|
21
|
Girard JM, Shandar G, Liu Z, Cohn JF, Yin L, Morency LP. Reconsidering the Duchenne Smile: Indicator of Positive Emotion or Artifact of Smile Intensity? INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION AND WORKSHOPS : [PROCEEDINGS]. ACII (CONFERENCE) 2019; 2019:594-599. [PMID: 32363090 DOI: 10.1109/acii.2019.8925535] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
The Duchenne smile hypothesis is that smiles that include eye constriction (AU6) are the product of genuine positive emotion, whereas smiles that do not are either falsified or related to negative emotion. This hypothesis has become very influential and is often used in scientific and applied settings to justify the inference that a smile is either true or false. However, empirical support for this hypothesis has been equivocal and some researchers have proposed that, rather than being a reliable indicator of positive emotion, AU6 may just be an artifact produced by intense smiles. Initial support for this proposal has been found when comparing smiles related to genuine and feigned positive emotion; however, it has not yet been examined when comparing smiles related to genuine positive and negative emotion. The current study addressed this gap in the literature by examining spontaneous smiles from 136 participants during the elicitation of amusement, embarrassment, fear, and pain (from the BP4D+ dataset). Bayesian multilevel regression models were used to quantify the associations between AU6 and self-reported amusement while controlling for smile intensity. Models were estimated to infer amusement from AU6 and to explain the intensity of AU6 using amusement. In both cases, controlling for smile intensity substantially reduced the hypothesized association, whereas the effect of smile intensity itself was quite large and reliable. These results provide further evidence that the Duchenne smile is likely an artifact of smile intensity rather than a reliable and unique indicator of genuine positive emotion.
Collapse
Affiliation(s)
- Jeffrey M Girard
- Language Technologies Institute, Carnegie Mellon University, Pittsburgh, PA
| | - Gayatri Shandar
- Language Technologies Institute, Carnegie Mellon University, Pittsburgh, PA
| | - Zhun Liu
- Language Technologies Institute, Carnegie Mellon University, Pittsburgh, PA
| | - Jeffrey F Cohn
- Department of Psychology, University of Pittsburgh, Pittsburgh, PA
| | - Lijun Yin
- Department of Computer Science, Binghamton University, Binghamton, NY
| | | |
Collapse
|
22
|
The Priming Effect of a Facial Expression of Surprise on the Discrimination of a Facial Expression of Fear. CURRENT PSYCHOLOGY 2019. [DOI: 10.1007/s12144-017-9719-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
23
|
Cultural Moderation of Unconscious Hedonic Responses to Food. Nutrients 2019; 11:nu11112832. [PMID: 31752310 PMCID: PMC6893624 DOI: 10.3390/nu11112832] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2019] [Revised: 11/14/2019] [Accepted: 11/15/2019] [Indexed: 11/17/2022] Open
Abstract
Previous psychological studies have shown that images of food elicit hedonic responses, either consciously or unconsciously, and that participants' cultural experiences moderate conscious hedonic ratings of food. However, whether cultural factors moderate unconscious hedonic responses to food remains unknown. We investigated this issue in Polish and Japanese participants using the subliminal affective priming paradigm. Images of international fast food and domestic Japanese food were presented subliminally as prime stimuli. Participants rated their preferences for the subsequently presented target ideographs. Participants also rated their preferences for supraliminally presented food images. In the subliminal rating task, Polish participants showed higher preference ratings for fast food primes than for Japanese food primes, whereas Japanese participants showed comparable preference ratings across these two conditions. In the supraliminal rating task, both Polish and Japanese participants reported comparable preferences for fast and Japanese food stimuli. These results suggest that cultural experiences moderate unconscious hedonic responses to food, which may not be detected based on explicit ratings.
Collapse
|
24
|
Amici F, Waterman J, Kellermann CM, Karimullah K, Bräuer J. The ability to recognize dog emotions depends on the cultural milieu in which we grow up. Sci Rep 2019; 9:16414. [PMID: 31712680 PMCID: PMC6848084 DOI: 10.1038/s41598-019-52938-4] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2019] [Accepted: 10/26/2019] [Indexed: 11/09/2022] Open
Abstract
Inter-specific emotion recognition is especially adaptive when species spend a long time in close association, like dogs and humans. Here, we comprehensively studied the human ability to recognize facial expressions associated with dog emotions (hereafter, emotions). Participants were presented with pictures of dogs, humans and chimpanzees, showing angry, fearful, happy, neutral and sad emotions, and had to assess which emotion was shown, and the context in which the picture had been taken. Participants were recruited among children and adults with different levels of general experience with dogs, resulting from different personal (i.e. dog ownership) and cultural experiences (i.e. growing up or being exposed to a cultural milieu in which dogs are highly valued and integrated in human lives). Our results showed that some dog emotions such as anger and happiness are recognized from early on, independently of experience. However, the ability to recognize dog emotions is mainly acquired through experience. In adults, the probability of recognizing dog emotions was higher for participants grown up in a cultural milieu with a positive attitude toward dogs, which may result in different passive exposure, interest or inclination toward this species.
Collapse
Affiliation(s)
- Federica Amici
- Research Group "Primate Behavioural Ecology", Department of Human Behavior, Ecology and Culture, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany. .,Behavioral Ecology Research Group, Institute of Biology, Faculty of Life Science, University of Leipzig, Leipzig, Germany. .,Leipzig Research Center for Early Child Development, University of Leipzig, Leipzig, Germany.
| | - James Waterman
- School of Psychology, University of Lincoln, Lincoln, UK
| | - Christina Maria Kellermann
- Leipzig Research Center for Early Child Development, University of Leipzig, Leipzig, Germany.,Faculty of Social and Behavioral Sciences, Friedrich Schiller University, Jena, Germany
| | - Karimullah Karimullah
- Behavioral Ecology Research Group, Institute of Biology, Faculty of Life Science, University of Leipzig, Leipzig, Germany
| | - Juliane Bräuer
- Department of Linguistic and Cultural Evolution, Max Planck Institute for the Science of Human History, Jena, Germany.,Friedrich Schiller University, Department of General Psychology and Cognitive Neuroscience, Jena, Germany
| |
Collapse
|
25
|
Cowen AS, Keltner D. What the face displays: Mapping 28 emotions conveyed by naturalistic expression. ACTA ACUST UNITED AC 2019; 75:349-364. [PMID: 31204816 DOI: 10.1037/amp0000488] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
What emotions do the face and body express? Guided by new conceptual and quantitative approaches (Cowen, Elfenbein, Laukka, & Keltner, 2018; Cowen & Keltner, 2017, 2018), we explore the taxonomy of emotion recognized in facial-bodily expression. Participants (N = 1,794; 940 female, ages 18-76 years) judged the emotions captured in 1,500 photographs of facial-bodily expression in terms of emotion categories, appraisals, free response, and ecological validity. We find that facial-bodily expressions can reliably signal at least 28 distinct categories of emotion that occur in everyday life. Emotion categories, more so than appraisals such as valence and arousal, organize emotion recognition. However, categories of emotion recognized in naturalistic facial and bodily behavior are not discrete but bridged by smooth gradients that correspond to continuous variations in meaning. Our results support a novel view that emotions occupy a high-dimensional space of categories bridged by smooth gradients of meaning. They offer an approximation of a taxonomy of facial-bodily expressions, visualized within an online interactive map. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
|
26
|
Schanz L, Krueger K, Hintze S. Sex and Age Don't Matter, but Breed Type Does-Factors Influencing Eye Wrinkle Expression in Horses. Front Vet Sci 2019; 6:154. [PMID: 31192235 PMCID: PMC6549476 DOI: 10.3389/fvets.2019.00154] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2019] [Accepted: 05/02/2019] [Indexed: 11/13/2022] Open
Abstract
Identifying valid indicators to assess animals' emotional states is a critical objective of animal welfare science. In horses, eye wrinkles above the eyeball have been shown to be affected by pain and other emotional states. From other species we know that individual characteristics, e.g., age in humans, affect facial wrinkles, but it has not yet been investigated whether eye wrinkle expression in horses is systematically affected by such characteristics. Therefore, the aim of this study was to assess how age, sex, breed type, body condition, and coat colour affect the expression and/or the assessment of eye wrinkles in horses. To this end, we adapted the eye wrinkle assessment scale from Hintze et al. (1) and assessed eye wrinkle expression in pictures taken from the left and the right eye of 181 horses in a presumably neutral situation, using five outcome measures: a qualitative first impression reflecting how worried the horse is perceived by humans, the extent to which the brow is raised, the number of wrinkles, their markedness and the angle between a line through both corners of the eye and the topmost wrinkle. All measures could be assessed highly reliable with respect to intra- and inter-observer agreement. Breed type affected the width of the angle [F (2,114) = 8.20, p < 0.001], with thoroughbreds having the narrowest angle (M = 23.80, SD = 1.60), followed by warmbloods (M = 28.00, SD = 0.60), and coldbloods (M = 31.00, SD = 0.90). None of the other characteristics affected any of the outcome measures, and eye wrinkle expression did not differ between the left and the right eye area (all p-values > 0.05). In conclusion, horses' eye wrinkle expression and its assessment in neutral situations was not systematically affected by the investigated characteristics, except for "breed type", which accounted for some variation in "angle"; how much eye wrinkle expression is affected by emotion or perhaps mood needs further investigation and validation.
Collapse
Affiliation(s)
- Lisa Schanz
- Division of Livestock Sciences, Department of Sustainable Agricultural Systems, University of Natural Resources and Life Sciences, Vienna, Austria
- Department of Equine Economics, Nuertingen-Geislingen University of Applied Sciences, Nürtingen, Germany
| | - Konstanze Krueger
- Department of Equine Economics, Nuertingen-Geislingen University of Applied Sciences, Nürtingen, Germany
- Biology I, University of Regensburg, Regensburg, Germany
| | - Sara Hintze
- Division of Livestock Sciences, Department of Sustainable Agricultural Systems, University of Natural Resources and Life Sciences, Vienna, Austria
| |
Collapse
|
27
|
Orienting asymmetries and physiological reactivity in dogs' response to human emotional faces. Learn Behav 2019; 46:574-585. [PMID: 29923158 DOI: 10.3758/s13420-018-0325-2] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Recent scientific literature shows that emotional cues conveyed by human vocalizations and odours are processed in an asymmetrical way by the canine brain. In the present study, during feeding behaviour, dogs were suddenly presented with 2-D stimuli depicting human faces expressing the Ekman's six basic emotion (e.g. anger, fear, happiness, sadness, surprise, disgust, and neutral), simultaneously into the left and right visual hemifields. A bias to turn the head towards the left (right hemisphere) rather than the right side was observed with human faces expressing anger, fear, and happiness emotions, but an opposite bias (left hemisphere) was observed with human faces expressing surprise. Furthermore, dogs displayed higher behavioural and cardiac activity to picture of human faces expressing clear arousal emotional state. Overall, results demonstrated that dogs are sensitive to emotional cues conveyed by human faces, supporting the existence of an asymmetrical emotional modulation of the canine brain to process basic human emotions.
Collapse
|
28
|
A Review on Automatic Facial Expression Recognition Systems Assisted by Multimodal Sensor Data. SENSORS 2019; 19:s19081863. [PMID: 31003522 PMCID: PMC6514576 DOI: 10.3390/s19081863] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/28/2019] [Revised: 04/15/2019] [Accepted: 04/15/2019] [Indexed: 11/28/2022]
Abstract
Facial Expression Recognition (FER) can be widely applied to various research areas, such as mental diseases diagnosis and human social/physiological interaction detection. With the emerging advanced technologies in hardware and sensors, FER systems have been developed to support real-world application scenes, instead of laboratory environments. Although the laboratory-controlled FER systems achieve very high accuracy, around 97%, the technical transferring from the laboratory to real-world applications faces a great barrier of very low accuracy, approximately 50%. In this survey, we comprehensively discuss three significant challenges in the unconstrained real-world environments, such as illumination variation, head pose, and subject-dependence, which may not be resolved by only analysing images/videos in the FER system. We focus on those sensors that may provide extra information and help the FER systems to detect emotion in both static images and video sequences. We introduce three categories of sensors that may help improve the accuracy and reliability of an expression recognition system by tackling the challenges mentioned above in pure image/video processing. The first group is detailed-face sensors, which detect a small dynamic change of a face component, such as eye-trackers, which may help differentiate the background noise and the feature of faces. The second is non-visual sensors, such as audio, depth, and EEG sensors, which provide extra information in addition to visual dimension and improve the recognition reliability for example in illumination variation and position shift situation. The last is target-focused sensors, such as infrared thermal sensors, which can facilitate the FER systems to filter useless visual contents and may help resist illumination variation. Also, we discuss the methods of fusing different inputs obtained from multimodal sensors in an emotion system. We comparatively review the most prominent multimodal emotional expression recognition approaches and point out their advantages and limitations. We briefly introduce the benchmark data sets related to FER systems for each category of sensors and extend our survey to the open challenges and issues. Meanwhile, we design a framework of an expression recognition system, which uses multimodal sensor data (provided by the three categories of sensors) to provide complete information about emotions to assist the pure face image/video analysis. We theoretically analyse the feasibility and achievability of our new expression recognition system, especially for the use in the wild environment, and point out the future directions to design an efficient, emotional expression recognition system.
Collapse
|
29
|
Pahnke R, Mau-Moeller A, Junge M, Wendt J, Weymar M, Hamm AO, Lischke A. Oral Contraceptives Impair Complex Emotion Recognition in Healthy Women. Front Neurosci 2019; 12:1041. [PMID: 30804733 PMCID: PMC6378414 DOI: 10.3389/fnins.2018.01041] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2018] [Accepted: 12/21/2018] [Indexed: 12/21/2022] Open
Abstract
Despite the widespread use of oral contraceptives (OCs), remarkably little is known about the effects of OCs on emotion, cognition, and behavior. However, coincidental findings suggest that OCs impair the ability to recognize others’ emotional expressions, which may have serious consequences in interpersonal contexts. To further investigate the effects of OCs on emotion recognition, we tested whether women who were using OCs (n = 42) would be less accurate in the recognition of complex emotional expressions than women who were not using OCs (n = 53). In addition, we explored whether these differences in emotion recognition would depend on women’s menstrual cycle phase. We found that women with OC use were indeed less accurate in the recognition of complex expressions than women without OC use, in particular during the processing of expressions that were difficult to recognize. These differences in emotion recognition did not depend on women’s menstrual cycle phase. Our findings, thus, suggest that OCs impair women’s emotion recognition, which should be taken into account when informing women about the side-effects of OC use.
Collapse
Affiliation(s)
- Rike Pahnke
- Department of Sport Sciences, University of Rostock, Rostock, Germany
| | - Anett Mau-Moeller
- Department of Sport Sciences, University of Rostock, Rostock, Germany.,Department of Orthopaedics, University Medicine Rostock, Rostock, Germany
| | - Martin Junge
- Institute for Community Medicine, University Medicine Greifswald, Greifswald, Germany
| | - Julia Wendt
- Department of Psychology, University of Greifswald, Greifswald, Germany
| | - Mathias Weymar
- Department of Psychology, University of Potsdam, Potsdam, Germany
| | - Alfons O Hamm
- Department of Psychology, University of Greifswald, Greifswald, Germany
| | - Alexander Lischke
- Department of Psychology, University of Greifswald, Greifswald, Germany
| |
Collapse
|
30
|
Yamashiro A, Sorcinelli A, Rahman T, Elbogen R, Curtin S, Vouloumanos A. Shifting Preferences for Primate Faces in Neurotypical Infants and Infants Later Diagnosed With ASD. Autism Res 2019; 12:249-262. [PMID: 30561908 PMCID: PMC6368880 DOI: 10.1002/aur.2043] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2018] [Accepted: 10/22/2018] [Indexed: 11/09/2022]
Abstract
Infants look at others' faces to gather social information. Newborns look equally at human and monkey faces but prefer human faces by 1 month, helping them learn to communicate and interact with others. Infants later diagnosed with autism spectrum disorder (ASD) look at human faces less than neurotypical infants, which may underlie some deficits in social-communication later in life. Here, we asked whether infants later diagnosed with ASD differ in their preferences for both human and nonhuman primate faces compared to neurotypical infants over their first 2 years of life. We compare infants' relative looking times to human or monkey faces paired with nonface controls (Experiment 1) and infants' total looking times to pairs of human and monkey faces (Experiment 2). Across two experiments, we find that between 6 and 18 months, infants later diagnosed with ASD show a greater downturn (decrease after an initial increase) in looking at both primate faces than neurotypical infants. A decrease in attention to primate faces may partly underlie the social-communicative difficulties in children with ASD and could reveal how early perceptual experiences with faces affect development. Autism Res 2019, 12: 249-262 © 2018 International Society for Autism Research, Wiley Periodicals, Inc. LAY SUMMARY: Looking at faces helps infants learn to interact with others. Infants look equally at human and monkey faces at birth but prefer human faces by 1 month. Infants later diagnosed with ASD who show deficits in social-communication look at human faces less than neurotypical infants. We find that a downturn (decline after an initial increase) in attention to both human and monkey faces between 6 and 18 months may partly underlie the social-communicative difficulties in children with ASD.
Collapse
|
31
|
Mitrovic A, Goller J, Tinio PPL, Leder H. How relationship status and sociosexual orientation influence the link between facial attractiveness and visual attention. PLoS One 2018; 13:e0207477. [PMID: 30427937 PMCID: PMC6241135 DOI: 10.1371/journal.pone.0207477] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2018] [Accepted: 10/31/2018] [Indexed: 11/18/2022] Open
Abstract
Facial attractiveness captures and binds visual attention, thus affecting visual exploration of our environment. It is often argued that this effect on attention has evolutionary functions related to mating. Although plausible, such perspectives have been challenged by recent behavioral and eye-tracking studies, which have shown that the effect on attention is moderated by various sex- and goal-related variables such as sexual orientation. In the present study, we examined how relationship status and sociosexual orientation moderate the link between attractiveness and visual attention. We hypothesized that attractiveness leads to longer looks and that being single as well as being more sociosexually unrestricted, enhances the effect of attractiveness. Using an eye-tracking free-viewing paradigm, we tested 150 heterosexual men and women looking at images of urban real-world scenes depicting two people differing in facial attractiveness. Participants additionally provided attractiveness ratings of all stimuli. We analyzed the correlations between how long faces were looked at and participants’ ratings of attractiveness and found that more attractive faces—especially of the other sex—were looked at longer. We also found that more sociosexually unrestricted participants who were single had the highest attractiveness-attention correlation. Our results show that evolutionary predictions cannot fully explain the attractiveness-attention correlation; perceiver characteristics and motives moderate this relationship.
Collapse
Affiliation(s)
- Aleksandra Mitrovic
- Department of Basic Psychological Research and Research Methods, University of Vienna, Vienna, Austria
- * E-mail:
| | - Juergen Goller
- Department of Basic Psychological Research and Research Methods, University of Vienna, Vienna, Austria
| | - Pablo P. L. Tinio
- College of Education and Human Services, Montclair State University, Montclair, New Jersey, United States of America
| | - Helmut Leder
- Department of Basic Psychological Research and Research Methods, University of Vienna, Vienna, Austria
| |
Collapse
|
32
|
Can grimace scales estimate the pain status in horses and mice? A statistical approach to identify a classifier. PLoS One 2018; 13:e0200339. [PMID: 30067759 PMCID: PMC6070187 DOI: 10.1371/journal.pone.0200339] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2018] [Accepted: 06/25/2018] [Indexed: 11/19/2022] Open
Abstract
Pain recognition is fundamental for safeguarding animal welfare. Facial expressions have been investigated in several species and grimace scales have been developed as pain assessment tool in many species including horses (HGS) and mice (MGS). This study is intended to progress the validation of grimace scales, by proposing a statistical approach to identify a classifier that can estimate the pain status of the animal based on Facial Action Units (FAUs) included in HGS and MGS. To achieve this aim, through a validity study, the relation between FAUs included in HGS and MGS and the real pain condition was investigated. A specific statistical approach (Cumulative Link Mixed Model, Inter-rater reliability, Multiple Correspondence Analysis, Linear Discriminant Analysis and Support Vector Machines) was applied to two datasets. Our results confirm the reliability of both scales and show that individual FAU scores of HGS and MGS are related to the pain state of the animal. Finally, we identified the optimal weights of the FAU scores that can be used to best classify animals in pain with an accuracy greater than 70%. For the first time, this study describes a statistical approach to develop a classifier, based on HGS and MGS, for estimating the pain status of animals. The classifier proposed is the starting point to develop a computer-based image analysis for the automatic recognition of pain in horses and mice.
Collapse
|
33
|
Smith FW, Rossit S. Identifying and detecting facial expressions of emotion in peripheral vision. PLoS One 2018; 13:e0197160. [PMID: 29847562 PMCID: PMC5976168 DOI: 10.1371/journal.pone.0197160] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Accepted: 04/27/2018] [Indexed: 11/24/2022] Open
Abstract
Facial expressions of emotion are signals of high biological value. Whilst recognition of facial expressions has been much studied in central vision, the ability to perceive these signals in peripheral vision has only seen limited research to date, despite the potential adaptive advantages of such perception. In the present experiment, we investigate facial expression recognition and detection performance for each of the basic emotions (plus neutral) at up to 30 degrees of eccentricity. We demonstrate, as expected, a decrease in recognition and detection performance with increasing eccentricity, with happiness and surprised being the best recognized expressions in peripheral vision. In detection however, while happiness and surprised are still well detected, fear is also a well detected expression. We show that fear is a better detected than recognized expression. Our results demonstrate that task constraints shape the perception of expression in peripheral vision and provide novel evidence that detection and recognition rely on partially separate underlying mechanisms, with the latter more dependent on the higher spatial frequency content of the face stimulus.
Collapse
Affiliation(s)
- Fraser W. Smith
- School of Psychology, University of East Anglia, Norwich, United Kingdom
| | - Stephanie Rossit
- School of Psychology, University of East Anglia, Norwich, United Kingdom
| |
Collapse
|
34
|
Mortillaro M, Dukes D. Jumping for Joy: The Importance of the Body and of Dynamics in the Expression and Recognition of Positive Emotions. Front Psychol 2018; 9:763. [PMID: 29867704 PMCID: PMC5962906 DOI: 10.3389/fpsyg.2018.00763] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2018] [Accepted: 04/30/2018] [Indexed: 11/15/2022] Open
Abstract
The majority of research on emotion expression has focused on static facial prototypes of a few selected, mostly negative emotions. Implicitly, most researchers seem to have considered all positive emotions as sharing one common signal (namely, the smile), and consequently as being largely indistinguishable from each other in terms of expression. Recently, a new wave of studies has started to challenge the traditional assumption by considering the role of multiple modalities and the dynamics in the expression and recognition of positive emotions. Based on these recent studies, we suggest that positive emotions are better expressed and correctly perceived when (a) they are communicated simultaneously through the face and body and (b) perceivers have access to dynamic stimuli. Notably, we argue that this improvement is comparatively more important for positive emotions than for negative emotions. Our view is that the misperception of positive emotions has fewer immediate and potentially life-threatening consequences than the misperception of negative emotions; therefore, from an evolutionary perspective, there was only limited benefit in the development of clear, quick signals that allow observers to draw fine distinctions between them. Consequently, we suggest that the successful communication of positive emotions requires a stronger signal than that of negative emotions, and that this signal is provided by the use of the body and the way those movements unfold. We hope our contribution to this growing field provides a new direction and a theoretical grounding for the many lines of empirical research on the expression and recognition of positive emotions.
Collapse
Affiliation(s)
- Marcello Mortillaro
- Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Daniel Dukes
- Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
- Psychology Research Institute, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
35
|
Not just for fun! Social play
as a springboard for adult social competence in human and non-human
primates. Behav Ecol Sociobiol 2018. [DOI: 10.1007/s00265-018-2506-6] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
|
36
|
Plusquellec P, Denault V. The 1000 Most Cited Papers on Visible Nonverbal Behavior: A Bibliometric Analysis. JOURNAL OF NONVERBAL BEHAVIOR 2018. [DOI: 10.1007/s10919-018-0280-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
|
37
|
Tanikawa C, Takada K. Test-retest reliability of smile tasks using three-dimensional facial topography. Angle Orthod 2018; 88:319-328. [PMID: 29509027 DOI: 10.2319/062617-425.1] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Abstract
OBJECTIVES To evaluate the test-retest reliability of three types of facial expression tasks using three-dimensional (3D) facial topography. MATERIALS AND METHODS Twelve adult volunteers were enrolled in this study. They were instructed to perform three different facial expression tasks: rest posture, posed smile, and maximum effort smile. Each task was recorded using a 3D image-capturing device on two separate occasions with an interval of 1 week between sessions. The images of two sessions were superimposed based on the forehead. For each participant and for each facial expression, a wire mesh fitting was conducted. This method generated 6,017 points on the wire mesh. Intraindividual reliability between sessions for each task was statistically tested by intraclass correlation coefficients (ICCs) and the 95% confidence interval minimal detectable change (MDC95). RESULTS The MDC95 for the repeated measures of the rest posture, posed smile, and maximum effort smile exhibited means of 0.8, 1.5, and 1.3 mm, respectively, on the z-axis. The ICCs ranged from substantial to almost perfect agreement for repeated measures for the rest posture and maximum effort smile (0.60 < ICC ≤ 1.00). The right corner of the mouth in the posed smile showed moderate agreement (0.40 < ICC ≤ 0.60). CONCLUSIONS The overall test-retest reliability of the maximum effort smile and rest posture showed substantial to almost perfect agreement, and this was clinically acceptable.
Collapse
|
38
|
Klasen M, von Marschall C, Isman G, Zvyagintsev M, Gur RC, Mathiak K. Prosody production networks are modulated by sensory cues and social context. Soc Cogn Affect Neurosci 2018. [PMID: 29514331 PMCID: PMC5928400 DOI: 10.1093/scan/nsy015] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022] Open
Abstract
The neurobiology of emotional prosody production is not well investigated. In particular, the effects of cues and social context are not known. The present study sought to differentiate cued from free emotion generation and the effect of social feedback from a human listener. Online speech filtering enabled functional magnetic resonance imaging during prosodic communication in 30 participants. Emotional vocalizations were (i) free, (ii) auditorily cued, (iii) visually cued or (iv) with interactive feedback. In addition to distributed language networks, cued emotions increased activity in auditory and—in case of visual stimuli—visual cortex. Responses were larger in posterior superior temporal gyrus at the right hemisphere and the ventral striatum when participants were listened to and received feedback from the experimenter. Sensory, language and reward networks contributed to prosody production and were modulated by cues and social context. The right posterior superior temporal gyrus is a central hub for communication in social interactions—in particular for interpersonal evaluation of vocal emotions.
Collapse
Affiliation(s)
- Martin Klasen
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA - Translational Brain Medicine, 52074 Aachen, Germany
| | - Clara von Marschall
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA - Translational Brain Medicine, 52074 Aachen, Germany
| | - Güldehen Isman
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Pauwelsstraße 30, 52074 Aachen, Germany
| | - Mikhail Zvyagintsev
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA - Translational Brain Medicine, 52074 Aachen, Germany
| | - Ruben C Gur
- Department of Psychiatry, University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104, USA
| | - Klaus Mathiak
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical Faculty, RWTH Aachen, Pauwelsstraße 30, 52074 Aachen, Germany.,JARA - Translational Brain Medicine, 52074 Aachen, Germany
| |
Collapse
|
39
|
Abstract
The smile is a frequently expressed facial expression that typically conveys a positive emotional state and friendly intent. However, human beings have also learned how to fake smiles, typically by controlling the mouth to provide a genuine-looking expression. This is often accompanied by inaccuracies that can allow others to determine that the smile is false. Mouth movement is one of the most striking features of the smile, yet our understanding of its dynamic elements is still limited. The present study analyzes the dynamic features of lip corners, and considers how they differ between genuine and posed smiles. Employing computer vision techniques, we investigated elements such as the duration, intensity, speed, symmetry of the lip corners, and certain irregularities in genuine and posed smiles obtained from the UvA-NEMO Smile Database. After utilizing the facial analysis tool OpenFace, we further propose a new approach to segmenting the onset, apex, and offset phases of smiles, as well as a means of measuring irregularities and symmetry in facial expressions. We extracted these features according to 2D and 3D coordinates, and conducted an analysis. The results reveal that genuine smiles have higher values for onset, offset, apex, and total durations, as well as offset displacement, and a variable we termed Irregularity-b (the SD of the apex phase) than do posed smiles. Conversely, values tended to be lower for onset and offset Speeds, and Irregularity-a (the rate of peaks), Symmetry-a (the correlation between left and right facial movements), and Symmetry-d (differences in onset frame numbers between the left and right faces). The findings from the present study have been compared to those of previous research, and certain speculations are made.
Collapse
Affiliation(s)
- Hui Guo
- Wenzhou 7th People's Hospital, Wenzhou, China
| | - Xiao-Hui Zhang
- Institute of Psychology and Behavior Sciences, Wenzhou University, Wenzhou, China
| | - Jun Liang
- Institute of Psychology and Behavior Sciences, Wenzhou University, Wenzhou, China
| | - Wen-Jing Yan
- Institute of Psychology and Behavior Sciences, Wenzhou University, Wenzhou, China
| |
Collapse
|
40
|
Caeiro C, Guo K, Mills D. Dogs and humans respond to emotionally competent stimuli by producing different facial actions. Sci Rep 2017; 7:15525. [PMID: 29138393 PMCID: PMC5686192 DOI: 10.1038/s41598-017-15091-4] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2017] [Accepted: 10/20/2017] [Indexed: 01/17/2023] Open
Abstract
The commonality of facial expressions of emotion has been studied in different species since Darwin, with most of the research focusing on closely related primate species. However, it is unclear to what extent there exists common facial expression in species more phylogenetically distant, but sharing a need for common interspecific emotional understanding. Here we used the objective, anatomically-based tools, FACS and DogFACS (Facial Action Coding Systems), to quantify and compare human and domestic dog facial expressions in response to emotionally-competent stimuli associated with different categories of emotional arousal. We sought to answer two questions: Firstly, do dogs display specific discriminatory facial movements in response to different categories of emotional stimuli? Secondly, do dogs display similar facial movements to humans when reacting in emotionally comparable contexts? We found that dogs displayed distinctive facial actions depending on the category of stimuli. However, dogs produced different facial movements to humans in comparable states of emotional arousal. These results refute the commonality of emotional expression across mammals, since dogs do not display human-like facial expressions. Given the unique interspecific relationship between dogs and humans, two highly social but evolutionarily distant species sharing a common environment, these findings give new insight into the origin of emotion expression.
Collapse
Affiliation(s)
- Cátia Caeiro
- School of Psychology, University of Lincoln, Lincoln, UK. .,School of Life Sciences, University of Lincoln, Lincoln, UK.
| | - Kun Guo
- School of Psychology, University of Lincoln, Lincoln, UK
| | - Daniel Mills
- School of Life Sciences, University of Lincoln, Lincoln, UK
| |
Collapse
|
41
|
Rychlowska M, Jack RE, Garrod OGB, Schyns PG, Martin JD, Niedenthal PM. Functional Smiles: Tools for Love, Sympathy, and War. Psychol Sci 2017; 28:1259-1270. [PMID: 28741981 DOI: 10.1177/0956797617706082] [Citation(s) in RCA: 66] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
A smile is the most frequent facial expression, but not all smiles are equal. A social-functional account holds that smiles of reward, affiliation, and dominance serve basic social functions, including rewarding behavior, bonding socially, and negotiating hierarchy. Here, we characterize the facial-expression patterns associated with these three types of smiles. Specifically, we modeled the facial expressions using a data-driven approach and showed that reward smiles are symmetrical and accompanied by eyebrow raising, affiliative smiles involve lip pressing, and dominance smiles are asymmetrical and contain nose wrinkling and upper-lip raising. A Bayesian-classifier analysis and a detection task revealed that the three smile types are highly distinct. Finally, social judgments made by a separate participant group showed that the different smile types convey different social messages. Our results provide the first detailed description of the physical form and social messages conveyed by these three types of functional smiles and document the versatility of these facial expressions.
Collapse
Affiliation(s)
| | - Rachael E Jack
- 2 School of Psychology, University of Glasgow.,3 Institute of Neuroscience and Psychology, University of Glasgow
| | | | | | - Jared D Martin
- 4 Department of Psychology, University of Wisconsin-Madison
| | | |
Collapse
|
42
|
Leppanen J, Dapelo MM, Davies H, Lang K, Treasure J, Tchanturia K. Computerised analysis of facial emotion expression in eating disorders. PLoS One 2017; 12:e0178972. [PMID: 28575109 PMCID: PMC5456367 DOI: 10.1371/journal.pone.0178972] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2017] [Accepted: 05/22/2017] [Indexed: 11/24/2022] Open
Abstract
Background Problems with social-emotional processing are known to be an important contributor to the development and maintenance of eating disorders (EDs). Diminished facial communication of emotion has been frequently reported in individuals with anorexia nervosa (AN). Less is known about facial expressivity in bulimia nervosa (BN) and in people who have recovered from AN (RecAN). This study aimed to pilot the use of computerised facial expression analysis software to investigate emotion expression across the ED spectrum and recovery in a large sample of participants. Method 297 participants with AN, BN, RecAN, and healthy controls were recruited. Participants watched film clips designed to elicit happy or sad emotions, and facial expressions were then analysed using FaceReader. Results The finding mirrored those from previous work showing that healthy control and RecAN participants expressed significantly more positive emotions during the positive clip compared to the AN group. There were no differences in emotion expression during the sad film clip. Discussion These findings support the use of computerised methods to analyse emotion expression in EDs. The findings also demonstrate that reduced positive emotion expression is likely to be associated with the acute stage of AN illness, with individuals with BN showing an intermediate profile.
Collapse
Affiliation(s)
- Jenni Leppanen
- Department of Psychological Medicine, Institute of Psychology, Psychiatry, and Neuroscience, King’s College London, London, United Kingdom
| | - Marcela Marin Dapelo
- Department of Psychological Medicine, Institute of Psychology, Psychiatry, and Neuroscience, King’s College London, London, United Kingdom
| | - Helen Davies
- Department of Psychological Medicine, Institute of Psychology, Psychiatry, and Neuroscience, King’s College London, London, United Kingdom
| | - Katie Lang
- Department of Psychological Medicine, Institute of Psychology, Psychiatry, and Neuroscience, King’s College London, London, United Kingdom
| | - Janet Treasure
- Department of Psychological Medicine, Institute of Psychology, Psychiatry, and Neuroscience, King’s College London, London, United Kingdom
| | - Kate Tchanturia
- Department of Psychological Medicine, Institute of Psychology, Psychiatry, and Neuroscience, King’s College London, London, United Kingdom
- Illia State University, Department of Psychology, Tbilisi, Georgia
- * E-mail:
| |
Collapse
|
43
|
Albuquerque N, Guo K, Wilkinson A, Savalli C, Otta E, Mills D. Dogs recognize dog and human emotions. Biol Lett 2017; 12:20150883. [PMID: 26763220 DOI: 10.1098/rsbl.2015.0883] [Citation(s) in RCA: 167] [Impact Index Per Article: 23.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022] Open
Abstract
The perception of emotional expressions allows animals to evaluate the social intentions and motivations of each other. This usually takes place within species; however, in the case of domestic dogs, it might be advantageous to recognize the emotions of humans as well as other dogs. In this sense, the combination of visual and auditory cues to categorize others' emotions facilitates the information processing and indicates high-level cognitive representations. Using a cross-modal preferential looking paradigm, we presented dogs with either human or dog faces with different emotional valences (happy/playful versus angry/aggressive) paired with a single vocalization from the same individual with either a positive or negative valence or Brownian noise. Dogs looked significantly longer at the face whose expression was congruent to the valence of vocalization, for both conspecifics and heterospecifics, an ability previously known only in humans. These results demonstrate that dogs can extract and integrate bimodal sensory emotional information, and discriminate between positive and negative emotions from both humans and dogs.
Collapse
Affiliation(s)
- Natalia Albuquerque
- School of Life Sciences, University of Lincoln, Lincoln LN6 7DL, UK Department of Experimental Psychology, Institute of Psychology, University of São Paulo, São Paulo 05508-030, Brazil
| | - Kun Guo
- School of Psychology, University of Lincoln, Lincoln LN6 7DL, UK
| | - Anna Wilkinson
- School of Life Sciences, University of Lincoln, Lincoln LN6 7DL, UK
| | - Carine Savalli
- Department of Public Politics and Public Health, Federal University of São Paulo, Santos 11015-020, Brazil
| | - Emma Otta
- Department of Experimental Psychology, Institute of Psychology, University of São Paulo, São Paulo 05508-030, Brazil
| | - Daniel Mills
- School of Life Sciences, University of Lincoln, Lincoln LN6 7DL, UK
| |
Collapse
|
44
|
Quinto-Sánchez M, Cintas C, Silva de Cerqueira CC, Ramallo V, Acuña-Alonzo V, Adhikari K, Castillo L, Gomez-Valdés J, Everardo P, De Avila F, Hünemeier T, Jaramillo C, Arias W, Fuentes M, Gallo C, Poletti G, Schuler-Faccini L, Bortolini MC, Canizales-Quinteros S, Rothhammer F, Bedoya G, Rosique J, Ruiz-Linares A, González-José R. Socioeconomic Status Is Not Related with Facial Fluctuating Asymmetry: Evidence from Latin-American Populations. PLoS One 2017; 12:e0169287. [PMID: 28060876 PMCID: PMC5218465 DOI: 10.1371/journal.pone.0169287] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2016] [Accepted: 12/14/2016] [Indexed: 11/18/2022] Open
Abstract
The expression of facial asymmetries has been recurrently related with poverty and/or disadvantaged socioeconomic status. Departing from the developmental instability theory, previous approaches attempted to test the statistical relationship between the stress experienced by individuals grown in poor conditions and an increase in facial and corporal asymmetry. Here we aim to further evaluate such hypothesis on a large sample of admixed Latin Americans individuals by exploring if low socioeconomic status individuals tend to exhibit greater facial fluctuating asymmetry values. To do so, we implement Procrustes analysis of variance and Hierarchical Linear Modelling (HLM) to estimate potential associations between facial fluctuating asymmetry values and socioeconomic status. We report significant relationships between facial fluctuating asymmetry values and age, sex, and genetic ancestry, while socioeconomic status failed to exhibit any strong statistical relationship with facial asymmetry. These results are persistent after the effect of heterozygosity (a proxy for genetic ancestry) is controlled in the model. Our results indicate that, at least on the studied sample, there is no relationship between socioeconomic stress (as intended as low socioeconomic status) and facial asymmetries.
Collapse
Affiliation(s)
- Mirsha Quinto-Sánchez
- Grupo de Investigación en Biología Evolutiva Humana, Instituto Patagónico de Ciencias Sociales y Humanas, Centro Nacional Patagónico, CONICET. Puerto Madryn, Chubut, Argentina
- Ciencia Forense, Facultad de Medicina, Universidad Nacional Autónoma de México, Ciudad de México, México
| | - Celia Cintas
- Grupo de Investigación en Biología Evolutiva Humana, Instituto Patagónico de Ciencias Sociales y Humanas, Centro Nacional Patagónico, CONICET. Puerto Madryn, Chubut, Argentina
| | - Caio Cesar Silva de Cerqueira
- Grupo de Investigación en Biología Evolutiva Humana, Instituto Patagónico de Ciencias Sociales y Humanas, Centro Nacional Patagónico, CONICET. Puerto Madryn, Chubut, Argentina
- Ciencia Forense, Facultad de Medicina, Universidad Nacional Autónoma de México, Ciudad de México, México
- Superintendência da Polícia Técnico-Científica do Estado de São Paulo. Equipe de Perícias Criminalísticas de Ourinhos, São Paulo, Brazil
| | - Virginia Ramallo
- Grupo de Investigación en Biología Evolutiva Humana, Instituto Patagónico de Ciencias Sociales y Humanas, Centro Nacional Patagónico, CONICET. Puerto Madryn, Chubut, Argentina
| | - Victor Acuña-Alonzo
- Department of Genetics, Evolution and Environment, and UCL Genetics Institute, University College London, London, United Kingdom
- Escuela Nacional de Antropología e Historia. Instituto Nacional de Antropología e Historia, Ciudad de México, México
| | - Kaustubh Adhikari
- Department of Genetics, Evolution and Environment, and UCL Genetics Institute, University College London, London, United Kingdom
| | - Lucía Castillo
- Grupo de Investigación en Biología Evolutiva Humana, Instituto Patagónico de Ciencias Sociales y Humanas, Centro Nacional Patagónico, CONICET. Puerto Madryn, Chubut, Argentina
| | - Jorge Gomez-Valdés
- Posgrado en Antropología Física, Escuela Nacional de Antropología e Historia, Ciudad de México, México
| | - Paola Everardo
- Escuela Nacional de Antropología e Historia. Instituto Nacional de Antropología e Historia, Ciudad de México, México
| | - Francisco De Avila
- Escuela Nacional de Antropología e Historia. Instituto Nacional de Antropología e Historia, Ciudad de México, México
| | - Tábita Hünemeier
- Departamento de Genética e Biologia Evolutiva, Instituto de Biociências, Universidade de São Paulo
| | | | | | - Macarena Fuentes
- Department of Genetics, Evolution and Environment, and UCL Genetics Institute, University College London, London, United Kingdom
- Departamento de Técnología Médica, Facultad de Ciencias de la Salud, Universidad de Tarapacá, Arica, Chile
| | - Carla Gallo
- Laboratorios de Investigación y Desarrollo, Facultad de Ciencias y Filosofía, Universidad Peruana Cayetano Heredia, Lima, Perú
| | - Giovani Poletti
- Laboratorios de Investigación y Desarrollo, Facultad de Ciencias y Filosofía, Universidad Peruana Cayetano Heredia, Lima, Perú
| | - Lavinia Schuler-Faccini
- Departamento de Genética, Instituto de Biociências, Universidade Federal do Rio Grande do Sul, Porto Alegre, Brasil
| | - Maria Cátira Bortolini
- Departamento de Genética, Instituto de Biociências, Universidade Federal do Rio Grande do Sul, Porto Alegre, Brasil
| | - Samuel Canizales-Quinteros
- Unidad de Genómica de Poblaciones Aplicada a la Salud, Facultad de Química, UNAM-Instituto Nacional de Medicina Genómica, Ciudad de México, México
| | | | | | - Javier Rosique
- Departamento de Antropología. Facultad de Ciencias Sociales y Humanas. Universidad de Antioquia, Medellín, Colombia
| | - Andrés Ruiz-Linares
- Department of Genetics, Evolution and Environment, and UCL Genetics Institute, University College London, London, United Kingdom
- MOE Key Laboratory of Contemporary Anthropology, Fudan University, Shanghai, China
- Aix Marseille Univ, CNRS, EFS, ADES, Marseille, France
| | - Rolando González-José
- Grupo de Investigación en Biología Evolutiva Humana, Instituto Patagónico de Ciencias Sociales y Humanas, Centro Nacional Patagónico, CONICET. Puerto Madryn, Chubut, Argentina
| |
Collapse
|
45
|
Du S, Martinez AM. Compound facial expressions of emotion: from basic research to clinical applications. DIALOGUES IN CLINICAL NEUROSCIENCE 2016. [PMID: 26869845 PMCID: PMC4734882 DOI: 10.31887/dcns.2015.17.4/sdu] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Emotions are sometimes revealed through facial expressions. When these natural facial articulations involve the contraction of the same muscle groups in people of distinct cultural upbringings, this is taken as evidence of a biological origin of these emotions. While past research had identified facial expressions associated with a single internally felt category (eg, the facial expression of happiness when we feel joyful), we have recently studied facial expressions observed when people experience compound emotions (eg, the facial expression of happy surprise when we feel joyful in a surprised way, as, for example, at a surprise birthday party). Our research has identified 17 compound expressions consistently produced across cultures, suggesting that the number of facial expressions of emotion of biological origin is much larger than previously believed. The present paper provides an overview of these findings and shows evidence supporting the view that spontaneous expressions are produced using the same facial articulations previously identified in laboratory experiments. We also discuss the implications of our results in the study of psychopathologies, and consider several open research questions.
Collapse
Affiliation(s)
- Shichuan Du
- LENA Research Foundation, Boulder, Colorado, USA
| | | |
Collapse
|
46
|
Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze. PLoS One 2016; 11:e0168307. [PMID: 27942030 PMCID: PMC5152920 DOI: 10.1371/journal.pone.0168307] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2016] [Accepted: 11/30/2016] [Indexed: 11/19/2022] Open
Abstract
The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down.
Collapse
|
47
|
Tramacere A, Pievani T, Ferrari PF. Mirror neurons in the tree of life: mosaic evolution, plasticity and exaptation of sensorimotor matching responses. Biol Rev Camb Philos Soc 2016; 92:1819-1841. [PMID: 27862868 DOI: 10.1111/brv.12310] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2015] [Revised: 10/05/2016] [Accepted: 10/10/2016] [Indexed: 12/31/2022]
Abstract
Considering the properties of mirror neurons (MNs) in terms of development and phylogeny, we offer a novel, unifying, and testable account of their evolution according to the available data and try to unify apparently discordant research, including the plasticity of MNs during development, their adaptive value and their phylogenetic relationships and continuity. We hypothesize that the MN system reflects a set of interrelated traits, each with an independent natural history due to unique selective pressures, and propose that there are at least three evolutionarily significant trends that gave raise to three subtypes: hand visuomotor, mouth visuomotor, and audio-vocal. Specifically, we put forward a mosaic evolution hypothesis, which posits that different types of MNs may have evolved at different rates within and among species. This evolutionary hypothesis represents an alternative to both adaptationist and associative models. Finally, the review offers a strong heuristic potential in predicting the circumstances under which specific variations and properties of MNs are expected. Such predictive value is critical to test new hypotheses about MN activity and its plastic changes, depending on the species, the neuroanatomical substrates, and the ecological niche.
Collapse
Affiliation(s)
- Antonella Tramacere
- Department of Neuroscience, University of Parma, Parma, 43100, Italy.,Deutsche Primaten Zentrum - Lichtenberg-Kolleg, Institute for Advanced Study, 37083, Göttingen, Germany
| | - Telmo Pievani
- Department of Biology, University of Padua, Padua, 35131, Italy
| | - Pier F Ferrari
- Department of Neuroscience, University of Parma, Parma, 43100, Italy.,Institut des Sciences Cognitives 'Marc Jeannerod', CNRS/Université Claude Bernard Lyon, 69675, Bron Cedex, France
| |
Collapse
|
48
|
Vieira JB, Tavares TP, Marsh AA, Mitchell DGV. Emotion and personal space: Neural correlates of approach-avoidance tendencies to different facial expressions as a function of coldhearted psychopathic traits. Hum Brain Mapp 2016; 38:1492-1506. [PMID: 27859920 DOI: 10.1002/hbm.23467] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2016] [Revised: 10/25/2016] [Accepted: 11/06/2016] [Indexed: 12/30/2022] Open
Abstract
In social interactions, humans are expected to regulate interpersonal distance in response to the emotion displayed by others. Yet, the neural mechanisms implicated in approach-avoidance tendencies to distinct emotional expressions have not been fully described. Here, we investigated the neural systems implicated in regulating the distance to different emotions, and how they vary as a function of empathy. Twenty-three healthy participants assessed for psychopathic traits underwent fMRI scanning while they viewed approaching and withdrawing angry, fearful, happy, sad and neutral faces. Participants were also asked to set the distance to those faces on a computer screen, and to adjust the physical distance from the experimenter outside the scanner. Participants kept the greatest distances from angry faces, and shortest from happy expressions. This was accompanied by increased activation in the dorsomedial prefrontal and orbitofrontal cortices, inferior frontal gyrus, and temporoparietal junction for angry and happy expressions relative to the other emotions. Irrespective of emotion, longer distances were kept from approaching faces, which was associated with increased activation in the amygdala and insula, as well as parietal and prefrontal regions. Amygdala activation was positively correlated with greater preferred distances to angry, fearful and sad expressions. Moreover, participants scoring higher on coldhearted psychopathic traits (lower empathy) showed reduced amygdala activation to sad expressions. These findings elucidate the neural mechanisms underlying social approach-avoidance, and how they are related to variations in empathy. Hum Brain Mapp 38:1492-1506, 2017. © 2016 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Joana B Vieira
- Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Anatomy and Cell Biology, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada
| | - Tamara P Tavares
- Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada
| | - Abigail A Marsh
- Department of Psychology, Georgetown University, Washington, DC, USA
| | - Derek G V Mitchell
- Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Anatomy and Cell Biology, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada.,Department of Psychiatry, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada.,Department of Psychology, The University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
49
|
Wierzchoń M, Wronka E, Paulewicz B, Szczepanowski R. Post-Decision Wagering Affects Metacognitive Awareness of Emotional Stimuli: An Event Related Potential Study. PLoS One 2016; 11:e0159516. [PMID: 27490816 PMCID: PMC4973871 DOI: 10.1371/journal.pone.0159516] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2015] [Accepted: 07/05/2016] [Indexed: 12/01/2022] Open
Abstract
The present research investigated metacognitive awareness of emotional stimuli and its psychophysiological correlates. We used a backward masking task presenting participants with fearful or neutral faces. We asked participants for face discrimination and then probed their metacognitive awareness with confidence rating (CR) and post-decision wagering (PDW) scales. We also analysed psychophysiological correlates of awareness with event-related potential (ERP) components: P1, N170, early posterior negativity (EPN), and P3. We have not observed any differences between PDW and CR conditions in the emotion identification task. However, the "aware" ratings were associated with increased accuracy performance. This effect was more pronounced in PDW, especially for fearful faces, suggesting that emotional stimuli awareness may be enhanced by monetary incentives. EEG analysis showed larger N170, EPN and P3 amplitudes in aware compared to unaware trials. It also appeared that both EPN and P3 ERP components were more pronounced in the PDW condition, especially when emotional faces were presented. Taken together, our ERP findings suggest that metacognitive awareness of emotional stimuli depends on the effectiveness of both early and late visual information processing. Our study also indicates that awareness of emotional stimuli can be enhanced by the motivation induced by wagering.
Collapse
Affiliation(s)
- Michał Wierzchoń
- Consciousness Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland
- * E-mail:
| | - Eligiusz Wronka
- Psychophysiology Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland
| | - Borysław Paulewicz
- SWPS University of Social Science and Humanities, Faculty in Katowice, Poland
| | | |
Collapse
|
50
|
Rymarczyk K, Żurawski Ł, Jankowiak-Siuda K, Szatkowska I. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry? PLoS One 2016; 11:e0158534. [PMID: 27390867 PMCID: PMC4938565 DOI: 10.1371/journal.pone.0158534] [Citation(s) in RCA: 42] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2015] [Accepted: 06/17/2016] [Indexed: 11/18/2022] Open
Abstract
Facial mimicry is the spontaneous response to others’ facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation). In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing.
Collapse
Affiliation(s)
- Krystyna Rymarczyk
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities, Warsaw, Poland
- * E-mail: (KR); (ŁŻ)
| | - Łukasz Żurawski
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
- * E-mail: (KR); (ŁŻ)
| | - Kamila Jankowiak-Siuda
- Department of Experimental Psychology, Institute of Cognitive and Behavioural Neuroscience, University of Social Sciences and Humanities, Warsaw, Poland
| | - Iwona Szatkowska
- Laboratory of Psychophysiology, Department of Neurophysiology, Nencki Institute of Experimental Biology of Polish Academy of Sciences, Warsaw, Poland
| |
Collapse
|