76
|
Krumhuber EG, Tamarit L, Roesch EB, Scherer KR. FACSGen 2.0 animation software: generating three-dimensional FACS-valid facial expressions for emotion research. ACTA ACUST UNITED AC 2012; 12:351-63. [PMID: 22251045 DOI: 10.1037/a0026632] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
In this article, we present FACSGen 2.0, new animation software for creating static and dynamic three-dimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces. In two studies, we tested the validity of the software for the AU appearance defined in the FACS manual and the conveyed emotionality of FACSGen expressions. In Experiment 1, four FACS-certified coders evaluated the complete set of 35 single AUs and 54 AU combinations for AU presence or absence, appearance quality, intensity, and asymmetry. In Experiment 2, lay participants performed a recognition task on emotional expressions created with FACSGen software and rated the similarity of expressions displayed by human and FACSGen faces. Results showed good to excellent classification levels for all AUs by the four FACS coders, suggesting that the AUs are valid exemplars of FACS specifications. Lay participants' recognition rates for nine emotions were high, and comparisons of human and FACSGen expressions were very similar. The findings demonstrate the effectiveness of the software in producing reliable and emotionally valid expressions, and suggest its application in numerous scientific areas, including perception, emotion, and clinical and neuroscience research.
Collapse
|
77
|
Abstract
Affect bursts consist of spontaneous and short emotional expressions in which facial, vocal, and gestural components are highly synchronized. Although the vocal characteristics have been examined in several recent studies, the facial modality remains largely unexplored. This study investigated the facial correlates of affect bursts that expressed five different emotions: anger, fear, sadness, joy, and relief. Detailed analysis of 59 facial actions with the Facial Action Coding System revealed a reasonable degree of emotion differentiation for individual action units (AUs). However, less convergence was shown for specific AU combinations for a limited number of prototypes. Moreover, expression of facial actions peaked in a cumulative-sequential fashion with significant differences in their sequential appearance between emotions. When testing for the classification of facial expressions within a dimensional approach, facial actions differed significantly as a function of the valence and arousal level of the five emotions, thereby allowing further distinction between joy and relief. The findings cast doubt on the existence of fixed patterns of facial responses for each emotion, resulting in unique facial prototypes. Rather, the results suggest that each emotion can be portrayed by several different expressions that share multiple facial actions.
Collapse
|
78
|
Mortillaro M, Meuleman B, Scherer KR. Advocating a Componential Appraisal Model to Guide Emotion Recognition. ACTA ACUST UNITED AC 2012. [DOI: 10.4018/jse.2012010102] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Most models of automatic emotion recognition use a discrete perspective and a black-box approach, i.e., they output an emotion label chosen from a limited pool of candidate terms, on the basis of purely statistical methods. Although these models are successful in emotion classification, a number of practical and theoretical drawbacks limit the range of possible applications. In this paper, the authors suggest the adoption of an appraisal perspective in modeling emotion recognition. The authors propose to use appraisals as an intermediate layer between expressive features (input) and emotion labeling (output). The model would then be made of two parts: first, expressive features would be used to estimate appraisals; second, resulting appraisals would be used to predict an emotion label. While the second part of the model has already been the object of several studies, the first is unexplored. The authors argue that this model should be built on the basis of both theoretical predictions and empirical results about the link between specific appraisals and expressive features. For this purpose, the authors suggest to use the component process model of emotion, which includes detailed predictions of efferent effects of appraisals on facial expression, voice, and body movements.
Collapse
|
79
|
Dael N, Mortillaro M, Scherer KR. Emotion expression in body action and posture. Emotion 2012; 12:1085-101. [DOI: 10.1037/a0025737] [Citation(s) in RCA: 224] [Impact Index Per Article: 18.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
80
|
Bänziger T, Mortillaro M, Scherer KR. Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception. ACTA ACUST UNITED AC 2011; 12:1161-79. [PMID: 22081890 DOI: 10.1037/a0025827] [Citation(s) in RCA: 158] [Impact Index Per Article: 12.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Research on the perception of emotional expressions in faces and voices is exploding in psychology, the neurosciences, and affective computing. This article provides an overview of some of the major emotion expression (EE) corpora currently available for empirical research and introduces a new, dynamic, multimodal corpus of emotion expressions, the Geneva Multimodal Emotion Portrayals Core Set (GEMEP-CS). The design features of the corpus are outlined and justified, and detailed validation data for the core set selection are presented and discussed. Finally, an associated database with microcoded facial, vocal, and body action elements, as well as observer ratings, is introduced.
Collapse
|
81
|
Scherer KR, Scherer U. Assessing the Ability to Recognize Facial and Vocal Expressions of Emotion: Construction and Validation of the Emotion Recognition Index. JOURNAL OF NONVERBAL BEHAVIOR 2011. [DOI: 10.1007/s10919-011-0115-4] [Citation(s) in RCA: 41] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
|
82
|
Brosch T, Coppin G, Scherer KR, Schwartz S, Sander D. Generating value(s): Psychological value hierarchies reflect context-dependent sensitivity of the reward system. Soc Neurosci 2011; 6:198-208. [DOI: 10.1080/17470919.2010.506754] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
83
|
Aue T, Scherer KR. Effects of intrinsic pleasantness and goal conduciveness appraisals on somatovisceral responding: Somewhat similar, but not identical. Biol Psychol 2011; 86:65-73. [DOI: 10.1016/j.biopsycho.2010.10.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2009] [Revised: 09/29/2010] [Accepted: 10/13/2010] [Indexed: 10/18/2022]
|
84
|
Krumhuber EG, Scherer KR. "Affect bursts: Dynamic patterns of facial expressions": Correction to Krumhuber and Scherer (2011). Emotion 2011. [DOI: 10.1037/a0025007] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
85
|
Mortillaro M, Mehu M, Scherer KR. Subtly Different Positive Emotions Can Be Distinguished by Their Facial Expressions. SOCIAL PSYCHOLOGICAL AND PERSONALITY SCIENCE 2010. [DOI: 10.1177/1948550610389080] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Positive emotions are crucial to social relationships and social interaction. Although smiling is a frequently studied facial action, investigations of positive emotional expressions are underrepresented in the literature. This may be partly because of the assumption that all positive emotions share the smile as a common signal but lack specific facial configurations. The present study investigated prototypical expressions of four positive emotions—interest, pride, pleasure, and joy. The Facial Action Coding System was used to microcode facial expression of representative samples of these emotions taken from the Geneva Multimodal Emotion Portrayal corpus. The data showed that the frequency and duration of several action units differed between emotions, indicating that actors did not use the same pattern of expression to encode them. The authors argue that an appraisal perspective is suitable to describe how subtly differentiated positive emotional states differ in their prototypical facial expressions.
Collapse
|
86
|
Korb S, Grandjean D, Scherer KR. Timing and voluntary suppression of facial mimicry to smiling faces in a Go/NoGo task—An EMG study. Biol Psychol 2010; 85:347-9. [DOI: 10.1016/j.biopsycho.2010.07.012] [Citation(s) in RCA: 48] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2009] [Revised: 03/23/2010] [Accepted: 07/19/2010] [Indexed: 10/19/2022]
|
87
|
Roesch EB, Sander D, Mumenthaler C, Kerzel D, Scherer KR. Psychophysics of emotion: the QUEST for emotional attention. J Vis 2010; 10:4.1-9. [PMID: 20377281 DOI: 10.1167/10.3.4] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2008] [Accepted: 12/12/2009] [Indexed: 11/24/2022] Open
Abstract
To investigate the mechanisms involved in automatic processing of facial expressions, we used the QUEST procedure to measure the display durations needed to make a gender decision on emotional faces portraying fearful, happy, or neutral facial expressions. In line with predictions of appraisal theories of emotion, our results showed greater processing priority of emotional stimuli regardless of their valence. Whereas all experimental conditions led to an averaged threshold of about 50 ms, fearful and happy facial expressions led to significantly less variability in the responses than neutral faces. Results suggest that attention may have been automatically drawn by the emotion portrayed by face targets, yielding more informative perceptions and less variable responses. The temporal resolution of the perceptual system (expressed by the thresholds) and the processing priority of the stimuli (expressed by the variability in the responses) may influence subjective and objective measures of awareness, respectively.
Collapse
|
88
|
Scherer KR. Emotions are emergent processes: they require a dynamic computational architecture. Philos Trans R Soc Lond B Biol Sci 2010; 364:3459-74. [PMID: 19884141 DOI: 10.1098/rstb.2009.0141] [Citation(s) in RCA: 126] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Emotion is a cultural and psychobiological adaptation mechanism which allows each individual to react flexibly and dynamically to environmental contingencies. From this claim flows a description of the elements theoretically needed to construct a virtual agent with the ability to display human-like emotions and to respond appropriately to human emotional expression. This article offers a brief survey of the desirable features of emotion theories that make them ideal blueprints for agent models. In particular, the component process model of emotion is described, a theory which postulates emotion-antecedent appraisal on different levels of processing that drive response system patterning predictions. In conclusion, investing seriously in emergent computational modelling of emotion using a nonlinear dynamic systems approach is suggested.
Collapse
|
89
|
Deonna JA, Scherer KR. The Case of the Disappearing Intentional Object: Constraints on a Definition of Emotion. EMOTION REVIEW 2009. [DOI: 10.1177/1754073909345544] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Taking our lead from Solomon’s emphasis on the importance of the intentional object of emotion, we review the history of repeated attempts to make this object disappear. We adduce evidence suggesting that in the case of James and Schachter, the intentional object got lost unintentionally. By contrast, modern constructivists (in particular Barrett) seem quite determined to deny the centrality of the intentional object in accounting for the occurrence of emotions. Griffiths, however, downplays the role objects have in emotion noting that these do not qualify as intentional. We argue that these disappearing acts, deliberate or not, generate fruitless debate and add little to the advancement of our understanding of emotion as an adaptive mechanism to cope with events that are relevant to an organism’s life.
Collapse
|
90
|
|
91
|
Brosch T, Grandjean D, Sander D, Scherer KR. Cross-modal Emotional Attention: Emotional Voices Modulate Early Stages of Visual Processing. J Cogn Neurosci 2009; 21:1670-9. [DOI: 10.1162/jocn.2009.21110] [Citation(s) in RCA: 59] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Emotional attention, the boosting of the processing of emotionally relevant stimuli, has, up to now, mainly been investigated within a sensory modality, for instance, by using emotional pictures to modulate visual attention. In real-life environments, however, humans typically encounter simultaneous input to several different senses, such as vision and audition. As multiple signals entering different channels might originate from a common, emotionally relevant source, the prioritization of emotional stimuli should be able to operate across modalities. In this study, we explored cross-modal emotional attention. Spatially localized utterances with emotional and neutral prosody served as cues for a visually presented target in a cross-modal dot-probe task. Participants were faster to respond to targets that appeared at the spatial location of emotional compared to neutral prosody. Event-related brain potentials revealed emotional modulation of early visual target processing at the level of the P1 component, with neural sources in the striate visual cortex being more active for targets that appeared at the spatial location of emotional compared to neutral prosody. These effects were not found using synthesized control sounds matched for mean fundamental frequency and amplitude envelope. These results show that emotional attention can operate across sensory modalities by boosting early sensory stages of processing, thus facilitating the multimodal assessment of emotionally relevant stimuli in the environment.
Collapse
|
92
|
Delplanque S, Grandjean D, Chrea C, Coppin G, Aymard L, Cayeux I, Margot C, Velazco MI, Sander D, Scherer KR. Sequential unfolding of novelty and pleasantness appraisals of odors: evidence from facial electromyography and autonomic reactions. ACTA ACUST UNITED AC 2009; 9:316-28. [PMID: 19485609 DOI: 10.1037/a0015369] [Citation(s) in RCA: 87] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
We investigated the effects of odors on appraisal processes and consequent emotional responses. The main goal was to test whether an odor is detected as novel or familiar before it is evaluated as pleasant or unpleasant. Participants performed a recognition task in which they were presented with pairs of unpleasant or pleasant odors (sample and target odors). Within a pair, the sample and target were either identical or different to assess participants' novelty detection; unpleasant and pleasant target odors were contrasted to examine participants' appraisal of intrinsic pleasantness. We measured facial expressions using electromyography and physiological reactions using electrocardiogram and electrodermal activity in response to odors. The earliest effects on facial muscles and heart rate occurred in response to novelty detection. Later effects on facial muscles and heart rate were related to pleasantness evaluation. This study is the first to demonstrate the existence of a sequence of appraisal checks for odors eliciting emotional reaction.
Collapse
|
93
|
Scherer KR. Stereotype change following exposure to counter‐stereotypical media heroes. ACTA ACUST UNITED AC 2009. [DOI: 10.1080/08838157009363629] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
94
|
Bänziger T, Grandjean D, Scherer KR. Emotion recognition from expressions in face, voice, and body: The Multimodal Emotion Recognition Test (MERT). Emotion 2009; 9:691-704. [DOI: 10.1037/a0017088] [Citation(s) in RCA: 206] [Impact Index Per Article: 13.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
95
|
Flykt A, Dan ES, Scherer KR. Using a Probe Detection Task to Assess the Timing of Intrinsic Pleasantness Appraisals. SWISS JOURNAL OF PSYCHOLOGY 2009. [DOI: 10.1024/1421-0185.68.3.161] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
The occurrence and timing of emotion-antecedent appraisal checks are difficult to assess. We report an attempt to estimate the time window of the intrinsic pleasantness check using a dual-task probe paradigm. In three experiments, participants viewed negative and positive pictures. Their other task was speeded response on a probe superimposed on the pictures with different stimulus onset asynchronies (SOAs). Longer probe-reaction times were observed for negative than for positive pictures. This effect appeared at SOA 300 ms or 350 ms, suggesting that the intrinsic pleasantness appraisal check yields a differential behavioral outcome around 300 ms after stimulus onset, and seems to continue unless attention to picture content is inhibited. This paradigm might be successfully used for the mental chronography of appraisal processes.
Collapse
|
96
|
Aue T, Scherer KR. Appraisal-driven somatovisceral response patterning: Effects of intrinsic pleasantness and goal conduciveness. Biol Psychol 2008; 79:158-64. [DOI: 10.1016/j.biopsycho.2008.04.004] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2007] [Revised: 04/06/2008] [Accepted: 04/06/2008] [Indexed: 10/22/2022]
|
97
|
Scherer KR, Grandjean D. Facial expressions allow inference of both emotions and their components. Cogn Emot 2008. [DOI: 10.1080/02699930701516791] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
98
|
Brosch T, Sander D, Pourtois G, Scherer KR. Beyond fear: rapid spatial orienting toward positive emotional stimuli. Psychol Sci 2008; 19:362-70. [PMID: 18399889 DOI: 10.1111/j.1467-9280.2008.02094.x] [Citation(s) in RCA: 232] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
Abstract
There is much empirical evidence for modulation of attention by negative -- particularly fear-relevant -- emotional stimuli. This modulation is often explained in terms of a fear module. Appraisal theories of emotion posit a more general mechanism, predicting attention capture by stimuli that are relevant for the needs and goals of the organism, regardless of valence. To examine the brain-activation patterns underlying attentional modulation, we recorded event-related potentials from 20 subjects performing a dot-probe task in which the cues were fear-inducing and nurturance-inducing stimuli (i.e., anger faces and baby faces). Highly similar validity modulation was found for the P1 time-locked to target onset, indicating early attentional capture by both positive and negative emotional stimuli. Topographic segmentation analysis and source localization indicate that the same amplification process is involved whether attention orienting is triggered by negative, fear-relevant stimuli or positive, nurturance-relevant stimuli. These results confirm that biological relevance, and not exclusively fear, produces an automatic spatial orienting toward the location of a stimulus.
Collapse
|
99
|
|
100
|
Grandjean D, Sander D, Scherer KR. Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization. Conscious Cogn 2008; 17:484-95. [DOI: 10.1016/j.concog.2008.03.019] [Citation(s) in RCA: 90] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2008] [Accepted: 03/11/2008] [Indexed: 10/22/2022]
|