1
|
Pisanski K, Reby D, Oleszkiewicz A. Humans need auditory experience to produce typical volitional nonverbal vocalizations. COMMUNICATIONS PSYCHOLOGY 2024; 2:65. [PMID: 39242947 PMCID: PMC11332021 DOI: 10.1038/s44271-024-00104-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/03/2023] [Accepted: 05/16/2024] [Indexed: 09/09/2024]
Abstract
Human nonverbal vocalizations such as screams and cries often reflect their evolved functions. Although the universality of these putatively primordial vocal signals and their phylogenetic roots in animal calls suggest a strong reflexive foundation, many of the emotional vocalizations that we humans produce are under our voluntary control. This suggests that, like speech, volitional vocalizations may require auditory input to develop typically. Here, we acoustically analyzed hundreds of volitional vocalizations produced by profoundly deaf adults and typically-hearing controls. We show that deaf adults produce unconventional and homogenous vocalizations of aggression and pain that are unusually high-pitched, unarticulated, and with extremely few harsh-sounding nonlinear phenomena compared to controls. In contrast, fear vocalizations of deaf adults are relatively acoustically typical. In four lab experiments involving a range of perception tasks with 444 participants, listeners were less accurate in identifying the intended emotions of vocalizations produced by deaf vocalizers than by controls, perceived their vocalizations as less authentic, and reliably detected deafness. Vocalizations of congenitally deaf adults with zero auditory experience were most atypical, suggesting additive effects of auditory deprivation. Vocal learning in humans may thus be required not only for speech, but also to acquire the full repertoire of volitional non-linguistic vocalizations.
Collapse
Affiliation(s)
- Katarzyna Pisanski
- ENES Bioacoustics Research Laboratory, CRNL Center for Research in Neuroscience in Lyon, University of Saint-Étienne, 42023, Saint-Étienne, France.
- CNRS French National Centre for Scientific Research, DDL Dynamics of Language Lab, University of Lyon 2, 69007, Lyon, France.
- Institute of Psychology, University of Wrocław, 50-527, Wrocław, Poland.
| | - David Reby
- ENES Bioacoustics Research Laboratory, CRNL Center for Research in Neuroscience in Lyon, University of Saint-Étienne, 42023, Saint-Étienne, France
- Institut Universitaire de France, Paris, France
| | - Anna Oleszkiewicz
- Institute of Psychology, University of Wrocław, 50-527, Wrocław, Poland.
- Department of Otorhinolaryngology, Smell and Taste Clinic, Carl Gustav Carus Medical School, Technische Universitaet Dresden, 01307, Dresden, Germany.
| |
Collapse
|
2
|
Crawford MT, Maymon C, Miles NL, Blackburne K, Tooley M, Grimshaw GM. Emotion in motion: perceiving fear in the behaviour of individuals from minimal motion capture displays. Cogn Emot 2024; 38:451-462. [PMID: 38354068 DOI: 10.1080/02699931.2023.2300748] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2023] [Accepted: 12/21/2023] [Indexed: 02/16/2024]
Abstract
The ability to quickly and accurately recognise emotional states is adaptive for numerous social functions. Although body movements are a potentially crucial cue for inferring emotions, few studies have studied the perception of body movements made in naturalistic emotional states. The current research focuses on the use of body movement information in the perception of fear expressed by targets in a virtual heights paradigm. Across three studies, participants made judgments about the emotional states of others based on motion-capture body movement recordings of those individuals actively engaged in walking a virtual plank at ground-level or 80 stories above a city street. Results indicated that participants were reliably able to differentiate between height and non-height conditions (Studies 1 & 2), were more likely to spontaneously describe target behaviour in the height condition as fearful (Study 2) and their fear estimates were highly calibrated with the fear ratings from the targets (Studies 1-3). Findings show that VR height scenarios can induce fearful behaviour and that people can perceive fear in minimal representations of body movement.
Collapse
Affiliation(s)
- Matthew T Crawford
- School of Psychology, Victoria University of Wellington, Wellington, New Zealand
| | - Christopher Maymon
- School of Psychology, Victoria University of Wellington, Wellington, New Zealand
| | - Nicola L Miles
- School of Psychology, Victoria University of Wellington, Wellington, New Zealand
| | - Katie Blackburne
- School of Psychology, Victoria University of Wellington, Wellington, New Zealand
| | - Michael Tooley
- School of Psychology, Victoria University of Wellington, Wellington, New Zealand
| | - Gina M Grimshaw
- School of Psychology, Victoria University of Wellington, Wellington, New Zealand
| |
Collapse
|
3
|
Fernandez-Fresard G, Acevedo K. Voice Performance Chart: A Pedagogical Tool to Enhance Vocal Expressive Ability in Acting Students. J Voice 2024; 38:797.e1-797.e10. [PMID: 34895988 DOI: 10.1016/j.jvoice.2021.10.020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2021] [Revised: 10/24/2021] [Accepted: 10/25/2021] [Indexed: 11/16/2022]
Abstract
The purpose of the present study was to explore the effectiveness of the Voice Performance Chart (VPCH) as a pedagogical training tool to enhance vocal expressive ability of 1st year acting students. Forty recorded audio samples were perceptually assessed by six blinded raters, using a five-points Likert scale for each of the dependent variables observed. Results showed that loudness, pitch, and speech rate variations significantly differ when comparing the vocal condition before and after a 11 weeks training period. That suggests that VPCH can be an effective pedagogical tool to develop vocal expressive capabilities of acting students, by enhancing their expressive nuances level, according to the text content. Additionally, it might be argued that VPCH is an effective pedagogical tool not only for acting students, but also for individuals from any discipline requiring the use of the spoken voice in a professional context and/or in an expressive sense.
Collapse
Affiliation(s)
- Gala Fernandez-Fresard
- Drama School, Faculty of Arts, Pontifical Catholic University of Chile, Santiago, Chile; Metropolitan University of Education Sciences (UMCE) of Chile, Santiago, Chile.
| | - Karol Acevedo
- Drama School, Faculty of Arts, Pontifical Catholic University of Chile, Santiago, Chile
| |
Collapse
|
4
|
Aydın S, Onbaşı L. Graph theoretical brain connectivity measures to investigate neural correlates of music rhythms associated with fear and anger. Cogn Neurodyn 2024; 18:49-66. [PMID: 38406195 PMCID: PMC10881947 DOI: 10.1007/s11571-023-09931-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Revised: 10/19/2022] [Accepted: 01/09/2023] [Indexed: 01/26/2023] Open
Abstract
The present study tests the hypothesis that emotions of fear and anger are associated with distinct psychophysiological and neural circuitry according to discrete emotion model due to contrasting neurotransmitter activities, despite being included in the same affective group in many studies due to similar arousal-valance scores of them in emotion models. EEG data is downloaded from OpenNeuro platform with access number of ds002721. Brain connectivity estimations are obtained by using both functional and effective connectivity estimators in analysis of short (2 sec) and long (6 sec) EEG segments across the cortex. In tests, discrete emotions and resting-states are identified by frequency band specific brain network measures and then contrasting emotional states are deep classified with 5-fold cross-validated Long Short Term Memory Networks. Logistic regression modeling has also been examined to provide robust performance criteria. Commonly, the best results are obtained by using Partial Directed Coherence in Gamma (31.5 - 60.5 H z ) sub-bands of short EEG segments. In particular, Fear and Anger have been classified with accuracy of 91.79%. Thus, our hypothesis is supported by overall results. In conclusion, Anger is found to be characterized by increased transitivity and decreased local efficiency in addition to lower modularity in Gamma-band in comparison to fear. Local efficiency refers functional brain segregation originated from the ability of the brain to exchange information locally. Transitivity refer the overall probability for the brain having adjacent neural populations interconnected, thus revealing the existence of tightly connected cortical regions. Modularity quantifies how well the brain can be partitioned into functional cortical regions. In conclusion, PDC is proposed to graph theoretical analysis of short EEG epochs in presenting robust emotional indicators sensitive to perception of affective sounds.
Collapse
Affiliation(s)
- Serap Aydın
- Department of Biophysics, Faculty of Medicine, Hacettepe University, Sıhhiye, Ankara, Turkey
| | - Lara Onbaşı
- School of Medicine, Hacettepe University, Sıhhiye, Ankara, Turkey
| |
Collapse
|
5
|
Emotion Elicitation through Vibrotactile Stimulation as an Alternative for Deaf and Hard of Hearing People: An EEG Study. ELECTRONICS 2022. [DOI: 10.3390/electronics11142196] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Despite technological and accessibility advances, the performing arts and their cultural offerings remain inaccessible to many people. By using vibrotactile stimulation as an alternative channel, we explored a different way to enhance emotional processes produced while watching audiovisual media and, thus, elicit a greater emotional reaction in hearing-impaired people. We recorded the brain activity of 35 participants with normal hearing and 8 participants with severe and total hearing loss. The results showed activation in the same areas both in participants with normal hearing while watching a video, and in hearing-impaired participants while watching the same video with synchronized soft vibrotactile stimulation in both hands, based on a proprietary stimulation glove. These brain areas (bilateral middle frontal orbitofrontal, bilateral superior frontal gyrus, and left cingulum) have been reported as emotional and attentional areas. We conclude that vibrotactile stimulation can elicit the appropriate cortex activation while watching audiovisual media.
Collapse
|
6
|
Perrett D. Representations of facial expressions since Darwin. EVOLUTIONARY HUMAN SCIENCES 2022; 4:e22. [PMID: 37588914 PMCID: PMC10426120 DOI: 10.1017/ehs.2022.10] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
Darwin's book on expressions of emotion was one of the first publications to include photographs (Darwin, The expression of the emotions in Man and animals, 1872). The inclusion of expression photographs meant that readers could form their own opinions and could, like Darwin, survey others for their interpretations. As such, the images provided an evidence base and an 'open source'. Since Darwin, increases in the representativeness and realism of emotional expressions have come from the use of composite images, colour, multiple views and dynamic displays. Research on understanding emotional expressions has been aided by the use of computer graphics to interpolate parametrically between different expressions and to extrapolate exaggerations. This review tracks the developments in how emotions are illustrated and studied and considers where to go next.
Collapse
Affiliation(s)
- David Perrett
- School of Psychology and Neuroscience, University of St Andrews, St Mary's Quad, St Andrews, Fife KY169JP, UK
| |
Collapse
|
7
|
Le Mau T, Hoemann K, Lyons SH, Fugate JMB, Brown EN, Gendron M, Barrett LF. Professional actors demonstrate variability, not stereotypical expressions, when portraying emotional states in photographs. Nat Commun 2021; 12:5037. [PMID: 34413313 PMCID: PMC8376986 DOI: 10.1038/s41467-021-25352-6] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2019] [Accepted: 08/02/2021] [Indexed: 02/07/2023] Open
Abstract
It is long hypothesized that there is a reliable, specific mapping between certain emotional states and the facial movements that express those states. This hypothesis is often tested by asking untrained participants to pose the facial movements they believe they use to express emotions during generic scenarios. Here, we test this hypothesis using, as stimuli, photographs of facial configurations posed by professional actors in response to contextually-rich scenarios. The scenarios portrayed in the photographs were rated by a convenience sample of participants for the extent to which they evoked an instance of 13 emotion categories, and actors' facial poses were coded for their specific movements. Both unsupervised and supervised machine learning find that in these photographs, the actors portrayed emotional states with variable facial configurations; instances of only three emotion categories (fear, happiness, and surprise) were portrayed with moderate reliability and specificity. The photographs were separately rated by another sample of participants for the extent to which they portrayed an instance of the 13 emotion categories; they were rated when presented alone and when presented with their associated scenarios, revealing that emotion inferences by participants also vary in a context-sensitive manner. Together, these findings suggest that facial movements and perceptions of emotion vary by situation and transcend stereotypes of emotional expressions. Future research may build on these findings by incorporating dynamic stimuli rather than photographs and studying a broader range of cultural contexts.
Collapse
Affiliation(s)
- Tuan Le Mau
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
- Institute for High Performance Computing, Social and Cognitive Computing, Connexis North, Singapore
| | - Katie Hoemann
- Department of Psychology, Katholieke Universiteit Leuven, Leuven, Belgium
| | - Sam H Lyons
- Department of Neurology, University of Pennsylvania, Philadelphia, PA, USA
| | - Jennifer M B Fugate
- Department of Psychology, University of Massachusetts at Dartmouth, Dartmouth, MA, 02747, USA
| | - Emery N Brown
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Maria Gendron
- Department of Psychology, Yale University, New Haven, CT, USA
| | - Lisa Feldman Barrett
- Department of Psychology, Northeastern University, Boston, MA, USA.
- Massachusetts General Hospital/Martinos Center for Biomedical Imaging, Charlestown, MA, USA.
| |
Collapse
|
8
|
Hintze S, Schanz L. Using the Judgment Bias Task to Identify Behavioral Indicators of Affective State: Do Eye Wrinkles in Horses Reflect Mood? Front Vet Sci 2021; 8:676888. [PMID: 34307525 PMCID: PMC8295722 DOI: 10.3389/fvets.2021.676888] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2021] [Accepted: 05/25/2021] [Indexed: 11/13/2022] Open
Abstract
Identifying and validating behavioral indicators of mood are important for the assessment of animal welfare. Here, we investigated whether horses' eye wrinkle expression in a presumably neutral situation is a measure of mood as assessed in a cognitive judgment bias task (JBT). To this end, we scored pictures of the left and right eyes of 16 stallions for different aspects of eye wrinkle expression and tested the same individuals on a spatial JBT with active trial initiation. Eye wrinkle expressions were assessed by a qualitative assessment, i.e., the overall assessment of how "worried" horses look, the number of wrinkles, and the angle measured at the intersection of lines drawn through the eyeball and the topmost wrinkle. Correlations between the three eye wrinkle measures and the optimism index as a measure of horses' decisions in the JBT were not statistically significant, but with increasing optimism index, horses tended to be scored as looking less worried (qualitative assessment). We discuss our findings from different perspectives and make suggestions for future research, e.g., by calling for experimental induction of mood and thus greater variation within and/or between individuals and by investigating the interplay between shorter-lasting emotional and longer-lasting mood states to further explore the potential use of the JBT to validate eye wrinkles and other facial or body expressions as indicators of mood.
Collapse
Affiliation(s)
- Sara Hintze
- Division of Livestock Sciences, Department of Sustainable Agricultural Systems, University of Natural Resources and Life Sciences, Vienna, Austria
| | - Lisa Schanz
- Division of Livestock Sciences, Department of Sustainable Agricultural Systems, University of Natural Resources and Life Sciences, Vienna, Austria
| |
Collapse
|
9
|
Abstract
With a shift in interest toward dynamic expressions, numerous corpora of dynamic facial stimuli have been developed over the past two decades. The present research aimed to test existing sets of dynamic facial expressions (published between 2000 and 2015) in a cross-corpus validation effort. For this, 14 dynamic databases were selected that featured facial expressions of the basic six emotions (anger, disgust, fear, happiness, sadness, surprise) in posed or spontaneous form. In Study 1, a subset of stimuli from each database (N = 162) were presented to human observers and machine analysis, yielding considerable variance in emotion recognition performance across the databases. Classification accuracy further varied with perceived intensity and naturalness of the displays, with posed expressions being judged more accurately and as intense, but less natural compared to spontaneous ones. Study 2 aimed for a full validation of the 14 databases by subjecting the entire stimulus set (N = 3812) to machine analysis. A FACS-based Action Unit (AU) analysis revealed that facial AU configurations were more prototypical in posed than spontaneous expressions. The prototypicality of an expression in turn predicted emotion classification accuracy, with higher performance observed for more prototypical facial behavior. Furthermore, technical features of each database (i.e., duration, face box size, head rotation, and motion) had a significant impact on recognition accuracy. Together, the findings suggest that existing databases vary in their ability to signal specific emotions, thereby facing a trade-off between realism and ecological validity on the one end, and expression uniformity and comparability on the other.
Collapse
|
10
|
Yelderman LA, Estrada-Reynolds V, Lawrence TI. Release or Denial: Evaluating the Roles of Emotion and Risk in Parole Decisions. Psychol Rep 2021; 125:2088-2108. [PMID: 33845670 DOI: 10.1177/00332941211007929] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Parole boards often incorporate numerous factors when making release decisions. These factors are typically related to the inmates' case files. However, in some instances, parole boards' decisions are influenced by factors outside of the case files, sometimes referred to as extra-legal factors. According to the emotion as social information model, emotion can communicate specific messages to others, and in this case, parole board members might unknowingly incorporate their own emotions and inmates' emotional displays into their decisions. The current study examines the role of parole board member and inmate emotional expressions as predictors of parole release decisions. Parole hearings were coded for emotion, parole board and inmate gender, supporter presence, and risk scores. Overall, risk scores and parole board members' emotions predicted release decisions. Higher risk scores were associated with a lower likelihood of release, and inmates' negative emotion was related to a lower likelihood of release. Implications are discussed.
Collapse
Affiliation(s)
- Logan A Yelderman
- Department of Psychology, College of Juvenile Justice and Psychology, Prairie View A&M University, Prairie View, TX, USA
| | | | - Timothy I Lawrence
- Department of Psychology, College of Juvenile Justice and Psychology, Prairie View A&M University, Prairie View, TX, USA
| |
Collapse
|
11
|
Fugate JMB, MacDonald C, O’Hare AJ. Emotion Words' Effect on Visual Awareness and Attention of Emotional Faces. Front Psychol 2020; 10:2896. [PMID: 32010012 PMCID: PMC6974626 DOI: 10.3389/fpsyg.2019.02896] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2019] [Accepted: 12/06/2019] [Indexed: 11/13/2022] Open
Abstract
To explore whether the meaning of a word changes visual processing of emotional faces (i.e., visual awareness and visual attention), we performed two complementary studies. In Experiment 1, we presented participants with emotion and control words and then tracked their visual awareness for two competing emotional faces using a binocular rivalry paradigm. Participants experienced the emotional face congruent with the emotion word for longer than a word-incongruent emotional face, as would be expected if the word was biasing awareness toward the (unseen) face. In Experiment 2, we similarly presented participants with emotion and control words prior to presenting emotional faces using a divided visual field paradigm. Emotion words were congruent with either the emotional face in the right or left visual field. After the presentation of faces, participants saw a dot in either the left or right visual field. Participants were slower to identify the location of the dot when it appeared in the same visual field as the emotional face congruent with the emotion word. The effect was limited to the left hemisphere (RVF), as would be expected for linguistic integration of the word with the face. Since the task was not linguistic, but rather a simple dot-probe task, participants were slower in their responses under these conditions because they likely had to disengage from the additional linguistic processing caused by the word-face integration. These findings indicate that emotion words bias visual awareness for congruent emotional faces, as well as shift attention toward congruent emotional faces.
Collapse
Affiliation(s)
- Jennifer M. B. Fugate
- Department of Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| | - Cameron MacDonald
- Department of Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| | - Aminda J. O’Hare
- Department of Psychology, Weber State University, Ogden, UT, United States
| |
Collapse
|
12
|
Franco CL, Fugate JMB. Emoji Face Renderings: Exploring the Role Emoji Platform Differences have on Emotional Interpretation. JOURNAL OF NONVERBAL BEHAVIOR 2020. [DOI: 10.1007/s10919-019-00330-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
13
|
Barrett LF, Adolphs R, Marsella S, Martinez A, Pollak SD. Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements. Psychol Sci Public Interest 2019; 20:1-68. [PMID: 31313636 PMCID: PMC6640856 DOI: 10.1177/1529100619832930] [Citation(s) in RCA: 384] [Impact Index Per Article: 76.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
It is commonly assumed that a person's emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require.
Collapse
Affiliation(s)
- Lisa Feldman Barrett
- Northeastern University, Department of Psychology, Boston, MA
- Massachusetts General Hospital, Department of Psychiatry and the Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, MA
- Harvard Medical School, Department of Psychiatry, Boston MA
| | - Ralph Adolphs
- California Institute of Technology, Departments of Psychology, Neuroscience, and Biology,Pasadena, CA
| | - Stacy Marsella
- Northeastern University, Department of Psychology, Boston, MA
- Northeastern University, College of Computer and Information Science, Boston, MA
- University of Glasgow, Glasgow, Scotland
| | - Aleix Martinez
- The Ohio State University, Department of Electrical and Computer Engineering, and Center for Cognitive and Brain Sciences, Columbus, OH
| | - Seth D. Pollak
- University of Wisconsin - Madison, Department of Psychology, Madison, WI
| |
Collapse
|
14
|
Guo K, Li Z, Yan Y, Li W. Viewing heterospecific facial expressions: an eye-tracking study of human and monkey viewers. Exp Brain Res 2019; 237:2045-2059. [PMID: 31165915 PMCID: PMC6647127 DOI: 10.1007/s00221-019-05574-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Accepted: 05/31/2019] [Indexed: 11/03/2022]
Abstract
Common facial expressions of emotion have distinctive patterns of facial muscle movements that are culturally similar among humans, and perceiving these expressions is associated with stereotypical gaze allocation at local facial regions that are characteristic for each expression, such as eyes in angry faces. It is, however, unclear to what extent this 'universality' view can be extended to process heterospecific facial expressions, and how 'social learning' process contributes to heterospecific expression perception. In this eye-tracking study, we examined face-viewing gaze allocation of human (including dog owners and non-dog owners) and monkey observers while exploring expressive human, chimpanzee, monkey and dog faces (positive, neutral and negative expressions in human and dog faces; neutral and negative expressions in chimpanzee and monkey faces). Human observers showed species- and experience-dependent expression categorization accuracy. Furthermore, both human and monkey observers demonstrated different face-viewing gaze distributions which were also species dependent. Specifically, humans predominately attended at human eyes but animal mouth when judging facial expressions. Monkeys' gaze distributions in exploring human and monkey faces were qualitatively different from exploring chimpanzee and dog faces. Interestingly, the gaze behaviour of both human and monkey observers were further affected by their prior experience of the viewed species. It seems that facial expression processing is species dependent, and social learning may play a significant role in discriminating even rudimentary types of heterospecific expressions.
Collapse
Affiliation(s)
- Kun Guo
- School of Psychology, University of Lincoln, Lincoln, LN6 7TS, UK.
| | - Zhihan Li
- State Key Laboratory of Cognitive Neuroscience and Learning, and IDG, Beijing Normal University, Beijing, 100875, China
| | - Yin Yan
- State Key Laboratory of Cognitive Neuroscience and Learning, and IDG, Beijing Normal University, Beijing, 100875, China
| | - Wu Li
- State Key Laboratory of Cognitive Neuroscience and Learning, and IDG, Beijing Normal University, Beijing, 100875, China
| |
Collapse
|
15
|
Spinelli S, Jaeger SR. What do we know about the sensory drivers of emotions in foods and beverages? Curr Opin Food Sci 2019. [DOI: 10.1016/j.cofs.2019.06.007] [Citation(s) in RCA: 27] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
16
|
Fugate JMB, Franco CL. What Color Is Your Anger? Assessing Color-Emotion Pairings in English Speakers. Front Psychol 2019; 10:206. [PMID: 30863330 PMCID: PMC6399154 DOI: 10.3389/fpsyg.2019.00206] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2018] [Accepted: 01/21/2019] [Indexed: 11/16/2022] Open
Abstract
Do English-speakers think about anger as “red” and sadness as “blue”? Some theories of emotion suggests that color(s)—like other biologically-derived signals- should be reliably paired with an emotion, and that colors should differentiate across emotions. We assessed consistency and specificity for color-emotion pairings among English-speaking adults. In study 1, participants (n = 73) completed an online survey in which they could select up to three colors from 23 colored swatches (varying hue, saturation, and light) for each of ten emotion words. In study 2, different participants (n = 52) completed a similar online survey except that we added additional emotions and colors (which better sampled color space). Participants in both studies indicated the strength of the relationship between a selected color(s) and the emotion. In study 1, four of the ten emotions showed consistency, and about one-third of the colors showed specificity, yet agreement was low-to-moderate among raters even in these cases. When we resampled our data, however, none of these effects were likely to replicate with statistical confidence. In study 2, only two of 20 emotions showed consistency, and three colors showed specificity. As with the first study, no color-emotion pairings were both specific and consistent. In addition, in study 2, we found that saturation and lightness, and to a lesser extent hue, predicted color-emotion agreement rather than perceived color. The results suggest that previous studies which report emotion-color pairings are likely best thought of experiment-specific. The results are discussed with respect to constructionist theories of emotion.
Collapse
Affiliation(s)
| | - Courtny L Franco
- Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| |
Collapse
|
17
|
Olderbak S, Semmler M, Doebler P. Four-Branch Model of Ability Emotional Intelligence With Fluid and Crystallized Intelligence: A Meta-Analysis of Relations. EMOTION REVIEW 2018. [DOI: 10.1177/1754073918776776] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
We meta-analytically investigated relations between the four-branch model of ability emotional intelligence (EI) with fluid (Gf) and crystallized intelligence (Gc; 352 effect sizes; ntotal = 15,333). We found that for each branch, the strength of relations with Gf and Gc were equivalent. Understanding emotions has the strongest relation with Gf/Gc combined (ρ = .43, k = 81, n = 11,524), relative to facilitating thought using emotion (ρ = .19, k = 51, n = 7,254), managing emotions (ρ = .20, k = 74, n = 11,359), and perceiving emotion (ρ = .20, k = 79, n = 9,636); for the latter, relations were also moderated by stimulus type. We conclude with implications and recommendations for the study of ability EI.
Collapse
Affiliation(s)
- Sally Olderbak
- Individual Differences and Psychological Assessment Department, Institute for Psychology and Education, Ulm University, Germany
- Statistical Methods in Social Research, TU Dortmund University, Germany
| | - Martin Semmler
- Individual Differences and Psychological Assessment Department, Institute for Psychology and Education, Ulm University, Germany
- Statistical Methods in Social Research, TU Dortmund University, Germany
| | - Philipp Doebler
- Statistical Methods in Social Research, TU Dortmund University, Germany
| |
Collapse
|
18
|
Zhao G, Zhang Y, Ge Y. Frontal EEG Asymmetry and Middle Line Power Difference in Discrete Emotions. Front Behav Neurosci 2018; 12:225. [PMID: 30443208 PMCID: PMC6221898 DOI: 10.3389/fnbeh.2018.00225] [Citation(s) in RCA: 46] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2018] [Accepted: 09/10/2018] [Indexed: 12/25/2022] Open
Abstract
A traditional model of emotion cannot explain the differences in brain activities between two discrete emotions that are similar in the valence-arousal coordinate space. The current study elicited two positive emotions (amusement and tenderness) and two negative emotions (anger and fear) that are similar in both valence and arousal dimensions to examine the differences in brain activities in these emotional states. Frontal electroencephalographic (EEG) asymmetry and midline power in three bands (theta, alpha and beta) were measured when participants watched affective film excerpts. Significant differences were detected between tenderness and amusement on FP1/FP2 theta asymmetry, F3/F4 theta and alpha asymmetry. Significant differences between anger and fear on FP1/FP2 theta asymmetry and F3/F4 alpha asymmetry were also observed. For midline power, midline theta power could distinguish two negative emotions, while midline alpha and beta power could effectively differentiate two positive emotions. Liking and dominance were also related to EEG features. Stepwise multiple linear regression results revealed that frontal alpha and theta asymmetry could predict the subjective feelings of two positive and two negative emotions in different patterns. The binary classification accuracy, which used EEG frontal asymmetry and midline power as features and support vector machine (SVM) as classifiers, was as high as 64.52% for tenderness and amusement and 78.79% for anger and fear. The classification accuracy was improved after adding these features to other features extracted across the scalp. These findings indicate that frontal EEG asymmetry and midline power might have the potential to recognize discrete emotions that are similar in the valence-arousal coordinate space.
Collapse
Affiliation(s)
- Guozhen Zhao
- CAS Key Laboratory of Behavioral Science, Institute of PsychologyBeijing, China
- Department of Psychology, University of Chinese Academy of SciencesBeijing, China
| | - Yulin Zhang
- CAS Key Laboratory of Behavioral Science, Institute of PsychologyBeijing, China
- Department of Psychology, University of Chinese Academy of SciencesBeijing, China
| | - Yan Ge
- CAS Key Laboratory of Behavioral Science, Institute of PsychologyBeijing, China
- Department of Psychology, University of Chinese Academy of SciencesBeijing, China
| |
Collapse
|
19
|
Wu Y, Baker CL, Tenenbaum JB, Schulz LE. Rational Inference of Beliefs and Desires From Emotional Expressions. Cogn Sci 2018; 42:850-884. [PMID: 28986938 PMCID: PMC6033160 DOI: 10.1111/cogs.12548] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2016] [Revised: 07/14/2017] [Accepted: 07/19/2017] [Indexed: 01/09/2023]
Abstract
We investigated people's ability to infer others' mental states from their emotional reactions, manipulating whether agents wanted, expected, and caused an outcome. Participants recovered agents' desires throughout. When the agent observed, but did not cause the outcome, participants' ability to recover the agent's beliefs depended on the evidence they got (i.e., her reaction only to the actual outcome or to both the expected and actual outcomes; Experiments 1 and 2). When the agent caused the event, participants' judgments also depended on the probability of the action (Experiments 3 and 4); when actions were improbable given the mental states, people failed to recover the agent's beliefs even when they saw her react to both the anticipated and actual outcomes. A Bayesian model captured human performance throughout (rs ≥ .95), consistent with the proposal that people rationally integrate information about others' actions and emotional reactions to infer their unobservable mental states.
Collapse
Affiliation(s)
- Yang Wu
- Department of Brain and Cognitive SciencesMassachusetts Institute of Technology
| | - Chris L. Baker
- Department of Brain and Cognitive SciencesMassachusetts Institute of Technology
| | - Joshua B. Tenenbaum
- Department of Brain and Cognitive SciencesMassachusetts Institute of Technology
| | - Laura E. Schulz
- Department of Brain and Cognitive SciencesMassachusetts Institute of Technology
| |
Collapse
|
20
|
Guo K, Soornack Y, Settle R. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion. Vision Res 2018; 157:112-122. [PMID: 29496513 DOI: 10.1016/j.visres.2018.02.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2017] [Revised: 02/02/2018] [Accepted: 02/04/2018] [Indexed: 11/29/2022]
Abstract
Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features.
Collapse
Affiliation(s)
- Kun Guo
- School of Psychology, University of Lincoln, UK.
| | | | | |
Collapse
|
21
|
Barrett LF. The theory of constructed emotion: an active inference account of interoception and categorization. Soc Cogn Affect Neurosci 2017; 12:1-23. [PMID: 27798257 PMCID: PMC5390700 DOI: 10.1093/scan/nsw154] [Citation(s) in RCA: 297] [Impact Index Per Article: 42.4] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2016] [Accepted: 10/11/2016] [Indexed: 12/21/2022] Open
Abstract
The science of emotion has been using folk psychology categories derived from philosophy to search for the brain basis of emotion. The last two decades of neuroscience research have brought us to the brink of a paradigm shift in understanding the workings of the brain, however, setting the stage to revolutionize our understanding of what emotions are and how they work. In this article, we begin with the structure and function of the brain, and from there deduce what the biological basis of emotions might be. The answer is a brain-based, computational account called the theory of constructed emotion.
Collapse
Affiliation(s)
- Lisa Feldman Barrett
- Department of Psychology, Northeastern University, Boston, MA, USA.,Athinoula, A. Martinos Center for Biomedical Imaging.,Psychiatric Neuroimaging Division, Department of Psychiatry, Massachusetts General Hospital and Harvard Medical School, Charlestown, MA, USA
| |
Collapse
|
22
|
Moors A. Integration of Two Skeptical Emotion Theories: Dimensional Appraisal Theory and Russell's Psychological Construction Theory. PSYCHOLOGICAL INQUIRY 2017. [DOI: 10.1080/1047840x.2017.1235900] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
- Agnes Moors
- Research Group of Quantitative Psychology and Individual Differences, KU Leuven, Leuven, Belgium
- Center for Social and Cultural Psychology, KU Leuven, Leuven, Belgium
- Experimental Clinical and Health Psychology, Department of Psychology, Ghent University, Ghent, Belgium
| |
Collapse
|
23
|
Zhou H, Majka EA, Epley N. Inferring Perspective Versus Getting Perspective: Underestimating the Value of Being in Another Person’s Shoes. Psychol Sci 2017; 28:482-493. [DOI: 10.1177/0956797616687124] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
People use at least two strategies to solve the challenge of understanding another person’s mind: inferring that person’s perspective by reading his or her behavior (theorization) and getting that person’s perspective by experiencing his or her situation (simulation). The five experiments reported here demonstrate a strong tendency for people to underestimate the value of simulation. Predictors estimated a stranger’s emotional reactions toward 50 pictures. They could either infer the stranger’s perspective by reading his or her facial expressions or simulate the stranger’s perspective by watching the pictures he or she viewed. Predictors were substantially more accurate when they got perspective through simulation, but overestimated the accuracy they had achieved by inferring perspective. Predictors’ miscalibrated confidence stemmed from overestimating the information revealed through facial expressions and underestimating the similarity in people’s reactions to a given situation. People seem to underappreciate a useful strategy for understanding the minds of others, even after they gain firsthand experience with both strategies.
Collapse
Affiliation(s)
- Haotian Zhou
- School of Entrepreneurship and Management, Shanghai Tech University
| | | | | |
Collapse
|
24
|
Affiliation(s)
- Paul L. Harris
- Graduate School of Education, Harvard University, Cambridge, MA, USA
| |
Collapse
|
25
|
Rethinking primate facial expression: A predictive framework. Neurosci Biobehav Rev 2016; 82:13-21. [PMID: 27637495 DOI: 10.1016/j.neubiorev.2016.09.005] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2016] [Revised: 08/31/2016] [Accepted: 09/12/2016] [Indexed: 11/24/2022]
Abstract
Primate facial expression has long been studied within a framework of emotion that has heavily influenced both theoretical approaches and scientific methods. For example, our understanding of the adaptive function and cognition of facial expression is tied to the assumption that facial expression is accompanied by an emotional internal state, which is decipherable by others. Here, we challenge this view and instead support the alternative that facial expression should also be conceptualised as an indicator of future behaviour as opposed to current emotional state alone (Behavioural Ecology View, Fridlund, 1994). We also advocate the use of standardised, objective methodology Facial Action Coding System, to avoid making assumptions about the underlying emotional state of animals producing facial expressions. We argue that broadening our approach to facial expression in this way will open new avenues to explore the underlying neurobiology, cognition and evolution of facial communication in both human and non-human primates.
Collapse
|
26
|
Corneanu CA, Simon MO, Cohn JF, Guerrero SE. Survey on RGB, 3D, Thermal, and Multimodal Approaches for Facial Expression Recognition: History, Trends, and Affect-Related Applications. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2016; 38:1548-68. [PMID: 26761193 PMCID: PMC7426891 DOI: 10.1109/tpami.2016.2515606] [Citation(s) in RCA: 90] [Impact Index Per Article: 11.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Facial expressions are an important way through which humans interact socially. Building a system capable of automatically recognizing facial expressions from images and video has been an intense field of study in recent years. Interpreting such expressions remains challenging and much research is needed about the way they relate to human affect. This paper presents a general overview of automatic RGB, 3D, thermal and multimodal facial expression analysis. We define a new taxonomy for the field, encompassing all steps from face detection to facial expression recognition, and describe and classify the state of the art methods accordingly. We also present the important datasets and the bench-marking of most influential methods. We conclude with a general discussion about trends, important questions and future lines of research.
Collapse
|
27
|
|
28
|
Hildebrandt A, Sommer W, Schacht A, Wilhelm O. Perceiving and remembering emotional facial expressions — A basic facet of emotional intelligence. INTELLIGENCE 2015. [DOI: 10.1016/j.intell.2015.02.003] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
29
|
Jürgens R, Grass A, Drolet M, Fischer J. Effect of Acting Experience on Emotion Expression and Recognition in Voice: Non-Actors Provide Better Stimuli than Expected. JOURNAL OF NONVERBAL BEHAVIOR 2015; 39:195-214. [PMID: 26246649 PMCID: PMC4519627 DOI: 10.1007/s10919-015-0209-5] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Both in the performative arts and in emotion research, professional actors are assumed to be capable of delivering emotions comparable to spontaneous emotional expressions. This study examines the effects of acting training on vocal emotion depiction and recognition. We predicted that professional actors express emotions in a more realistic fashion than non-professional actors. However, professional acting training may lead to a particular speech pattern; this might account for vocal expressions by actors that are less comparable to authentic samples than the ones by non-professional actors. We compared 80 emotional speech tokens from radio interviews with 80 re-enactments by professional and inexperienced actors, respectively. We analyzed recognition accuracies for emotion and authenticity ratings and compared the acoustic structure of the speech tokens. Both play-acted conditions yielded similar recognition accuracies and possessed more variable pitch contours than the spontaneous recordings. However, professional actors exhibited signs of different articulation patterns compared to non-trained speakers. Our results indicate that for emotion research, emotional expressions by professional actors are not better suited than those from non-actors.
Collapse
Affiliation(s)
- Rebecca Jürgens
- Cognitive Ethology Laboratory, German Primate Center, Kellnerweg 4, 37077 Göttingen, Germany ; Courant Research Centre "Evolution of Social Behaviour", University of Göttingen, Göttingen, Germany
| | - Annika Grass
- Cognitive Ethology Laboratory, German Primate Center, Kellnerweg 4, 37077 Göttingen, Germany ; Courant Research Centre "Text Structures", University of Göttingen, Göttingen, Germany
| | - Matthis Drolet
- Cognitive Ethology Laboratory, German Primate Center, Kellnerweg 4, 37077 Göttingen, Germany
| | - Julia Fischer
- Cognitive Ethology Laboratory, German Primate Center, Kellnerweg 4, 37077 Göttingen, Germany ; Courant Research Centre "Evolution of Social Behaviour", University of Göttingen, Göttingen, Germany
| |
Collapse
|
30
|
Touroutoglou A, Lindquist KA, Dickerson BC, Barrett LF. Intrinsic connectivity in the human brain does not reveal networks for 'basic' emotions. Soc Cogn Affect Neurosci 2015; 10:1257-65. [PMID: 25680990 DOI: 10.1093/scan/nsv013] [Citation(s) in RCA: 73] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2014] [Accepted: 02/09/2015] [Indexed: 11/14/2022] Open
Abstract
We tested two competing models for the brain basis of emotion, the basic emotion theory and the conceptual act theory of emotion, using resting-state functional connectivity magnetic resonance imaging (rs-fcMRI). The basic emotion view hypothesizes that anger, sadness, fear, disgust and happiness each arise from a brain network that is innate, anatomically constrained and homologous in other animals. The conceptual act theory of emotion hypothesizes that an instance of emotion is a brain state constructed from the interaction of domain-general, core systems within the brain such as the salience, default mode and frontoparietal control networks. Using peak coordinates derived from a meta-analysis of task-evoked emotion fMRI studies, we generated a set of whole-brain rs-fcMRI 'discovery' maps for each emotion category and examined the spatial overlap in their conjunctions. Instead of discovering a specific network for each emotion category, variance in the discovery maps was accounted for by the known domain-general network. Furthermore, the salience network is observed as part of every emotion category. These results indicate that specific networks for each emotion do not exist within the intrinsic architecture of the human brain and instead support the conceptual act theory of emotion.
Collapse
Affiliation(s)
- Alexandra Touroutoglou
- Department of Neurology, Athinoula A. Martinos Center for Biomedical Imaging, and Psychiatric Neuroimaging Division, Department of Psychiatry, Massachusetts General Hospital and Harvard Medical School, Charlestown, MA,USA,
| | - Kristen A Lindquist
- Department of Psychology and Biomedical Research Imaging Center, University of North Carolina, Chapel Hill, NC, USA
| | - Bradford C Dickerson
- Athinoula A. Martinos Center for Biomedical Imaging, and Frontotemporal Disorders Unit, Department of Neurology, Massachusetts General Hospital and Harvard Medical School, Charlestown, MA, USA, and
| | - Lisa Feldman Barrett
- Athinoula A. Martinos Center for Biomedical Imaging, and Psychiatric Neuroimaging Division, Department of Psychiatry, Massachusetts General Hospital and Harvard Medical School, Charlestown, MA,USA, Department of Psychology, Northeastern University, Boston, MA, USA
| |
Collapse
|
31
|
Stringaris A, Castellanos-Ryan N, Banaschewski T, Barker GJ, Bokde AL, Bromberg U, Büchel C, Fauth-Bühler M, Flor H, Frouin V, Gallinat J, Garavan H, Gowland P, Heinz A, Itterman B, Lawrence C, Nees F, Paillere-Martinot ML, Paus T, Pausova Z, Rietschel M, Smolka MN, Schumann G, Goodman R, Conrod P. Dimensions of manic symptoms in youth: psychosocial impairment and cognitive performance in the IMAGEN sample. J Child Psychol Psychiatry 2014; 55:1380-9. [PMID: 24865127 PMCID: PMC4167034 DOI: 10.1111/jcpp.12255] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 03/25/2014] [Indexed: 11/28/2022]
Abstract
BACKGROUND It has been reported that mania may be associated with superior cognitive performance. In this study, we test the hypothesis that manic symptoms in youth separate along two correlated dimensions and that a symptom constellation of high energy and cheerfulness is associated with superior cognitive performance. METHOD We studied 1755 participants of the IMAGEN study, of average age 14.4 years (SD = 0.43), 50.7% girls. Manic symptoms were assessed using the Development and Wellbeing Assessment by interviewing parents and young people. Cognition was assessed using the Wechsler Intelligence Scale For Children (WISC-IV) and a response inhibition task. RESULTS Manic symptoms in youth formed two correlated dimensions: one termed exuberance, characterized by high energy and cheerfulness and one of undercontrol with distractibility, irritability and risk-taking behavior. Only the undercontrol, but not the exuberant dimension, was independently associated with measures of psychosocial impairment. In multivariate regression models, the exuberant, but not the undercontrolled, dimension was positively and significantly associated with verbal IQ by both parent- and self-report; conversely, the undercontrolled, but not the exuberant, dimension was associated with poor performance in a response inhibition task. CONCLUSIONS Our findings suggest that manic symptoms in youth may form dimensions with distinct correlates. The results are in keeping with previous findings about superior performance associated with mania. Further research is required to study etiological differences between these symptom dimensions and their implications for clinical practice.
Collapse
Affiliation(s)
| | - Natalie Castellanos-Ryan
- Department of Psychiatry, Centre de recherche du CHU Ste-Justine, Montral UniversityMontreal, QC, Canada
| | - Tobias Banaschewski
- Department of Child and Adolescent Psychiatry and Psychotherapy, Central Institute of Mental HealthMannheim, Germany,Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, University of HeidelbergMannheim, Germany
| | | | - Arun L Bokde
- Institute of Neuroscience, School of Medicine, Trinity College DublinDublin, Ireland
| | - Uli Bromberg
- University Medical Centre Hamburg-EppendorfHamburg, Germany
| | | | - Mira Fauth-Bühler
- Department of Child and Adolescent Psychiatry and Psychotherapy, Central Institute of Mental HealthMannheim, Germany
| | - Herta Flor
- Department of Child and Adolescent Psychiatry and Psychotherapy, Central Institute of Mental HealthMannheim, Germany,Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, University of HeidelbergMannheim, Germany
| | | | - Juergen Gallinat
- Department of Psychiatry and Psychology, Campus Charite Mitte, Charite Universitatsmedizin BerlinBerlin, Germany
| | - Hugh Garavan
- Institute of Neuroscience, School of Medicine, Trinity College DublinDublin, Ireland,Department of Psychiatry and Psychology, University of VermontBurlington, VT, USA
| | - Penny Gowland
- School of Physics and Astronomy, University of NottinghamNottingham, UK
| | - Andreas Heinz
- Department of Psychiatry and Psychology, Campus Charite Mitte, Charite Universitatsmedizin BerlinBerlin, Germany
| | - Bernd Itterman
- Physikalisch-Technische Bundesanstalt (PTB)Berlin, Germany
| | - Claire Lawrence
- School of Psychology, University of NottinghamNottingham, UK
| | - Frauke Nees
- Department of Child and Adolescent Psychiatry and Psychotherapy, Central Institute of Mental HealthMannheim, Germany,Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, University of HeidelbergMannheim, Germany
| | - Marie-Laure Paillere-Martinot
- Institut National de la Sante et de la Recherche Medicale, INSERM CEA Unit 1000 Imaging & Psychiatry, University Paris SudOrsay, France
| | - Tomas Paus
- School of Psychology, University of NottinghamNottingham, UK,Rotman Research Institute, University of TorontoToronto, ON, Canada,Montreal Neurological Institute, McGill UniversityMontreal, QC, Canada
| | - Zdenka Pausova
- The Hospital for Sick Children, University of TorontoToronto, ON, Canada
| | - Marcella Rietschel
- Department of Child and Adolescent Psychiatry and Psychotherapy, Central Institute of Mental HealthMannheim, Germany,Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, University of HeidelbergMannheim, Germany,Department of Genetic Epidemiology, Central Institute of Mental Health, Medical Faculty Mannheim, University of HeidelbergMannheim, Germany
| | - Michael N Smolka
- Neuroimaging Centre, Department of Psychiatry and Psychotherapy, Dresden University of TechnologyDresden, Germany
| | | | - Robert Goodman
- Institute of Psychiatry, King's College LondonLondon, UK
| | | | | |
Collapse
|
32
|
Abstract
While emotion is a central component of human health and well-being, traditional approaches to understanding its biological function have been wanting. A dynamic systems model, however, broadly redefines and recasts emotion as a primary sensory system-perhaps the first sensory system to have emerged, serving the ancient autopoietic function of "self-regulation." Drawing upon molecular biology and revelations from the field of epigenetics, the model suggests that human emotional perceptions provide an ongoing stream of "self-relevant" sensory information concerning optimally adaptive states between the organism and its immediate environment, along with coupled behavioral corrections that honor a universal self-regulatory logic, one still encoded within cellular signaling and immune functions. Exemplified by the fundamental molecular circuitry of sensorimotor control in the E coli bacterium, the model suggests that the hedonic (affective) categories emerge directly from positive and negative feedback processes, their good/bad binary appraisals relating to dual self-regulatory behavioral regimes-evolutionary purposes, through which organisms actively participate in natural selection, and through which humans can interpret optimal or deficit states of balanced being and becoming. The self-regulatory sensory paradigm transcends anthropomorphism, unites divergent theoretical perspectives and isolated bodies of literature, while challenging time-honored assumptions. While suppressive regulatory strategies abound, it suggests that emotions are better understood as regulating us, providing a service crucial to all semantic language, learning systems, evaluative decision-making, and fundamental to optimal physical, mental, and social health.
Collapse
Affiliation(s)
- Katherine T Peil
- College of Professional Studies, Northeastern University, Boston, Massachusetts; Harvard Divinity School, Cambridge, Massachusetts, United States
| |
Collapse
|
33
|
Gendron M, Roberson D, van der Vyver JM, Barrett LF. Cultural relativity in perceiving emotion from vocalizations. Psychol Sci 2014; 25:911-20. [PMID: 24501109 DOI: 10.1177/0956797613517239] [Citation(s) in RCA: 75] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
A central question in the study of human behavior is whether certain emotions, such as anger, fear, and sadness, are recognized in nonverbal cues across cultures. We predicted and found that in a concept-free experimental task, participants from an isolated cultural context (the Himba ethnic group from northwestern Namibia) did not freely label Western vocalizations with expected emotion terms. Responses indicate that Himba participants perceived more basic affective properties of valence (positivity or negativity) and to some extent arousal (high or low activation). In a second, concept-embedded task, we manipulated whether the target and foil on a given trial matched in both valence and arousal, neither valence nor arousal, valence only, or arousal only. Himba participants achieved above-chance accuracy only when foils differed from targets in valence only. Our results indicate that the voice can reliably convey affective meaning across cultures, but that perceptions of emotion from the voice are culturally variable.
Collapse
Affiliation(s)
- Maria Gendron
- 1Affective Science Institute, Northeastern University
| | | | | | | |
Collapse
|
34
|
Lee DH, Mirza R, Flanagan JG, Anderson AK. Optical origins of opposing facial expression actions. Psychol Sci 2014; 25:745-52. [PMID: 24463554 DOI: 10.1177/0956797613514451] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Darwin theorized that emotional expressions originated as opposing functional adaptations for the expresser, not as distinct categories of social signals. Given that two thirds of the eye's refractive power comes from the cornea, we examined whether opposing expressive behaviors that widen the eyes (e.g., fear) or narrow the eyes (e.g., disgust) may have served as an optical trade-off, enhancing either sensitivity or acuity, thereby promoting stimulus localization ("where") or stimulus discrimination ("what"), respectively. An optical model based on eye apertures of posed fear and disgust expressions supported this functional trade-off. We then tested the model using standardized optometric measures of sensitivity and acuity. We demonstrated that eye widening enhanced stimulus detection, whereas eye narrowing enhanced discrimination, each at the expense of the other. Opposing expressive actions around the eye may thus reflect origins in an optical principle, shaping visual encoding at its earliest stage-how light is cast onto the retina.
Collapse
Affiliation(s)
- Daniel H Lee
- 1Department of Psychology, University of Toronto
| | | | | | | |
Collapse
|
35
|
Harkness AR, Reynolds SM, Lilienfeld SO. A review of systems for psychology and psychiatry: adaptive systems, personality psychopathology five (PSY-5), and the DSM-5. J Pers Assess 2013; 96:121-39. [PMID: 23941204 DOI: 10.1080/00223891.2013.823438] [Citation(s) in RCA: 50] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
We outline a crisis in clinical description, in which atheoretical categorical descriptors, as in the Diagnostic and Statistical Manual of Mental Disorders (DSM), has turned focus away from the obvious: evolved major adaptive systems. Adaptive systems, at the core of a medical review of systems (ROS), allow models of pathology to be layered over an understanding of systems as they normally function. We argue that clinical psychology and psychiatry would develop more programmatically by incorporating 5 systems evolved for adaptation to the external environment: reality modeling for action, short-term danger detection, long-term cost-benefit projection, resource acquisition, and agenda protection. These systems, although not exhaustive, coincide with great historical issues in psychology, psychopathology, and individual differences. Readers of this journal should be interested in this approach because personality is seen as a relatively stable property of these systems. Thus, an essential starting point in ROS-based clinical description involves personality assessment. But this approach also places demands on scientist-practitioners to integrate across sciences. An ROS promotes theories that are (a) compositional, answering the question: What elements comprise the system?; (b) dynamic, answering: How do the elements and other systems interact?; and (c) developmental: How do systems change over time? The proposed ROS corresponds well with the National Institute of Mental Health's recent research domain criteria (RDoC) approach. We urge that in the RDoC approach, measurement variables should be treated as falsifiable and theory-laden markers, not unfalsifiable criteria. We argue that our proposed ROS promotes integration across sciences, rather than fostering the isolation of sciences allowed by atheoretical observation terms, as in the DSM.
Collapse
Affiliation(s)
- Allan R Harkness
- a Department of Psychology and Institute for Biochemical and Psychological Study of Individual Differences , The University of Tulsa
| | | | | |
Collapse
|
36
|
Farb NAS, Chapman HA, Anderson AK. Emotions: form follows function. Curr Opin Neurobiol 2013; 23:393-8. [PMID: 23375166 DOI: 10.1016/j.conb.2013.01.015] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2013] [Accepted: 01/13/2013] [Indexed: 10/27/2022]
Abstract
Emotion research has been divided by debate as to whether emotions are universal in form or cognitively constructed. We review an emerging approach that focuses on function rather than form. Functional affective science suggests that the particular origin of an emotion is relatively unimportant; instead, emotions can be understood in terms of a rapidly deployed set of mechanisms that structure perception, cognition and behavior to facilitate goal fulfillment. Evidence from this approach suggests at least three major functions of emotion: sensory gating, embodying affect, and integrating knowledge toward goal resolution. These functions appear to be universal and automatically activated, yet also moderated by conscious representation and regulatory efforts.
Collapse
|
37
|
Abstract
In this review, we highlight evidence suggesting that concepts represented in language are used to create a perception of emotion from the constant ebb and flow of other people’s facial muscle movements. In this “construction hypothesis,” (cf. Gendron, Lindquist, Barsalou, & Barrett, 2012) (see also Barrett, 2006b; Barrett, Lindquist, & Gendron, 2007; Barrett, Mesquita, & Gendron, 2011), language plays a constitutive role in emotion perception because words ground the otherwise highly variable instances of an emotion category. We demonstrate that language plays a constitutive role in emotion perception by discussing findings from behavior, neuropsychology, development, and neuroimaging. We close by discussing implications of a constructionist view for the science of emotion.
Collapse
Affiliation(s)
| | - Maria Gendron
- Department of Psychology, Boston College, USA; Department of Psychology, Northeastern University, USA
| |
Collapse
|
38
|
Abstract
Translational research on emotion in schizophrenia has revealed deficits in emotion perception and expression, as well as intact areas, including emotional experience and brain activation in the presence of emotionally evocative material. Yet, a closer look at emotional experience reveals that all is not well in the experience domain. People with schizophrenia have difficulty anticipating emotional events and maintaining or savoring their emotional experiences, as evidenced in behavioral, psychophysiological, and brain imaging studies. Furthermore, people with schizophrenia have difficulty integrating emotion perception with context and reporting on feelings that are differently valenced than presented emotional stimuli. Differences in brain activation are typically observed in areas tightly coupled with cognitive control, such as the dorsolateral prefrontal cortex, and thus the latest research on emotion in schizophrenia explicitly integrates emotion and cognition. Translational research holds promise to identify when in the course of the disorder emotion deficits emerge and to develop more effective interventions for schizophrenia.
Collapse
Affiliation(s)
- Ann M Kring
- Department of Psychology, University of California, Berkeley, California 94720, USA.
| | | |
Collapse
|
39
|
Abstract
In our response, we clarify important theoretical differences between basic emotion and psychological construction approaches. We evaluate the empirical status of the basic emotion approach, addressing whether it requires brain localization, whether localization can be observed with better analytic tools, and whether evidence for basic emotions exists in other types of measures. We then revisit the issue of whether the key hypotheses of psychological construction are supported by our meta-analytic findings. We close by elaborating on commentator suggestions for future research.
Collapse
|
40
|
Hamann S. Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends Cogn Sci 2012; 16:458-66. [DOI: 10.1016/j.tics.2012.07.006] [Citation(s) in RCA: 133] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2012] [Revised: 07/21/2012] [Accepted: 07/22/2012] [Indexed: 10/28/2022]
|
41
|
Abstract
Although research on the nonverbal expression of emotion has played a prominent role throughout psychology during the past two decades—including an instrumental role in the development of contemporary evolutionary psychology—little research has focused on the evolutionary origins and functions of the emotional expressions themselves. However, recent findings from psychophysical, comparative, social, and cross-cultural psychology are converging to produce a compelling functionalist account, suggesting that emotional expressions serve critical adaptive purposes. Most of these studies have narrowly focused on single emotions—an approach that has been very useful for providing new insights about specific expressions but not for developing a broader understanding of why humans universally display and recognize distinct emotions. Here we unify these disparate findings in order to illuminate this fundamental form of social communication.
Collapse
|
42
|
Abstract
We appreciate Barrett’s (2011, this issue) comments and her discussion of how our two-stage model is and is not consistent with Darwin’s views on the evolution of emotion expressions. Like many pioneering books, Darwin’s The Expression of Emotions in Man and Animals represents a flurry of novel and revolutionary, yet often inconsistent, ideas, which lend themselves to different readings. However, while the historical perspective Barrett provides is useful, the scientific conversation on emotion expressions has evolved since Darwin. Here, we briefly discuss why the two alternative explanations Barrett offers for the origins of emotion expressions—expressions as cultural symbols and/or as evolutionary byproducts—are both untenable in light of existing research. We also note that although evidence for our two-stage model is currently incomplete, our goal was not to tell a complete story. Instead, we sought to offer the best emerging explanation for the existing research and provide a path for future empirical work that can test it.
Collapse
|