1
|
Yang X, Wang W, Yang M, Qiu C, Xu Q. The influence of contextual uncertainty on facial expression processing: Evidence from behavior and ERPs. Biol Psychol 2024; 193:108861. [PMID: 39293553 DOI: 10.1016/j.biopsycho.2024.108861] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2024] [Revised: 09/06/2024] [Accepted: 09/09/2024] [Indexed: 09/20/2024]
Abstract
The brain helps individuals build expectations based on emotional prediction, facilitating the processing of faces in social interactions. Due to the intricacy of the environment, accurately predicting emotions remains a formidable task. Contextual uncertainty represents a state characterized by the inability to predict when, how, and why events occur. It leads to intensified sentiments and triggers adverse emotions like anxiety. Therefore, comprehending the influences of contextual uncertainty carries importance. The present study utilized event-related potentials (ERPs) technology to investigate contextual uncertainty's influence on facial expression processing. We employed a novel S1-S2 paradigm, using scene images as S1 and faces as S2. From the learning phase into the testing phase, the certain to uncertain group (CER to UNC) experienced more unpredictability (increased uncertainty), whereas the uncertain to certain group (UNC to CER) experienced more predictability (decreased uncertainty). This allowed for manipulating dynamic alterations in predictive relationships. The behavioral results showed that the valence ratings of neutral facial expressions were more negative in the CER to UNC group with increased contextual uncertainty. The ERP results showed that the more negative SPN (stimulus preceding negativity) amplitudes and positive LPP (late positive potential) amplitudes were observed in the UNC to CER group with decreased contextual uncertainty, compared to the CER to UNC group with increased contextual uncertainty. These findings have indicated that contextual uncertainty affects facial expression processing. In summary, these results contributed to comprehending the contextual uncertainty.
Collapse
Affiliation(s)
- Xinchao Yang
- Department and Institute of Psychology, Ningbo University, Ningbo 315211, China
| | - Weihan Wang
- Department and Institute of Psychology, Ningbo University, Ningbo 315211, China
| | - Mingkui Yang
- Department and Institute of Psychology, Ningbo University, Ningbo 315211, China
| | - Chunying Qiu
- Department and Institute of Psychology, Ningbo University, Ningbo 315211, China
| | - Qiang Xu
- Department and Institute of Psychology, Ningbo University, Ningbo 315211, China.
| |
Collapse
|
2
|
Trujillo-Llano C, Sainz-Ballesteros A, Suarez-Ardila F, Gonzalez-Gadea ML, Ibáñez A, Herrera E, Baez S. Neuroanatomical markers of social cognition in neglected adolescents. Neurobiol Stress 2024; 31:100642. [PMID: 38800539 PMCID: PMC11127280 DOI: 10.1016/j.ynstr.2024.100642] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2024] [Revised: 05/07/2024] [Accepted: 05/12/2024] [Indexed: 05/29/2024] Open
Abstract
Growing up in neglectful households can impact multiple aspects of social cognition. However, research on neglect's effects on social cognition processes and their neuroanatomical correlates during adolescence is scarce. Here, we aimed to comprehensively assess social cognition processes (recognition of basic and contextual emotions, theory of mind, the experience of envy and Schadenfreude and empathy for pain) and their structural brain correlates in adolescents with legal neglect records within family-based care. First, we compared neglected adolescents (n = 27) with control participants (n = 25) on context-sensitive social cognition tasks while controlling for physical and emotional abuse and executive and intellectual functioning. Additionally, we explored the grey matter correlates of these domains through voxel-based morphometry. Compared to controls, neglected adolescents exhibited lower performance in contextual emotional recognition and theory of mind, higher levels of envy and Schadenfreude and diminished empathy. Physical and emotional abuse and executive or intellectual functioning did not explain these effects. Moreover, social cognition scores correlated with brain volumes in regions subserving social cognition and emotional processing. Our results underscore the potential impact of neglect on different aspects of social cognition during adolescence, emphasizing the necessity for preventive and intervention strategies to address these deficits in this population.
Collapse
Affiliation(s)
- Catalina Trujillo-Llano
- Department of Neurology, Universitätsmedizin Greifswald, Greifswald, Germany
- Facultad de Psicología, Universidad Del Valle, Cali, Colombia
| | - Agustín Sainz-Ballesteros
- Department of Psychology, University of Tübingen, Tübingen, Germany
- Centre for Integrative Neuroscience, Tübingen, Germany
- Department for High-Field Magnetic Resonance, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | | | - María Luz Gonzalez-Gadea
- Cognitive Neuroscience Center, Universidad de San Andres, Buenos Aires, Argentina
- National Scientific and Technical Research Council (CONICET), Buenos Aires, Argentina
| | - Agustín Ibáñez
- Cognitive Neuroscience Center, Universidad de San Andres, Buenos Aires, Argentina
- Latin American Brain Health (BrainLat), Universidad Adolfo Ibáñez, Santiago, Chile
- Global Brain Health Institute, University of California-San Francisco, San Francisco, CA, United States
- Trinity College Dublin, Dublin, Ireland
| | - Eduar Herrera
- Universidad Icesi, Departamento de Estudios Psicológicos, Cali, Colombia
| | - Sandra Baez
- Global Brain Health Institute, University of California-San Francisco, San Francisco, CA, United States
- Trinity College Dublin, Dublin, Ireland
- Universidad de Los Andes, Bogotá, Colombia
| |
Collapse
|
3
|
Huang X, Sun Y, Tao R, Yan K, Zhang E. Morality or competence is more important? The effect of evaluation dimensions on ERP responses to neutral faces depends on contextual valence and self-relevance. Int J Psychophysiol 2024; 200:112358. [PMID: 38710371 DOI: 10.1016/j.ijpsycho.2024.112358] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2024] [Revised: 04/21/2024] [Accepted: 05/01/2024] [Indexed: 05/08/2024]
Abstract
Recent studies have shown that the processing of neutral facial expressions could be modulated by the valence and self-relevance of preceding verbal evaluations. However, these studies have not distinguished the dimension (i.e., morality and competence) from verbal evaluations. In fact, there is a hot controversy about whether morality or competence receives more weight. Therefore, using the ERP technique, the current study aimed to address this issue by comparing the influence of morality and competence evaluations on behavioral and neural responses to neutral facial expressions when these evaluations varied with contextual valence and self-relevance. Our ERP results revealed that the early EPN amplitudes were larger for neutral faces after receiving evaluations about self relative to evaluations about senders. Moreover, the EPN was more negative after a competence evaluation relative to a morality evaluation when these evaluations were positive, while this effect was absent when these evaluations were negative. The late LPP was larger after a morality evaluation compared to a competence evaluation when these evaluations were negative and directed to self. However, no significant LPP effect between morality and competence evaluations was observed when these evaluations were positive. The present study extended previous studies by showing that early and late processing stages of faces are affected by the evaluation dimension in a top-down manner and further modulated by contextual valence and self-relevance.
Collapse
Affiliation(s)
- Xiaoyang Huang
- Institute of Cognition, Brain & Health, Henan University, Kaifeng, China; Institute of Psychology and Behavior, Henan University, Kaifeng, China
| | - Yuliu Sun
- Zhengzhou University of Railway Engineering, Zhengzhou, China
| | - Ruiwen Tao
- Center for Magnetic Resonance Imaging Research & Key Laboratory of Applied Brain and Cognitive Sciences, Shanghai International Studies University, Shanghai, China; College of International Business, Shanghai International Studies University, Shanghai, China
| | - Kaikai Yan
- Center for Magnetic Resonance Imaging Research & Key Laboratory of Applied Brain and Cognitive Sciences, Shanghai International Studies University, Shanghai, China; College of International Business, Shanghai International Studies University, Shanghai, China
| | - Entao Zhang
- Institute of Cognition, Brain & Health, Henan University, Kaifeng, China; Institute of Psychology and Behavior, Henan University, Kaifeng, China.
| |
Collapse
|
4
|
Brooks JA, Kim L, Opara M, Keltner D, Fang X, Monroy M, Corona R, Tzirakis P, Baird A, Metrick J, Taddesse N, Zegeye K, Cowen AS. Deep learning reveals what facial expressions mean to people in different cultures. iScience 2024; 27:109175. [PMID: 38433918 PMCID: PMC10906517 DOI: 10.1016/j.isci.2024.109175] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2023] [Revised: 09/05/2023] [Accepted: 02/06/2024] [Indexed: 03/05/2024] Open
Abstract
Cross-cultural studies of the meaning of facial expressions have largely focused on judgments of small sets of stereotypical images by small numbers of people. Here, we used large-scale data collection and machine learning to map what facial expressions convey in six countries. Using a mimicry paradigm, 5,833 participants formed facial expressions found in 4,659 naturalistic images, resulting in 423,193 participant-generated facial expressions. In their own language, participants also rated each expression in terms of 48 emotions and mental states. A deep neural network tasked with predicting the culture-specific meanings people attributed to facial movements while ignoring physical appearance and context discovered 28 distinct dimensions of facial expression, with 21 dimensions showing strong evidence of universality and the remainder showing varying degrees of cultural specificity. These results capture the underlying dimensions of the meanings of facial expressions within and across cultures in unprecedented detail.
Collapse
Affiliation(s)
- Jeffrey A. Brooks
- Research Division, Hume AI, New York, NY 10010, USA
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Lauren Kim
- Research Division, Hume AI, New York, NY 10010, USA
| | | | - Dacher Keltner
- Research Division, Hume AI, New York, NY 10010, USA
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Xia Fang
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, Zhejiang, China
| | - Maria Monroy
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | - Rebecca Corona
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| | | | - Alice Baird
- Research Division, Hume AI, New York, NY 10010, USA
| | | | | | | | - Alan S. Cowen
- Research Division, Hume AI, New York, NY 10010, USA
- Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA
| |
Collapse
|
5
|
Chiu CD, Lo APK, Mak FKL, Hui KH, Lynn SJ, Cheng SK. Remember walking in their shoes? The relation of self-referential source memory and emotion recognition. Cogn Emot 2024; 38:120-130. [PMID: 37882206 DOI: 10.1080/02699931.2023.2274040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Accepted: 10/09/2023] [Indexed: 10/27/2023]
Abstract
Deficits in the ability to read the emotions of others have been demonstrated in mental disorders, such as dissociation and schizophrenia, which involve a distorted sense of self. This study examined whether weakened self-referential source memory, being unable to remember whether a piece of information has been processed with reference to oneself, is linked to ineffective emotion recognition. In two samples from a college and community, we quantified the participants' ability to remember the self-generated versus non-self-generated origins of sentences they had previously read or partially generated. We also measured their ability to read others' emotions accurately when viewing photos of people in affect-charged situations. Multinomial processing tree modelling was applied to obtain a measure of self-referential source memory that was not biased by non-mnemonic factors. Our first experiment with college participants revealed a positive correlation between correctly remembering the origins of sentences and accurately recognising the emotions of others. This correlation was successfully replicated in the second experiment with community participants. The current study offers evidence of a link between self-referential source memory and emotion recognition.
Collapse
Affiliation(s)
- Chui-De Chiu
- Department of Psychology, The Chinese University of Hong Kong, Hong Kong S.A.R., People's Republic of China
| | - Alfred Pak-Kwan Lo
- Department of Psychology, The Chinese University of Hong Kong, Hong Kong S.A.R., People's Republic of China
| | - Frankie Ka-Lun Mak
- Department of Psychology, The Chinese University of Hong Kong, Hong Kong S.A.R., People's Republic of China
| | - Kam-Hei Hui
- Department of Psychology, The Chinese University of Hong Kong, Hong Kong S.A.R., People's Republic of China
| | - Steven Jay Lynn
- Department of Psychology, Binghamton University, Binghamton, NY, USA
| | - Shih-Kuen Cheng
- Institute of Cognitive Neuroscience, National Central University, Taoyuan, Taiwan
| |
Collapse
|
6
|
Ziereis A, Schacht A. Gender congruence and emotion effects in cross-modal associative learning: Insights from ERPs and pupillary responses. Psychophysiology 2023; 60:e14380. [PMID: 37387451 DOI: 10.1111/psyp.14380] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 05/01/2023] [Accepted: 06/17/2023] [Indexed: 07/01/2023]
Abstract
Social and emotional cues from faces and voices are highly relevant and have been reliably demonstrated to attract attention involuntarily. However, there are mixed findings as to which degree associating emotional valence to faces occurs automatically. In the present study, we tested whether inherently neutral faces gain additional relevance by being conditioned with either positive, negative, or neutral vocal affect bursts. During learning, participants performed a gender-matching task on face-voice pairs without explicit emotion judgments of the voices. In the test session on a subsequent day, only the previously associated faces were presented and had to be categorized regarding gender. We analyzed event-related potentials (ERPs), pupil diameter, and response times (RTs) of N = 32 subjects. Emotion effects were found in auditory ERPs and RTs during the learning session, suggesting that task-irrelevant emotion was automatically processed. However, ERPs time-locked to the conditioned faces were mainly modulated by the task-relevant information, that is, the gender congruence of the face and voice, but not by emotion. Importantly, these ERP and RT effects of learned congruence were not limited to learning but extended to the test session, that is, after removing the auditory stimuli. These findings indicate successful associative learning in our paradigm, but it did not extend to the task-irrelevant dimension of emotional relevance. Therefore, cross-modal associations of emotional relevance may not be completely automatic, even though the emotion was processed in the voice.
Collapse
Affiliation(s)
- Annika Ziereis
- Department for Cognition, Emotion and Behavior, Affective Neuroscience and Psychophysiology Laboratory, Institute of Psychology, Georg-August-University of Göttingen, Göttingen, Germany
| | - Anne Schacht
- Department for Cognition, Emotion and Behavior, Affective Neuroscience and Psychophysiology Laboratory, Institute of Psychology, Georg-August-University of Göttingen, Göttingen, Germany
| |
Collapse
|
7
|
Li Z, Li K, Liu Y, Gong M, Shang J, Liu W, Liu Y, Jiang Z. Semantic satiation of emotional words impedes facial expression processing in two stages. Heliyon 2023; 9:e18341. [PMID: 37539095 PMCID: PMC10395535 DOI: 10.1016/j.heliyon.2023.e18341] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Revised: 07/10/2023] [Accepted: 07/13/2023] [Indexed: 08/05/2023] Open
Abstract
To explore the mechanism of emotional words semantic satiation effect on facial expression processing, participants were asked to judge the facial expression (happiness or sadness) after an emotional word ((cry) or (smile)) or a neutral word ((Ah), baseline condition) was presented for 20 s. The results revealed that participants were slower in judging valence-congruent facial expressions and reported a more enlarged (Experiment 1) and prolonged (Experiment 2) N170 component than the baseline condition. No significant difference in behavior and N170 appeared between the valence-incongruent and the baseline condition. However, the amplitude of LPC (Late Positive Complex) under both valence-congruent/incongruent conditions was smaller than the baseline condition. It indicates that, in the early stage, the impeding effect of satiated emotional words is specifically constrained to facial expressions with the same emotional valence; in the late stage, such an impeding effect might spread to facial expressions with the opposite valence of the satiated emotional word.
Collapse
Affiliation(s)
- Zhao Li
- School of Psychology, Liaoning Normal University, Dalian, China
| | - Kewei Li
- Tianjin Vocational College of Sports, Tianjin, China
| | - Ying Liu
- School of Psychology, Liaoning Normal University, Dalian, China
| | - Mingliang Gong
- School of Psychology, Jiangxi Normal University, Nanchang, China
| | - Junchen Shang
- School of Humanities, Southeast University, Nanjing, China
| | - Wen Liu
- School of Psychology, Liaoning Normal University, Dalian, China
| | - Yangtao Liu
- School of Psychology, Liaoning Normal University, Dalian, China
| | - Zhongqing Jiang
- School of Psychology, Liaoning Normal University, Dalian, China
| |
Collapse
|
8
|
Ortega J, Chen Z, Whitney D. Inferential Emotion Tracking reveals impaired context-based emotion processing in individuals with high Autism Quotient scores. Sci Rep 2023; 13:8093. [PMID: 37208368 DOI: 10.1038/s41598-023-35371-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2023] [Accepted: 05/17/2023] [Indexed: 05/21/2023] Open
Abstract
Emotion perception is essential for successful social interactions and maintaining long-term relationships with friends and family. Individuals with autism spectrum disorder (ASD) experience social communication deficits and have reported difficulties in facial expression recognition. However, emotion recognition depends on more than just processing face expression; context is critically important to correctly infer the emotions of others. Whether context-based emotion processing is impacted in those with Autism remains unclear. Here, we used a recently developed context-based emotion perception task, called Inferential Emotion Tracking (IET), and investigated whether individuals who scored high on the Autism Spectrum Quotient (AQ) had deficits in context-based emotion perception. Using 34 videos (including Hollywood movies, home videos, and documentaries), we tested 102 participants as they continuously tracked the affect (valence and arousal) of a blurred-out, invisible character. We found that individual differences in Autism Quotient scores were more strongly correlated with IET task accuracy than they are with traditional face emotion perception tasks. This correlation remained significant even when controlling for potential covarying factors, general intelligence, and performance on traditional face perception tasks. These findings suggest that individuals with ASD may have impaired perception of contextual information, it reveals the importance of developing ecologically relevant emotion perception tasks in order to better assess and treat ASD, and it provides a new direction for further research on context-based emotion perception deficits in ASD.
Collapse
Affiliation(s)
- Jefferson Ortega
- Department of Psychology, University of California, Berkeley, CA, 94720, USA.
| | - Zhimin Chen
- Department of Psychology, University of California, Berkeley, CA, 94720, USA
| | - David Whitney
- Department of Psychology, University of California, Berkeley, CA, 94720, USA
- Vision Science Program, University of California, Berkeley, CA, 94720, USA
- Helen Wills Neuroscience Institute, University of California, Berkeley, CA, 94720, USA
| |
Collapse
|
9
|
Effects of social context on facial trustworthiness judgments. CURRENT PSYCHOLOGY 2022. [DOI: 10.1007/s12144-022-04143-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
|
10
|
Gagnon M, Chérif L, Roy-Charland A. Contextual cues about reciprocity impact ratings of smile sincerity. Cogn Emot 2022; 36:1181-1195. [PMID: 35731119 DOI: 10.1080/02699931.2022.2090903] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
Abstract
Research has shown that context influences how sincere a smile appears to observers. That said, most studies on this topic have focused exclusively on situational cues (e.g. smiling while at a party versus smiling during a job interview) and few have examined other elements of context. One important element concerns any knowledge an observer might have about the smiler as an individual (e.g. their habitual behaviours, traits or attitudes). In this manuscript, we present three experiments that explored the influence of such knowledge on ratings of smile sincerity. In Experiments 1 and 2, participants rated the sincerity of Duchenne and non-Duchenne smiles after having been exposed to cues about the smiler's tendency to reciprocate (this person always, never or occasionally returns favours). In Experiment 3 they performed the same task but with cues about the smiler's love of learning (this person always, never or occasionally enjoys learning new tasks). The results show that cues about the smiler's reciprocity tendency influenced participants' ratings of smile sincerity and did so in a stronger manner than cues about the smiler's love of learning. Overall, these results both strengthen and broaden the literature on the role of context on judgements of smile sincerity.
Collapse
Affiliation(s)
- Mathieu Gagnon
- Department of Military Psychology and Leadership, Royal Military College of Canada, Kingston, ON, Canada
| | - Lobna Chérif
- Department of Military Psychology and Leadership, Royal Military College of Canada, Kingston, ON, Canada
| | | |
Collapse
|
11
|
Chen Y, Xu Q, Fan C, Wang Y, Jiang Y. Eye gaze direction modulates nonconscious affective contextual effect. Conscious Cogn 2022; 102:103336. [DOI: 10.1016/j.concog.2022.103336] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Revised: 04/06/2022] [Accepted: 04/23/2022] [Indexed: 11/03/2022]
|
12
|
An S, Zhao M, Qin F, Zhang H, Mao W. High Emotional Similarity Will Enhance the Face Memory and Face-Context Associative Memory. Front Psychol 2022; 13:877375. [PMID: 35615173 PMCID: PMC9126175 DOI: 10.3389/fpsyg.2022.877375] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Accepted: 04/08/2022] [Indexed: 11/26/2022] Open
Abstract
Previous research has explored how emotional valence (positive or negative) affected face-context associative memory, while little is known about how arousing stimuli that share the same valence but differ in emotionality are bound together and retained in memory. In this study, we manipulated the emotional similarity between the target face and the face associated with the context emotion (i.e., congruent, high similarity, and low similarity), and examined the effect of emotional similarity of negative emotion (i.e., disgust, anger, and fear) on face-context associative memory. Our results showed that the greater the emotional similarity between the faces, the better the face memory and face-context associative memory were. These findings suggest that the processing of facial expression and its associated context may benefit from taking into account the emotional similarity between the faces.
Collapse
Affiliation(s)
| | | | | | | | - Weibin Mao
- School of Psychology, Shandong Normal University, Jinan, China
| |
Collapse
|
13
|
Nudelman MF, Portugal LCL, Mocaiber I, David IA, Rodolpho BS, Pereira MG, de Oliveira L. Long-Term Influence of Incidental Emotions on the Emotional Judgment of Neutral Faces. Front Psychol 2022; 12:772916. [PMID: 35069355 PMCID: PMC8773088 DOI: 10.3389/fpsyg.2021.772916] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2021] [Accepted: 11/22/2021] [Indexed: 11/29/2022] Open
Abstract
Background: Evidence indicates that the processing of facial stimuli may be influenced by incidental factors, and these influences are particularly powerful when facial expressions are ambiguous, such as neutral faces. However, limited research investigated whether emotional contextual information presented in a preceding and unrelated experiment could be pervasively carried over to another experiment to modulate neutral face processing. Objective: The present study aims to investigate whether an emotional text presented in a first experiment could generate negative emotion toward neutral faces in a second experiment unrelated to the previous experiment. Methods: Ninety-nine students (all women) were randomly assigned to read and evaluate a negative text (negative context) or a neutral text (neutral text) in the first experiment. In the subsequent second experiment, the participants performed the following two tasks: (1) an attentional task in which neutral faces were presented as distractors and (2) a task involving the emotional judgment of neutral faces. Results: The results show that compared to the neutral context, in the negative context, the participants rated more faces as negative. No significant result was found in the attentional task. Conclusion: Our study demonstrates that incidental emotional information available in a previous experiment can increase participants’ propensity to interpret neutral faces as more negative when emotional information is directly evaluated. Therefore, the present study adds important evidence to the literature suggesting that our behavior and actions are modulated by previous information in an incidental or low perceived way similar to what occurs in everyday life, thereby modulating our judgments and emotions.
Collapse
Affiliation(s)
- Marta F Nudelman
- Laboratory of Neurophysiology of Behaviour, Department of Physiology and Pharmacology, Biomedical Institute, Universidade Federal Fluminense, Niterói, Brazil
| | - Liana C L Portugal
- Laboratory of Neurophysiology of Behaviour, Department of Physiology and Pharmacology, Biomedical Institute, Universidade Federal Fluminense, Niterói, Brazil.,Department of Physiological Sciences, Biomedical Center, Roberto Alcantara Gomes Biology Institute, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, Brazil
| | - Izabela Mocaiber
- Laboratory of Cognitive Psychophysiology, Department of Natural Sciences, Institute of Humanities and Health, Universidade Federal Fluminense, Rio das Ostras, Brazil
| | - Isabel A David
- Laboratory of Neurophysiology of Behaviour, Department of Physiology and Pharmacology, Biomedical Institute, Universidade Federal Fluminense, Niterói, Brazil
| | - Beatriz S Rodolpho
- Laboratory of Neurophysiology of Behaviour, Department of Physiology and Pharmacology, Biomedical Institute, Universidade Federal Fluminense, Niterói, Brazil
| | - Mirtes G Pereira
- Laboratory of Neurophysiology of Behaviour, Department of Physiology and Pharmacology, Biomedical Institute, Universidade Federal Fluminense, Niterói, Brazil
| | - Leticia de Oliveira
- Laboratory of Neurophysiology of Behaviour, Department of Physiology and Pharmacology, Biomedical Institute, Universidade Federal Fluminense, Niterói, Brazil
| |
Collapse
|
14
|
Yonemitsu F, Sasaki K, Gobara A, Yamada Y. The clone devaluation effect: does duplication of local facial features matter? BMC Res Notes 2021; 14:400. [PMID: 34715916 PMCID: PMC8555204 DOI: 10.1186/s13104-021-05815-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2021] [Accepted: 10/19/2021] [Indexed: 11/16/2022] Open
Abstract
Objective The clone devaluation is a phenomenon reported by the latest paper in which eeriness is evoked when people observe individuals with the same face (clone faces) compared to those with different faces. There are two possibilities that explain the clone devaluation effect. One is that the same facial features that clone faces have (duplication of facial features) induce the clone devaluation effect. The other possibility is that duplication of identities between people with clone faces is important for the clone devaluation effect. We thus conducted an experiment to investigate whether the duplication of identities or of facial features induces the clone devaluation effect. Results Participants evaluated eeriness of scrambled clone faces and scrambled different faces using the paired comparison method. There was only a slight difference in subjective eeriness between scrambled clone faces and scrambled different faces. Therefore, this study suggests that the duplication of local facial features does not play a key role in inducing the clone devaluation effect.
Collapse
Affiliation(s)
- Fumiya Yonemitsu
- Graduate School of Humanities and Social Sciences, Hiroshima University, 1-7-1 Kagamiyama, Higashihiroshima, Hiroshima, 739-8521, Japan. .,Japan Society for the Promotion of Science, Kojimachi Business Center Building, 5-3-1 Kojimachi, Chiyoda-ku, Tokyo, 102-0083, Japan.
| | - Kyoshiro Sasaki
- Faculty of Informatics, Kansai University, 2-1-1 Ryozenji-cho, Takatsuki, Osaka, 569-1095, Japan
| | - Akihiko Gobara
- Faculty of Arts and Science, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka, 819-0395, Japan.,BKC Research Organization of Social Sciences, Ritsumeikan University, 1-1-1 Noji-higashi, Kusatsu, Shiga, 525-8577, Japan
| | - Yuki Yamada
- Faculty of Arts and Science, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka, 819-0395, Japan
| |
Collapse
|
15
|
Li Z, Zhu P, Liu Y, Jiang Z. Gender Word Semantic Satiation Inhibits Facial Gender Information Processing. J PSYCHOPHYSIOL 2021. [DOI: 10.1027/0269-8803/a000274] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Abstract. In order to explore the time course of the influence of gender words semantic satiation on facial gender information processing, the semantic satiation paradigm was used to induce semantic satiation by presenting Chinese gender words “男, 女 (Male, Female)” for a long duration (25 s), with conjunction words “及(And), 且(Moreover)” served as the baseline (the Chinese words and their English translations do not completely equal in terms of pronunciation, form, and sense). Participants were asked to judge whether the two simultaneously presented faces (Experiment 1) or two successively presented faces (Experiment 2) were of the same gender. The results of Experiment 1 showed that the response time in semantic satiation condition was significantly longer than that of the baseline condition. The event-related potential (ERP) results of Experiment 2 showed that the peak amplitude of P1 component in semantic satiation condition was significantly smaller than that of the baseline condition in the early stage of face processing; N170, a specific component of face perception, in semantic satiation condition was significantly larger than that of the baseline condition. The average amplitude of LPC in semantic satiation condition was significantly smaller than that of the baseline condition. This study shows that facial gender information processing is affected by its semantic contextual information. The inhibition effect of gender word semantic satiation on facial gender information processing starts at the attention orientation stage, then continues to the face structural encoding stage, and eventually ends at the advanced cognitive response stage.
Collapse
Affiliation(s)
- Zhao Li
- School of Psychology, Liaoning Normal University, Dalian, PR China
| | - Peng Zhu
- School of Teacher Education, Huzhou University, HuZhou, PR China
| | - Ying Liu
- School of Psychology, Liaoning Normal University, Dalian, PR China
| | - Zhongqing Jiang
- School of Psychology, Liaoning Normal University, Dalian, PR China
| |
Collapse
|
16
|
Korb S, Deniz TC, Ünal B, Clarke A, Silani G. Emotion perception bias associated with the hijab in Austrian and Turkish participants. Q J Exp Psychol (Hove) 2021; 75:796-807. [PMID: 34507515 PMCID: PMC8958558 DOI: 10.1177/17470218211048317] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In a cross-cultural study, we investigated the link between explicit attitudes towards the hijab and implicit measures of cultural and religious bias during the recognition of emotions. Participants tested in Austria (N = 71) and in Turkey (N = 70) reported their attitude towards the hijab, and categorised in a mousetracker task happy and sad faces of women, shown with five levels of intensity, and framed either by a hijab or by an oval-shaped mask. The two samples did not differ in their explicit attitudes towards the hijab. However, negative attitude towards the hijab predicted greater sadness attribution to happy faces with the hijab in Austrian participants. Unrelated to their explicit attitudes, Turkish participants attributed more sadness to happy faces with than without the hijab. Results suggest that the sight of the hijab activated, in both Austrian and Turkish participants, implicit biases resulting in associations with sadness and negative emotions.
Collapse
Affiliation(s)
- Sebastian Korb
- Department of Psychology, University of Essex, Colchester, UK
- Department of Cognition, Emotion, and Methods in Psychology, University of Vienna, Wien, Austria
- Sebastian Korb, Department of Psychology, University of Essex, Wivenhoe Park, Colchester CO4 3SQ, UK.
| | - Tugba Ceren Deniz
- Faculty of Arts and Sciences, Department of Psychology, TED University, Ankara, Turkey
| | - Bengi Ünal
- Faculty of Arts and Sciences, Department of Psychology, TED University, Ankara, Turkey
| | - Alasdair Clarke
- Department of Psychology, University of Essex, Colchester, UK
| | - Giorgia Silani
- Department of Clinical and Health Psychology, University of Vienna, Wien, Austria
| |
Collapse
|
17
|
Ong CW, Ito K. Can't fight seeing sadness in tears: Measuring the implicit association between tears and sadness. BRITISH JOURNAL OF SOCIAL PSYCHOLOGY 2021; 61:672-687. [PMID: 34569070 DOI: 10.1111/bjso.12503] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2021] [Revised: 09/12/2021] [Indexed: 11/28/2022]
Abstract
Visible tears have been shown to enhance the perception of sadness. Whether the sadness perception from visible tears can occur automatically, which is essential for the rapid identification of emotional cues in real-life social interactions, is still unclear. We employed the reaction-time-based Implicit Association Test (IAT) to assess the implicit association of tears and sadness in two studies. Study 1 (N = 58) used sadness/non-sadness or negative/positive affect words as attribute pairs and images of tearless or tearful neutral expressions as targeted concepts. In Study 2 (N = 54), the neutral expressions were replaced with anger, disgust, fear, surprise, and happiness expressions with or without tears. Both studies revealed a strong tendency among participants to implicitly associate tears with sadness and negative affect. The results complemented findings from self-report measures by showing that the perception of sadness from visible tears can occur efficiently with little control.
Collapse
Affiliation(s)
- Chew Wei Ong
- School of Social Sciences, Nanyang Technological University, Singapore
| | - Kenichi Ito
- School of Social Sciences, Nanyang Technological University, Singapore
| |
Collapse
|
18
|
The Sexual OBjectification and EMotion database: A free stimulus set and norming data of sexually objectified and non-objectified female targets expressing multiple emotions. Behav Res Methods 2021; 54:541-555. [PMID: 34291433 PMCID: PMC9046321 DOI: 10.3758/s13428-021-01640-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/28/2021] [Indexed: 11/08/2022]
Abstract
Sexual objectification - perceiving or treating a woman as a sexual object - is a widespread phenomenon. Studies on sexual objectification and its consequences have grown dramatically over the last decades covering multiple and diverse areas of research. However, research studying sexual objectification might have limited internal and external validity due to the lack of a controlled and standardized picture database. Moreover, there is a need to extend this research to other fields including the study of emotions. Therefore, in this paper we introduce the SOBEM Database, a free tool consisting of 280 high-resolution pictures depicting objectified and non-objectified female models expressing a neutral face and three different emotions (happiness, anger, and sadness) with different intensity. We report the validation of this dataset by analyzing results of 134 participants judging pictures on the six basic emotions and on a range of social judgments related to sexual objectification. Results showed how the SOBEM can constitute an appropriate instrument to study both sexual objectification per se and its relation with emotions. This database could therefore become an important instrument able to improve the experimental control in future studies on sexual objectification and to create new links with different fields of research.
Collapse
|
19
|
Israelashvili J, Perry A. Nuancing Perspective. SOCIAL PSYCHOLOGY 2021. [DOI: 10.1027/1864-9335/a000452] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Abstract. Two experiments manipulated participants’ familiarity with another person and examined their performance in future understanding of that person’s emotions. To gain familiarity, participants watched several videos of the target sharing experiences and rated her emotions. In the Feedback condition, perceivers learned about the actual emotions the target felt. In the Control condition, perceivers completed identical recognition tasks but did not know the target’s own emotion ratings. Studies ( Ntotal = 398; one preregistered) found that the Feedback group was more accurate than the Control in future understanding of the target’s emotions. Results provide a proof-of-concept demonstration that brief preliminary learning about past emotional experiences of another person can give one a more accurate understanding of the person in the future.
Collapse
Affiliation(s)
| | - Anat Perry
- Department of Psychology, The Hebrew University of Jerusalem, Israel
| |
Collapse
|
20
|
Minton AR, Mienaltowski A. More than Face Value: Context and Age Differences in Negative Emotion Discrimination. JOURNAL OF NONVERBAL BEHAVIOR 2021. [DOI: 10.1007/s10919-021-00369-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
21
|
Matyjek M, Kroczek B, Senderecka M. Socially induced negative affective knowledge modulates early face perception but not gaze cueing of attention. Psychophysiology 2021; 58:e13876. [PMID: 34110019 PMCID: PMC8459251 DOI: 10.1111/psyp.13876] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2020] [Revised: 05/11/2021] [Accepted: 05/12/2021] [Indexed: 12/12/2022]
Abstract
Prior affective and social knowledge about other individuals has been shown to modulate perception of their faces and gaze‐related attentional processes. However, it remains unclear whether emotionally charged knowledge acquired through interactive social learning also modulates face processing and attentional control. Thus, the aim of this study was to test whether affective knowledge induced through social interactions in a naturalistic exchange game can influence early stages of face processing and attentional shifts in a subsequent gaze‐cueing task. As indicated by self‐reported ratings, the game was successful in inducing valenced affective knowledge towards positive and negative players. In the subsequent task, in which the locations of future targets were cued by the gaze of the game players, we observed enhanced early neural activity (larger amplitude of the P1 component) in response to a photograph of the negative player. This indicates that negative affective knowledge about an individual indeed modulates very early stages of the processing of this individual's face. Our study contributes to the existing literature by providing further evidence for the saliency of interactive social exchange paradigms that are used to induce affective knowledge. Moreover, it extends the previous research by presenting a very early modulation of perception by socially learned affective knowledge. Importantly, it also offers increased ecological validity of the findings due to the use of naturalistic social exchange in the study design. This research complements previous evidence that experimentally induced socio‐affective knowledge about other individuals, modulates processing of their faces, and shows that negative (but not positive) affect enhances very early face processing (the P1). Importantly, we provide an effective affect induction tool—an interactive social exchange game—which offers increased social ecological validity in experimental settings.
Collapse
Affiliation(s)
| | - Bartłomiej Kroczek
- Faculty of Mathematics and Computer Science, Jagiellonian University, Kraków, Poland
| | | |
Collapse
|
22
|
Lecker M, Aviezer H. More than Words? Semantic Emotion Labels Boost Context Effects on Faces. AFFECTIVE SCIENCE 2021; 2:163-170. [PMID: 36043174 PMCID: PMC9382963 DOI: 10.1007/s42761-021-00043-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/15/2020] [Accepted: 03/09/2021] [Indexed: 06/15/2023]
Abstract
Semantic emotional labels can influence the recognition of isolated facial expressions. However, it is unknown if labels also influence the susceptibility of facial expressions to context. To examine this, participants categorized expressive faces presented with emotionally congruent or incongruent bodies, serving as context. Face-body composites were presented together, aligned in their natural form, or spatially misaligned with the head shifted horizontally beside the body-a condition known to reduce the contextual impact of the body on the face. Critically, participants responded either by choosing emotion labels or by perceptually matching the target expression with expression probes. The results show a label dominance effect: Face-body congruency effects were larger with semantic labels than with perceptual expression matching, indicating that facial expressions are more prone to contextual influence when categorized with emotion labels, an effect only found when faces and bodies were aligned. These findings suggest that the role of conceptual language in face-body context effects may be larger than previously assumed.
Collapse
Affiliation(s)
- Maya Lecker
- Department of Psychology, Hebrew University of Jerusalem, Mt. Scopus, 9190501 Jerusalem, Israel
| | - Hillel Aviezer
- Department of Psychology, Hebrew University of Jerusalem, Mt. Scopus, 9190501 Jerusalem, Israel
| |
Collapse
|
23
|
Rodger H, Lao J, Stoll C, Richoz AR, Pascalis O, Dye M, Caldara R. The recognition of facial expressions of emotion in deaf and hearing individuals. Heliyon 2021; 7:e07018. [PMID: 34041389 PMCID: PMC8141778 DOI: 10.1016/j.heliyon.2021.e07018] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 03/25/2021] [Accepted: 05/04/2021] [Indexed: 11/25/2022] Open
Abstract
During real-life interactions, facial expressions of emotion are perceived dynamically with multimodal sensory information. In the absence of auditory sensory channel inputs, it is unclear how facial expressions are recognised and internally represented by deaf individuals. Few studies have investigated facial expression recognition in deaf signers using dynamic stimuli, and none have included all six basic facial expressions of emotion (anger, disgust, fear, happiness, sadness, and surprise) with stimuli fully controlled for their low-level visual properties, leaving the question of whether or not a dynamic advantage for deaf observers exists unresolved. We hypothesised, in line with the enhancement hypothesis, that the absence of auditory sensory information might have forced the visual system to better process visual (unimodal) signals, and predicted that this greater sensitivity to visual stimuli would result in better recognition performance for dynamic compared to static stimuli, and for deaf-signers compared to hearing non-signers in the dynamic condition. To this end, we performed a series of psychophysical studies with deaf signers with early-onset severe-to-profound deafness (dB loss >70) and hearing controls to estimate their ability to recognize the six basic facial expressions of emotion. Using static, dynamic, and shuffled (randomly permuted video frames of an expression) stimuli, we found that deaf observers showed similar categorization profiles and confusions across expressions compared to hearing controls (e.g., confusing surprise with fear). In contrast to our hypothesis, we found no recognition advantage for dynamic compared to static facial expressions for deaf observers. This observation shows that the decoding of dynamic facial expression emotional signals is not superior even in the deaf expert visual system, suggesting the existence of optimal signals in static facial expressions of emotion at the apex. Deaf individuals match hearing individuals in the recognition of facial expressions of emotion.
Collapse
Affiliation(s)
- Helen Rodger
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Junpeng Lao
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Chloé Stoll
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes, France
| | | | - Olivier Pascalis
- Laboratoire de Psychologie et de Neurocognition (CNRS-UMR5105), Université Grenoble-Alpes, France
| | - Matthew Dye
- National Technical Institute for Deaf/Rochester Institute of Technology, Rochester, New York, USA
| | - Roberto Caldara
- Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
24
|
McElvaney TJ, Osman M, Mareschal I. Perceiving threat in others: The role of body morphology. PLoS One 2021; 16:e0249782. [PMID: 33831099 PMCID: PMC8031394 DOI: 10.1371/journal.pone.0249782] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2020] [Accepted: 03/24/2021] [Indexed: 01/29/2023] Open
Abstract
People make judgments of others based on appearance, and these inferences can affect social interactions. Although the importance of facial appearance in these judgments is well established, the impact of the body morphology remains unclear. Specifically, it is unknown whether experimentally varied body morphology has an impact on perception of threat in others. In two preregistered experiments (N = 250), participants made judgments of perceived threat of body stimuli of varying morphology, both in the absence (Experiment 1) and presence (Experiment 2) of facial information. Bodies were perceived as more threatening as they increased in mass with added musculature and portliness, and less threatening as they increased in emaciation. The impact of musculature endured even in the presence of faces, although faces contributed more to the overall threat judgment. The relative contributions of the faces and bodies seemed to be driven by discordance, such that threatening faces exerted the most influence when paired with non-threatening bodies, and vice versa. This suggests that the faces and bodies were not perceived as entirely independent and separate components. Overall, these findings suggest that body morphology plays an important role in perceived threat and may bias real-world judgments.
Collapse
Affiliation(s)
- Terence J. McElvaney
- Department of Biological and Experimental Psychology, School of Biological and Chemical Sciences, Queen Mary University of London, London, United Kingdom
- * E-mail:
| | - Magda Osman
- Department of Biological and Experimental Psychology, School of Biological and Chemical Sciences, Queen Mary University of London, London, United Kingdom
| | - Isabelle Mareschal
- Department of Biological and Experimental Psychology, School of Biological and Chemical Sciences, Queen Mary University of London, London, United Kingdom
| |
Collapse
|
25
|
Fugate JMB, Franco CL. Implications for Emotion: Using Anatomically Based Facial Coding to Compare Emoji Faces Across Platforms. Front Psychol 2021; 12:605928. [PMID: 33716870 PMCID: PMC7947884 DOI: 10.3389/fpsyg.2021.605928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2020] [Accepted: 01/18/2021] [Indexed: 11/27/2022] Open
Abstract
Emoji faces, which are ubiquitous in our everyday communication, are thought to resemble human faces and aid emotional communication. Yet, few studies examine whether emojis are perceived as a particular emotion and whether that perception changes based on rendering differences across electronic platforms. The current paper draws upon emotion theory to evaluate whether emoji faces depict anatomical differences that are proposed to differentiate human depictions of emotion (hereafter, “facial expressions”). We modified the existing Facial Action Coding System (FACS) (Ekman and Rosenberg, 1997) to apply to emoji faces. An equivalent “emoji FACS” rubric allowed us to evaluate two important questions: First, Anatomically, does the same emoji face “look” the same across platforms and versions? Second, Do emoji faces perceived as a particular emotion category resemble the proposed human facial expression for that emotion? To answer these questions, we compared the anatomically based codes for 31 emoji faces across three platforms and two version updates. We then compared those codes to the proposed human facial expression prototype for the emotion perceived within the emoji face. Overall, emoji faces across platforms and versions were not anatomically equivalent. Moreover, the majority of emoji faces did not conform to human facial expressions for an emotion, although the basic anatomical codes were shared among human and emoji faces. Some emotion categories were better predicted by the assortment of anatomical codes than others, with some individual differences among platforms. We discuss theories of emotion that help explain how emoji faces are perceived as an emotion, even when anatomical differences are not always consistent or specific to an emotion.
Collapse
Affiliation(s)
- Jennifer M B Fugate
- Department of Psychology, University of Massachusetts Dartmouth, Dartmouth, MA, United States
| | - Courtny L Franco
- Department of Communication and Information Science, University of Alabama, Tuscaloosa, AL, United States
| |
Collapse
|
26
|
Camilo C, Vaz Garrido M, Calheiros MM. Recognizing children's emotions in child abuse and neglect. Aggress Behav 2021; 47:161-172. [PMID: 33164223 DOI: 10.1002/ab.21935] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2019] [Revised: 08/16/2020] [Accepted: 10/28/2020] [Indexed: 11/11/2022]
Abstract
Past research has suggested that parents' ability to recognize their children's emotions is associated with an enhanced quality of parent-child interactions and appropriateness of parental caregiving behavior. Although this association has also been examined in abusive and neglectful parents, the results are mixed and do not adequately address child neglect. Based on the Social Information Processing model of child abuse and neglect, we examined the association between mothers' ability to recognize children's emotions and self- and professionals-reported child abuse and neglect. The ability to recognize children's emotions was assessed with an implicit valence classification task and an emotion labeling task. A convenience sample of 166 mothers (78 with at least one child referred to Child Protection Services) completed the tasks. Child abuse and neglect were measured with self-report and professionals-report instruments. The moderating role of mothers' intellectual functioning and socioeconomic status were also examined. Results revealed that abusive mothers performed more poorly on the negative emotions recognition task, while neglectful mothers demonstrated a lower overall ability in recognizing children's emotions. When classifying the valence of emotions, mothers who obtained higher scores on child neglect presented a higher positivity bias particularly when their scores in measures of intellectual functioning were low. There was no moderation effect for socioeconomic status. Moreover, the results for child abuse were mainly observed with self-report measures, while for child neglect, they predominantly emerged with professionals-report. Our findings highlight the important contribution of the social information processing model in the context of child maltreatment, with implications for prevention and intervention addressed.
Collapse
Affiliation(s)
| | | | - Maria Manuela Calheiros
- Iscte–Instituto Universitário de Lisboa Lisboa Portugal
- Faculdade de Psicologia, CICPSI Universidade de Lisboa Lisboa Portugal
| |
Collapse
|
27
|
Fondevila S, Espuny J, Hernández-Gutiérrez D, Jiménez-Ortega L, Casado P, Muñoz-Muñoz F, Sánchez-García J, Martín-Loeches M. How society modulates our behavior: Effects on error processing of masked emotional cues contextualized in social status. Soc Neurosci 2021; 16:153-165. [PMID: 33494660 DOI: 10.1080/17470919.2021.1879255] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
In the present study, we investigate whether subliminal complex social cues have an impact on error-monitoring processes. For this purpose, we presented two social status ranks (high and low) with three possible emotional expressions (happy, neutral, angry), using a backward masking paradigm. Participants were instructed to perform a flanker task while recording Event-Related brain Potentials. Results showed larger amplitudes for the Error-Related Negativity index after the presentation of high relative to low social ranks, only for neutral expressions. Neither the angry nor the happy faces induced significant differences in social rank processing. This indicates that subliminal high social ranks, specifically with neutral expressions, increase error processing by boosting attentional control to perform the ongoing task. Our findings extend current knowledge on the automaticity of social and emotional processing and its influence on performance monitoring mechanisms.
Collapse
Affiliation(s)
- Sabela Fondevila
- Center for Human Evolution and Behavior, UCM-ISCIII, Madrid, Spain.,Departamento de Psicobiología y Metodología de las Ciencias del Comportamiento, Universidad Complutense De Madrid, Spain
| | - Javier Espuny
- Center for Human Evolution and Behavior, UCM-ISCIII, Madrid, Spain
| | | | - Laura Jiménez-Ortega
- Center for Human Evolution and Behavior, UCM-ISCIII, Madrid, Spain.,Departamento de Psicobiología y Metodología de las Ciencias del Comportamiento, Universidad Complutense De Madrid, Spain
| | - Pilar Casado
- Center for Human Evolution and Behavior, UCM-ISCIII, Madrid, Spain.,Departamento de Psicobiología y Metodología de las Ciencias del Comportamiento, Universidad Complutense De Madrid, Spain
| | - Francisco Muñoz-Muñoz
- Center for Human Evolution and Behavior, UCM-ISCIII, Madrid, Spain.,Departamento de Psicobiología y Metodología de las Ciencias del Comportamiento, Universidad Complutense De Madrid, Spain
| | | | - Manuel Martín-Loeches
- Center for Human Evolution and Behavior, UCM-ISCIII, Madrid, Spain.,Departamento de Psicobiología y Metodología de las Ciencias del Comportamiento, Universidad Complutense De Madrid, Spain
| |
Collapse
|
28
|
Atkinson L, Murray JE, Halberstadt J. Older Adults' Emotion Recognition Ability Is Unaffected by Stereotype Threat. Front Psychol 2021; 11:605724. [PMID: 33488464 PMCID: PMC7817847 DOI: 10.3389/fpsyg.2020.605724] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2020] [Accepted: 12/07/2020] [Indexed: 11/22/2022] Open
Abstract
Eliciting negative stereotypes about ageing commonly results in worse performance on many physical, memory, and cognitive tasks in adults aged over 65. The current studies explored the potential effect of this “stereotype threat” phenomenon on older adults’ emotion recognition, a cognitive ability that has been demonstrated to decline with age. In Study 1, stereotypes about emotion recognition ability across the lifespan were established. In Study 2, these stereotypes were utilised in a stereotype threat manipulation that framed an emotion recognition task as assessing either cognitive ability (stereotypically believed to worsen with age), social ability (believed to be stable across lifespan), or general abilities (control). Participants then completed an emotion recognition task in which they labelled dynamic expressions of negative and positive emotions. Self-reported threat concerns were also measured. Framing an emotion recognition task as assessing cognitive ability significantly heightened older adults’ (but not younger adults’) reports of stereotype threat concerns. Despite this, older adults’ emotion recognition performance was unaffected. Unlike other cognitive abilities, recognising facially expressed emotions may be unaffected by stereotype threat, possibly because emotion recognition is automatic, making it less susceptible to the cognitive load that stereotype threat produces.
Collapse
Affiliation(s)
- Lianne Atkinson
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | - Janice E Murray
- Department of Psychology, University of Otago, Dunedin, New Zealand
| | | |
Collapse
|
29
|
Chen Z, Whitney D. Inferential affective tracking reveals the remarkable speed of context-based emotion perception. Cognition 2020; 208:104549. [PMID: 33340812 DOI: 10.1016/j.cognition.2020.104549] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Revised: 12/08/2020] [Accepted: 12/09/2020] [Indexed: 10/22/2022]
Abstract
Understanding the emotional states of others is important for social functioning. Recent studies show that context plays an essential role in emotion recognition. However, it remains unclear whether emotion inference from visual scene context is as efficient as emotion recognition from faces. Here, we measured the speed of context-based emotion perception, using Inferential Affective Tracking (IAT) with naturalistic and dynamic videos. Using cross-correlation analyses, we found that inferring affect based on visual context alone is just as fast as tracking affect with all available information including face and body. We further demonstrated that this approach has high precision and sensitivity to sub-second lags. Our results suggest that emotion recognition from dynamic contextual information might be automatic and immediate. Seemingly complex context-based emotion perception is far more efficient than previously assumed.
Collapse
Affiliation(s)
- Zhimin Chen
- Department of Psychology, University of California, Berkeley, CA 94720, United States of America.
| | - David Whitney
- Department of Psychology, University of California, Berkeley, CA 94720, United States of America; Vision Science Program, University of California, Berkeley, CA 94720, United States of America; Helen Wills Neuroscience Institute, University of California, Berkeley, CA 94720, United States of America
| |
Collapse
|
30
|
Smith KE, Leitzke BT, Pollak SD. Youths' processing of emotion information: Responses to chronic and video-based laboratory stress. Psychoneuroendocrinology 2020; 122:104873. [PMID: 33070023 PMCID: PMC7686118 DOI: 10.1016/j.psyneuen.2020.104873] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/14/2020] [Revised: 07/03/2020] [Accepted: 09/10/2020] [Indexed: 10/23/2022]
Abstract
Integrating multiple sources of information about others' emotional states is critical to making accurate emotional inferences. There is evidence that both acute and chronic stress influence how individuals perceive emotional information. However, there is little research examining how acute and chronic stress interact to impact these processes. The current study examined whether acute and chronic stress interact to influence how children make emotional inferences. Eighty-nine youths (aged 11-15 years) underwent a novel video-based social stressor. Children completed an emotion recognition task prior to and after the stressor in which they saw integrated displays of facial expressions and contexts depicting congruent or incongruent emotional information. Eye tracking assessed changes in attention to the stimuli. Children became more likely to use and attended more to facial information than contextual information when labeling emotions following exposure to acute stress. Moreover, the effect of acute stress on use of facial information to label emotions was stronger for children who experienced higher levels of chronic stress. These data suggest that acute stress shifts attention towards facial information while suppressing processing of other sources of emotional information, and that youths with a history of chronic stress are more susceptible to these effects.
Collapse
Affiliation(s)
- Karen E. Smith
- Department of Psychology and Waisman Center, University of Wisconsin - Madison,Correspondence should be directed to Karen E. Smith, Waisman Center, University of Wisconsin - Madison, 1500 Highland Ave, Rm 392, Madison, WI 53705.
| | - Brian T. Leitzke
- Department of Psychology and Waisman Center, University of Wisconsin - Madison
| | - Seth D. Pollak
- Department of Psychology and Waisman Center, University of Wisconsin - Madison
| |
Collapse
|
31
|
Karaaslan A, Durmuş B, Amado S. Does body context affect facial emotion perception and eliminate emotional ambiguity without visual awareness? VISUAL COGNITION 2020. [DOI: 10.1080/13506285.2020.1846649] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Affiliation(s)
- Aslan Karaaslan
- Psychology Department, Faculty of Letters, Ege University, Izmir, Turkey
| | - Belkıs Durmuş
- Psychology Department, Faculty of Letters, Ege University, Izmir, Turkey
| | - Sonia Amado
- Psychology Department, Faculty of Letters, Ege University, Izmir, Turkey
| |
Collapse
|
32
|
Consistent behavioral and electrophysiological evidence for rapid perceptual discrimination among the six human basic facial expressions. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2020; 20:928-948. [PMID: 32918269 DOI: 10.3758/s13415-020-00811-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
Abstract
The extent to which the six basic human facial expressions perceptually differ from one another remains controversial. For instance, despite the importance of rapidly decoding fearful faces, this expression often is confused with other expressions, such as Surprise in explicit behavioral categorization tasks. We quantified implicit visual discrimination among rapidly presented facial expressions with an oddball periodic visual stimulation approach combined with electroencephalography (EEG), testing for the relationship with behavioral explicit measures of facial emotion discrimination. We report robust facial expression discrimination responses bilaterally over the occipito-temporal cortex for each pairwise expression change. While fearful faces presented as repeated stimuli led to the smallest deviant responses from all other basic expressions, deviant fearful faces were well discriminated overall and to a larger extent than expressions of Sadness and Anger. Expressions of Happiness did not differ quantitatively as much in EEG as for behavioral subjective judgments, suggesting that the clear dissociation between happy and other expressions, typically observed in behavioral studies, reflects higher-order processes. However, this expression differed from all others in terms of scalp topography, pointing to a qualitative rather than quantitative difference. Despite this difference, overall, we report for the first time a tight relationship of the similarity matrices across facial expressions obtained for implicit EEG responses and behavioral explicit measures collected under the same temporal constraints, paving the way for new approaches of understanding facial expression discrimination in developmental, intercultural, and clinical populations.
Collapse
|
33
|
Clark GM, McNeel C, Bigelow FJ, Enticott PG. The effect of empathy and context on face-processing ERPs. Neuropsychologia 2020; 147:107612. [PMID: 32882241 DOI: 10.1016/j.neuropsychologia.2020.107612] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2020] [Revised: 06/26/2020] [Accepted: 08/30/2020] [Indexed: 10/23/2022]
Abstract
The investigation of emotional face processing has largely used faces devoid of context, and does not account for within-perceiver differences in empathy. The importance of context in face perception has become apparent in recent years. This study examined the interaction of the contextual factors of facial expression, knowledge of a person's character, and within-perceiver empathy levels on face processing event-related potentials (ERPs). Forty-two adult participants learned background information about six individuals' character. Three types of character were described, in which the character was depicted as deliberately causing harm to others, accidently causing harm to others, or undertaking neutral actions. Subsequently, EEG was recorded while participants viewed the characters' faces displaying neutral or emotional expressions. Participants' empathy was assessed using the Empathy Quotient survey. Results showed a significant interaction of character type and empathy on the early posterior negativity (EPN) ERP component. These results suggested that for those with either low or high empathy, more attention was paid to the face stimuli, with more distinction between the different characters. In contrast, those in the middle range of empathy tended to produce smaller EPN with less distinction between character types. Findings highlight the importance of trait empathy in accounting for how faces in context are perceived.
Collapse
Affiliation(s)
- Gillian M Clark
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, Australia.
| | - Claire McNeel
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, Australia
| | - Felicity J Bigelow
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, Australia
| | - Peter G Enticott
- Cognitive Neuroscience Unit, School of Psychology, Deakin University, Geelong, Australia
| |
Collapse
|
34
|
Barisnikov K, Theurel A, Lejeune F. Emotion knowledge in neurotypical children and in those with down syndrome. APPLIED NEUROPSYCHOLOGY-CHILD 2020; 11:197-211. [PMID: 32579087 DOI: 10.1080/21622965.2020.1777131] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
This research aimed to assess two components of emotion knowledge (EK): receptive EK with face emotion identification and matching tasks, and emotion situation knowledge with the emotion attribution task (EAT). Study 1 assessed the development of EK in 265 neurotypical (NT) children (4-11 years), divided into four age groups. Overall, results showed a significant improvement of EK with age in the NT population for the three tasks, especially between the ages of 4/5 and 6/7. Children were less successful at the EAT in comparison to the other two tasks, indicating that receptive EK develops earlier than emotion situation knowledge. The presence of visual context (EAT) does not help to improve our children's overall facial emotion recognition, especially for anger and sadness, while these emotions are well recognized in isolated facial expressions (emotion identification). Study 2 compared EK between 32 children with Down syndrome (CA: M = 13 years, SD = 2.13) and 32 NT children (CA: M = 5.3 years, SD = 1.36): matched on a vocabulary task. Children with DS had more difficulties in EK than NT children. They had lower performances on the identification and the EAT tasks, while exhibited similar performances to their NT controls on the emotion matching task. Moreover, good abilities to identify emotion expressions seem to be a prerequisite for successful face-context recognition in NT children, but not in children with DS. Difficulties encountered by children with DS could result from executive dysfunction when dealing with complex visual information in addition to emotion processing difficulties.
Collapse
Affiliation(s)
- Koviljka Barisnikov
- Child Clinical Neuropsychology Unit, FPSE, University of Geneva, Geneva, Switzerland
| | | | - Fleur Lejeune
- Child Clinical Neuropsychology Unit, FPSE, University of Geneva, Geneva, Switzerland
| |
Collapse
|
35
|
Bublatzky F, Kavcıoğlu F, Guerra P, Doll S, Junghöfer M. Contextual information resolves uncertainty about ambiguous facial emotions: Behavioral and magnetoencephalographic correlates. Neuroimage 2020; 215:116814. [PMID: 32276073 DOI: 10.1016/j.neuroimage.2020.116814] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2018] [Revised: 03/17/2020] [Accepted: 04/01/2020] [Indexed: 02/01/2023] Open
Abstract
Environmental conditions bias our perception of other peoples' facial emotions. This becomes quite relevant in potentially threatening situations, when a fellow's facial expression might indicate potential danger. The present study tested the prediction that a threatening environment biases the recognition of facial emotions. To this end, low- and medium-expressive happy and fearful faces (morphed to 10%, 20%, 30%, or 40% emotional) were presented within a context of instructed threat-of-shock or safety. Self-reported data revealed that instructed threat led to a biased recognition of fearful, but not happy facial expressions. Magnetoencephalographic correlates revealed spatio-temporal clusters of neural network activity associated with emotion recognition and contextual threat/safety in early to mid-latency time intervals in the left parietal cortex, bilateral prefrontal cortex, and the left temporal pole regions. Early parietal activity revealed a double dissociation of face-context information as a function of the expressive level of facial emotions: When facial expressions were difficult to recognize (low-expressive), contextual threat enhanced fear processing and contextual safety enhanced processing of subtle happy faces. However, for rather easily recognizable faces (medium-expressive) the left hemisphere (parietal cortex, PFC, and temporal pole) showed enhanced activity to happy faces during contextual threat and fearful faces during safety. Thus, contextual settings reduce the salience threshold and boost early face processing of low-expressive congruent facial emotions, whereas face-context incongruity or mismatch effects drive neural activity of easier recognizable facial emotions. These results elucidate how environmental settings help recognize facial emotions, and the brain mechanisms underlying the recognition of subtle nuances of fear.
Collapse
Affiliation(s)
- Florian Bublatzky
- Department of Psychosomatic Medicine and Psychotherapy, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim/Heidelberg University, Germany.
| | - Fatih Kavcıoğlu
- Chair of Biological Psychology, Clinical Psychology and Psychotherapy, University of Würzburg, Germany
| | - Pedro Guerra
- Department of Personality, University of Granada, Spain
| | - Sarah Doll
- Institute for Biomagnetism and Biosignalanalysis, University Hospital Münster, Münster, Germany
| | - Markus Junghöfer
- Institute for Biomagnetism and Biosignalanalysis, University Hospital Münster, Münster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Münster, Germany
| |
Collapse
|
36
|
Integrating faces and bodies: Psychological and neural perspectives on whole person perception. Neurosci Biobehav Rev 2020; 112:472-486. [PMID: 32088346 DOI: 10.1016/j.neubiorev.2020.02.021] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2019] [Revised: 12/19/2019] [Accepted: 02/15/2020] [Indexed: 11/20/2022]
Abstract
The human "person" is a common percept we encounter. Research on person perception has been focused either on face or body perception-with less attention paid to whole person perception. We review psychological and neuroscience studies aimed at understanding how face and body processing operate in concert to support intact person perception. We address this question considering: a.) the task to be accomplished (identification, emotion processing, detection), b.) the neural stage of processing (early/late visual mechanisms), and c.) the relevant brain regions for face/body/person processing. From the psychological perspective, we conclude that the integration of faces and bodies is mediated by the goal of the processing (e.g., emotion analysis, identification, etc.). From the neural perspective, we propose a hierarchical functional neural architecture of face-body integration that retains a degree of separation between the dorsal and ventral visual streams. We argue for two centers of integration: a ventral semantic integration hub that is the result of progressive, posterior-to-anterior, face-body integration; and a social agent integration hub in the dorsal stream STS.
Collapse
|
37
|
Franco CL, Fugate JMB. Emoji Face Renderings: Exploring the Role Emoji Platform Differences have on Emotional Interpretation. JOURNAL OF NONVERBAL BEHAVIOR 2020. [DOI: 10.1007/s10919-019-00330-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
38
|
Barabanschikov V, Korolkova O. Perception of “Live” Facial Expressions. EXPERIMENTAL PSYCHOLOGY (RUSSIA) 2020. [DOI: 10.17759/exppsy.2020130305] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The article provides a review of experimental studies of interpersonal perception on the material of static and dynamic facial expressions as a unique source of information about the person’s inner world. The focus is on the patterns of perception of a moving face, included in the processes of communication and joint activities (an alternative to the most commonly studied perception of static images of a person outside of a behavioral context). The review includes four interrelated topics: face statics and dynamics in the recognition of emotional expressions; specificity of perception of moving face expressions; multimodal integration of emotional cues; generation and perception of facial expressions in communication processes. The analysis identifies the most promising areas of research of face in motion. We show that the static and dynamic modes of facial perception complement each other, and describe the role of qualitative features of the facial expression dynamics in assessing the emotional state of a person. Facial expression is considered as part of a holistic multimodal manifestation of emotions. The importance of facial movements as an instrument of social interaction is emphasized.
Collapse
|
39
|
Aguirre-Loaiza H, Arenas J, Arias I, Franco-Jímenez A, Barbosa-Granados S, Ramos-Bermúdez S, Ayala-Zuluaga F, Núñez C, García-Mas A. Effect of Acute Physical Exercise on Executive Functions and Emotional Recognition: Analysis of Moderate to High Intensity in Young Adults. Front Psychol 2019; 10:2774. [PMID: 31920823 PMCID: PMC6937985 DOI: 10.3389/fpsyg.2019.02774] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2019] [Accepted: 11/25/2019] [Indexed: 12/24/2022] Open
Abstract
Physical exercise (PE) is associated with cognitive changes and brain function. However, it is required to clarify the effect of PE in different intensities, population groups conditions and the EF duration over different cognitive domains. Besides, no studies are known to have evaluated the contextual emotional recognition. Therefore, we studied the effect of acute PE of moderate intensities up to higher ones to the executive functions and the contextual emotional recognition. The participants were evaluated and classified in two experiments according to the IPAQ short form self-report and control measures. In both experiments, the groups were randomized, controlled, and exposed to one session of indoor cycling through intervals of high measure intensity (75–85% HRmax). Experiment 1 comprised young adults who were physically active (PA) and healthy, apparently (n = 54, Mage = 20.7, SD = 2.5). Experiment 2 involved young adults who were physically inactive (IP) and healthy, apparently (n = 36, Mage = 21.6, SD = 1.8). The duration was the only factor that varied: 45 min for PA and 30 min for PI. The executive functions were evaluated by the Stroop, TMT A/B, and verbal fluency, and the emotional recognition through a task that includes body and facial emotions in context, simultaneously. The analysis of factorial mixed ANOVA showed effects on the right choices of the indoor cycling groups in the PA, and the time response in PI. Also, other effects were observed in the controlled groups. TMT-A/B measures showed changes in the pre-test–post-test measures for both experiments. Verbal fluency performance favored the control group in both experiments. Meanwhile, the emotional recognition showed an effect of the PE in error-reduction and enhanced the scores in the right choices of body emotions. These results suggest that the EF with intensities favored cognitive processes such as inhibitory control and emotional recognition in context. We took into account the importance of high-complexity tasks design that avoid a ceiling effect. This study is the first on reporting a positive effect of PE over the emotional contextual recognition. Important clinical and educational implications are presented implications which highlight the modulatory role of EF with moderate to high intensities.
Collapse
Affiliation(s)
| | - Jaime Arenas
- Physical Education, University of Quindío, Armenia, Colombia
| | - Ianelleen Arias
- Physical Education, University of Quindío, Armenia, Colombia
| | | | | | | | - Federico Ayala-Zuluaga
- Research Group Physical Activity, Cumanday, Manizales, Colombia.,Department of Physical Action, Caldas University, Manizales, Colombia
| | - César Núñez
- Psychology Program, Universidad de Medellín, Medellín, Colombia
| | - Alexandre García-Mas
- Department of Basic Psychology, University of the Balearic Islands, Palma, Spain
| |
Collapse
|
40
|
Mullennix J, Barber J, Cory T. An examination of the Kuleshov effect using still photographs. PLoS One 2019; 14:e0224623. [PMID: 31671134 PMCID: PMC6822748 DOI: 10.1371/journal.pone.0224623] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2019] [Accepted: 10/17/2019] [Indexed: 11/25/2022] Open
Abstract
The goal of the present study was to examine whether the effect of visual context on the interpretation of facial expression from an actor’s face could be produced using isolated photographic stills, instead of the typical dynamic film sequences used to demonstrate the effect. Two-photograph sequences consisting of a context photograph varying in pleasantness and a photograph of an actor’s neutral face were presented. Participants performed a liking rating task for the context photograph (to ensure attention to the stimulus) and they performed three tasks for the face stimulus: labeling the emotion portrayed by the actor, rating valence, and rating arousal. The results of the labeling data confirmed the existence of a visual context effect, with more faces labeled as “happy” after viewing pleasant context and more faces labeled “sad” or “fearful” after viewing unpleasant context. This effect was demonstrated when no explicit connection between the context stimulus and face stimulus was invoked, with the contextual information exerting its effect on labeling after being held in memory for at least 10 seconds. The results for ratings of valence and arousal were mixed. Overall, the results suggest that isolated photograph sequences produce a Kuleshov-type context effect on attributions of emotion to actors’ faces, replicating previous research conducted with dynamic film sequences.
Collapse
Affiliation(s)
- John Mullennix
- University of Pittsburgh at Johnstown, Johnstown, PA, United States of America
- * E-mail:
| | - Jeremy Barber
- University of Pittsburgh at Johnstown, Johnstown, PA, United States of America
| | - Trista Cory
- University of Pittsburgh at Johnstown, Johnstown, PA, United States of America
| |
Collapse
|
41
|
Neurocognitive determinants of theory of mind across the adult lifespan. Brain Cogn 2019; 136:103588. [DOI: 10.1016/j.bandc.2019.103588] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2019] [Revised: 07/29/2019] [Accepted: 07/29/2019] [Indexed: 12/28/2022]
|
42
|
Skottnik L, Linden DEJ. Mental Imagery and Brain Regulation-New Links Between Psychotherapy and Neuroscience. Front Psychiatry 2019; 10:779. [PMID: 31736799 PMCID: PMC6831624 DOI: 10.3389/fpsyt.2019.00779] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/30/2019] [Accepted: 09/30/2019] [Indexed: 01/23/2023] Open
Abstract
Mental imagery is a promising tool and mechanism of psychological interventions, particularly for mood and anxiety disorders. In parallel developments, neuromodulation techniques have shown promise as add-on therapies in psychiatry, particularly non-invasive brain stimulation for depression. However, these techniques have not yet been combined in a systematic manner. One novel technology that may be able to achieve this is neurofeedback, which entails the self-regulation of activation in specific brain areas or networks (or the self-modulation of distributed activation patterns) by the patients themselves, through real-time feedback of brain activation (for example, from functional magnetic resonance imaging). One of the key mechanisms by which patients learn such self-regulation is mental imagery. Here, we will first review the main mental imagery approaches in psychotherapy and the implicated brain networks. We will then discuss how these networks can be targeted with neuromodulation (neurofeedback or non-invasive or invasive brain stimulation). We will review the clinical evidence for neurofeedback and discuss possible ways of enhancing it through systematic combination with psychological interventions, with a focus on depression, anxiety disorders, and addiction. The overarching aim of this perspective paper will be to open a debate on new ways of developing neuropsychotherapies.
Collapse
Affiliation(s)
| | - David E. J. Linden
- School for Mental Health and Neuroscience, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
43
|
Zhang M, Liu T, Jin Y, He W, Huang Y, Luo W. The asynchronous influence of facial expressions on bodily expressions. Acta Psychol (Amst) 2019; 200:102941. [PMID: 31677428 DOI: 10.1016/j.actpsy.2019.102941] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2018] [Revised: 08/19/2019] [Accepted: 09/20/2019] [Indexed: 10/25/2022] Open
Abstract
The ability to extract correct emotional information from facial and bodily expressions is fundamental for the development of social skills. Previous studies have shown that bodily expressions affect the recognition of basic facial expressions dramatically. However, few studies have considered the view that facial expressions may influence the recognition of bodily expressions. Further, previous studies have failed to consider a comprehensive set of emotional categories. The present study sought to examine whether facial expressions would impact the recognition of bodily expressions asynchronously, using four basic emotions. Participants performed an affective priming task, in which the priming stimuli included four facial expressions (happy, sad, fearful, and angry), and the target stimuli were bodily expressions matching the same emotions. The results indicated that the perception of affective facial expressions significantly influenced the accuracy and reaction time for body-based emotion categorization, particularly for bodily expression of happiness. The recognition accuracy of congruent expressions was higher, relative to that of incongruent expressions. The findings show that facial expressions influence the recognition of bodily expressions, despite the asynchrony.
Collapse
|
44
|
The neural representation of facial-emotion categories reflects conceptual structure. Proc Natl Acad Sci U S A 2019; 116:15861-15870. [PMID: 31332015 DOI: 10.1073/pnas.1816408116] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Humans reliably categorize configurations of facial actions into specific emotion categories, leading some to argue that this process is invariant between individuals and cultures. However, growing behavioral evidence suggests that factors such as emotion-concept knowledge may shape the way emotions are visually perceived, leading to variability-rather than universality-in facial-emotion perception. Understanding variability in emotion perception is only emerging, and the neural basis of any impact from the structure of emotion-concept knowledge remains unknown. In a neuroimaging study, we used a representational similarity analysis (RSA) approach to measure the correspondence between the conceptual, perceptual, and neural representational structures of the six emotion categories Anger, Disgust, Fear, Happiness, Sadness, and Surprise. We found that subjects exhibited individual differences in their conceptual structure of emotions, which predicted their own unique perceptual structure. When viewing faces, the representational structure of multivoxel patterns in the right fusiform gyrus was significantly predicted by a subject's unique conceptual structure, even when controlling for potential physical similarity in the faces themselves. Finally, cross-cultural differences in emotion perception were also observed, which could be explained by individual differences in conceptual structure. Our results suggest that the representational structure of emotion expressions in visual face-processing regions may be shaped by idiosyncratic conceptual understanding of emotion categories.
Collapse
|
45
|
Li S, Zhu X, Ding R, Ren J, Luo W. The effect of emotional and self-referential contexts on ERP responses towards surprised faces. Biol Psychol 2019; 146:107728. [PMID: 31306692 DOI: 10.1016/j.biopsycho.2019.107728] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2019] [Revised: 06/26/2019] [Accepted: 07/07/2019] [Indexed: 01/24/2023]
Abstract
The perception of surprised faces is demonstrably modulated by emotional context. However, the influence of self-relevance and its interaction with emotional context have not been explored. The present study investigated the effects of contextual valence and self-reference on the perception of surprised faces. Our results revealed that faces in a negative context elicited a larger N170 than those in a neutral context. The EPN was affected by the interaction between contextual valence and self-reference, with larger amplitudes for faces in self-related positive contexts and sender-related negative contexts. Additionally, LPP amplitudes were enhanced for faces in negative contexts relative to neutral and positive contexts, as well as for self-related contexts in comparison to sender-related contexts. Together, these findings help to elucidate the psychophysiological mechanisms underlying the effects of emotional and self-referential contexts on the perception of surprised faces, which are characterized by distinctive ERPs.
Collapse
Affiliation(s)
- Shuaixia Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| | - Xiangru Zhu
- Institute of Cognition, Brain and Health, Henan University, Kaifeng, China; Institute of Psychology and Behavior, Henan University, Kaifeng, China
| | - Rui Ding
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| | - Jie Ren
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian, China.
| |
Collapse
|
46
|
Pereira EJ, Birmingham E, Ristic J. Contextually-Based Social Attention Diverges across Covert and Overt Measures. Vision (Basel) 2019; 3:E29. [PMID: 31735830 PMCID: PMC6802786 DOI: 10.3390/vision3020029] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2019] [Revised: 05/27/2019] [Accepted: 05/30/2019] [Indexed: 11/16/2022] Open
Abstract
Humans spontaneously attend to social cues like faces and eyes. However, recent data show that this behavior is significantly weakened when visual content, such as luminance and configuration of internal features, as well as visual context, such as background and facial expression, are controlled. Here, we investigated attentional biasing elicited in response to information presented within appropriate background contexts. Using a dot-probe task, participants were presented with a face-house cue pair, with a person sitting in a room and a house positioned within a picture hanging on a wall. A response target occurred at the previous location of the eyes, mouth, top of the house, or bottom of the house. Experiment 1 measured covert attention by assessing manual responses while participants maintained central fixation. Experiment 2 measured overt attention by assessing eye movements using an eye tracker. The data from both experiments indicated no evidence of spontaneous attentional biasing towards faces or facial features in manual responses; however, an infrequent, though reliable, overt bias towards the eyes of faces emerged. Together, these findings suggest that contextually-based social information does not determine spontaneous social attentional biasing in manual measures, although it may act to facilitate oculomotor behavior.
Collapse
Affiliation(s)
- Effie J. Pereira
- Department of Psychology, McGill University, 1205 Dr. Penfield Avenue, Montreal, QC H3A 1B1, Canada
| | - Elina Birmingham
- Faculty of Education, Simon Fraser University, 8888 University Drive, Burnaby, BC V5A 1S6, Canada
| | - Jelena Ristic
- Department of Psychology, McGill University, 1205 Dr. Penfield Avenue, Montreal, QC H3A 1B1, Canada
| |
Collapse
|
47
|
Ikeda S. Influence of Color on Emotion Recognition Is Not Bidirectional: An Investigation of the Association Between Color and Emotion Using a Stroop-Like Task. Psychol Rep 2019; 123:1226-1239. [DOI: 10.1177/0033294119850480] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The association between color and emotion has been shown, with red facilitating recognition of anger and green facilitating recognition of happiness. However, it has been unclear if emotional stimulus conversely facilitates and/or inhibits recognition of such colors. This study used a Stroop-like task, which required participants to ignore facial expressions and recognize color, in order to investigate the influence of emotion on recognition of color. In addition, this study investigated the association between color and emotion recognition from emoticons, as it was recently suggested that the process of emotion recognition from emoticons was different from that of actual faces. Results revealed that for facial expressions and emoticons, color influenced emotion recognition, in line with previous studies. Conversely, facial expression did not influence recognition of color. The results suggest that in emotion recognition people consider surrounding contextual information and integrate it automatically; however, in color recognition, they do not.
Collapse
|
48
|
Ito K, Ong CW, Kitada R. Emotional Tears Communicate Sadness but Not Excessive Emotions Without Other Contextual Knowledge. Front Psychol 2019; 10:878. [PMID: 31068868 PMCID: PMC6491854 DOI: 10.3389/fpsyg.2019.00878] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2019] [Accepted: 04/03/2019] [Indexed: 12/17/2022] Open
Abstract
Contexts of face perception are diverse. They range from the social environment to body postures, from the expresser's gaze direction to the tone of voice. In extending the research on contexts of face perception, we investigated people's perception of tears on a face. The act of shedding tears is often perceived as an expression of sad feelings aroused by experiencing loss, disappointment, or helplessness. Alternatively, tears may also represent the excessive intensity of any emotion, such as extreme fear during an unexpected encounter with a giant bear and extreme happiness when you win a competition. Investigating these competing interpretations of tears, we found that the addition of tears to different facial expressions made the expressions conceptually closer to sad expressions. In particular, the results of the similarity analysis showed that, after the addition of tears, patterns of ratings for anger, fear, disgust, and neutral facial expressions became more similar to those for sadness expressions. The effect of tears on the ratings of basic emotions and their patterns in facial expressions are discussed.
Collapse
Affiliation(s)
- Kenichi Ito
- Division of Psychology, School of Social Sciences, College of Humanities, Arts, and Social Sciences, Nanyang Technological University, Singapore, Singapore
| | | | - Ryo Kitada
- Division of Psychology, School of Social Sciences, College of Humanities, Arts, and Social Sciences, Nanyang Technological University, Singapore, Singapore
| |
Collapse
|
49
|
Abstract
Emotion recognition is widely assumed to be determined by face and body features, and measures of emotion perception typically use unnatural, static, or decontextualized face stimuli. Using our method called affective tracking, we show that observers can infer, recognize, and track over time the affect of an invisible person based solely on visual spatial context. We further show that visual context provides a substantial and unique contribution to the perception of human affect, beyond the information available from face and body. This method reveals that emotion recognition is, at its heart, a context-based process. Emotion recognition is an essential human ability critical for social functioning. It is widely assumed that identifying facial expression is the key to this, and models of emotion recognition have mainly focused on facial and bodily features in static, unnatural conditions. We developed a method called affective tracking to reveal and quantify the enormous contribution of visual context to affect (valence and arousal) perception. When characters’ faces and bodies were masked in silent videos, viewers inferred the affect of the invisible characters successfully and in high agreement based solely on visual context. We further show that the context is not only sufficient but also necessary to accurately perceive human affect over time, as it provides a substantial and unique contribution beyond the information available from face and body. Our method (which we have made publicly available) reveals that emotion recognition is, at its heart, an issue of context as much as it is about faces.
Collapse
|
50
|
Calbi M, Siri F, Heimann K, Barratt D, Gallese V, Kolesnikov A, Umiltà MA. How context influences the interpretation of facial expressions: a source localization high-density EEG study on the "Kuleshov effect". Sci Rep 2019; 9:2107. [PMID: 30765713 PMCID: PMC6376122 DOI: 10.1038/s41598-018-37786-y] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2018] [Accepted: 12/12/2018] [Indexed: 11/24/2022] Open
Abstract
Few studies have explored the specificities of contextual modulations of the processing of facial expressions at a neuronal level. This study fills this gap by employing an original paradigm, based on a version of the filmic “Kuleshov effect”. High-density EEG was recorded while participants watched film sequences consisting of three shots: the close-up of a target person’s neutral face (Face_1), the scene that the target person was looking at (happy, fearful, or neutral), and another close-up of the same target person’s neutral face (Face_2). The participants’ task was to rate both valence and arousal, and subsequently to categorize the target person’s emotional state. The results indicate that despite a significant behavioural ‘context’ effect, the electrophysiological indexes still indicate that the face is evaluated as neutral. Specifically, Face_2 elicited a high amplitude N170 when preceded by neutral contexts, and a high amplitude Late Positive Potential (LPP) when preceded by emotional contexts, thus showing sensitivity to the evaluative congruence (N170) and incongruence (LPP) between context and Face_2. The LPP activity was mainly underpinned by brain regions involved in facial expressions and emotion recognition processing. Our results shed new light on temporal and neural correlates of context-sensitivity in the interpretation of facial expressions.
Collapse
Affiliation(s)
- Marta Calbi
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy.
| | - Francesca Siri
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy
| | - Katrin Heimann
- Interacting Minds Center, University of Aarhus, Aarhus, Denmark
| | - Daniel Barratt
- Department of Management, Society and Communication, Copenhagen Business School, Copenhagen, Denmark
| | - Vittorio Gallese
- Department of Medicine and Surgery, Unit of Neuroscience, University of Parma, Parma, Italy. .,Institute of Philosophy, School of Advanced Study, University of London, London, UK.
| | - Anna Kolesnikov
- Department of Humanities, Social Sciences and Cultural Industries, University of Parma, Parma, Italy
| | | |
Collapse
|