1
|
Li S, Dang W, Zhang Y, Hao B, Zhao D, Luo W. Perceiving emotions in the eyes: The biasing role of a fearful mouth. Biol Psychol 2025; 194:108968. [PMID: 39709097 DOI: 10.1016/j.biopsycho.2024.108968] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2024] [Revised: 11/24/2024] [Accepted: 12/13/2024] [Indexed: 12/23/2024]
Abstract
The role of the eye region in interpersonal communication and emotional recognition is widely acknowledged. However, the influence of mouth expression on perceiving and recognizing genuine emotions in the eye region, especially with limited attentional resources, remains unclear. Thirty-four participants in this study completed a dual-target rapid serial visual presentation (RSVP) task while their event-related potential (ERP) data were simultaneously recorded. They were instructed to identify the type of houses and the emotional expression displayed in the eye region. The first target (T1) consisted of three upright houses, and the second target (T2) included fearful and neutral normal faces, mouth-scrambled faces, as well as composite faces (fearful eye + neutral mouth, neutral eye + fearful mouth). A robust mass univariate statistics approach was utilized to analyze the EEG data. Behaviorally, the presence of a fearful mouth facilitated recognition of the fearful eye region but hindered recognition of the neutral eye region compared to a neutral mouth. The ERP results showed that fearful expressions elicited larger N170, early posterior negativity (EPN), and P3 amplitudes relative to neutral expressions. The P1 amplitudes were increased, whereas the N170 and EPN amplitudes were reduced in response to normal and composite faces compared to mouth-scrambled faces. Collectively, these findings indicate that an unattended fearful mouth can capture covert attention and shape evaluation of eye expressions within a face, providing novel insights into the impact of visually salient mouth cues on cognitive processes involved in mind reading.
Collapse
Affiliation(s)
- Shuaixia Li
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China.
| | - Wei Dang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China
| | - Yihan Zhang
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China
| | - Bin Hao
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China
| | - Dongfang Zhao
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China
| | - Wenbo Luo
- Research Center of Brain and Cognitive Neuroscience, Liaoning Normal University, Dalian 116029, China; Key Laboratory of Brain and Cognitive Neuroscience, Liaoning Province, Dalian 116029, China.
| |
Collapse
|
2
|
Ma J, Liu X, Li Y. A Comparative Study Recognizing the Expression of Information Between Elderly Individuals and Young Individuals. Psychol Res Behav Manag 2024; 17:3111-3120. [PMID: 39253353 PMCID: PMC11382663 DOI: 10.2147/prbm.s471196] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2024] [Accepted: 08/26/2024] [Indexed: 09/11/2024] Open
Abstract
Background Studies have shown that elderly individuals have significantly worse facial expression recognition scores than young adults. Some have suggested that this difference is due to perceptual degradation, while others suggest it is due to decreased attention of elderly individuals to the most informative regions of the face. Methods To resolve this controversy, this study recruited 85 participants and used a behavioral task and eye-tracking techniques (EyeLink 1000 Plus eye tracker). It adopted the "study-recognition" paradigm, and a mixed experimental design of 3 (facial expressions: positive, neutral, negative) × 2 (subjects' age: young, old) × 3 (facial areas of interest: eyes, nose, and mouth) was used to explore whether there was perceptual degradation in older people's attention to facial expressions and investigate the differences in diagnostic areas between young and older people. Results The behavioral results revealed that young participants had significantly higher facial expression recognition scores than older participants did; moreover, the eye-tracking results revealed that younger people generally fixated on faces significantly more than elderly people, demonstrating the perceptual degradation in elderly people. Young people primarily look at the eyes, followed by the nose and, finally, the mouth when examining facial expressions. The elderly participants primarily focus on the eyes, followed by the mouth and then the nose. Conclusion The findings confirmed that young participants have better facial expression recognition performance than elderly participants, which may be related more to perceptual degradation than to decreased attention to informative areas of the face. For elderly people, the duration of gaze toward the facial diagnosis area (such as the eyes) should be increased when recognizing faces to compensate for the disadvantage of decreased facial recognition performance caused by perceptual aging.
Collapse
Affiliation(s)
- Jialin Ma
- Faculty of Education, Henan University, Kaifeng, Henan Province, People's Republic of China
| | - Xiaojing Liu
- Faculty of Education, Henan University, Kaifeng, Henan Province, People's Republic of China
| | - Yongxin Li
- Faculty of Education, Henan University, Kaifeng, Henan Province, People's Republic of China
| |
Collapse
|
3
|
Sueur C, Piermattéo A, Pelé M. Eye image effect in the context of pedestrian safety: a French questionnaire study. F1000Res 2023; 11:218. [PMID: 37822956 PMCID: PMC10562793 DOI: 10.12688/f1000research.76062.2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 06/29/2023] [Indexed: 10/13/2023] Open
Abstract
Human behavior is influenced by the presence of others, which scientists also call 'the audience effect'. The use of social control to produce more cooperative behaviors may positively influence road use and safety. This study uses an online questionnaire to test how eyes images affect the behavior of pedestrians when crossing a road. Different eyes images of men, women and a child with different facial expressions -neutral, friendly and angry- were presented to participants who were asked what they would feel by looking at these images before crossing a signalized road. Participants completed a questionnaire of 20 questions about pedestrian behaviors (PBQ). The questionnaire was received by 1,447 French participants, 610 of whom answered the entire questionnaire. Seventy-one percent of participants were women, and the mean age was 35 ± 14 years. Eye images give individuals the feeling they are being observed at 33%, feared at 5% and surprised at 26%, and thus seem to indicate mixed results about avoiding crossing at the red light. The expressions shown in the eyes are also an important factor: feelings of being observed increased by about 10-15% whilst feelings of being scared or inhibited increased by about 5% as the expression changed from neutral to friendly to angry. No link was found between the results of our questionnaire and those of the Pedestrian Behavior Questionnaire (PBQ). This study shows that the use of eye images could reduce illegal crossings by pedestrians, and is thus of key interest as a practical road safety tool. However, the effect is limited and how to increase this nudge effect needs further consideration.
Collapse
Affiliation(s)
- Cédric Sueur
- Institut Universitaire de France, Paris, France
- IPHC, UMR7178, Université de Strasbourg, CNRS, Strasbourg, France
| | | | - Marie Pelé
- ETHICS EA7446, Lille Catholic University, Lille, France
| |
Collapse
|
4
|
Kim H, Küster D, Girard JM, Krumhuber EG. Human and machine recognition of dynamic and static facial expressions: prototypicality, ambiguity, and complexity. Front Psychol 2023; 14:1221081. [PMID: 37794914 PMCID: PMC10546417 DOI: 10.3389/fpsyg.2023.1221081] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2023] [Accepted: 08/22/2023] [Indexed: 10/06/2023] Open
Abstract
A growing body of research suggests that movement aids facial expression recognition. However, less is known about the conditions under which the dynamic advantage occurs. The aim of this research was to test emotion recognition in static and dynamic facial expressions, thereby exploring the role of three featural parameters (prototypicality, ambiguity, and complexity) in human and machine analysis. In two studies, facial expression videos and corresponding images depicting the peak of the target and non-target emotion were presented to human observers and the machine classifier (FACET). Results revealed higher recognition rates for dynamic stimuli compared to non-target images. Such benefit disappeared in the context of target-emotion images which were similarly well (or even better) recognised than videos, and more prototypical, less ambiguous, and more complex in appearance than non-target images. While prototypicality and ambiguity exerted more predictive power in machine performance, complexity was more indicative of human emotion recognition. Interestingly, recognition performance by the machine was found to be superior to humans for both target and non-target images. Together, the findings point towards a compensatory role of dynamic information, particularly when static-based stimuli lack relevant features of the target emotion. Implications for research using automatic facial expression analysis (AFEA) are discussed.
Collapse
Affiliation(s)
- Hyunwoo Kim
- Departmet of Experimental Psychology, University College London, London, United Kingdom
| | - Dennis Küster
- Cognitive Systems Lab, Department of Mathematics and Computer Science, University of Bremen, Bremen, Germany
| | - Jeffrey M. Girard
- Department of Psychology, University of Kansas, Lawrence, KS, United States
| | - Eva G. Krumhuber
- Departmet of Experimental Psychology, University College London, London, United Kingdom
| |
Collapse
|
5
|
Ma J, Zhang R, Li Y. Age Weakens the Other-Race Effect among Han Subjects in Recognizing Own- and Other-Ethnicity Faces. Behav Sci (Basel) 2023; 13:675. [PMID: 37622815 PMCID: PMC10452021 DOI: 10.3390/bs13080675] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2023] [Revised: 08/09/2023] [Accepted: 08/10/2023] [Indexed: 08/26/2023] Open
Abstract
The development and change in the other-race effect (ORE) in different age groups have always been a focus of researchers. Previous studies have mainly focused on the influence of maturity of life (from infancy to early adulthood) on the ORE, while few researchers have explored the ORE in older people. Therefore, this study used behavioral and eye movement techniques to explore the influence of age on the ORE and the visual scanning pattern of Han subjects recognizing own- and other-ethnicity faces. All participants were asked to complete a study-recognition task for faces, and the behavioral results showed that the ORE of elderly Han subjects was significantly lower than that of young Han subjects. The results of eye movement showed that there were significant differences in the visual scanning pattern of young subjects in recognizing the faces of individuals of their own ethnicity and other ethnicities, which were mainly reflected in the differences in looking at the nose and mouth, while the differences were reduced in the elderly subjects. The elderly subjects used similar scanning patterns to recognize the own- and other-ethnicity faces. This indicates that as age increases, the ORE of older people in recognizing faces of those from different ethnic groups becomes weaker, and elderly subjects have more similar visual scanning patterns in recognizing faces of their own and other ethnicities.
Collapse
Affiliation(s)
- Jialin Ma
- Facuty of Education, School of Psychology, Henan University, Kaifeng 475000, China; (R.Z.); (Y.L.)
| | | | | |
Collapse
|
6
|
Zhang J, Wang R, Xue Y. Deconstructing Mechanisms of Abnormal Categorical Perception of Emotional Facial Expressions in Schizophrenia Patients. Psychiatry Investig 2022; 19:991-999. [PMID: 36588433 PMCID: PMC9806511 DOI: 10.30773/pi.2022.0215] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/27/2022] [Accepted: 09/25/2022] [Indexed: 12/24/2022] Open
Abstract
OBJECTIVE The current study aims to find out the potential reasons why most schizophrenia patients have a relatively low sensitivity to the classification of emotional facial expressions. METHODS By using an emotional categorical perception task, eighty-three schizophrenia patients and seventy-one healthy adults are provided with morphed emotional continuums with two emotional facial expressions (a positive emotional valence: happy; a negative emotional valence: sad). RESULTS Through comparing the difference between schizophrenia patients and healthy adults in the processes of estimating facial expressions with ambiguous emotions, we find that the pattern of emotional categorical perception for schizophrenia patients is significantly different from that of healthy controls when they process signals on the local facial areas. Compared to healthy people, schizophrenia patients have a significantly separate classification pattern in processing emotional signals between the eyes and mouth regions. It indicates that compared to healthy adults, schizophrenia patients have larger conflicts in integrating emotional signals from different facial areas. To overcome conflicts, more cognitive resources are required. Unfortunately, the lack of cognitive resources leads to the failure of integration, which further increases the difficulty of estimating facial expressions with ambiguous emotions, and finally leads to the relatively low sensitivity of emotional facial expressions classification. CONCLUSION To sum up, the deficit of abnormal perceptions of emotional facial expressions in schizophrenia patients results from an integrated deficit of signals on facial areas.
Collapse
Affiliation(s)
- Jian Zhang
- School of Psychology, Guizhou Normal University, Guiyang, China
| | - Ruomin Wang
- Department of Economics, University of Illinois Urbana-Champaign, Urbana, IL, USA
| | - Yunzhen Xue
- Department of Humanities and Social Science, Shanxi Medical University, Taiyuan, China
| |
Collapse
|
7
|
Verroca A, de Rienzo CM, Gambarota F, Sessa P. Mapping the perception-space of facial expressions in the era of face masks. Front Psychol 2022; 13:956832. [PMID: 36176786 PMCID: PMC9514388 DOI: 10.3389/fpsyg.2022.956832] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Accepted: 08/05/2022] [Indexed: 11/28/2022] Open
Abstract
With the advent of the severe acute respiratory syndrome-Corona Virus type 2 (SARS-CoV-2) pandemic, the theme of emotion recognition from facial expressions has become highly relevant due to the widespread use of face masks as one of the main devices imposed to counter the spread of the virus. Unsurprisingly, several studies published in the last 2 years have shown that accuracy in the recognition of basic emotions expressed by faces wearing masks is reduced. However, less is known about the impact that wearing face masks has on the ability to recognize emotions from subtle expressions. Furthermore, even less is known regarding the role of interindividual differences (such as alexithymic and autistic traits) in emotion processing. This study investigated the perception of all the six basic emotions (anger, disgust, fear, happiness, sadness, and surprise), both as a function of the face mask and as a function of the facial expressions' intensity (full vs. subtle) in terms of participants' uncertainty in their responses, misattribution errors, and perceived intensity. The experiment was conducted online on a large sample of participants (N = 129). Participants completed the 20-item Toronto Alexithymia Scale and the Autistic Spectrum Quotient and then performed an emotion-recognition task that involved face stimuli wearing a mask or not, and displaying full or subtle expressions. Each face stimulus was presented alongside the Geneva Emotion Wheel (GEW), and participants had to indicate what emotion they believed the other person was feeling and its intensity using the GEW. For each combination of our variables, we computed the indices of 'uncertainty' (i.e., the spread of responses around the correct emotion category), 'bias' (i.e., the systematic errors in recognition), and 'perceived intensity' (i.e., the distance from the center of the GEW). We found that face masks increase uncertainty for all facial expressions of emotion, except for fear when intense, and that disgust was systematically confused with anger (i.e., response bias). Furthermore, when faces were covered by the mask, all the emotions were perceived as less intense, and this was particularly evident for subtle expressions. Finally, we did not find any evidence of a relationship between these indices and alexithymic/autistic traits.
Collapse
Affiliation(s)
- Alessia Verroca
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy
| | | | - Filippo Gambarota
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy
- Padova Neuroscience Center (PNC), University of Padova, Padova, Italy
| | - Paola Sessa
- Department of Developmental and Social Psychology, University of Padova, Padova, Italy
- Padova Neuroscience Center (PNC), University of Padova, Padova, Italy
| |
Collapse
|
8
|
Amadeo MB, Escelsior A, Amore M, Serafini G, Pereira da Silva B, Gori M. Face masks affect perception of happy faces in deaf people. Sci Rep 2022; 12:12424. [PMID: 35858937 PMCID: PMC9298172 DOI: 10.1038/s41598-022-16138-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2021] [Accepted: 07/05/2022] [Indexed: 11/22/2022] Open
Abstract
The SARS-CoV-2 pandemic has led significant social repercussions and forced people to wear face masks. Recent research has demonstrated that the human ability to infer emotions from facial configurations is significantly reduced when face masks are worn. Since the mouth region is specifically crucial for deaf people who speak sign language, the current study assessed the impact of face masks on inferring emotional facial expressions in a population of adult deaf signers. A group of 34 congenitally deaf individuals and 34 normal-hearing individuals were asked to identify happiness, sadness, fear, anger, and neutral expression on static human pictures with and without facial masks presented through smartphones. For each emotion, the percentage of correct responses with and without face masks was calculated and compared between groups. Results indicated that face masks, such as those worn due to the SARS-CoV-2 pandemic, limit the ability of people to infer emotions from facial expressions. The negative impact of face masks is significantly pronounced when deaf people have to recognize low-intensity expressions of happiness. These findings are of essential importance because difficulties in recognizing emotions from facial expressions due to mask wearing may contribute to the communication challenges experienced by the deaf community during the SARS-CoV-2 pandemic, generating feelings of frustration and exclusion.
Collapse
Affiliation(s)
- Maria Bianca Amadeo
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via Enrico Melen 83, 16152, Genoa, Italy. .,Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.
| | - Andrea Escelsior
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Mario Amore
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Gianluca Serafini
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Beatriz Pereira da Silva
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via Enrico Melen 83, 16152, Genoa, Italy.,Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Monica Gori
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via Enrico Melen 83, 16152, Genoa, Italy.,Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy
| |
Collapse
|
9
|
Muukkonen I, Kilpeläinen M, Turkkila R, Saarela T, Salmela V. Obligatory integration of face features in expression discrimination. VISUAL COGNITION 2022. [DOI: 10.1080/13506285.2022.2046222] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- I. Muukkonen
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - M. Kilpeläinen
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - R. Turkkila
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - T. Saarela
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - V. Salmela
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| |
Collapse
|
10
|
Effects of face masks on the appearance of emotional expressions and invariant characteristics. OPEN PSYCHOLOGY 2021. [DOI: 10.1515/psych-2020-0113] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023] Open
Abstract
Abstract
Faces convey a lot of information about a person. However, the usage of face masks occludes important parts of the face. There is already information that face masks alter the processing of variable characteristics such as emotional expressions and the identity of a person. To investigate whether masks influenced the processing of facial information, we compared ratings of full faces and those covered by face masks. 196 participants completed one of two parallel versions of the experiment. The data demonstrated varying effects of face masks on various characteristics. First, we showed that the perceived intensity of emotional expressions was reduced when the face was covered by face masks. This can be regarded as conceptual replication and extension of the impairing effects of face masks on the recognition of emotional expressions. Next, by analyzing valence and arousal ratings, the data illustrated that emotional expressions were regressed toward neutrality for masked faces relative to no-masked faces. This effect was grossly pronounced for happy facial expressions, less for neutral expressions, and absent for sad expressions. The sex of masked faces was also less accurately identified. Finally, masked faces looked older and less attractive. Post hoc correlational analyses revealed correlation coefficient differences between no-masked and masked faces. The differences occurred in some characteristic pairs (e.g., Age and Attractiveness, Age and Trustworthiness) but not in others. This suggested that the ratings for some characteristics could be influenced by the presence of face masks. Similarly, the ratings of some characteristics could also be influenced by other characteristics, irrespective of face masks. We speculate that the amount of information available on a face could drive our perception of others during social communication. Future directions for research were discussed.
Collapse
|
11
|
Lau WK. Face Masks Bolsters the Characteristics From Looking at a Face Even When Facial Expressions Are Impaired. Front Psychol 2021; 12:704916. [PMID: 34955943 PMCID: PMC8702500 DOI: 10.3389/fpsyg.2021.704916] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 11/22/2021] [Indexed: 11/16/2022] Open
Abstract
Face masks impact social interactions because emotion recognition is difficult due to face occlusion. However, is this enough to conclude that face masks negatively impact social interactions? We investigated the impact of face masks on invariant characteristics (sex, age), trait-like characteristics (trustworthiness, attractiveness, and approachability), and emotional expressions (happiness and excitability). Participants completed an online survey and rated masked and no-masked faces. The same face remained masked or no-masked throughout the survey. Results revealed that, when compared to no-masked faces, masked happy faces appeared less happy. Face masks did not negatively impact the ratings of other characteristics. Participants were better at judging the sex of masked faces. Masked faces also appeared younger, more trustworthy, more attractive, and more approachable. Therefore, face masks did not always result in unfavorable ratings. An additional post hoc modeling revealed that trustworthiness and attractiveness ratings for masked faces predicted the same trait ratings for no-masked faces. However, approachability ratings for no-masked faces predicted the same trait ratings for masked faces. This hinted that information from masked/no-masked faces, such as from the eye and eye region, could aid in the understanding of others during social interaction. Future directions were proposed to expand the research.
Collapse
Affiliation(s)
- Wee Kiat Lau
- Department of General Psychology, Ulm University, Ulm, Germany
| |
Collapse
|
12
|
Philipp MC, Bernstein MJ, Vanman EJ, Johnston L. Social exclusion enhances affiliative signaling. The Journal of Social Psychology 2021; 161:508-518. [PMID: 33357078 DOI: 10.1080/00224545.2020.1854648] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Reciprocating smiles is important for maintaining social bonds as it both signals affiliative intent and elicits affiliative responses. Feelings of social exclusion may increase mimicry as a means to regulate affiliative bonds with others. In this study, we examined whether feelings of exclusion lead people to selectively reciprocate the facial expressions of more affiliative-looking people. Participants first wrote about either a time they were excluded or a neutral event. They then classified 20 smiles-half spontaneous smiles and half posed. Facial electromyography recorded smile muscle activity. Excluded participants distinguished the two smile types better than controls. Excluded participants also showed greater zygomaticus major (mouth smiling) activity toward enjoyment smiles compared to posed smiles; control participants did not. Orbicularis oculi (eye crinkle) activity matched that of the smile type viewed, but did not vary by exclusion condition. Affiliative social regulation is discussed as a possible explanation for these effects.
Collapse
|
13
|
Ruba AL, Pollak SD. Children's emotion inferences from masked faces: Implications for social interactions during COVID-19. PLoS One 2020; 15:e0243708. [PMID: 33362251 PMCID: PMC7757816 DOI: 10.1371/journal.pone.0243708] [Citation(s) in RCA: 82] [Impact Index Per Article: 16.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 11/29/2020] [Indexed: 11/28/2022] Open
Abstract
To slow the progression of COVID-19, the Centers for Disease Control (CDC) and the World Health Organization (WHO) have recommended wearing face coverings. However, very little is known about how occluding parts of the face might impact the emotion inferences that children make during social interactions. The current study recruited a racially diverse sample of school-aged (7- to 13-years) children from publicly funded after-school programs. Children made inferences from facial configurations that were not covered, wearing sunglasses to occlude the eyes, or wearing surgical masks to occlude the mouth. Children were still able to make accurate inferences about emotions, even when parts of the faces were covered. These data suggest that while there may be some challenges for children incurred by others wearing masks, in combination with other contextual cues, masks are unlikely to dramatically impair children's social interactions in their everyday lives.
Collapse
Affiliation(s)
- Ashley L. Ruba
- Department of Psychology and Waisman Center, University of Wisconsin – Madison, Madison, Wisconsin, United States of America
| | - Seth D. Pollak
- Department of Psychology and Waisman Center, University of Wisconsin – Madison, Madison, Wisconsin, United States of America
| |
Collapse
|
14
|
Kilpeläinen M, Salmela V. Perceived emotional expressions of composite faces. PLoS One 2020; 15:e0230039. [PMID: 32155204 PMCID: PMC7064203 DOI: 10.1371/journal.pone.0230039] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2018] [Accepted: 02/19/2020] [Indexed: 11/18/2022] Open
Abstract
The eye and mouth regions serve as the primary sources of facial information regarding an individual's emotional state. The aim of this study was to provide a comprehensive assessment of the relative importance of those two information sources in the identification of different emotions. The stimuli were composite facial images, in which different expressions (Neutral, Anger, Disgust, Fear, Happiness, Contempt, and Surprise) were presented in the eyes and the mouth. Participants (21 women, 11 men, mean age 25 years) rated the expressions of 7 congruent and 42 incongruent composite faces by clicking on a point within the valence-arousal emotion space. Eye movements were also monitored. With most incongruent composite images, the perceived emotion corresponded to the expression of either the eye region or the mouth region or an average of those. The happy expression was different. Happy eyes often shifted the perceived emotion towards a slightly negative point in the valence-arousal space, not towards the location associated with a congruent happy expression. The eye-tracking data revealed significant effects of congruency, expressions and interaction on total dwell time. Our data indicate that whether a face that combines features from two emotional expressions leads to a percept based on only one of the expressions (categorical perception) or integration of the two expressions (dimensional perception), or something altogether different, strongly depends upon the expressions involved.
Collapse
Affiliation(s)
- Markku Kilpeläinen
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
| | - Viljami Salmela
- Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
- * E-mail:
| |
Collapse
|
15
|
Shimada K, Kasaba R, Yao A, Tomoda A. Less efficient detection of positive facial expressions in parents at risk of engaging in child physical abuse. BMC Psychol 2019; 7:56. [PMID: 31455426 PMCID: PMC6712715 DOI: 10.1186/s40359-019-0333-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2019] [Accepted: 08/13/2019] [Indexed: 11/21/2022] Open
Abstract
Background Parental physical punishment (e.g., spanking) of children can gradually escalate into child physical abuse (CPA). According to social-information processing (SIP) models of aggressive behaviors, distorted social cognitive mechanisms can increase the risk of maladaptive parenting behaviors by changing how parents detect, recognize, and act on information from their social environments. In this study, we aimed to identify differences between mothers with a low and high risk of CPA regarding how quickly they detect positive facial expressions. Methods Based on their use of spanking to discipline children, 52 mothers were assigned to a low- (n = 39) or high-CPA-risk group (n = 13). A single-target facial emotional search (face-in-the-crowd) task was used, which required participants to search through an array of faces to determine whether a target emotional face was present in a crowd of non-target neutral faces. Search efficiency index was computed by subtracting the search time for target-present trials from that for target-absent trials. Results The high-CPA-risk group searched significantly less efficiently for the happy, but not sad, faces, than did the low-CPA-risk group; meanwhile, self-reported emotional ratings (i.e., valence and arousal) of the faces did not differ between the groups. Conclusions Consistent with the SIP models, our findings suggest that low- and high-CPA-risk mothers differ in how they rapidly detect positive facial expressions, but not in how they explicitly evaluate them. On a CPA-risk continuum, less efficient detection of positive facial expressions in the initial processes of the SIP system may begin to occur in the physical-discipline stage, and decrease the likelihood of positive interpersonal experiences, consequently leading to an increased risk of CPA.
Collapse
Affiliation(s)
- Koji Shimada
- Research Center for Child Mental Development, University of Fukui, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan. .,Biomedical Imaging Research Center, University of Fukui, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan. .,Division of Developmental Higher Brain Functions, United Graduate School of Child Development, University of Fukui, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan.
| | - Ryoko Kasaba
- Division of Developmental Higher Brain Functions, United Graduate School of Child Development, University of Fukui, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan
| | - Akiko Yao
- Division of Developmental Higher Brain Functions, United Graduate School of Child Development, University of Fukui, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan
| | - Akemi Tomoda
- Research Center for Child Mental Development, University of Fukui, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan.,Division of Developmental Higher Brain Functions, United Graduate School of Child Development, University of Fukui, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan.,Department of Child and Adolescent Psychological Medicine, University of Fukui Hospital, 23-3 Matsuoka-Shimoaizuki, Eiheiji-cho, Yoshida-gun, Fukui, 910-1193, Japan
| |
Collapse
|
16
|
Craig BM, Nelson NL, Dixson BJW. Sexual Selection, Agonistic Signaling, and the Effect of Beards on Recognition of Men’s Anger Displays. Psychol Sci 2019; 30:728-738. [DOI: 10.1177/0956797619834876] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
Abstract
The beard is arguably one of the most obvious signals of masculinity in humans. Almost 150 years ago, Darwin suggested that beards evolved to communicate formidability to other males, but no studies have investigated whether beards enhance recognition of threatening expressions, such as anger. We found that the presence of a beard increased the speed and accuracy with which participants recognized displays of anger but not happiness (Experiment 1, N = 219). This effect was not due to negative evaluations shared by beardedness and anger or to negative stereotypes associated with beardedness, as beards did not facilitate recognition of another negative expression, sadness (Experiment 2, N = 90), and beards increased the rated prosociality of happy faces in addition to the rated masculinity and aggressiveness of angry faces (Experiment 3, N = 445). A computer-based emotion classifier reproduced the influence of beards on emotion recognition (Experiment 4). The results suggest that beards may alter perceived facial structure, facilitating rapid judgments of anger in ways that conform to evolutionary theory.
Collapse
Affiliation(s)
- Belinda M. Craig
- School of Psychology, Curtin University
- School of Psychology, University of New England
| | | | | |
Collapse
|
17
|
Discrimination between smiling faces: Human observers vs. automated face analysis. Acta Psychol (Amst) 2018; 187:19-29. [PMID: 29758397 DOI: 10.1016/j.actpsy.2018.04.019] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2017] [Revised: 04/09/2018] [Accepted: 04/30/2018] [Indexed: 11/23/2022] Open
Abstract
This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes).
Collapse
|
18
|
Krumhuber EG, Scherer KR. The Look of Fear from the Eyes Varies with the Dynamic Sequence of Facial Actions. SWISS JOURNAL OF PSYCHOLOGY 2016. [DOI: 10.1024/1421-0185/a000166] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Abstract. Most research on the ability to interpret expressions from the eyes has utilized static information. This research investigates whether the dynamic sequence of facial actions in the eye region influences the judgments of perceivers. Dynamic fear expressions involving the eye region and eyebrows were created which systematically differed in the sequential occurrence of facial actions. Participants rated the intensity of sequential fear expressions, either in addition to a simultaneous, full-blown expression (Experiment 1) or in combination with different levels of eye gaze (Experiment 2). The results showed that the degree of attributed emotion and the appraisal ratings differed as a function of the sequence of facial expressions of fear, with direct gaze resulting in stronger subjective responses. The findings challenge current notions surrounding the study of static facial displays from the eyes and suggest that emotion perception is a dynamic process shaped by the time course of the facial actions of an expression. Possible implications for the field of affective computing and clinical research are discussed.
Collapse
Affiliation(s)
- Eva G. Krumhuber
- Department of Experimental Psychology, University College London, UK
| | - Klaus R. Scherer
- Swiss Center for Affective Sciences, University of Geneva, Switzerland
| |
Collapse
|
19
|
Gutiérrez-García A, Calvo MG. Discrimination thresholds for smiles in genuine versus blended facial expressions. COGENT PSYCHOLOGY 2015. [DOI: 10.1080/23311908.2015.1064586] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022] Open
Affiliation(s)
| | - Manuel G. Calvo
- Department of Cognitive Psychology, University of La Laguna, Tenerife 38205, Spain
| |
Collapse
|
20
|
Calvo MG, Nummenmaa L. Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cogn Emot 2015. [PMID: 26212348 DOI: 10.1080/02699931.2015.1049124] [Citation(s) in RCA: 132] [Impact Index Per Article: 13.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.
Collapse
Affiliation(s)
- Manuel G Calvo
- a Department of Cognitive Psychology , University of La Laguna , Tenerife , Spain
| | - Lauri Nummenmaa
- b School of Science , Aalto University , Espoo , Finland.,c Department of Psychology and Turku PET Centre , University of Turku , Turku , Finland
| |
Collapse
|
21
|
Whitaker L, Jones CRG, Wilkins AJ, Roberson D. Judging the Intensity of Emotional Expression in Faces: the Effects of Colored Tints on Individuals With Autism Spectrum Disorder. Autism Res 2015; 9:450-9. [PMID: 26058998 DOI: 10.1002/aur.1506] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2014] [Accepted: 05/07/2015] [Indexed: 11/09/2022]
Abstract
Individuals with autism spectrum disorder (ASD) often show atypical processing of facial expressions, which may result from visual stress. In the current study, children with ASD and matched controls judged which member of a pair of faces displayed the more intense emotion. Both faces showed anger, disgust, fear, happiness, sadness or surprise but to different degrees. Faces were presented on a monitor that was tinted either gray or with a color previously selected by the participant individually as improving the clarity of text. Judgments of emotional intensity improved significantly with the addition of the preferred colored tint in the ASD group but not in controls, a result consistent with a link between visual stress and impairments in processing facial expressions in individuals with ASD.
Collapse
Affiliation(s)
- Lydia Whitaker
- Department of Psychology, University of Essex, Wivenhoe Park, Colchester, United Kingdom
| | - Catherine R G Jones
- School of Psychology, Cardiff University, Tower Building, Park Place, Cardiff, United Kingdom
| | - Arnold J Wilkins
- Department of Psychology, University of Essex, Wivenhoe Park, Colchester, United Kingdom
| | - Debi Roberson
- Department of Psychology, University of Essex, Wivenhoe Park, Colchester, United Kingdom
| |
Collapse
|
22
|
Prazak ER, Burgund ED. Keeping it real: Recognizing expressions in real compared to schematic faces. VISUAL COGNITION 2014. [DOI: 10.1080/13506285.2014.914991] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
23
|
Calvo MG, Fernández-Martín A, Nummenmaa L. A smile biases the recognition of eye expressions: Configural projection from a salient mouth. Q J Exp Psychol (Hove) 2013; 66:1159-81. [DOI: 10.1080/17470218.2012.732586] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
A smile is visually highly salient and grabs attention automatically. We investigated how extrafoveally seen smiles influence the viewers' perception of non-happy eyes in a face. A smiling mouth appeared in composite faces with incongruent non-happy (fearful, neutral, etc.) eyes, thus producing blended expressions, or it appeared in intact faces with genuine expressions. Attention to the eye region was spatially cued while foveal vision of the mouth was blocked by gaze-contingent masking. Participants judged whether the eyes were happy or not. Results indicated that the smile biased the evaluation of the eye expression: The same non-happy eyes were more likely to be judged as happy and categorized more slowly as not happy in a face with a smiling mouth than in a face with a non-smiling mouth or with no mouth. This bias occurred when the mouth and the eyes appeared simultaneously and aligned, but also to some extent when they were misaligned and when the mouth appeared after the eyes. We conclude that the highly salient smile projects to other facial regions, thus influencing the perception of the eye expression. Projection serves spatial and temporal integration of face parts and changes.
Collapse
Affiliation(s)
- Manuel G. Calvo
- Department of Cognitive Psychology, University of La Laguna, Tenerife, Spain
| | | | - Lauri Nummenmaa
- Department of Biomedical Engineering and Computational Science, and Brain Research Unit, O.V. Lounasmaa Laboratory, School of Science, Aalto University, Espoo, Finland
- Turku PET Centre, Turku, Finland
| |
Collapse
|
24
|
Gutiérrez-García A, Calvo MG. Social anxiety and interpretation of ambiguous smiles. ANXIETY STRESS AND COPING 2013; 27:74-89. [DOI: 10.1080/10615806.2013.794941] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|