1
|
Borten JBL, Barros MCM, Silva ES, Carlini LP, Balda RCX, Orsi RN, Heiderich TM, Sanudo A, Thomaz CE, Guinsburg R. Looking through Providers' Eyes: Pain in the Neonatal Intensive Care Unit. Am J Perinatol 2024; 41:e3242-e3248. [PMID: 37973154 DOI: 10.1055/a-2212-0578] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/19/2023]
Abstract
OBJECTIVE Evaluate the pain of critically ill newborns is a challenge because of the devices for cardiorespiratory support. This study aim to verify the adults' gaze when assessing the critically ill neonates' pain at bedside. STUDY DESIGN Cross-sectional study in which pediatricians, nursing technicians, and parents evaluated critically ill neonates' pain at bedside, for 20 seconds with eye-tracking glasses. At the end, they answered whether the neonate was in pain or not. Visual tracking outcomes: number and time of visual fixations in four areas of interest (AOI) (face, trunk, and upper [UL] and lower [LL] limbs) were compared between groups and according to pain perception (present/absent). RESULTS A total of 62 adults (21 pediatricians, 23 nursing technicians, 18 parents) evaluated 27 neonates (gestational age: 31.8 ± 4.4 weeks; birth weight: 1,645 ± 1,234 g). More adults fixed their gaze on the face (96.8%) and trunk (96.8%), followed by UL (74.2%) and LL (66.1%). Parents performed a greater number of fixations on the trunk than nursing technicians (11.0 vs. 5.5 vs. 6.0; p = 0.023). Controlled for visual tracking variables, each second of eye fixation in AOI (1.21; 95% confidence interval [CI]: 1.03-1.42; p = 0.018) and UL (1.07; 95% CI: 1.03-1.10; p < 0.001) increased the chance of perceiving the presence of pain. CONCLUSION Adults, when assessing at bedside critically ill newborns' pain, fixed their eyes mainly on the face and trunk. The time spent looking at the UL was associated with the perception of pain presence. KEY POINTS · Pain assessment in critically ill newborns is a challenge.. · To assess critically ill neonates' pain, adults mainly look at the face and trunk.. · Looking at the upper limbs also helps in assessing critically ill neonates' pain..
Collapse
Affiliation(s)
- Julia B L Borten
- Division of Neonatal Medicine, Department of Pediatrics at Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
| | - Marina C M Barros
- Division of Neonatal Medicine, Department of Pediatrics at Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
| | - Erica S Silva
- Division of Neonatal Medicine, Department of Pediatrics at Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
| | - Lucas P Carlini
- Image Processing Laboratory, Department of Electrical Engineering, Centro Universitario FEI, Sao Bernardo do Campo, São Paulo, Brazil
| | - Rita C X Balda
- Division of Neonatal Medicine, Department of Pediatrics at Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
| | - Rafael N Orsi
- Epidemiology and Biostatistics, Department of Preventive Medicine at Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
| | - Tatiany M Heiderich
- Image Processing Laboratory, Department of Electrical Engineering, Centro Universitario FEI, Sao Bernardo do Campo, São Paulo, Brazil
| | - Adriana Sanudo
- Epidemiology and Biostatistics, Department of Preventive Medicine at Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
| | - Carlos E Thomaz
- Image Processing Laboratory, Department of Electrical Engineering, Centro Universitario FEI, Sao Bernardo do Campo, São Paulo, Brazil
| | - Ruth Guinsburg
- Division of Neonatal Medicine, Department of Pediatrics at Escola Paulista de Medicina, Universidade Federal de São Paulo, São Paulo, Brazil
| |
Collapse
|
2
|
Ghandchi A, Golbabaei S, Borhani K. Effects of two different social exclusion paradigms on ambiguous facial emotion recognition. Cogn Emot 2024; 38:296-314. [PMID: 38678446 DOI: 10.1080/02699931.2023.2285862] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Accepted: 11/13/2023] [Indexed: 04/30/2024]
Abstract
Social exclusion is an emotionally painful experience that leads to various alterations in socio-emotional processing. The perceptual and emotional consequences that may arise from experiencing social exclusion can vary depending on the paradigm used to manipulate it. Exclusion paradigms can vary in terms of the severity and duration of the leading exclusion experience, thereby classifying it as either a short-term or long-term experience. The present study aimed to study the impact of exclusion on socio-emotional processing using different paradigms that caused experiencing short-term and imagining long-term exclusion. Ambiguous facial emotions were used as socio-emotional cues. In study 1, the Ostracism Online paradigm was used to manipulate short-term exclusion. In study 2, a new sample of participants imagined long-term exclusion through the future life alone paradigm. Participants of both studies then completed a facial emotion recognition task consisting of morphed ambiguous facial emotions. By means of Point of Subjective Equivalence analyses, our results indicate that the experience of short-term exclusion hinders recognising happy facial expressions. In contrast, imagining long-term exclusion causes difficulties in recognising sad facial expressions. These findings extend the current literature, suggesting that not all social exclusion paradigms affect socio-emotional processing similarly.
Collapse
Affiliation(s)
- Arezoo Ghandchi
- Institute for Cognitive and Brain Sciences, Shahid Beheshti University, Tehran, Iran
| | - Soroosh Golbabaei
- Institute for Cognitive and Brain Sciences, Shahid Beheshti University, Tehran, Iran
| | - Khatereh Borhani
- Institute for Cognitive and Brain Sciences, Shahid Beheshti University, Tehran, Iran
| |
Collapse
|
3
|
Thomas PJN, Caharel S. Do masks cover more than just a face? A study on how facemasks affect the perception of emotional expressions according to their degree of intensity. Perception 2024; 53:3-16. [PMID: 37709269 DOI: 10.1177/03010066231201230] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/16/2023]
Abstract
Emotional facial expressions convey crucial information in nonverbal communication and serve as a mediator in face-to-face relationships. Their recognition would rely on specific facial traits depending on the perceived emotion. During the COVID-19 pandemic, wearing a facemask has thus disrupted the human ability to read emotions from faces. Yet, these effects are usually assessed across studies from faces expressing stereotypical and exaggerated emotions, which is far removed from real-life conditions. The objective of the present study was to evaluate the impact of facemasks through an emotion categorization task using morphs ranging from a neutral face and an expressive face (anger, disgust, fear, happiness, and sadness) (from 0% neutral to 100% expressive in 20% steps). Our results revealed a strong impact of facemasks on the recognition of expressions of disgust, happiness, and sadness, resulting in a decrease in performance and an increase in misinterpretations, both for low and high levels of intensity. In contrast, the recognition of anger and fear, as well as neutral expression, was found to be less impacted by mask-wearing. Future studies should address this issue from a more ecological point of view with the aim of taking concrete adaptive measures in the context of daily interactions.
Collapse
|
4
|
Silva ES, Barros MCDM, Borten JBL, Carlini LP, Balda RDCX, Orsi RN, Heiderich TM, Thomaz CE, Guinsburg R. Pediatricians' focus of sight at pain assessment during a neonatal heel puncture. REVISTA PAULISTA DE PEDIATRIA : ORGAO OFICIAL DA SOCIEDADE DE PEDIATRIA DE SAO PAULO 2023; 42:e2023089. [PMID: 38088681 PMCID: PMC10712942 DOI: 10.1590/1984-0462/2024/42/2023089] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/07/2023] [Accepted: 09/18/2023] [Indexed: 12/17/2023]
Abstract
OBJECTIVE To evaluate the focus of pediatricians' gaze during the heel prick of neonates. METHODS Prospective study in which pediatricians wearing eye tracker glasses evaluated neonatal pain before/after a heel prtick. Pediatricians scored the pain they perceived in the neonate in a verbal analogue numerical scale (0=no pain; 10=maximum pain). The outcomes measured were number and time of visual fixations in upper face, lower face, and hands, in two 10-second periods, before (pre) and after the puncture (post). These outcomes were compared between the periods, and according to pediatricians' pain perception: absent/mild (score: 0-5) and moderate/intense (score: 6-10). RESULTS 24 pediatricians (31 years old, 92% female) evaluated 24 neonates. The median score attributed to neonatal pain during the heel prick was 7.0 (Interquartile range: 5-8). Compared to pre-, in the post-periods, more pediatricians fixed their gaze on the lower face (63 vs. 92%; p=0.036) and the number of visual fixations was greater on the lower face (2.0 vs. 5.0; p=0.018). There was no difference in the number and time of visual fixations according to the intensity of pain. CONCLUSIONS At bedside, pediatricians change their focus of attention on the neonatal face after a painful procedure, focusing mainly on the lower part of the face.
Collapse
Affiliation(s)
- Erica Souza Silva
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Departamento de Pediatria, Disciplina de Pediatria Neonatal – São Paulo, SP, Brasil
| | - Marina Carvalho de Moraes Barros
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Departamento de Pediatria, Disciplina de Pediatria Neonatal – São Paulo, SP, Brasil
| | - Julia Baptista Lopes Borten
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Departamento de Pediatria, Disciplina de Pediatria Neonatal – São Paulo, SP, Brasil
| | - Lucas Pereira Carlini
- Centro Universitario FEI, Departamento de Engenharia Elétrica, Laboratório de Processamento de Imagens – São Bernardo do Campo, SP, Brasil
| | - Rita de Cássia Xavier Balda
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Departamento de Pediatria, Disciplina de Pediatria Neonatal – São Paulo, SP, Brasil
| | - Rafael Nobre Orsi
- Centro Universitario FEI, Departamento de Engenharia Elétrica, Laboratório de Processamento de Imagens – São Bernardo do Campo, SP, Brasil
| | - Tatiany Marcondes Heiderich
- Centro Universitario FEI, Departamento de Engenharia Elétrica, Laboratório de Processamento de Imagens – São Bernardo do Campo, SP, Brasil
| | - Carlos Eduardo Thomaz
- Centro Universitario FEI, Departamento de Engenharia Elétrica, Laboratório de Processamento de Imagens – São Bernardo do Campo, SP, Brasil
| | - Ruth Guinsburg
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Departamento de Pediatria, Disciplina de Pediatria Neonatal – São Paulo, SP, Brasil
| |
Collapse
|
5
|
Franca M, Bolognini N, Brysbaert M. Seeing emotions in the eyes: a validated test to study individual differences in the perception of basic emotions. Cogn Res Princ Implic 2023; 8:67. [PMID: 37919608 PMCID: PMC10622392 DOI: 10.1186/s41235-023-00521-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Accepted: 10/20/2023] [Indexed: 11/04/2023] Open
Abstract
People are able to perceive emotions in the eyes of others and can therefore see emotions when individuals wear face masks. Research has been hampered by the lack of a good test to measure basic emotions in the eyes. In two studies respectively with 358 and 200 participants, we developed a test to see anger, disgust, fear, happiness, sadness and surprise in images of eyes. Each emotion is measured with 8 stimuli (4 male actors and 4 female actors), matched in terms of difficulty and item discrimination. Participants reliably differed in their performance on the Seeing Emotions in the Eyes test (SEE-48). The test correlated well not only with Reading the Mind in the Eyes Test (RMET) but also with the Situational Test of Emotion Understanding (STEU), indicating that the SEE-48 not only measures low-level perceptual skills but also broader skills of emotion perception and emotional intelligence. The test is freely available for research and clinical purposes.
Collapse
Affiliation(s)
- Maria Franca
- Ph.D. Program in Neuroscience, School of Medicine and Surgery, University of Milano-Bicocca, Monza, Italy
| | - Nadia Bolognini
- Department of Psychology and NeuroMI - Milan Centre for Neuroscience, University of Milano-Bicocca, Milan, Italy.
- Laboratory of Neuropsychology, Department of Neurorehabilitation Sciences, IRCCS Istituto Auxologico Italiano, Via Mercalli 32, 20122, Milan, Italy.
| | - Marc Brysbaert
- Department of Experimental Psychology, Ghent University, H. Dunantlaan 2, 9000, Ghent, Belgium.
| |
Collapse
|
6
|
Fujihara Y, Guo K, Liu CH. Relationship between types of anxiety and the ability to recognize facial expressions. Acta Psychol (Amst) 2023; 241:104100. [PMID: 38041913 DOI: 10.1016/j.actpsy.2023.104100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 11/16/2023] [Accepted: 11/28/2023] [Indexed: 12/04/2023] Open
Abstract
This study examined whether three subtypes of anxiety (trait anxiety, state anxiety, and social anxiety) have different effects on recognition of facial expressions. One hundred and thirty-eight participants matched facial expressions of three intensity levels (20 %, 40 %, 100 %) with one of the six emotion labels ("happy", "sad", "fear", "angry", "disgust", and "surprise"). While using a conventional method of analysis we were able to replicate some significant correlations between each anxiety type and recognition performance found in the literature. However, when we used partial correlation to isolate the effect of each anxiety type, most of these correlations were no longer significant, apart from the negative correlations between Beck Anxiety Inventory and reaction time to fearful faces displayed at 40 % intensity level, and the correlations between anxiety and categorisation errors. Specifically, social anxiety was positively correlated with misidentifying a happy face as a disgust face at 40 % intensity level, and state anxiety negatively correlated with misidentifying a happy face as a sad face at 20 % intensity level. However, these partial correlation analyses became non-significant after p value adjustment for multiple comparisons. Our eye tracking data also showed that state anxiety may be associated with reduced fixations on the eye regions of low-intensity sad or fearful faces. These analyses cast doubts on some effects reported in the previous studies because they are likely to reflect a mixture of influences from highly correlated anxiety subtypes.
Collapse
Affiliation(s)
- Yuya Fujihara
- Department of Psychology, Yasuda Women's University, Japan.
| | - Kun Guo
- School of Psychology, University of Lincoln, Brayford Pool, Lincoln, Lincolnshire LN6 7TS, United Kingdom.
| | - Chang Hong Liu
- Department of Psychology, Bournemouth University, United Kingdom.
| |
Collapse
|
7
|
Itier RJ, Durston AJ. Mass-univariate analysis of scalp ERPs reveals large effects of gaze fixation location during face processing that only weakly interact with face emotional expression. Sci Rep 2023; 13:17022. [PMID: 37813928 PMCID: PMC10562468 DOI: 10.1038/s41598-023-44355-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Accepted: 10/06/2023] [Indexed: 10/11/2023] Open
Abstract
Decoding others' facial expressions is critical for social functioning. To clarify the neural correlates of expression perception depending on where we look on the face, three combined gaze-contingent ERP experiments were analyzed using robust mass-univariate statistics. Regardless of task, fixation location impacted face processing from 50 to 350 ms, maximally around 120 ms, reflecting retinotopic mapping around C2 and P1 components. Fixation location also impacted majorly the N170-P2 interval while weak effects were seen at the face-sensitive N170 peak. Results question the widespread assumption that faces are processed holistically into an indecomposable perceptual whole around the N170. Rather, face processing is a complex and view-dependent process that continues well beyond the N170. Expression and fixation location interacted weakly during the P1-N170 interval, supporting a role for the mouth and left eye in fearful and happy expression decoding. Expression effects were weakest at the N170 peak but strongest around P2, especially for fear, reflecting task-independent affective processing. Results suggest N170 reflects a transition between processes rather than the maximum of a holistic face processing stage. Focus on this peak should be replaced by data-driven analyses of the epoch using robust statistics to fully unravel the early visual processing of faces and their affective content.
Collapse
Affiliation(s)
- Roxane J Itier
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada.
| | - Amie J Durston
- Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo, ON, N2L 3G1, Canada
| |
Collapse
|
8
|
Mulder MJ, Prummer F, Terburg D, Kenemans JL. Drift-diffusion modeling reveals that masked faces are preconceived as unfriendly. Sci Rep 2023; 13:16982. [PMID: 37813970 PMCID: PMC10562405 DOI: 10.1038/s41598-023-44162-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2022] [Accepted: 10/04/2023] [Indexed: 10/11/2023] Open
Abstract
During the COVID-19 pandemic, the use of face masks has become a daily routine. Studies have shown that face masks increase the ambiguity of facial expressions which not only affects (the development of) emotion recognition, but also interferes with social interaction and judgement. To disambiguate facial expressions, we rely on perceptual (stimulus-driven) as well as preconceptual (top-down) processes. However, it is unknown which of these two mechanisms accounts for the misinterpretation of masked expressions. To investigate this, we asked participants (N = 136) to decide whether ambiguous (morphed) facial expressions, with or without a mask, were perceived as friendly or unfriendly. To test for the independent effects of perceptual and preconceptual biases we fitted a drift-diffusion model (DDM) to the behavioral data of each participant. Results show that face masks induce a clear loss of information leading to a slight perceptual bias towards friendly choices, but also a clear preconceptual bias towards unfriendly choices for masked faces. These results suggest that, although face masks can increase the perceptual friendliness of faces, people have the prior preconception to interpret masked faces as unfriendly.
Collapse
Affiliation(s)
- Martijn J Mulder
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands.
| | - Franziska Prummer
- School of Computing and Communications, Lancaster University, Lancaster, UK
| | - David Terburg
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| | - J Leon Kenemans
- Department of Experimental Psychology, Helmholtz Institute, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
9
|
Vicente-Querol MA, Fernández-Caballero A, González P, González-Gualda LM, Fernández-Sotos P, Molina JP, García AS. Effect of Action Units, Viewpoint and Immersion on Emotion Recognition Using Dynamic Virtual Faces. Int J Neural Syst 2023; 33:2350053. [PMID: 37746831 DOI: 10.1142/s0129065723500533] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/26/2023]
Abstract
Facial affect recognition is a critical skill in human interactions that is often impaired in psychiatric disorders. To address this challenge, tests have been developed to measure and train this skill. Recently, virtual human (VH) and virtual reality (VR) technologies have emerged as novel tools for this purpose. This study investigates the unique contributions of different factors in the communication and perception of emotions conveyed by VHs. Specifically, it examines the effects of the use of action units (AUs) in virtual faces, the positioning of the VH (frontal or mid-profile), and the level of immersion in the VR environment (desktop screen versus immersive VR). Thirty-six healthy subjects participated in each condition. Dynamic virtual faces (DVFs), VHs with facial animations, were used to represent the six basic emotions and the neutral expression. The results highlight the important role of the accurate implementation of AUs in virtual faces for emotion recognition. Furthermore, it is observed that frontal views outperform mid-profile views in both test conditions, while immersive VR shows a slight improvement in emotion recognition. This study provides novel insights into the influence of these factors on emotion perception and advances the understanding and application of these technologies for effective facial emotion recognition training.
Collapse
Affiliation(s)
- Miguel A Vicente-Querol
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
| | - Antonio Fernández-Caballero
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III, Madrid 28029, Spain
| | - Pascual González
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III, Madrid 28029, Spain
| | - Luz M González-Gualda
- Servicio de Salud Mental, Complejo Hospitalario, Universitario de Albacete, Albacete 02004, Spain
| | - Patricia Fernández-Sotos
- Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III, Madrid 28029, Spain
- Servicio de Salud Mental, Complejo Hospitalario, Universitario de Albacete, Albacete 02004, Spain
| | - José P Molina
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
| | - Arturo S García
- Instituto de Investigación en Informática, Universidad de Castilla-La Mancha, Albacete 02071, Spain
- Departamento de Sistemas Informáticos, Universidad de Castilla-La Mancha, Albacete 02071, Spain
| |
Collapse
|
10
|
Sun J, Dong T, Liu P. Holistic processing and visual characteristics of regulated and spontaneous expressions. J Vis 2023; 23:6. [PMID: 36912592 PMCID: PMC10019490 DOI: 10.1167/jov.23.3.6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/14/2023] Open
Abstract
The rapid and efficient recognition of facial expressions is crucial for adaptive behaviors, and holistic processing is one of the critical processing methods to achieve this adaptation. Therefore, this study integrated the effects and attentional characteristics of the authenticity of facial expressions on holistic processing. The results show that both regulated and spontaneous expressions were processed holistically. However, the spontaneous expression details did not indicate typical holistic processing, with the congruency effect observed equally for aligned and misaligned conditions. No significant difference between the two expressions was observed in terms of reaction times and eye movement characteristics (i.e., total fixation duration, fixation counts, and first fixation duration). These findings suggest that holistic processing strategies differ between the two expressions. Nevertheless, the difference was not reflected in attentional engagement.
Collapse
Affiliation(s)
- Juncai Sun
- School of Psychology, Qufu Normal University, Qufu, China.,
| | - Tiantian Dong
- Department of Psychology, Shanghai Normal University, Shanghai, China.,
| | - Ping Liu
- Department of Psychology, Shaoxing University, Shaoxing, China.,
| |
Collapse
|
11
|
Abstract
The judgment of female body appearance has been reported to be affected by a range of internal (e.g., viewers' sexual cognition) and external factors (e.g., viewed clothing type and colour). This eye-tracking study aimed to complement previous research by examining the effect of facial expression on female body perception and associated body-viewing gaze behaviour. We presented female body images of Caucasian avatars in a continuum of common dress sizes posing seven basic facial expressions (neutral, happiness, sadness, anger, fear, surprise, and disgust), and asked both male and female participants to rate the perceived body attractiveness and body size. The analysis revealed an evident modulatory role of avatar facial expressions on body attractiveness and body size ratings, but not on the amount of viewing time directed at individual body features. Specifically, happy and angry avatars attracted the highest and lowest body attractiveness ratings, respectively, and fearful and surprised avatars tended to be rated slimmer. Interestingly, the impact of facial expression on female body assessment was not further influenced by viewers' gender, suggesting a 'universal' role of common facial expressions in modifying the perception of female body appearance.
Collapse
Affiliation(s)
| | | | | | - Kun Guo
- Kun Guo, School of Psychology, University of Lincoln, Lincoln, LN6 7TS, UK.
| |
Collapse
|
12
|
Shepherd JL, Rippon D. The impact of briefly observing faces in opaque facial masks on emotion recognition and empathic concern. Q J Exp Psychol (Hove) 2023; 76:404-418. [PMID: 35319298 PMCID: PMC9896299 DOI: 10.1177/17470218221092590] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Since the outbreak of SARS-CoV-2 in 2019, there have been global public health initiatives that have advocated for the community use of face masks to reduce spread of the virus. Although the community use of facial coverings has been deemed essential for public health, there have been calls for enquiries to ascertain how face masks may impact non-verbal methods of communication. This study aimed to ascertain how the brief observations of faces in opaque facial coverings could impact facial emotion recognition. It was also an aim to ascertain if there was an association between the levels of empathic concern and facial emotion recognition when viewing masked faces. An opportunity sample of 199 participants, who resided in the United Kingdom, were randomly assigned to briefly observe either masked (n = 102) or unmasked (n = 97) faces. Participants in both conditions were required to view a series of facial expressions, from the Radboud Faces Database, with models conveying the emotional states of anger, disgust, fear, happiness, sadness, and surprised. Each face was presented to participants for a period of 250 ms in the masked and unmasked conditions. A 6 (emotion type) x 2 (masked/unmasked condition) mixed ANOVA revealed that viewing masked faces significantly reduced facial emotion recognition of disgust, fear, happiness, sadness, and surprised. However, there were no differences in the success rate of recognising the emotional state of anger between the masked and unmasked conditions. Furthermore, higher levels of empathic concern were associated with greater success in facially recognising the emotional state of disgust. The results of this study suggest that significant reductions in emotion recognition, when viewing faces in opaque masks, can still be observed when people are exposed to facial stimuli for a brief period of time.
Collapse
Affiliation(s)
| | - Daniel Rippon
- Daniel Rippon, Faculty of Health and Life Sciences, Northumbria University, Northumberland Building, Newcastle Upon Tyne NE1 8ST, UK.
| |
Collapse
|
13
|
Akiyama T, Matsumoto K, Osaka K, Tanioka R, Betriana F, Zhao Y, Kai Y, Miyagawa M, Yasuhara Y, Ito H, Soriano G, Tanioka T. Comparison of Subjective Facial Emotion Recognition and "Facial Emotion Recognition Based on Multi-Task Cascaded Convolutional Network Face Detection" between Patients with Schizophrenia and Healthy Participants. Healthcare (Basel) 2022; 10:healthcare10122363. [PMID: 36553887 PMCID: PMC9777528 DOI: 10.3390/healthcare10122363] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Revised: 11/16/2022] [Accepted: 11/21/2022] [Indexed: 11/27/2022] Open
Abstract
Patients with schizophrenia may exhibit a flat affect and poor facial expressions. This study aimed to compare subjective facial emotion recognition (FER) and FER based on multi-task cascaded convolutional network (MTCNN) face detection in 31 patients with schizophrenia (patient group) and 40 healthy participants (healthy participant group). A Pepper Robot was used to converse with the 71 aforementioned participants; these conversations were recorded on video. Subjective FER (assigned by medical experts based on video recordings) and FER based on MTCNN face detection was used to understand facial expressions during conversations. This study confirmed the discriminant accuracy of the FER based on MTCNN face detection. The analysis of the smiles of healthy participants revealed that the kappa coefficients of subjective FER (by six examiners) and FER based on MTCNN face detection concurred (κ = 0.63). The perfect agreement rate between the subjective FER (by three medical experts) and FER based on MTCNN face detection in the patient, and healthy participant groups were analyzed using Fisher's exact probability test where no significant difference was observed (p = 0.72). The validity and reliability were assessed by comparing the subjective FER and FER based on MTCNN face detection. The reliability coefficient of FER based on MTCNN face detection was low for both the patient and healthy participant groups.
Collapse
Affiliation(s)
- Toshiya Akiyama
- Graduate School of Health Sciences, Tokushima University, Tokushima 770-8509, Japan
| | - Kazuyuki Matsumoto
- Graduate School of Engineering, Tokushima University, Tokushima 770-8506, Japan
| | - Kyoko Osaka
- Department of Psychiatric Nursing, Nursing Course of Kochi Medical School, Kochi University, Kochi 783-8505, Japan
| | - Ryuichi Tanioka
- Department of Physical Therapy, Hiroshima Cosmopolitan University, Hiroshima 734-0014, Japan
| | | | - Yueren Zhao
- Department of Psychiatry, Fujita Health University, Nagoya 470-1192, Japan
| | - Yoshihiro Kai
- Department of Mechanical Engineering, Tokai University, Tokyo 151-8677, Japan
| | - Misao Miyagawa
- Department of Nursing, Faculty of Health and Welfare, Tokushima Bunri University, Tokushima 770-8514, Japan
| | - Yuko Yasuhara
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan
| | - Hirokazu Ito
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan
| | - Gil Soriano
- Department of Nursing, College of Allied Health, National University Philippines, Manila 1008, Philippines
| | - Tetsuya Tanioka
- Institute of Biomedical Sciences, Tokushima University, Tokushima 770-8509, Japan
- Correspondence:
| |
Collapse
|
14
|
Rabadan V, Ricou C, Latinus M, Aguillon-Hernandez N, Wardak C. Facial mask disturbs ocular exploration but not pupil reactivity. Front Neurosci 2022; 16:1033243. [DOI: 10.3389/fnins.2022.1033243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2022] [Accepted: 10/28/2022] [Indexed: 11/22/2022] Open
Abstract
IntroductionThe COVID-19 pandemic has imposed to wear a face mask that may have negative consequences for social interactions despite its health benefits. A lot of recent studies focused on emotion recognition of masked faces, as the mouth is, with the eyes, essential to convey emotional content. However, none have studied neurobehavioral and neurophysiological markers of masked faces perception, such as ocular exploration and pupil reactivity. The purpose of this eye tracking study was to quantify how wearing a facial accessory, and in particular a face mask, affected the ocular and pupillary response to a face, emotional or not.MethodsWe used videos of actors wearing a facial accessory to characterize the visual exploration and pupillary response in several occlusion (no accessory, sunglasses, scarf, and mask) and emotional conditions (neutral, happy, and sad) in a population of 44 adults.ResultsWe showed that ocular exploration differed for face covered with an accessory, and in particular a mask, compared to the classical visual scanning pattern of a non-covered face. The covered areas of the face were less explored. Pupil reactivity seemed only slightly affected by the mask, while its sensitivity to emotions was observed even in the presence of a facial accessory.DiscussionThese results suggest a mixed impact of the mask on attentional capture and physiological adjustment, which does not seem to be reconcilable with its strong effect on behavioral emotional recognition previously described.
Collapse
|
15
|
Amadeo MB, Escelsior A, Amore M, Serafini G, Pereira da Silva B, Gori M. Face masks affect perception of happy faces in deaf people. Sci Rep 2022; 12:12424. [PMID: 35858937 PMCID: PMC9298172 DOI: 10.1038/s41598-022-16138-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2021] [Accepted: 07/05/2022] [Indexed: 11/22/2022] Open
Abstract
The SARS-CoV-2 pandemic has led significant social repercussions and forced people to wear face masks. Recent research has demonstrated that the human ability to infer emotions from facial configurations is significantly reduced when face masks are worn. Since the mouth region is specifically crucial for deaf people who speak sign language, the current study assessed the impact of face masks on inferring emotional facial expressions in a population of adult deaf signers. A group of 34 congenitally deaf individuals and 34 normal-hearing individuals were asked to identify happiness, sadness, fear, anger, and neutral expression on static human pictures with and without facial masks presented through smartphones. For each emotion, the percentage of correct responses with and without face masks was calculated and compared between groups. Results indicated that face masks, such as those worn due to the SARS-CoV-2 pandemic, limit the ability of people to infer emotions from facial expressions. The negative impact of face masks is significantly pronounced when deaf people have to recognize low-intensity expressions of happiness. These findings are of essential importance because difficulties in recognizing emotions from facial expressions due to mask wearing may contribute to the communication challenges experienced by the deaf community during the SARS-CoV-2 pandemic, generating feelings of frustration and exclusion.
Collapse
Affiliation(s)
- Maria Bianca Amadeo
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via Enrico Melen 83, 16152, Genoa, Italy. .,Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.
| | - Andrea Escelsior
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Mario Amore
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Gianluca Serafini
- Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Beatriz Pereira da Silva
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via Enrico Melen 83, 16152, Genoa, Italy.,Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy.,Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa, Genoa, Italy.,IRCCS Ospedale Policlinico San Martino, Genoa, Italy
| | - Monica Gori
- U-VIP Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Via Enrico Melen 83, 16152, Genoa, Italy.,Applied Neurosciences for Technological Advances in Rehabilitation Systems (ANTARES) Joint Lab, Clinica Psichiatrica Ed SPDC, Largo Rosanna Benzi, 10, 16132, Genoa, Italy
| |
Collapse
|
16
|
Soares JDCA, Barros MCDM, da Silva GVT, Carlini LP, Heiderich TM, Orsi RN, Balda RDCX, Silva PASO, Thomaz CE, Guinsburg R. Looking at neonatal facial features of pain: do health and non-health professionals differ? J Pediatr (Rio J) 2022; 98:406-412. [PMID: 34914897 PMCID: PMC9432145 DOI: 10.1016/j.jped.2021.10.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/02/2021] [Revised: 10/25/2021] [Accepted: 10/25/2021] [Indexed: 12/03/2022] Open
Abstract
OBJECTIVE To analyze the regions that trigger the attention of adults' gaze when assessing pain in newborn infants' pictures and to verify if there are differences between health and non-health professionals. METHOD Experimental study with 84 health professionals and 59 non-health professionals, who evaluated two images of 10 neonates, one at rest and the other during a painful procedure. Each image was shown for 7 seconds on a computer screen, while eye movements were tracked by the Tobii TX300 EyeTracker. After evaluating each image, participants gave a score from 0 (absent pain) to 10 (maximum pain), according to their perception of neonatal pain. For each image, the number and total time of gaze fixations in the forehead, eyes, nasolabial furrow, and mouth were studied. Comparisons between both groups of adults were made by an intraclass correlation coefficient, Student's t-test, and Bland Altman graphic. RESULTS Health professionals (93% female; 34 ± 9 years old), compared to non-health professionals (64% female; 35 ± 11 years old), gave lower scores for images at rest (0.81 ± 0.50 vs. 1.59 ± 0.76; p = 0.010), with no difference for those obtained during the painful procedure (6.98 ± 1.08 vs. 6.73 ± 0.82). There was a strong or almost perfect correlation for the number of fixations in the mouth, eyes, forehead, and for the total fixation time in the eyes and forehead. CONCLUSIONS Adults, irrespective of their profession, showed a homogeneous gaze pattern when evaluating pictures of neonates at rest or during a painful procedures.
Collapse
Affiliation(s)
- Juliana do Carmo Azevedo Soares
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Department of Pediatrics, Division of Neonatal Medicine, São Paulo, SP, Brazil
| | - Marina Carvalho de Moraes Barros
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Department of Pediatrics, Division of Neonatal Medicine, São Paulo, SP, Brazil.
| | - Giselle Valério Teixeira da Silva
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Department of Pediatrics, Division of Neonatal Medicine, São Paulo, SP, Brazil
| | - Lucas Pereira Carlini
- Centro Universitario FEI, Department of Electrical Engineering, Image Processing Laboratory, São Bernardo do Campo, SP, Brazil
| | - Tatiany Marcondes Heiderich
- Centro Universitario FEI, Department of Electrical Engineering, Image Processing Laboratory, São Bernardo do Campo, SP, Brazil
| | - Rafael Nobre Orsi
- Centro Universitario FEI, Department of Electrical Engineering, Image Processing Laboratory, São Bernardo do Campo, SP, Brazil
| | - Rita de Cássia Xavier Balda
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Department of Pediatrics, Division of Neonatal Medicine, São Paulo, SP, Brazil
| | | | - Carlos Eduardo Thomaz
- Centro Universitario FEI, Department of Electrical Engineering, Image Processing Laboratory, São Bernardo do Campo, SP, Brazil
| | - Ruth Guinsburg
- Universidade Federal de São Paulo, Escola Paulista de Medicina, Department of Pediatrics, Division of Neonatal Medicine, São Paulo, SP, Brazil
| |
Collapse
|
17
|
Vicente-Querol MA, Fernandez-Caballero A, Molina JP, Gonzalez-Gualda LM, Fernandez-Sotos P, Garcia AS. Facial Affect Recognition in Immersive Virtual Reality: Where Is the Participant Looking? Int J Neural Syst 2022; 32:2250029. [DOI: 10.1142/s0129065722500290] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
18
|
Grahlow M, Rupp CI, Derntl B. The impact of face masks on emotion recognition performance and perception of threat. PLoS One 2022; 17:e0262840. [PMID: 35148327 PMCID: PMC8836371 DOI: 10.1371/journal.pone.0262840] [Citation(s) in RCA: 23] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Accepted: 01/07/2022] [Indexed: 12/03/2022] Open
Abstract
Facial emotion recognition is crucial for social interaction. However, in times of a global pandemic, where wearing a face mask covering mouth and nose is widely encouraged to prevent the spread of disease, successful emotion recognition may be challenging. In the current study, we investigated whether emotion recognition, assessed by a validated emotion recognition task, is impaired for faces wearing a mask compared to uncovered faces, in a sample of 790 participants between 18 and 89 years (condition mask vs. original). In two more samples of 395 and 388 participants between 18 and 70 years, we assessed emotion recognition performance for faces that are occluded by something other than a mask, i.e., a bubble as well as only showing the upper part of the faces (condition half vs. bubble). Additionally, perception of threat for faces with and without occlusion was assessed. We found impaired emotion recognition for faces wearing a mask compared to faces without mask, for all emotions tested (anger, fear, happiness, sadness, disgust, neutral). Further, we observed that perception of threat was altered for faces wearing a mask. Upon comparison of the different types of occlusion, we found that, for most emotions and especially for disgust, there seems to be an effect that can be ascribed to the face mask specifically, both for emotion recognition performance and perception of threat. Methodological constraints as well as the importance of wearing a mask despite temporarily compromised social interaction are discussed.
Collapse
Affiliation(s)
- Melina Grahlow
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
- Graduate Training Centre of Neuroscience, University of Tübingen, Tübingen, Germany
- Tübingen Center for Mental Health (TüCMH), Tübingen, Germany
- * E-mail:
| | - Claudia Ines Rupp
- Department of Psychiatry, Psychotherapy and Psychosomatics, Medical University of Innsbruck, Innsbruck, Austria
| | - Birgit Derntl
- Department of Psychiatry and Psychotherapy, University of Tübingen, Tübingen, Germany
- Tübingen Center for Mental Health (TüCMH), Tübingen, Germany
- Tübingen Neuro Campus, University of Tübingen, Tübingen, Germany
- Lead Graduate School, University of Tübingen, Tübingen, Germany
| |
Collapse
|
19
|
Ramachandra V, Longacre H. Unmasking the psychology of recognizing emotions of people wearing masks: The role of empathizing, systemizing, and autistic traits. PERSONALITY AND INDIVIDUAL DIFFERENCES 2022; 185:111249. [DOI: 10.1016/j.paid.2021.111249] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 08/24/2021] [Accepted: 09/03/2021] [Indexed: 12/28/2022]
|
20
|
Parada-Fernández P, Herrero-Fernández D, Jorge R, Comesaña P. Wearing mask hinders emotion recognition, but enhances perception of attractiveness. PERSONALITY AND INDIVIDUAL DIFFERENCES 2022; 184:111195. [DOI: 10.1016/j.paid.2021.111195] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/29/2021] [Revised: 04/26/2021] [Accepted: 08/09/2021] [Indexed: 10/20/2022]
|
21
|
Guo K, Hare A, Liu CH. Impact of Face Masks and Viewers' Anxiety on Ratings of First Impressions from Faces. Perception 2021; 51:37-50. [PMID: 34904869 PMCID: PMC8772253 DOI: 10.1177/03010066211065230] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Face mask is now a common feature in our social environment. Although face covering reduces our ability to recognize other's face identity and facial expressions, little is known about its impact on the formation of first impressions from faces. In two online experiments, we presented unfamiliar faces displaying neutral expressions with and without face masks, and participants rated the perceived approachableness, trustworthiness, attractiveness, and dominance from each face on a 9-point scale. Their anxiety levels were measured by the State-Trait Anxiety Inventory and Social Interaction Anxiety Scale. In comparison with mask-off condition, wearing face masks (mask-on) significantly increased the perceived approachableness and trustworthiness ratings, but showed little impact on increasing attractiveness or decreasing dominance ratings. Furthermore, both trait and state anxiety scores were negatively correlated with approachableness and trustworthiness ratings in both mask-off and mask-on conditions. Social anxiety scores, on the other hand, were negatively correlated with approachableness but not with trustworthiness ratings. It seems that the presence of a face mask can alter our first impressions of strangers. Although the ratings for approachableness, trustworthiness, attractiveness, and dominance were positively correlated, they appeared to be distinct constructs that were differentially influenced by face coverings and participants’ anxiety types and levels.
Collapse
Affiliation(s)
- Kun Guo
- School of Psychology, University of Lincoln, UK
| | | | | |
Collapse
|
22
|
Duran N, Atkinson AP. Foveal processing of emotion-informative facial features. PLoS One 2021; 16:e0260814. [PMID: 34855898 PMCID: PMC8638924 DOI: 10.1371/journal.pone.0260814] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Accepted: 11/17/2021] [Indexed: 11/18/2022] Open
Abstract
Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 2b) and when briefly presented at the mouth (Experiment 2a). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.
Collapse
Affiliation(s)
- Nazire Duran
- Department of Psychology, Durham University, Durham, United Kingdom
| | - Anthony P. Atkinson
- Department of Psychology, Durham University, Durham, United Kingdom
- * E-mail:
| |
Collapse
|
23
|
Mills E, Guo K. Impact of Face Masks on Female Body Perception is Modulated by Facial Expressions. Perception 2021; 51:51-59. [PMID: 34821177 PMCID: PMC8771895 DOI: 10.1177/03010066211061092] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
People routinely wear face masks during the pandemic, but little is known about their impact on body perception. In this online study, we presented female body images of Caucasian avatars in common dress sizes displaying happy, angry, and neutral facial expressions with and without face masks, and asked women to rate the perceived body attractiveness and body size. In comparison with mask-off condition, mask-on decreased body attractiveness ratings for happy avatars but did not affect ratings for neutral avatars irrespective of avatar dress sizes. For avatars displaying angry expressions, mask-on increased body attractiveness ratings for slimmer avatars but did not affect ratings for larger avatars. On the other hand, body size estimation was not systematically affected by face masks and facial expressions. It appears that face masks mainly show an expression-dependent influence on body attractiveness judgement, possibly through suppressing the perceived facial expressions.
Collapse
Affiliation(s)
- Eleanor Mills
- School of Psychology, University of Lincoln, Lincoln, UK
| | - Kun Guo
- School of Psychology, University of Lincoln, Lincoln, UK
| |
Collapse
|
24
|
Matt S, Dzhelyova M, Maillard L, Lighezzolo-Alnot J, Rossion B, Caharel S. The rapid and automatic categorization of facial expression changes in highly variable natural images. Cortex 2021; 144:168-184. [PMID: 34666300 DOI: 10.1016/j.cortex.2021.08.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 07/08/2021] [Accepted: 08/09/2021] [Indexed: 01/23/2023]
Abstract
Emotional expressions are quickly and automatically read from human faces under natural viewing conditions. Yet, categorization of facial expressions is typically measured in experimental contexts with homogenous sets of face stimuli. Here we evaluated how the 6 basic facial emotions (Fear, Disgust, Happiness, Anger, Surprise or Sadness) can be rapidly and automatically categorized with faces varying in head orientation, lighting condition, identity, gender, age, ethnic origin and background context. High-density electroencephalography was recorded in 17 participants viewing 50 s sequences with natural variable images of neutral-expression faces alternating at a 6 Hz rate. Every five stimuli (1.2 Hz), variable natural images of one of the six basic expressions were presented. Despite the wide physical variability across images, a significant F/5 = 1.2 Hz response and its harmonics (e.g., 2F/5 = 2.4 Hz, etc.) was observed for all expression changes at the group-level and in every individual participant. Facial categorization responses were found mainly over occipito-temporal sites, with distinct hemispheric lateralization and cortical topographies according to the different expressions. Specifically, a stronger response was found to Sadness categorization, especially over the left hemisphere, as compared to Fear and Happiness, together with a right hemispheric dominance for categorization of Fearful faces. Importantly, these differences were specific to upright faces, ruling out the contribution of low-level visual cues. Overall, these observations point to robust rapid and automatic facial expression categorization processes in the human brain.
Collapse
Affiliation(s)
- Stéphanie Matt
- Université de Lorraine, 2LPN, Nancy, France; Université de Lorraine, Laboratoire INTERPSY, Nancy, France.
| | - Milena Dzhelyova
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium.
| | - Louis Maillard
- Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | | | - Bruno Rossion
- Université Catholique de Louvain, Institute of Research in Psychological Science, Louvain-la-Neuve, Belgium; Université de Lorraine, CNRS, CRAN, Nancy, France; Université de Lorraine, CHRU-Nancy, Service de Neurologie, Nancy, France.
| | - Stéphanie Caharel
- Université de Lorraine, 2LPN, Nancy, France; Institut Universitaire de France, Paris, France.
| |
Collapse
|
25
|
Identification of pain in neonates: the adults' visual perception of neonatal facial features. J Perinatol 2021; 41:2304-2308. [PMID: 34253842 DOI: 10.1038/s41372-021-01143-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/17/2021] [Revised: 06/10/2021] [Accepted: 06/30/2021] [Indexed: 02/05/2023]
Abstract
OBJECTIVE To verify the visual attention of adults when assessing neonatal pain. STUDY DESIGN 143 adults (59% health professionals) evaluated 20 pictures (2 pictures of 10 neonates' faces: at rest; during a painful procedure). Tobii-TX300 tracked the participants' eyes movement. For each picture, adults scored pain intensity (0 = no pain; 10 = maximum). Latent classes analysis was applied by cognitive diagnosis models-GDINA with two attributes (knowledge of pain presence/absence). Variables associated with belonging to the class of adults that correctly identified pictures of newborns with/without pain were identified by logistic regression. RESULTS To identify neonatal pain, adults look at the mouth, eyes, and forehead in facial pictures. The latent class analysis identified four classes of adults: those that identify painful/painless neonates (YY-Class; n = 80); only painful neonates (n = 28); only painless neonates (n = 34) and none (n = 1). Being a health professional (OR: 2.29; 95% CI: 1.16-4.51), and each look at the nasolabial furrow (2.07; 1.19-3.62) increased the chance of belonging to the YY-class. CONCLUSIONS Being a health professional and the visual fixation at the nasolabial furrow helped to identify the presence/absence of neonatal pain.
Collapse
|
26
|
Wiesmann M, Franz C, Sichtermann T, Minkenberg J, Mathern N, Stockero A, Iordanishvili E, Freiherr J, Hodson J, Habel U, Nikoubashman O. Seeing faces, when faces can't be seen: Wearing portrait photos has a positive effect on how patients perceive medical staff when face masks have to be worn. PLoS One 2021; 16:e0251445. [PMID: 34010319 PMCID: PMC8133480 DOI: 10.1371/journal.pone.0251445] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 04/26/2021] [Indexed: 02/06/2023] Open
Abstract
Introduction Since the onset of the coronavirus disease 2019 (COVID-19) pandemic, wearing surgical face masks has become mandatory for healthcare staff in many countries when interacting with patients. Recently, it has been shown that wearing face masks impairs social interaction by diminishing a person’s ability to read the emotion of their counterparts, an essential prerequisite to respond adequately in social situations. It is easily conceivable that this may have a tangible negative influence on the communication and relationship between patients and healthcare personnel. We therefore investigated whether it has an effect on how patients perceive healthcare professionals when physicians and nursing staff wear portrait photos with their smiling faces in addition to face masks. Methods During the study period of 16 days, the medical staff of our Department wore surgical face masks at all times during any kind of interaction with patients. In a pseudorandomized order, all members of our staff additionally affixed their portrait photos to their work clothes on 8 of the 16 days. After completion of their visit, 226 patients were interviewed anonymously in a cross-sectional study design using a questionnaire in which they rated the following three items: friendliness of staff, medical quality of treatment, and how well they felt taken care of during treatment in our Department. Results On days, on which staff wore photos, mean scores of the questionnaires were significantly higher than on non-photo days (p = 0.013; mean ± standard deviation = 92.8 ± 11.3 vs. 91.0 ± 12.6; median (range) = 97 (98) vs. 96 (76)). When analyzed separately, the increased scores were only significant for the item friendliness of staff (p = 0.009; mean ± standard deviation = 95.8 ± 6.3 vs. 92.2 ± 11.5; median (range) = 98 (39) vs. 97 (54)). Conclusion Our study suggests that the use of portrait photos with smiling faces has a positive effect on how patients perceive healthcare staff.
Collapse
Affiliation(s)
- Martin Wiesmann
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
- * E-mail:
| | - Christiane Franz
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
| | - Thorsten Sichtermann
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
| | - Jan Minkenberg
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
| | - Nathalie Mathern
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
| | - Andrea Stockero
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
| | - Elene Iordanishvili
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
| | - Jessica Freiherr
- Department of Psychiatry and Psychotherapy, Friedrich Alexander University Erlangen, Erlangen, Germany
- Sensory Analytics, Fraunhofer Institute for Process Engineering and Packaging IVV, Freising, Germany
| | - Julian Hodson
- Faculty of Business and Economics, Leuphana University of Lueneburg, Lueneburg, Germany
| | - Ute Habel
- Department of Psychiatry, Psychotherapy and Psychosomatics, University Hospital RWTH Aachen, Aachen, Germany
| | - Omid Nikoubashman
- Department of Diagnostic and Interventional Neuroradiology, University Hospital RWTH Aachen, Aachen, Germany
| |
Collapse
|
27
|
Kinchella J, Guo K. Facial Expression Ambiguity and Face Image Quality Affect Differently on Expression Interpretation Bias. Perception 2021; 50:328-342. [PMID: 33709837 DOI: 10.1177/03010066211000270] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
We often show an invariant or comparable recognition performance for perceiving prototypical facial expressions, such as happiness and anger, under different viewing settings. However, it is unclear to what extent the categorisation of ambiguous expressions and associated interpretation bias are invariant in degraded viewing conditions. In this exploratory eye-tracking study, we systematically manipulated both facial expression ambiguity (via morphing happy and angry expressions in different proportions) and face image clarity/quality (via manipulating image resolution) to measure participants' expression categorisation performance, perceived expression intensity, and associated face-viewing gaze distribution. Our analysis revealed that increasing facial expression ambiguity and decreasing face image quality induced the opposite direction of expression interpretation bias (negativity vs. positivity bias, or increased anger vs. increased happiness categorisation), the same direction of deterioration impact on rating expression intensity, and qualitatively different influence on face-viewing gaze allocation (decreased gaze at eyes but increased gaze at mouth vs. stronger central fixation bias). These novel findings suggest that in comparison with prototypical facial expressions, our visual system has less perceptual tolerance in processing ambiguous expressions which are subject to viewing condition-dependent interpretation bias.
Collapse
|
28
|
Cui S, Song S, Si J, Wu M, Feng J. The influence of mouth opening and closing degrees on processing in NimStim facial expressions: An ERP study from Chinese college students. Int J Psychophysiol 2021; 162:157-165. [PMID: 33548347 DOI: 10.1016/j.ijpsycho.2021.01.013] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Revised: 12/23/2020] [Accepted: 01/12/2021] [Indexed: 01/13/2023]
Abstract
The degree of mouth opening and closing is one of the most important attributes of expression, reflecting the intensity of facial expression and can assist people to recognize the expression more accurately. The NimStim set of facial expressions contains the open and closed expression pictures of the same actor. Although this expression set has been widely used, there is little research on the intensity effect of this set. In this study, 32 Chinese college students were recruited in to view the pictures passively in an ERP experiment, aiming to investigate the intensity effect in the NimStim set (mouth open, mouth closed) of anger, disgust, sad, happy and neutral expression in electrical physiological aspects of the reaction. Our results reported that intensity of expression early affected in VPP and mainly affected in LPP with the open mouth having a larger activity. And there was no intensity effect found in P1, N170 and EPN. Notably, culture and social environment may influence the intensity effect of different emotions. In future, researchers should use methods that ensure subjects pay more attention to the intensity effect of the NimStim facial set.
Collapse
Affiliation(s)
- Shuang Cui
- School of Psychology, Shandong Normal University, Jinan, China
| | - Sutao Song
- School of Information Science and Engineering, Shandong Normal University, Jinan, China; School of Education and Psychology, University of Jinan, Jinan, China.
| | - Jiwei Si
- School of Psychology, Shandong Normal University, Jinan, China.
| | - Meiyun Wu
- School of Education and Psychology, University of Jinan, Jinan, China; State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
| | - Jieyin Feng
- School of Education and Psychology, University of Jinan, Jinan, China
| |
Collapse
|
29
|
ERP evidence for emotional sensitivity in social anxiety. J Affect Disord 2021; 279:361-367. [PMID: 33099050 DOI: 10.1016/j.jad.2020.09.111] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/13/2020] [Revised: 08/08/2020] [Accepted: 09/26/2020] [Indexed: 11/20/2022]
Abstract
BACKGROUND Emotional sensitivity involves the ability to recognize and interpret facial expressions. This is very important for interpersonal communication. Previous studies found differences in emotional sensitivity between high social anxiety (HSA) individuals and low social anxiety (LSA) individuals. However, the underlying neural mechanisms are still unclear. The present study explored the effects of expression intensity and social anxiety on emotional sensitivity and their neural mechanisms. METHODS The HSA group (n = 20) and the LSA group (n = 20) were asked to recognize anger expressions with different intensities in an emotion recognition task. The hit rate, reaction time, early time window (P1, N170), and late time window (LPP) were recorded. RESULTS The results showed that individuals with HSA had a significantly higher hit rate and shorter reaction time than individuals with LSA (p < 0.01). Event-related potential (ERP) results showed that, compared to the LSA group, the HSA group exhibited significantly enhanced N170 and LPP amplitude (p < 0.01). However, the difference in P1 amplitude was not significant (p > 0.05). LIMITATIONS The participants in this study were a subclinical social anxiety sample, and the effects of other mood disorders were not excluded, partially limiting the generalizability of the results. CONCLUSIONS Our findings suggest that, compared to LSA individuals, HSA individuals are more sensitive to all presented faces. The ERP results indicated that HSA individuals' high sensitivity to threatening expressions is related to stronger structural encoding and fine processing.
Collapse
|
30
|
Visual exploration of emotional body language: a behavioural and eye-tracking study. PSYCHOLOGICAL RESEARCH 2020; 85:2326-2339. [PMID: 32920675 DOI: 10.1007/s00426-020-01416-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Accepted: 09/01/2020] [Indexed: 10/23/2022]
Abstract
Bodily postures are essential to correctly comprehend others' emotions and intentions. Nonetheless, very few studies focused on the pattern of eye movements implicated in the recognition of emotional body language (EBL), demonstrating significant differences in relation to different emotions. A yet unanswered question regards the presence of the "left-gaze bias" (i.e. the tendency to look first, to make more fixations and to spend more looking time on the left side of centrally presented stimuli) while scanning bodies. Hence, the present study aims at exploring both the presence of a left-gaze bias and the modulation of EBL visual exploration mechanisms, by investigating the fixation patterns (number of fixations and latency of the first fixation) of participants while judging the emotional intensity of static bodily postures (Angry, Happy and Neutral, without head). While results on the latency of first fixations demonstrate for the first time the presence of the left-gaze bias while scanning bodies, suggesting that it could be related to the stronger expressiveness of the left hand (from the observer's point of view), results on fixations' number only partially fulfil our hypothesis. Moreover, an opposite viewing pattern between Angry and Happy bodily postures is showed. In sum, the present results, by integrating the spatial and temporal dimension of gaze exploration patterns, shed new light on EBL visual exploration mechanisms.
Collapse
|
31
|
Schindler S, Bublatzky F. Attention and emotion: An integrative review of emotional face processing as a function of attention. Cortex 2020; 130:362-386. [DOI: 10.1016/j.cortex.2020.06.010] [Citation(s) in RCA: 86] [Impact Index Per Article: 21.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Revised: 05/28/2020] [Accepted: 06/29/2020] [Indexed: 11/25/2022]
|
32
|
Nestor MS, Fischer DL, Arnold D. "Masking" our emotions: Botulinum toxin, facial expression, and well-being in the age of COVID-19. J Cosmet Dermatol 2020; 19:2154-2160. [PMID: 32592268 PMCID: PMC7361553 DOI: 10.1111/jocd.13569] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2020] [Accepted: 06/18/2020] [Indexed: 12/22/2022]
Abstract
Background The globally devastating effects of COVID‐19 breach not only the realm of public health, but of psychosocial interaction and communication as well, particularly with the advent of mask‐wearing. Methods A review of the literature and understanding of facial anatomy and expressions as well as the effect of botulinum toxin on emotions and nonverbal communication. Results Today, the mask has become a semi‐permanent accessory to the face, blocking our ability to express and perceive each other’s facial expressions by dividing it into a visible top half and invisible bottom half. This significantly restricts our ability to accurately interpret emotions based on facial expressions and strengthens our perceptions of negative emotions produced by frowning. The addition of botulinum toxin (BTX)–induced facial muscle paralysis to target the muscles of the top (visible) half of the face, especially the corrugator and procerus muscles, may act as a therapeutic solution by its suppression of glabellar lines and our ability to frown. The treatment of the glabella complex not only has been shown to inhibit the negative emotions of the treated individual but also can reduce the negative emotions in those who come in contact with the treated individual. Conclusions Mask‐wearing in the wake of COVID‐19 brings new challenges to our ability to communicate and perceive emotion through full facial expression, our most effective and universally shared form of communication, and BTX may offer a positive solution to decrease negative emotions and promote well‐being for both the mask‐wearer and all who come in contact with that individual.
Collapse
Affiliation(s)
- Mark S Nestor
- Center for Clinical and Cosmetic Research, Aventura, FL, USA.,Department of Dermatology and Cutaneous Surgery, University of Miami, Miller School of Medicine, Miami, FL, USA.,Department of Surgery, Division of Plastic Surgery, University of Miami, Miller School of Medicine, Miami, FL, USA
| | | | - David Arnold
- Center for Clinical and Cosmetic Research, Aventura, FL, USA
| |
Collapse
|
33
|
Guo K, Calver L, Soornack Y, Bourke P. Valence-dependent Disruption in Processing of Facial Expressions of Emotion in Early Visual Cortex—A Transcranial Magnetic Stimulation Study. J Cogn Neurosci 2020; 32:906-916. [DOI: 10.1162/jocn_a_01520] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Our visual inputs are often entangled with affective meanings in natural vision, implying the existence of extensive interaction between visual and emotional processing. However, little is known about the neural mechanism underlying such interaction. This exploratory transcranial magnetic stimulation (TMS) study examined the possible involvement of the early visual cortex (EVC, Area V1/V2/V3) in perceiving facial expressions of different emotional valences. Across three experiments, single-pulse TMS was delivered at different time windows (50–150 msec) after a brief 10-msec onset of face images, and participants reported the visibility and perceived emotional valence of faces. Interestingly, earlier TMS at ∼90 msec only reduced the face visibility irrespective of displayed expressions, but later TMS at ∼120 msec selectively disrupted the recognition of negative facial expressions, indicating the involvement of EVC in the processing of negative expressions at a later time window, possibly beyond the initial processing of fed-forward facial structure information. The observed TMS effect was further modulated by individuals' anxiety level. TMS at ∼110–120 msec disrupted the recognition of anger significantly more for those scoring relatively low in trait anxiety than the high scorers, suggesting that cognitive bias influences the processing of facial expressions in EVC. Taken together, it seems that EVC is involved in structural encoding of (at least) negative facial emotional valence, such as fear and anger, possibly under modulation from higher cortical areas.
Collapse
|
34
|
Bublatzky F, Kavcıoğlu F, Guerra P, Doll S, Junghöfer M. Contextual information resolves uncertainty about ambiguous facial emotions: Behavioral and magnetoencephalographic correlates. Neuroimage 2020; 215:116814. [PMID: 32276073 DOI: 10.1016/j.neuroimage.2020.116814] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2018] [Revised: 03/17/2020] [Accepted: 04/01/2020] [Indexed: 02/01/2023] Open
Abstract
Environmental conditions bias our perception of other peoples' facial emotions. This becomes quite relevant in potentially threatening situations, when a fellow's facial expression might indicate potential danger. The present study tested the prediction that a threatening environment biases the recognition of facial emotions. To this end, low- and medium-expressive happy and fearful faces (morphed to 10%, 20%, 30%, or 40% emotional) were presented within a context of instructed threat-of-shock or safety. Self-reported data revealed that instructed threat led to a biased recognition of fearful, but not happy facial expressions. Magnetoencephalographic correlates revealed spatio-temporal clusters of neural network activity associated with emotion recognition and contextual threat/safety in early to mid-latency time intervals in the left parietal cortex, bilateral prefrontal cortex, and the left temporal pole regions. Early parietal activity revealed a double dissociation of face-context information as a function of the expressive level of facial emotions: When facial expressions were difficult to recognize (low-expressive), contextual threat enhanced fear processing and contextual safety enhanced processing of subtle happy faces. However, for rather easily recognizable faces (medium-expressive) the left hemisphere (parietal cortex, PFC, and temporal pole) showed enhanced activity to happy faces during contextual threat and fearful faces during safety. Thus, contextual settings reduce the salience threshold and boost early face processing of low-expressive congruent facial emotions, whereas face-context incongruity or mismatch effects drive neural activity of easier recognizable facial emotions. These results elucidate how environmental settings help recognize facial emotions, and the brain mechanisms underlying the recognition of subtle nuances of fear.
Collapse
Affiliation(s)
- Florian Bublatzky
- Department of Psychosomatic Medicine and Psychotherapy, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim/Heidelberg University, Germany.
| | - Fatih Kavcıoğlu
- Chair of Biological Psychology, Clinical Psychology and Psychotherapy, University of Würzburg, Germany
| | - Pedro Guerra
- Department of Personality, University of Granada, Spain
| | - Sarah Doll
- Institute for Biomagnetism and Biosignalanalysis, University Hospital Münster, Münster, Germany
| | - Markus Junghöfer
- Institute for Biomagnetism and Biosignalanalysis, University Hospital Münster, Münster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Münster, Germany
| |
Collapse
|
35
|
Maza A, Moliner B, Ferri J, Llorens R. Visual Behavior, Pupil Dilation, and Ability to Identify Emotions From Facial Expressions After Stroke. Front Neurol 2020; 10:1415. [PMID: 32116988 PMCID: PMC7016192 DOI: 10.3389/fneur.2019.01415] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2019] [Accepted: 12/27/2019] [Indexed: 11/16/2022] Open
Abstract
Social cognition is the innate human ability to interpret the emotional state of others from contextual verbal and non-verbal information, and to self-regulate accordingly. Facial expressions are one of the most relevant sources of non-verbal communication, and their interpretation has been extensively investigated in the literature, using both behavioral and physiological measures, such as those derived from visual activity and visual responses. The decoding of facial expressions of emotion is performed by conscious and unconscious cognitive processes that involve a complex brain network that can be damaged after cerebrovascular accidents. A diminished ability to identify facial expressions of emotion has been reported after stroke, which has traditionally been attributed to impaired emotional processing. While this can be true, an alteration in visual behavior after brain injury could also negatively contribute to this ability. This study investigated the accuracy, distribution of responses, visual behavior, and pupil dilation of individuals with stroke while identifying emotional facial expressions. Our results corroborated impaired performance after stroke and exhibited decreased attention to the eyes, evidenced by a diminished time and number of fixations made in this area in comparison to healthy subjects and comparable pupil dilation. The differences in visual behavior reached statistical significance in some emotions when comparing individuals with stroke with impaired performance with healthy subjects, but not when individuals post-stroke with comparable performance were considered. The performance dependence of visual behavior, although not determinant, might indicate that altered visual behavior could be a negatively contributing factor for emotion recognition from facial expressions.
Collapse
Affiliation(s)
- Anny Maza
- Neurorehabilitation and Brain Research Group, Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, Valencia, Spain
| | - Belén Moliner
- NEURORHB, Servicio de Neurorrehabilitación de Hospitales Vithas, Valencia, Spain
| | - Joan Ferri
- NEURORHB, Servicio de Neurorrehabilitación de Hospitales Vithas, Valencia, Spain
| | - Roberto Llorens
- Neurorehabilitation and Brain Research Group, Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, Valencia, Spain.,NEURORHB, Servicio de Neurorrehabilitación de Hospitales Vithas, Valencia, Spain
| |
Collapse
|
36
|
Correia-Caeiro C, Guo K, Mills DS. Perception of dynamic facial expressions of emotion between dogs and humans. Anim Cogn 2020; 23:465-476. [PMID: 32052285 PMCID: PMC7181561 DOI: 10.1007/s10071-020-01348-5] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Revised: 01/07/2020] [Accepted: 01/14/2020] [Indexed: 11/29/2022]
Abstract
Facial expressions are a core component of the emotional response of social mammals. In contrast to Darwin's original proposition, expressive facial cues of emotion appear to have evolved to be species-specific. Faces trigger an automatic perceptual process, and so, inter-specific emotion perception is potentially a challenge; since observers should not try to “read” heterospecific facial expressions in the same way that they do conspecific ones. Using dynamic spontaneous facial expression stimuli, we report the first inter-species eye-tracking study on fully unrestrained participants and without pre-experiment training to maintain attention to stimuli, to compare how two different species living in the same ecological niche, humans and dogs, perceive each other’s facial expressions of emotion. Humans and dogs showed different gaze distributions when viewing the same facial expressions of either humans or dogs. Humans modulated their gaze depending on the area of interest (AOI) being examined, emotion, and species observed, but dogs modulated their gaze depending on AOI only. We also analysed if the gaze distribution was random across AOIs in both species: in humans, eye movements were not correlated with the diagnostic facial movements occurring in the emotional expression, and in dogs, there was only a partial relationship. This suggests that the scanning of facial expressions is a relatively automatic process. Thus, to read other species’ facial emotions successfully, individuals must overcome these automatic perceptual processes and employ learning strategies to appreciate the inter-species emotional repertoire.
Collapse
Affiliation(s)
- Catia Correia-Caeiro
- School of Psychology, University of Lincoln, Lincoln, UK. .,School of Life Sciences, University of Lincoln, Lincoln, UK.
| | - Kun Guo
- School of Psychology, University of Lincoln, Lincoln, UK
| | - Daniel S Mills
- School of Life Sciences, University of Lincoln, Lincoln, UK
| |
Collapse
|
37
|
Hills PJ, Roberts AL, Boobyer C. Being observed detrimentally affects face perception. JOURNAL OF COGNITIVE PSYCHOLOGY 2019. [DOI: 10.1080/20445911.2019.1685528] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Affiliation(s)
- Peter J. Hills
- Department of Psychology, Bournemouth University, Poole, United Kingdom of Great Britain and Northern Ireland
| | - Aimee Lee Roberts
- Anglia Ruskin University, Chelmsford, United Kingdom of Great Britain and Northern Ireland
| | - Charlotte Boobyer
- Department of Psychology, Bournemouth University, Poole, United Kingdom of Great Britain and Northern Ireland
| |
Collapse
|
38
|
Guo K, Li Z, Yan Y, Li W. Viewing heterospecific facial expressions: an eye-tracking study of human and monkey viewers. Exp Brain Res 2019; 237:2045-2059. [PMID: 31165915 PMCID: PMC6647127 DOI: 10.1007/s00221-019-05574-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2018] [Accepted: 05/31/2019] [Indexed: 11/03/2022]
Abstract
Common facial expressions of emotion have distinctive patterns of facial muscle movements that are culturally similar among humans, and perceiving these expressions is associated with stereotypical gaze allocation at local facial regions that are characteristic for each expression, such as eyes in angry faces. It is, however, unclear to what extent this 'universality' view can be extended to process heterospecific facial expressions, and how 'social learning' process contributes to heterospecific expression perception. In this eye-tracking study, we examined face-viewing gaze allocation of human (including dog owners and non-dog owners) and monkey observers while exploring expressive human, chimpanzee, monkey and dog faces (positive, neutral and negative expressions in human and dog faces; neutral and negative expressions in chimpanzee and monkey faces). Human observers showed species- and experience-dependent expression categorization accuracy. Furthermore, both human and monkey observers demonstrated different face-viewing gaze distributions which were also species dependent. Specifically, humans predominately attended at human eyes but animal mouth when judging facial expressions. Monkeys' gaze distributions in exploring human and monkey faces were qualitatively different from exploring chimpanzee and dog faces. Interestingly, the gaze behaviour of both human and monkey observers were further affected by their prior experience of the viewed species. It seems that facial expression processing is species dependent, and social learning may play a significant role in discriminating even rudimentary types of heterospecific expressions.
Collapse
Affiliation(s)
- Kun Guo
- School of Psychology, University of Lincoln, Lincoln, LN6 7TS, UK.
| | - Zhihan Li
- State Key Laboratory of Cognitive Neuroscience and Learning, and IDG, Beijing Normal University, Beijing, 100875, China
| | - Yin Yan
- State Key Laboratory of Cognitive Neuroscience and Learning, and IDG, Beijing Normal University, Beijing, 100875, China
| | - Wu Li
- State Key Laboratory of Cognitive Neuroscience and Learning, and IDG, Beijing Normal University, Beijing, 100875, China
| |
Collapse
|
39
|
Emotional expressions with minimal facial muscle actions. Report 1: Cues and targets. CURRENT PSYCHOLOGY 2019. [DOI: 10.1007/s12144-019-0151-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
40
|
Guo K, Soornack Y, Settle R. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion. Vision Res 2018; 157:112-122. [PMID: 29496513 DOI: 10.1016/j.visres.2018.02.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2017] [Revised: 02/02/2018] [Accepted: 02/04/2018] [Indexed: 11/29/2022]
Abstract
Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features.
Collapse
Affiliation(s)
- Kun Guo
- School of Psychology, University of Lincoln, UK.
| | | | | |
Collapse
|
41
|
Looking Behavior and Audiovisual Speech Understanding in Children With Normal Hearing and Children With Mild Bilateral or Unilateral Hearing Loss. Ear Hear 2017; 39:783-794. [PMID: 29252979 DOI: 10.1097/aud.0000000000000534] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVES Visual information from talkers facilitates speech intelligibility for listeners when audibility is challenged by environmental noise and hearing loss. Less is known about how listeners actively process and attend to visual information from different talkers in complex multi-talker environments. This study tracked looking behavior in children with normal hearing (NH), mild bilateral hearing loss (MBHL), and unilateral hearing loss (UHL) in a complex multi-talker environment to examine the extent to which children look at talkers and whether looking patterns relate to performance on a speech-understanding task. It was hypothesized that performance would decrease as perceptual complexity increased and that children with hearing loss would perform more poorly than their peers with NH. Children with MBHL or UHL were expected to demonstrate greater attention to individual talkers during multi-talker exchanges, indicating that they were more likely to attempt to use visual information from talkers to assist in speech understanding in adverse acoustics. It also was of interest to examine whether MBHL, versus UHL, would differentially affect performance and looking behavior. DESIGN Eighteen children with NH, eight children with MBHL, and 10 children with UHL participated (8-12 years). They followed audiovisual instructions for placing objects on a mat under three conditions: a single talker providing instructions via a video monitor, four possible talkers alternately providing instructions on separate monitors in front of the listener, and the same four talkers providing both target and nontarget information. Multi-talker background noise was presented at a 5 dB signal-to-noise ratio during testing. An eye tracker monitored looking behavior while children performed the experimental task. RESULTS Behavioral task performance was higher for children with NH than for either group of children with hearing loss. There were no differences in performance between children with UHL and children with MBHL. Eye-tracker analysis revealed that children with NH looked more at the screens overall than did children with MBHL or UHL, though individual differences were greater in the groups with hearing loss. Listeners in all groups spent a small proportion of time looking at relevant screens as talkers spoke. Although looking was distributed across all screens, there was a bias toward the right side of the display. There was no relationship between overall looking behavior and performance on the task. CONCLUSIONS The present study examined the processing of audiovisual speech in the context of a naturalistic task. Results demonstrated that children distributed their looking to a variety of sources during the task, but that children with NH were more likely to look at screens than were those with MBHL/UHL. However, all groups looked at the relevant talkers as they were speaking only a small proportion of the time. Despite variability in looking behavior, listeners were able to follow the audiovisual instructions and children with NH demonstrated better performance than children with MBHL/UHL. These results suggest that performance on some challenging multi-talker audiovisual tasks is not dependent on visual fixation to relevant talkers for children with NH or with MBHL/UHL.
Collapse
|
42
|
Mishra MV, Srinivasan N. Exogenous attention intensifies perceived emotion expressions. Neurosci Conscious 2017; 2017:nix022. [PMID: 30042853 PMCID: PMC6007186 DOI: 10.1093/nc/nix022] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2017] [Revised: 10/12/2017] [Accepted: 10/23/2017] [Indexed: 12/15/2022] Open
Abstract
Spatial attention not only enhances early visual processing and improves performance but also alters phenomenology of basic perceptual features. However, in spite of extensive research on attention altering appearance, it is still unknown whether attention also intensifies perceived facial emotional expressions. We investigated the effect of exogenous attention on two categories of emotions, one positive (happy) and one negative (sad) in separate sessions. Exogenous attention was manipulated using peripheral cues followed by two faces varying in emotional intensity that were presented on either side of fixation. Participants were asked to report the location of the emotional face displaying higher intensity of emotion. At short cue-to-target interval [CTI, Experiment 1 (60 ms)], participants reported the cued emotional face as more intense in expression compared with the uncued face. However, at longer CTI [Experiment 2 (500 ms)], this effect was absent. Results show that exogenous attention enhances appearance of higher level features, such as emotional intensity, irrespective of valence. Further, two experiments investigated the mediating role of facial contrast as a possible underlying mechanism for the observed effect. Although the results show that higher contrast faces are judged as more in emotional intensity, spatial attention effects seem to be dependent on task instructions. Possible mechanisms underlying the attentional effects on emotion intensity are discussed.
Collapse
Affiliation(s)
- Maruti V Mishra
- Centre of Behavioural and Cognitive Sciences, University of Allahabad, Allahabad, Uttar Pradesh, India
| | - Narayanan Srinivasan
- Centre of Behavioural and Cognitive Sciences, University of Allahabad, Allahabad, Uttar Pradesh, India
| |
Collapse
|
43
|
Boldness psychopathic traits predict reduced gaze toward fearful eyes in men with a history of violence. Biol Psychol 2017; 128:29-38. [DOI: 10.1016/j.biopsycho.2017.07.003] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2016] [Revised: 06/28/2017] [Accepted: 07/08/2017] [Indexed: 11/21/2022]
|
44
|
Ardizzi M, Evangelista V, Ferroni F, Umiltà MA, Ravera R, Gallese V. Evidence for Anger Saliency during the Recognition of Chimeric Facial Expressions of Emotions in Underage Ebola Survivors. Front Psychol 2017; 8:1026. [PMID: 28690565 PMCID: PMC5482096 DOI: 10.3389/fpsyg.2017.01026] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2017] [Accepted: 06/02/2017] [Indexed: 11/17/2022] Open
Abstract
One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims' recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims' performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual analysis of philogenetically old behavioral patterns like the facial expressions of emotions.
Collapse
Affiliation(s)
- Martina Ardizzi
- Department of Medicine and Surgery, Unit of Neuroscience, University of ParmaParma, Italy
- Ravera Children Rehabilitation CentreFreetown, Sierra Leone
| | | | - Francesca Ferroni
- Department of Medicine and Surgery, Unit of Neuroscience, University of ParmaParma, Italy
| | - Maria A. Umiltà
- Department of Food and Drug Sciences, University of ParmaParma, Italy
| | - Roberto Ravera
- Ravera Children Rehabilitation CentreFreetown, Sierra Leone
- Department of Health Psychology, ASL 1 (Azienda Sanitaria Locale) ImperieseSanremo, Italy
| | - Vittorio Gallese
- Department of Medicine and Surgery, Unit of Neuroscience, University of ParmaParma, Italy
- Institute of Philosophy, School of Advanced Study, University of LondonLondon, United Kingdom
| |
Collapse
|
45
|
Wegrzyn M, Westphal S, Kissler J. In your face: the biased judgement of fear-anger expressions in violent offenders. BMC Psychol 2017; 5:16. [PMID: 28499409 PMCID: PMC5429544 DOI: 10.1186/s40359-017-0186-z] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2016] [Accepted: 04/24/2017] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Why is it that certain violent criminals repeatedly find themselves engaged in brawls? Many inmates report having felt provoked or threatened by their victims, which might be due to a tendency to ascribe malicious intentions when faced with ambiguous social signals, termed hostile attribution bias. METHODS The present study presented morphed fear-anger faces to prison inmates with a history of violent crimes, a history of child sexual abuse, and to matched controls form the general population. Participants performed a fear-anger decision task. Analyses compared both response frequencies and measures derived from psychophysical functions fitted to the data. In addition, a test to distinguish basic facial expressions and questionnaires for aggression, psychopathy and personality disorders were administered. RESULTS Violent offenders present with a reliable hostile attribution bias, in that they rate ambiguous fear-anger expressions as more angry, compared to both the control population and perpetrators of child sexual abuse. Psychometric functions show a lowered threshold to detect anger in violent offenders compared to the general population. This effect is especially pronounced for male faces, correlates with self-reported aggression and presents in absence of a general emotion recognition impairment. CONCLUSIONS The results indicate that a hostile attribution, related to individual level of aggression and pronounced for male faces, might be one mechanism mediating physical violence.
Collapse
Affiliation(s)
- Martin Wegrzyn
- Department of Psychology, Bielefeld University, Postfach 10 01 31, 33501 Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Sina Westphal
- Department of Psychology, Bielefeld University, Postfach 10 01 31, 33501 Bielefeld, Germany
| | - Johanna Kissler
- Department of Psychology, Bielefeld University, Postfach 10 01 31, 33501 Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
46
|
Wegrzyn M, Vogt M, Kireclioglu B, Schneider J, Kissler J. Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PLoS One 2017; 12:e0177239. [PMID: 28493921 PMCID: PMC5426715 DOI: 10.1371/journal.pone.0177239] [Citation(s) in RCA: 140] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2016] [Accepted: 04/24/2017] [Indexed: 11/18/2022] Open
Abstract
Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.
Collapse
Affiliation(s)
- Martin Wegrzyn
- Department of Psychology, Bielefeld University, Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Maria Vogt
- Department of Psychology, Bielefeld University, Bielefeld, Germany
| | | | - Julia Schneider
- Department of Psychology, Bielefeld University, Bielefeld, Germany
| | - Johanna Kissler
- Department of Psychology, Bielefeld University, Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
47
|
Abstract
Individuals vary in perceptual accuracy when categorising facial expressions, yet it is unclear how these individual differences in non-clinical population are related to cognitive processing stages at facial information acquisition and interpretation. We tested 104 healthy adults in a facial expression categorisation task, and correlated their categorisation accuracy with face-viewing gaze allocation and personal traits assessed with Autism Quotient, anxiety inventory and Self-Monitoring Scale. The gaze allocation had limited but emotion-specific impact on categorising expressions. Specifically, longer gaze at the eyes and nose regions were coupled with more accurate categorisation of disgust and sad expressions, respectively. Regarding trait measurements, higher autistic score was coupled with better recognition of sad but worse recognition of anger expressions, and contributed to categorisation bias towards sad expressions; whereas higher anxiety level was associated with greater categorisation accuracy across all expressions and with increased tendency of gazing at the nose region. It seems that both anxiety and autistic-like traits were associated with individual variation in expression categorisation, but this association is not necessarily mediated by variation in gaze allocation at expression-specific local facial regions. The results suggest that both facial information acquisition and interpretation capabilities contribute to individual differences in expression categorisation within non-clinical populations.
Collapse
Affiliation(s)
- Corinne Green
- a School of Psychology , University of Lincoln , Lincoln , UK
| | - Kun Guo
- a School of Psychology , University of Lincoln , Lincoln , UK
| |
Collapse
|
48
|
Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze. PLoS One 2016; 11:e0168307. [PMID: 27942030 PMCID: PMC5152920 DOI: 10.1371/journal.pone.0168307] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2016] [Accepted: 11/30/2016] [Indexed: 11/19/2022] Open
Abstract
The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down.
Collapse
|
49
|
Meaux E, Vuilleumier P. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks. Neuroimage 2016; 141:154-173. [DOI: 10.1016/j.neuroimage.2016.07.004] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2015] [Revised: 06/26/2016] [Accepted: 07/02/2016] [Indexed: 11/27/2022] Open
|
50
|
Luke CJ, Pollux PMJ. Lateral presentation of faces alters overall viewing strategy. PeerJ 2016; 4:e2241. [PMID: 27547549 PMCID: PMC4958001 DOI: 10.7717/peerj.2241] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2016] [Accepted: 06/21/2016] [Indexed: 11/20/2022] Open
Abstract
Eye tracking has been used during face categorisation and identification tasks to identify perceptually salient facial features and infer underlying cognitive processes. However, viewing patterns are influenced by a variety of gaze biases, drawing fixations to the centre of a screen and horizontally to the left side of face images (left-gaze bias). In order to investigate potential interactions between gaze biases uniquely associated with facial expression processing, and those associated with screen location, face stimuli were presented in three possible screen positions to the left, right and centre. Comparisons of fixations between screen locations highlight a significant impact of the screen centre bias, pulling fixations towards the centre of the screen and modifying gaze biases generally observed during facial categorisation tasks. A left horizontal bias for fixations was found to be independent of screen position but interacting with screen centre bias, drawing fixations to the left hemi-face rather than just to the left of the screen. Implications for eye tracking studies utilising centrally presented faces are discussed.
Collapse
Affiliation(s)
| | - Petra M J Pollux
- School of Psychology, University of Lincoln, Lincoln, United Kingdom
| |
Collapse
|