1
|
Paredes V, Pino FJ, Díaz D. Does facial structure explain differences in student evaluations of teaching? The role of fWHR as a proxy for perceived dominance. ECONOMICS AND HUMAN BIOLOGY 2024; 54:101381. [PMID: 38642450 DOI: 10.1016/j.ehb.2024.101381] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Revised: 01/10/2024] [Accepted: 04/02/2024] [Indexed: 04/22/2024]
Abstract
Dominance is usually viewed as a positive male attribute, but this is not typically the case for women. Using a novel dataset of student evaluations of teaching in a school of Business and Economics of a selective university, we construct the face width-to-height ratio (fWHR) as a proxy for perceived dominance to assess whether individuals with a higher ratio obtain better evaluations. Our results show that a higher fWHR is associated with a better evaluation for male faculty, while the opposite is the case for female faculty. These results are not due to differences in teachers' quality or beauty. In terms of magnitude, the effect of the fWHR is much larger for female professors. To the extent that fWHR is a good proxy of perceived dominance, it appears that conformity to traditional gender norms pays off for both men and women. However, the cost of challenging these norms is much larger for women than for men.
Collapse
Affiliation(s)
| | - Francisco J Pino
- Department of Economics, University of Chile, Chile; IZA, Germany
| | - David Díaz
- Department of Management, University of Chile, Chile
| |
Collapse
|
2
|
González-Gualda LM, Vicente-Querol MA, García AS, Molina JP, Latorre JM, Fernández-Sotos P, Fernández-Caballero A. An exploratory study of the effect of age and gender on face scanning during affect recognition in immersive virtual reality. Sci Rep 2024; 14:5553. [PMID: 38448515 PMCID: PMC10918108 DOI: 10.1038/s41598-024-55774-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2023] [Accepted: 02/26/2024] [Indexed: 03/08/2024] Open
Abstract
A person with impaired emotion recognition is not able to correctly identify facial expressions represented by other individuals. The aim of the present study is to assess eyes gaze and facial emotion recognition in a healthy population using dynamic avatars in immersive virtual reality (IVR). For the first time, the viewing of each area of interest of the face in IVR is studied by gender and age. This work in healthy people is conducted to assess the future usefulness of IVR in patients with deficits in the recognition of facial expressions. Seventy-four healthy volunteers participated in the study. The materials used were a laptop computer, a game controller, and a head-mounted display. Dynamic virtual faces randomly representing the six basic emotions plus neutral expression were used as stimuli. After the virtual human represented an emotion, a response panel was displayed with the seven possible options. Besides storing the hits and misses, the software program internally divided the faces into different areas of interest (AOIs) and recorded how long participants looked at each AOI. As regards the overall accuracy of the participants' responses, hits decreased from the youngest to the middle-aged and older adults. Also, all three groups spent the highest percentage of time looking at the eyes, but younger adults had the highest percentage. It is also noteworthy that attention to the face compared to the background decreased with age. Moreover, the hits between women and men were remarkably similar and, in fact, there were no statistically significant differences between them. In general, men paid more attention to the eyes than women, but women paid more attention to the forehead and mouth. In contrast to previous work, our study indicates that there are no differences between men and women in facial emotion recognition. Moreover, in line with previous work, the percentage of face viewing time for younger adults is higher than for older adults. However, contrary to earlier studies, older adults look more at the eyes than at the mouth.Consistent with other studies, the eyes are the AOI with the highest percentage of viewing time. For men the most viewed AOI is the eyes for all emotions in both hits and misses. Women look more at the eyes for all emotions, except for joy, fear, and anger on hits. On misses, they look more into the eyes for almost all emotions except surprise and fear.
Collapse
Affiliation(s)
- Luz M González-Gualda
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Miguel A Vicente-Querol
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
| | - Arturo S García
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José P Molina
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - José M Latorre
- Departmento de Psicología, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - Patricia Fernández-Sotos
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain
| | - Antonio Fernández-Caballero
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain.
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain.
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain.
| |
Collapse
|
3
|
Monferrer M, García AS, Ricarte JJ, Montes MJ, Fernández-Caballero A, Fernández-Sotos P. Facial emotion recognition in patients with depression compared to healthy controls when using human avatars. Sci Rep 2023; 13:6007. [PMID: 37045889 PMCID: PMC10097677 DOI: 10.1038/s41598-023-31277-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2022] [Accepted: 03/09/2023] [Indexed: 04/14/2023] Open
Abstract
The negative, mood-congruent cognitive bias described in depression, as well as excessive rumination, have been found to interfere with emotional processing. This study focuses on the assessment of facial recognition of emotions in patients with depression through a new set of dynamic virtual faces (DVFs). The sample consisted of 54 stable patients compared to 54 healthy controls. The experiment consisted in an emotion recognition task using non-immersive virtual reality (VR) with DVFs of six basic emotions and neutral expression. Patients with depression showed a worst performance in facial affect recognition compared to healthy controls. Age of onset was negatively correlated with emotion recognition and no correlation was observed for duration of illness or number of lifetime hospitalizations. There was no correlation for the depression group between emotion recognition and degree of psychopathology, excessive rumination, degree of functioning, or quality of life. Hence, it is important to improve and validate VR tools for emotion recognition to achieve greater methodological homogeneity of studies and to be able to establish more conclusive results.
Collapse
Affiliation(s)
- Marta Monferrer
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Arturo S García
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
| | - Jorge J Ricarte
- Departmento de Psicología, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
| | - María J Montes
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain
| | - Antonio Fernández-Caballero
- Departmento de Sistemas Informáticos, Universidad de Castilla-La Mancha, 02071, Albacete, Spain
- Neurocognition and Emotion Unit, Instituto de Investigación en Informática de Albacete, 02071, Albacete, Spain
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain
| | - Patricia Fernández-Sotos
- Servicio de Salud de Castilla-La Mancha, Complejo Hospitalario Universitario de Albacete, Servicio de Salud Mental, 02004, Albacete, Spain.
- CIBERSAM-ISCIII (Biomedical Research Networking Centre in Mental Health, Instituto de Salud Carlos III), 28016, Madrid, Spain.
| |
Collapse
|
4
|
Schmid I, Witkower Z, Götz FM, Stieger S. Registered report: Social face evaluation: ethnicity-specific differences in the judgement of trustworthiness of faces and facial parts. Sci Rep 2022; 12:18311. [PMID: 36316450 PMCID: PMC9622746 DOI: 10.1038/s41598-022-22709-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2021] [Accepted: 10/18/2022] [Indexed: 11/06/2022] Open
Abstract
Social face evaluation is a common and consequential element of everyday life based on the judgement of trustworthiness. However, the particular facial regions that guide such trustworthiness judgements are largely unknown. It is also unclear whether different facial regions are consistently utilized to guide judgments for different ethnic groups, and whether previous exposure to specific ethnicities in one's social environment has an influence on trustworthiness judgements made from faces or facial regions. This registered report addressed these questions through a global online survey study that recruited Asian, Black, Latino, and White raters (N = 4580). Raters were shown full faces and specific parts of the face for an ethnically diverse, sex-balanced set of 32 targets and rated targets' trustworthiness. Multilevel modelling showed that in forming trustworthiness judgements, raters relied most strongly on the eyes (with no substantial information loss vis-à-vis full faces). Corroborating ingroup-outgroup effects, raters rated faces and facial parts of targets with whom they shared their ethnicity, sex, or eye color as significantly more trustworthy. Exposure to ethnic groups in raters' social environment predicted trustworthiness ratings of other ethnic groups in nuanced ways. That is, raters from the ambient ethnic majority provided slightly higher trustworthiness ratings for stimuli of their own ethnicity compared to minority ethnicities. In contrast, raters from an ambient ethnic minority (e.g., immigrants) provided substantially lower trustworthiness ratings for stimuli of the ethnic majority. Taken together, the current study provides a new window into the psychological processes underlying social face evaluation and its cultural generalizability. PROTOCOL REGISTRATION: The stage 1 protocol for this Registered Report was accepted in principle on 7 January 2022. The protocol, as accepted by the journal, can be found at: https://doi.org/10.6084/m9.figshare.18319244 .
Collapse
Affiliation(s)
- Irina Schmid
- grid.459693.4Department of Psychology and Psychodynamics, Karl Landsteiner University of Health Sciences, Krems an der Donau, Austria
| | - Zachary Witkower
- grid.17063.330000 0001 2157 2938Department of Psychology, University of Toronto, Toronto, Canada
| | - Friedrich M. Götz
- grid.17091.3e0000 0001 2288 9830Department of Psychology, University of British Columbia, Vancouver, Canada ,grid.47840.3f0000 0001 2181 7878Institute of Personality and Social Research, University of California, Berkeley, USA
| | - Stefan Stieger
- grid.459693.4Department of Psychology and Psychodynamics, Karl Landsteiner University of Health Sciences, Krems an der Donau, Austria
| |
Collapse
|
5
|
The cultural learning account of first impressions. Trends Cogn Sci 2022; 26:656-668. [PMID: 35697651 DOI: 10.1016/j.tics.2022.05.007] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2022] [Revised: 05/16/2022] [Accepted: 05/17/2022] [Indexed: 11/20/2022]
Abstract
Humans spontaneously attribute character traits to strangers based on their facial appearance. Although these 'first impressions' typically have no basis in reality, some authors have assumed that they have an innate origin. By contrast, the Trait Inference Mapping (TIM) account proposes that first impressions are products of culturally acquired associative mappings that allow activation to spread from representations of facial appearance to representations of trait profiles. According to TIM, cultural instruments, including propaganda, illustrated storybooks, art and iconography, ritual, film, and TV, expose many individuals within a community to common sources of correlated face-trait experience, yielding first impressions that are shared by many, but typically inaccurate. Here, we review emerging empirical findings, many of which accord with TIM, and argue that future work must distinguish first impressions based on invariant facial features (e.g., shape) from those based on facial behaviours (e.g., expressions).
Collapse
|