1
|
Walsh E, Moreira C, Longo MR. Opposite size illusions for inverted faces and letters. Cognition 2024; 245:105733. [PMID: 38281395 DOI: 10.1016/j.cognition.2024.105733] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2023] [Revised: 12/11/2023] [Accepted: 01/21/2024] [Indexed: 01/30/2024]
Abstract
Words are the primary means by which we communicate meaning and ideas, while faces provide important social cues. Studying visual illusions involving faces and words can elucidate the hierarchical processing of information as different regions of the brain are specialised for face recognition and word processing. A size illusion has previously been demonstrated for faces, whereby an inverted face is perceived as larger than the same stimulus upright. Here, two experiments replicate the face size illusion, and investigate whether the illusion is also present for individual letters (Experiment 1), and visual words and pseudowords (Experiment 2). Results confirm a robust size Illusion for faces. Letters, words and pseudowords and unfamiliar letters all show a reverse size illusion, as we previously demonstrated for human bodies. Overall, results indicate the illusion occurs in early perceptual stages upstream of semantic processing. Results are consistent with the idea of a general-purpose mechanism that encodes curvilinear shapes found in both scripts and our environment. Word and face perception rely on specialised, independent cognitive processes. The underestimation of the size of upright stimuli is specific to faces. Opposite size illusions may reflect differences in how size information is encoded and represented in stimulus-specialised neural networks, resulting in contrasting perceptual effects. Though words and faces differ visually, there is both symmetry and asymmetry in how the brain 'reads' them.
Collapse
Affiliation(s)
- Eamonn Walsh
- Department of Basic & Clinical Neuroscience, Institute of Psychiatry, Psychology & Neuroscience, King's College London, UK; Cultural and Social Neuroscience Research Group, Institute of Psychiatry, Psychology & Neuroscience, King's College London, UK.
| | - Carolina Moreira
- Department of Psychological Sciences, Birkbeck, University of London, UK
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, UK
| |
Collapse
|
2
|
Walsh E, Whitby J, Chen YY, Longo MR. No influence of emotional expression on size underestimation of upright faces. PLoS One 2024; 19:e0293920. [PMID: 38300951 PMCID: PMC10833517 DOI: 10.1371/journal.pone.0293920] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Accepted: 10/20/2023] [Indexed: 02/03/2024] Open
Abstract
Faces are a primary means of conveying social information between humans. One important factor modulating the perception of human faces is emotional expression. Face inversion also affects perception, including judgments of emotional expression, possibly through the disruption of configural processing. One intriguing inversion effect is an illusion whereby faces appear to be physically smaller when upright than when inverted. This illusion appears to be highly selective for faces. In this study, we investigated whether the emotional expression of a face (neutral, happy, afraid, and angry) modulates the magnitude of this size illusion. Results showed that for all four expressions, there was a clear bias for inverted stimuli to be judged as larger than upright ones. This demonstrates that there is no influence of emotional expression on the size underestimation of upright faces, a surprising result given that recognition of different emotional expressions is known to be affected unevenly by inversion. Results are discussed considering recent neuroimaging research which used population receptive field (pRF) mapping to investigate the neural mechanisms underlying face perception features and which may provide an explanation for how an upright face appears smaller than an inverted one. Elucidation of this effect would lead to a greater understanding of how humans communicate.
Collapse
Affiliation(s)
- Eamonn Walsh
- Department of Basic & Clinical Neuroscience, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
- Cultural and Social Neuroscience Research Group, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
| | - Jack Whitby
- Department of Basic & Clinical Neuroscience, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
| | - Yen-Ya Chen
- Department of Basic & Clinical Neuroscience, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom
| | - Matthew R. Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| |
Collapse
|
3
|
Ambroziak KB, Bofill MA, Azañón E, Longo MR. Perceptual aftereffects of adiposity transfer from hands to whole bodies. Exp Brain Res 2023; 241:2371-2379. [PMID: 37620437 DOI: 10.1007/s00221-023-06686-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2023] [Accepted: 08/08/2023] [Indexed: 08/26/2023]
Abstract
Adaptation aftereffects for features such as identity and gender have been shown to transfer between faces and bodies, and faces and body parts, i.e. hands. However, no studies have investigated transfer of adaptation aftereffects between whole bodies and body parts. The present study investigated whether visual adaptation aftereffects transfer between hands and whole bodies in the context of adiposity judgements (i.e. how thin or fat a body is). On each trial, participants had to decide whether the body they saw was thinner or fatter than average. Participants performed the task before and after exposure to a thin/fat hand. Consistent with body adaptation studies, after exposure to a slim hand participants judged subsequently presented bodies to be fatter than after adaptation to a fat hand. These results suggest that there may be links between visual representations of body adiposity for whole bodies and body parts.
Collapse
Affiliation(s)
- Klaudia B Ambroziak
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London, WC1E 7HX, UK.
| | - Marina Araujo Bofill
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London, WC1E 7HX, UK
| | - Elena Azañón
- Institute of Psychology, Otto-Von-Guericke University, Universitätsplatz 2, 39016, Magdeburg, Germany
- Center for Behavioral Brain Sciences, Universitätsplatz 2, 39106, Magdeburg, Germany
- Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Brenneckestraße 6, 39118, Magdeburg, Germany
| | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London, WC1E 7HX, UK.
| |
Collapse
|
4
|
Emotion is perceived accurately from isolated body parts, especially hands. Cognition 2023; 230:105260. [PMID: 36058103 DOI: 10.1016/j.cognition.2022.105260] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2022] [Revised: 08/16/2022] [Accepted: 08/17/2022] [Indexed: 11/21/2022]
Abstract
Body posture and configuration provide important visual cues about the emotion states of other people. We know that bodily form is processed holistically, however, emotion recognition may depend on different mechanisms; certain body parts, such as the hands, may be especially important for perceiving emotion. This study therefore compared participants' emotion recognition performance when shown images of full bodies, or of isolated hands, arms, heads and torsos. Across three experiments, emotion recognition accuracy was above chance for all body parts. While emotions were recognized most accurately from full bodies, recognition performance from the hands was more accurate than for other body parts. Representational similarity analysis further showed that the pattern of errors for the hands was related to that for full bodies. Performance was reduced when stimuli were inverted, showing a clear body inversion effect. The high performance for hands was not due only to the fact that there are two hands, as performance remained well above chance even when just one hand was shown. These results demonstrate that emotions can be decoded from body parts. Furthermore, certain features, such as the hands, are more important to emotion perception than others. STATEMENT OF RELEVANCE: Successful social interaction relies on accurately perceiving emotional information from others. Bodies provide an abundance of emotion cues; however, the way in which emotional bodies and body parts are perceived is unclear. We investigated this perceptual process by comparing emotion recognition for body parts with that for full bodies. Crucially, we found that while emotions were most accurately recognized from full bodies, emotions were also classified accurately when images of isolated hands, arms, heads and torsos were seen. Of the body parts shown, emotion recognition from the hands was most accurate. Furthermore, shared patterns of emotion classification for hands and full bodies suggested that emotion recognition mechanisms are shared for full bodies and body parts. That the hands are key to emotion perception is important evidence in its own right. It could also be applied to interventions for individuals who find it difficult to read emotions from faces and bodies.
Collapse
|
5
|
Zhang Y, Wang L, Jiang Y. My own face looks larger than yours: A self-induced illusory size perception. Cognition 2021; 212:104718. [PMID: 33839543 DOI: 10.1016/j.cognition.2021.104718] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2020] [Revised: 03/25/2021] [Accepted: 03/29/2021] [Indexed: 01/01/2023]
Abstract
Size perception of visual objects is highly context dependent. Here we report a novel perceptual size illusion that the self-face, being a unique and distinctive self-referential stimulus, can enlarge its perceived size. By using a size discrimination paradigm, we found that the self-face was perceived as significantly larger than the other-face of the same size. This size overestimation effect was not due to the familiarity of the self-face, since it could be still observed when the self-face was directly compared with a famous face. More crucially, such illusion effect could be extended to a new cartoon face that was transiently associated with one's own face and could also exert further contextual influences on visual size perception of other objects. These findings together highlight the role of self-awareness in visual size perception and point to a special mechanism of size perception tuned to self-referential information.
Collapse
Affiliation(s)
- Ying Zhang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, 16 Lincui Road, Beijing 100101, China; Department of Psychology, University of Chinese Academy of Sciences, 19A Yuquan Road, Beijing 100049, China; Chinese Institute for Brain Research, 26 Science Park Road, Beijing 102206, China
| | - Li Wang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, 16 Lincui Road, Beijing 100101, China; Department of Psychology, University of Chinese Academy of Sciences, 19A Yuquan Road, Beijing 100049, China; Chinese Institute for Brain Research, 26 Science Park Road, Beijing 102206, China.
| | - Yi Jiang
- State Key Laboratory of Brain and Cognitive Science, CAS Center for Excellence in Brain Science and Intelligence Technology, Institute of Psychology, Chinese Academy of Sciences, 16 Lincui Road, Beijing 100101, China; Department of Psychology, University of Chinese Academy of Sciences, 19A Yuquan Road, Beijing 100049, China; Chinese Institute for Brain Research, 26 Science Park Road, Beijing 102206, China.
| |
Collapse
|