1
|
Myga KA, Azañón E, Ambroziak KB, Ferrè ER, Longo MR. Haptic experience of bodies alters body perception. Perception 2024; 53:716-729. [PMID: 39324272 DOI: 10.1177/03010066241270627] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/27/2024]
Abstract
Research on media's effects on body perception has mainly focused on the role of vision of extreme body types. However, haptics is a major part of the way children experience bodies. Playing with unrealistically thin dolls has been linked to the emergence of body image concerns, but the perceptual mechanisms remain unknown. We explore the effects of haptic experience of extreme body types on body perception, using adaptation aftereffects. Blindfolded participants judged whether the doll-like stimuli explored haptically were thinner or fatter than the average body before and after adaptation to an underweight or overweight doll. In a second experiment, participants underwent a traditional visual adaptation paradigm to extreme bodies, using stimuli matched to those in Experiment 1. For both modalities, after adaptation to an underweight body test bodies were judged as fatter. Adaptation to an overweight body produced opposite results. For the first time, we show adiposity aftereffects in haptic modality, analogous to those established in vision, using matched stimuli across visual and haptic paradigms.
Collapse
Affiliation(s)
- Kasia A Myga
- Department of Psychological Sciences, Birkbeck, University of London, London, UK; Department of Neurology, Otto-Von-Guericke University, Magdeburg, Germany; Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Elena Azañón
- Department of Neurology, Otto-Von-Guericke University, Magdeburg, Germany; Department of Behavioral Neurology, Leibniz Institute for Neurobiology, Magdeburg, Germany; Center for Behavioral Brain Sciences, Magdeburg, Germany; Center for Intervention and Research on Adaptive and Maladaptive Brain Circuits Underlying Mental Health (C-I-R-C), Jena-Magdeburg-Halle, Germany
| | | | | | - Matthew R Longo
- Department of Psychological Sciences, Birkbeck, University of London, London, UK
| |
Collapse
|
2
|
Chennaz L, Mascle C, Baltenneck N, Baudouin JY, Picard D, Gentaz E, Valente D. Recognition of facial expressions of emotions in tactile drawings by blind children, children with low vision and sighted children. Acta Psychol (Amst) 2024; 247:104330. [PMID: 38852319 DOI: 10.1016/j.actpsy.2024.104330] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Revised: 05/02/2024] [Accepted: 06/06/2024] [Indexed: 06/11/2024] Open
Abstract
In the context of blindness, studies on the recognition of facial expressions of emotions by touch are essential to define the compensatory touch abilities and to create adapted tools on emotions. This study is the first to examine the effect of visual experience in the recognition of tactile drawings of facial expressions of emotions by children with different visual experiences. To this end, we compared the recognition rates of tactile drawings of emotions between blind children, children with low vision and sighted children aged 6-12 years. Results revealed no effect of visual experience on recognition rates. However, an effect of emotions and an interaction effect between emotions and visual experience were found. Indeed, while all children had a low average recognition rate, the drawings of fear, anger and disgust were particularly poorly recognized. Moreover, sighted children were significantly better at recognizing the drawings of surprise and sadness than the blind children who only showed high recognition rates for joy. The results of this study support the importance of developing emotion tools that can be understood by children with different visual experiences.
Collapse
Affiliation(s)
- Lola Chennaz
- Laboratory of Sensory-motor Affective and Social Development (SMAS), Faculty of Psychology and Educational Sciences (FAPSE), University of Geneva, Switzerland.
| | - Carolane Mascle
- Inter-university Laboratory for Education and Communication Sciences (LISEC), University of Strasbourg, France.
| | - Nicolas Baltenneck
- Laboratory of Development, Individual, Process, Disability, Education (UR DIPHE), University Lumière Lyon 2, France.
| | - Jean-Yves Baudouin
- Laboratory of Development, Individual, Process, Disability, Education (UR DIPHE), University Lumière Lyon 2, France.
| | | | - Edouard Gentaz
- Laboratory of Sensory-motor Affective and Social Development (SMAS), Faculty of Psychology and Educational Sciences (FAPSE), University of Geneva, Switzerland; Swiss Center for Affective Sciences, University of Geneva, Switzerland.
| | - Dannyelle Valente
- Laboratory of Sensory-motor Affective and Social Development (SMAS), Faculty of Psychology and Educational Sciences (FAPSE), University of Geneva, Switzerland; Laboratory of Development, Individual, Process, Disability, Education (UR DIPHE), University Lumière Lyon 2, France; Swiss Center for Affective Sciences, University of Geneva, Switzerland.
| |
Collapse
|
3
|
Fahey S, Santana C, Kitada R, Zheng Z. Affective judgement of social touch on a hand associated with hand embodiment. Q J Exp Psychol (Hove) 2019; 72:2408-2422. [DOI: 10.1177/1747021819842785] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
Social touch constitutes a critical component of human interactions. A gentle tap on the hand, for instance, can sometimes create emotional bonding and reduce interpersonal distance in social interactions. Evidence of tactile empathy suggests that touch can be experienced through both physical sensation and observation, yet vicarious perception of observed touch on an object as a function of the object’s conceptual representation (e.g., Is this object identified as mine? Does this object feel like part of me?) remains less explored. Here we examined the affective judgement of social touch when the illusory sense of ownership over a dummy hand was manipulated through the rubber-hand illusion. When the same social touch was performed on either the real or the dummy hand, we found a similar sense of perceived pleasantness between the felt and observed touch, but only when the dummy hand was embodied; when it was not, the perceived pleasantness of the observed touch was lesser (an “embodiment effect”; Experiment 1). In addition, we found that the embodiment effect associated with the observed touch was insensitive to the way in which embodiment was manipulated (Experiment 2), and that this effect was specific to social but not neutral touch (Experiment 3). Taken together, our findings suggest a role of embodiment in the affective component of observed social touch and contribute to our understanding of tactile empathy for objects.
Collapse
Affiliation(s)
- Samira Fahey
- Social Sciences Department, Lasell College, Newton, MA, USA
| | | | - Ryo Kitada
- School of Social Sciences, Nanyang Technological University, Singapore
| | - Zane Zheng
- Social Sciences Department, Lasell College, Newton, MA, USA
- RoseMary Fuss Center for Research on Aging, Lasell College, Newton, MA, USA
| |
Collapse
|
4
|
Brain networks require a network-conscious psychopathological approach. Behav Brain Sci 2019; 42:e20. [PMID: 30940218 DOI: 10.1017/s0140525x18001115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
In experimental psychology and neuroscience, technological advances and multisensory research have contributed to gradually dismiss a version of reductionism. Empirical results no longer support a brain model in which distinct "modules" perform discrete functions, but rather, a brain of partially overlapping networks. A similarly changed brain model is extending to psychopathology and clinical psychology, and partly accounts for the problems of reductionism.
Collapse
|
5
|
Abstract
Facial expressions of emotion are nonverbal behaviors that allow us to interact efficiently in social life and respond to events affecting our welfare. This article reviews 21 studies, published between 1932 and 2015, examining the production of facial expressions of emotion by blind people. It particularly discusses the impact of visual experience on the development of this behavior from birth to adulthood. After a discussion of three methodological considerations, the review of studies reveals that blind subjects demonstrate differing capacities for producing spontaneous expressions and voluntarily posed expressions. Seventeen studies provided evidence that blind and sighted spontaneously produce the same pattern of facial expressions, even if some variations can be found, reflecting facial and body movements specific to blindness or differences in intensity and control of emotions in some specific contexts. This suggests that lack of visual experience seems to not have a major impact when this behavior is generated spontaneously in real emotional contexts. In contrast, eight studies examining voluntary expressions indicate that blind individuals have difficulty posing emotional expressions. The opportunity for prior visual observation seems to affect performance in this case. Finally, we discuss three new directions for research to provide additional and strong evidence for the debate regarding the innate or the culture-constant learning character of the production of emotional facial expressions by blind individuals: the link between perception and production of facial expressions, the impact of display rules in the absence of vision, and the role of other channels in expression of emotions in the context of blindness.
Collapse
|
6
|
Asai T. Know thy agency in predictive coding: Meta-monitoring over forward modeling. Conscious Cogn 2017; 51:82-99. [PMID: 28327348 DOI: 10.1016/j.concog.2017.03.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2016] [Revised: 01/09/2017] [Accepted: 03/02/2017] [Indexed: 10/19/2022]
Abstract
Though the computation of agency is thought to be based on prediction error, it is important for us to grasp our own reliability of that detected error. Here, the current study shows that we have a meta-monitoring ability over our own forward model, where the accuracy of motor prediction and therefore of the felt agency are implicitly evaluated. Healthy participants (N=105) conducted a simple motor control task and SELF or OTHER visual feedback was given. The relationship between the accuracy and confidence in a mismatch detection task and in a self-other attribution task was examined. The results suggest an accuracy-confidence correlation in both tasks, indicating our meta-monitoring ability over such decisions. Furthermore, a statistically identified group with low accuracy and low confidence was characterized as higher schizotypal people. Finally, what we can know about our own forward model and how we can know it is discussed.
Collapse
Affiliation(s)
- Tomohisa Asai
- NTT Communication Science Laboratories, Human Information Science Laboratory, Kanagawa, Japan.
| |
Collapse
|
7
|
Stanley JT, Isaacowitz DM. Caring more and knowing more reduces age-related differences in emotion perception. Psychol Aging 2016; 30:383-395. [PMID: 26030775 DOI: 10.1037/pag0000028] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Traditional emotion perception tasks show that older adults are less accurate than are young adults at recognizing facial expressions of emotion. Recently, we proposed that socioemotional factors might explain why older adults seem impaired in lab tasks but less so in everyday life (Isaacowitz & Stanley, 2011). Thus, in the present research we empirically tested whether socioemotional factors such as motivation and familiarity can alter this pattern of age effects. In 1 task, accountability instructions eliminated age differences in the traditional emotion perception task. Using a novel emotion perception paradigm featuring spontaneous dynamic facial expressions of a familiar romantic partner versus a same-age stranger, we found that age differences in emotion perception accuracy were attenuated in the familiar partner condition, relative to the stranger condition. Taken together, the results suggest that both overall accuracy as well as specific patterns of age effects differ appreciably between traditional emotion perception tasks and emotion perception within a socioemotional context.
Collapse
|
8
|
Abstract
The idea that faces are represented within a structured face space (Valentine Quarterly Journal of Experimental Psychology 43: 161-204, 1991) has gained considerable experimental support, from both physiological and perceptual studies. Recent work has also shown that faces can even be recognized haptically-that is, from touch alone. Although some evidence favors congruent processing strategies in the visual and haptic processing of faces, the question of how similar the two modalities are in terms of face processing remains open. Here, this question was addressed by asking whether there is evidence for a haptic face space, and if so, how it compares to visual face space. For this, a physical face space was created, consisting of six laser-scanned individual faces, their morphed average, 50%-morphs between two individual faces, as well as 50%-morphs of the individual faces with the average, resulting in a set of 19 faces. Participants then rated either the visual or haptic pairwise similarity of the tangible 3-D face shapes. Multidimensional scaling analyses showed that both modalities extracted perceptual spaces that conformed to critical predictions of the face space framework, hence providing support for similar processing of complex face shapes in haptics and vision. Despite the overall similarities, however, systematic differences also emerged between the visual and haptic data. These differences are discussed in the context of face processing and complex-shape processing in vision and haptics.
Collapse
|
9
|
Kawamichi H, Kitada R, Yoshihara K, Takahashi HK, Sadato N. Interpersonal touch suppresses visual processing of aversive stimuli. Front Hum Neurosci 2015; 9:164. [PMID: 25904856 PMCID: PMC4389358 DOI: 10.3389/fnhum.2015.00164] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2014] [Accepted: 03/09/2015] [Indexed: 12/30/2022] Open
Abstract
Social contact is essential for survival in human society. A previous study demonstrated that interpersonal contact alleviates pain-related distress by suppressing the activity of its underlying neural network. One explanation for this is that attention is shifted from the cause of distress to interpersonal contact. To test this hypothesis, we conducted a functional MRI (fMRI) study wherein eight pairs of close female friends rated the aversiveness of aversive and non-aversive visual stimuli under two conditions: joining hands either with a rubber model (rubber-hand condition) or with a close friend (human-hand condition). Subsequently, participants rated the overall comfortableness of each condition. The rating result after fMRI indicated that participants experienced greater comfortableness during the human-hand compared to the rubber-hand condition, whereas aversiveness ratings during fMRI were comparable across conditions. The fMRI results showed that the two conditions commonly produced aversive-related activation in both sides of the visual cortex (including V1, V2, and V5). An interaction between aversiveness and hand type showed rubber-hand-specific activation for (aversive > non-aversive) in other visual areas (including V1, V2, V3, and V4v). The effect of interpersonal contact on the processing of aversive stimuli was negatively correlated with the increment of attentional focus to aversiveness measured by a pain-catastrophizing scale. These results suggest that interpersonal touch suppresses the processing of aversive visual stimuli in the occipital cortex. This effect covaried with aversiveness-insensitivity, such that aversive-insensitive individuals might require a lesser degree of attentional capture to aversive-stimulus processing. As joining hands did not influence the subjective ratings of aversiveness, interpersonal touch may operate by redirecting excessive attention away from aversive characteristics of the stimuli.
Collapse
Affiliation(s)
- Hiroaki Kawamichi
- Division of Cerebral Integration, Department of Cerebral Research, National Institute for Physiological Sciences, Okazaki Japan ; Graduate School of Human Health Sciences, Tokyo Metropolitan University, Tokyo Japan ; School of Medicine, Faculty of Medicine, Gunma University, Maebashi Japan
| | - Ryo Kitada
- Division of Cerebral Integration, Department of Cerebral Research, National Institute for Physiological Sciences, Okazaki Japan ; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Hayama Japan
| | - Kazufumi Yoshihara
- Department of Psychosomatic Medicine, Graduate School of Medical Sciences, Kyushu University, Fukuoka Japan
| | - Haruka K Takahashi
- Division of Cerebral Integration, Department of Cerebral Research, National Institute for Physiological Sciences, Okazaki Japan ; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Hayama Japan
| | - Norihiro Sadato
- Division of Cerebral Integration, Department of Cerebral Research, National Institute for Physiological Sciences, Okazaki Japan ; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Hayama Japan
| |
Collapse
|
10
|
Matsumiya K. Seeing a haptically explored face: visual facial-expression aftereffect from haptic adaptation to a face. Psychol Sci 2013; 24:2088-98. [PMID: 24002886 DOI: 10.1177/0956797613486981] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.
Collapse
|
11
|
Kitada R, Okamoto Y, Sasaki AT, Kochiyama T, Miyahara M, Lederman SJ, Sadato N. Early visual experience and the recognition of basic facial expressions: involvement of the middle temporal and inferior frontal gyri during haptic identification by the early blind. Front Hum Neurosci 2013; 7:7. [PMID: 23372547 PMCID: PMC3556569 DOI: 10.3389/fnhum.2013.00007] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2012] [Accepted: 01/07/2013] [Indexed: 12/02/2022] Open
Abstract
Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience.
Collapse
Affiliation(s)
- Ryo Kitada
- Department of Physiological Sciences, The Graduate University for Advanced Studies (Sokendai)Okazaki, Japan
- Division of Cerebral Integration, National Institute for Physiological SciencesOkazaki, Japan
| | - Yuko Okamoto
- Department of Physiological Sciences, The Graduate University for Advanced Studies (Sokendai)Okazaki, Japan
- Division of Cerebral Integration, National Institute for Physiological SciencesOkazaki, Japan
| | - Akihiro T. Sasaki
- Division of Cerebral Integration, National Institute for Physiological SciencesOkazaki, Japan
| | - Takanori Kochiyama
- The Hakubi Project, Primate Research Institute, Kyoto UniversityKyoto, Japan
| | - Motohide Miyahara
- School of Physical Education, University of OtagoDunedin, New Zealand
| | | | - Norihiro Sadato
- Department of Physiological Sciences, The Graduate University for Advanced Studies (Sokendai)Okazaki, Japan
- Division of Cerebral Integration, National Institute for Physiological SciencesOkazaki, Japan
- Biomedical Imaging Research Center, University of FukuiEiheiji, Japan
| |
Collapse
|
12
|
Krumhuber EG, Kappas A, Manstead ASR. Effects of Dynamic Aspects of Facial Expressions: A Review. EMOTION REVIEW 2013. [DOI: 10.1177/1754073912451349] [Citation(s) in RCA: 224] [Impact Index Per Article: 20.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
A key feature of facial behavior is its dynamic quality. However, most previous research has been limited to the use of static images of prototypical expressive patterns. This article explores the role of facial dynamics in the perception of emotions, reviewing relevant empirical evidence demonstrating that dynamic information improves coherence in the identification of affect (particularly for degraded and subtle stimuli), leads to higher emotion judgments (i.e., intensity and arousal), and helps to differentiate between genuine and fake expressions. The findings underline that using static expressions not only poses problems of ecological validity, but also limits our understanding of what facial activity does. Implications for future research on facial activity, particularly for social neuroscience and affective computing, are discussed.
Collapse
Affiliation(s)
| | - Arvid Kappas
- School of Humanities and Social Sciences, Jacobs University Bremen, Germany
| | | |
Collapse
|
13
|
Abstract
Face aftereffects (FAEs) are generally thought of as being a visual phenomenon. However, recent studies have shown that people can haptically recognize a face. Here, I report a haptic, rather than visual, FAE. By using three-dimensional facemasks, I found that haptic exploration of the facial expression of the facemask causes a subsequently touched neutral facemask to be perceived as having the opposite facial expression. The results thus suggest that FAEs can also occur in haptic perception of faces.
Collapse
Affiliation(s)
- Kazumichi Matsumiya
- Research Institute of Electrical Communication, Tohoku University, 2-1-1, Katahira, Aoba-ku, Sendai 980-8577, Japan; e-mail:
| |
Collapse
|
14
|
Fernandes AM, Albuquerque PB. Tactual perception: a review of experimental variables and procedures. Cogn Process 2012; 13:285-301. [PMID: 22669262 DOI: 10.1007/s10339-012-0443-2] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2012] [Accepted: 05/18/2012] [Indexed: 01/05/2023]
Abstract
This paper reviews the literature on tactual perception. Throughout this review, we will highlight some of the most relevant aspects in the touch literature: type of stimuli; type of participants; type of tactile exploration; and finally, the interaction between touch and other senses. Regarding type of stimuli, we will analyse studies with abstract stimuli such as vibrations, with two- and three-dimensional stimuli, and also concrete stimuli, considering the relation between familiar and unfamiliar stimuli and the haptic perception of faces. Under the "type of participants" topic, we separated studies with blind participants, studies with children and adults, and also performed an overview of sex differences in performance. The type of tactile exploration is explored considering conditions of active and passive touch, the relevance of movement in touch and the relation between haptic exploration and time. Finally, interactions between touch and vision, touch and smell and touch and taste are explored in the last topic. The review ends with an overall conclusion on the state of the art for the tactual perception literature. With this work, we intend to present an organised overview of the main variables in touch experiments, compiling aspects reported in the tactual literature, and attempting to provide both a summary of previous findings, and a guide to the design of future works on tactual perception and memory, through a presentation of implications from previous studies.
Collapse
|
15
|
Klatzky RL, Lederman SJ. Haptic object perception: spatial dimensionality and relation to vision. Philos Trans R Soc Lond B Biol Sci 2012; 366:3097-105. [PMID: 21969691 DOI: 10.1098/rstb.2011.0153] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Enabled by the remarkable dexterity of the human hand, specialized haptic exploration is a hallmark of object perception by touch. Haptic exploration normally takes place in a spatial world that is three-dimensional; nevertheless, stimuli of reduced spatial dimensionality are also used to display spatial information. This paper examines the consequences of full (three-dimensional) versus reduced (two-dimensional) spatial dimensionality for object processing by touch, particularly in comparison with vision. We begin with perceptual recognition of common human-made artefacts, then extend our discussion of spatial dimensionality in touch and vision to include faces, drawing from research on haptic recognition of facial identity and emotional expressions. Faces have often been characterized as constituting a specialized input for human perception. We find that contrary to vision, haptic processing of common objects is impaired by reduced spatial dimensionality, whereas haptic face processing is not. We interpret these results in terms of fundamental differences in object perception across the modalities, particularly the special role of manual exploration in extracting a three-dimensional structure.
Collapse
Affiliation(s)
- Roberta L Klatzky
- Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213, USA.
| | | |
Collapse
|
16
|
Marshall AD, Sippel LM, Belleau EL. Negatively biased emotion perception in depression as a contributing factor to psychological aggression perpetration: a preliminary study. THE JOURNAL OF PSYCHOLOGY 2012; 145:521-35. [PMID: 22208133 DOI: 10.1080/00223980.2011.599822] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/16/2022] Open
Abstract
Based on research linking depressive symptoms and intimate partner aggression perpetration with negatively biased perception of social stimuli, the present authors examined biased perception of emotional expressions as a mechanism in the frequently observed relationship between depression and psychological aggression perpetration. In all, 30 university students made valence ratings (negative to positive) of emotional facial expressions and completed measures of depressive symptoms and psychological aggression perpetration. As expected, depressive symptoms were positively associated with psychological aggression perpetration in an individual's current relationship, and this relationship was mediated by ratings of negative emotional expressions. These findings suggest that negatively biased perception of emotional expressions within the context of elevated depressive symptoms may represent an early stage of information processing that leads to aggressive relationship behaviors.
Collapse
Affiliation(s)
- Amy D Marshall
- Department of Psychology, 415 Moore Building, The Pennsylvania State University, University Park, PA 16802, USA.
| | | | | |
Collapse
|
17
|
Chiller-Glaus SD, Schwaninger A, Hofer F, Kleiner M, Knappmeyer B. Recognition of Emotion in Moving and Static Composite Faces. SWISS JOURNAL OF PSYCHOLOGY 2011. [DOI: 10.1024/1421-0185/a000061] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
This paper investigates whether the greater accuracy of emotion identification for dynamic versus static expressions, as noted in previous research, can be explained through heightened levels of either component or configural processing. Using a paradigm by Young, Hellawell, and Hay (1987 ), we tested recognition performance of aligned and misaligned composite faces with six basic emotions (happiness, fear, disgust, surprise, anger, sadness). Stimuli were created using 3D computer graphics and were shown as static peak expressions (static condition) and 7 s video sequences (dynamic condition). The results revealed that, overall, moving stimuli were better recognized than static faces, although no interaction between motion and other factors was found. For happiness, sadness, and surprise, misaligned composites were better recognized than aligned composites, suggesting that aligned composites fuse to form a single expression, while the two halves of misaligned composites are perceived as two separate emotions. For anger, disgust, and fear, this was not the case. These results indicate that emotions are perceived on the basis of both configural and component-based information, with specific activation patterns for separate emotions, and that motion has a quality of its own and does not increase configural or component-based recognition separately.
Collapse
Affiliation(s)
- Sarah Dagmar Chiller-Glaus
- Department of Psychology, University of Zurich, Switzerland
- Swiss University of Distance Education, Brig, Switzerland
| | - Adrian Schwaninger
- School of Applied Psychology, Institute Humans in Complex Systems, University of Applied Sciences Northwestern Switzerland, Olten, Switzerland, and Department of Informatics, University of Zurich, Switzerland
| | | | - Mario Kleiner
- Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
| | - Barbara Knappmeyer
- Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Center for Neural Science, New York University, USA
| |
Collapse
|
18
|
Abstract
One approach to gauge the complexity of the computational problem underlying haptic perception is to determine the number of dimensions needed to describe it. In vision, the number of dimensions can be estimated to be seven. This observation raises the question of what is the number of dimensions needed to describe touch. Only with certain simplified representations of mechanical interactions can this number be estimated, because it is in general infinite. Organisms must be sensitive to considerably reduced subsets of all possible measurements. These reductions are discussed by considering the sensing apparatuses of some animals and the underlying mechanisms of two haptic illusions.
Collapse
Affiliation(s)
- Vincent Hayward
- UPMC Universite Paris 06, Institut des Systèmes Intelligents et de Robotique, 4 Place Jussieu, 75005 Paris, France.
| |
Collapse
|
19
|
Isaacowitz DM, Stanley JT. Bringing an Ecological Perspective to the Study of Aging and Recognition of Emotional Facial Expressions: Past, Current, and Future Methods. JOURNAL OF NONVERBAL BEHAVIOR 2011; 35:261-278. [PMID: 22125354 DOI: 10.1007/s10919-011-0113-6] [Citation(s) in RCA: 70] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
Older adults perform worse on traditional tests of emotion recognition accuracy than do young adults. In this paper, we review descriptive research to date on age differences in emotion recognition from facial expressions, as well as the primary theoretical frameworks that have been offered to explain these patterns. We propose that this is an area of inquiry that would benefit from an ecological approach in which contextual elements are more explicitly considered and reflected in experimental methods. Use of dynamic displays and examination of specific cues to accuracy, for example, may reveal more nuanced age-related patterns and may suggest heretofore unexplored underlying mechanisms.
Collapse
|
20
|
Irrelevant visual faces influence haptic identification of facial expressions of emotion. Atten Percept Psychophys 2010; 73:521-30. [DOI: 10.3758/s13414-010-0038-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
21
|
Kitada R, Dijkerman HC, Soo G, Lederman SJ. Representing human hands haptically or visually from first-person versus third-person perspectives. Perception 2010; 39:236-54. [PMID: 20402245 DOI: 10.1068/p6535] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
Humans can recognise human body parts haptically as well as visually. We employed a mental-rotation task to determine whether participants could adopt a third-person perspective when judging the laterality of life-like human hands. Female participants adopted either a first-person or a third-person perspective using vision (experiment 1) or haptics (experiment 2), with hands presented at various orientations within a horizontal plane. In the first-person perspective task, most participants responded more slowly as hand orientation increasingly deviated from the participant's upright orientation, regardless of modality. In the visual third-person perspective task, most participants responded more slowly as hand orientation increasingly deviated from the experimenter's upright orientation; in contrast, less than half of the participants produced this same inverted U-shaped response-time function haptically. In experiment 3, participants were explicitly instructed to adopt a third-person perspective haptically by mentally rotating the rubber hand to the experimenter's upright orientation. Most participants produced an inverted U-shaped function. Collectively, these results suggest that humans can accurately assume a third-person perspective when hands are explored haptically or visually. With less explicit instructions, however, the canonical orientation for hand representation may be more strongly influenced haptically than visually by body-based heuristics, and less easily modified by perspective instructions.
Collapse
Affiliation(s)
- Ryo Kitada
- Division of Cerebral Integration, National Institute for Physiological Sciences, Okazaki, 444-8585, Japan.
| | | | | | | |
Collapse
|
22
|
Tan HZ, Reed CM, Durlach NI. Optimum Information Transfer Rates for Communication through Haptic and Other Sensory Modalities. IEEE TRANSACTIONS ON HAPTICS 2010; 3:98-108. [PMID: 27788117 DOI: 10.1109/toh.2009.46] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
This paper is concerned with investigating the factors that contribute to optimizing information transfer (IT) rate in humans. With an increasing interest in designing complex haptic signals for a wide variety of applications, there is a need for a better understanding of how information can be displayed in an optimal way. Based on the results of several early studies from the 1950s, a general "rule of thumb" has arisen in the literature which suggests that IT rate is dependent primarily on the stimulus delivery rate and is optimized for presentation rates of 2-3 items/s. Thus, the key to maximizing IT rate is to maximize the information in the stimulus set. Recent data obtained with multidimensional tactual signals, however, appear to contradict these conclusions. In particular, these current results suggest that optimal delivery rate varies with stimulus information to yield a constant peak IT rate that depends on the degree of familiarity and training with a particular stimulus set. We discuss factors that may be responsible for the discrepancies in results across studies including procedural differences, training issues, and stimulus-response compatibility. These factors should be taken into account when designing haptic signals to yield optimal IT rates for communication devices.
Collapse
|
23
|
Kitada R, Johnsrude IS, Kochiyama T, Lederman SJ. Brain networks involved in haptic and visual identification of facial expressions of emotion: An fMRI study. Neuroimage 2010; 49:1677-89. [DOI: 10.1016/j.neuroimage.2009.09.014] [Citation(s) in RCA: 81] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2009] [Revised: 09/10/2009] [Accepted: 09/12/2009] [Indexed: 11/28/2022] Open
|
24
|
Abramowicz A, Klatzky RL, Lederman SJ. Learning and Generalization in Haptic Classification of 2-D Raised-Line Drawings of Facial Expressions of Emotion by Sighted and Adventitiously Blind Observers. Perception 2010; 39:1261-75. [DOI: 10.1068/p6686] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
Sighted blindfolded individuals can successfully classify basic facial expressions of emotion (FEEs) by manually exploring simple 2-D raised-line drawings (Lederman et al 2008, IEEE Transactions on Haptics1 27–38). The effect of training on classification accuracy was assessed by sixty sighted blindfolded participants (experiment 1) and by three adventitiously blind participants (experiment 2). We further investigated whether the underlying learning process(es) constituted token-specific learning and/or generalization. A hybrid learning paradigm comprising pre/post and old/new test comparisons was used. For both participant groups, classification accuracy for old (ie trained) drawings markedly increased over study trials (mean improvement = 76%, and 88%, respectively). Additionally, RT decreased by a mean of 30% for the sighted, and 31% for the adventitiously blind. Learning was mostly token-specific, but some generalization was also observed for both groups. The sighted classified novel drawings of all six FEEs faster with training (mean RT decrease = 20%). Accuracy also improved significantly (mean improvement = 20%), but this improvement was restricted to two FEEs (anger and sadness). Two of three adventitiously blind participants classified new drawings more accurately (mean improvement = 30%); however, RTs for this group did not reflect generalization. Based on a limited number of blind subjects, our results tentatively suggest that adventitiously blind individuals learn to haptically classify FEEs as well as, or even better than, sighted persons.
Collapse
Affiliation(s)
| | - Roberta L Klatzky
- Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA
| | | |
Collapse
|
25
|
Lederman SJ, Klatzky RL, Rennert-May E, Lee JH, Ng K, Hamilton C. Haptic Processing of Facial Expressions of Emotion in 2D Raised-Line Drawings. IEEE TRANSACTIONS ON HAPTICS 2008; 1:27-38. [PMID: 27788083 DOI: 10.1109/toh.2008.3] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Participants haptically (vs. visually) classified universal facial expressions of emotion (FEEs) depicted in simple 2D raised-line displays. Experiments 1 and 2 established that haptic classification was well above chance; face-inversion effects further indicated that the upright orientation was privileged. Experiment 2 added a third condition in which the normal configuration of the upright features was spatially scrambled. Results confirmed that configural processing played a critical role, since upright FEEs were classified more accurately and confidently than either scrambled or inverted FEEs, which did not differ. Because accuracy in both scrambled and inverted conditions was above chance, feature processing also played a role, as confirmed by commonalities across confusions for upright, inverted, and scrambled faces. Experiment 3 required participants to visually and haptically assign emotional valence (positive/negative) and magnitude to upright and inverted 2-D FEE displays. While emotional magnitude could be assigned using either modality, haptic presentation led to more variable valence judgments. We also documented a new face-inversion effect for emotional valence visually, but not haptically. These results suggest emotions can be interpreted from 2-D displays presented haptically as well as visually; however, emotional impact is judged more reliably by vision than by touch. Potential applications of this work are also considered.
Collapse
|