101
|
Ross P, Atkinson AP. Expanding Simulation Models of Emotional Understanding: The Case for Different Modalities, Body-State Simulation Prominence, and Developmental Trajectories. Front Psychol 2020; 11:309. [PMID: 32194476 PMCID: PMC7063097 DOI: 10.3389/fpsyg.2020.00309] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 02/10/2020] [Indexed: 12/14/2022] Open
Abstract
Recent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions. First, that fuller consideration be given to simulation of bodily and vocal expressions, given that the body and voice are also important expressive channels for providing cues to another's emotional state. Second, that simulation of other aspects of the perceived emotional state, such as changes in the autonomic nervous system and viscera, might have a more prominent role in underpinning emotion recognition than is typically proposed. Sensorimotor simulation models tend to relegate such body-state simulation to a subsidiary role, despite the plausibility of body-state simulation being able to underpin emotion recognition in the absence of typical sensorimotor simulation. Third, that simulation models of emotion recognition be extended to address how embodied processes and emotion recognition abilities develop through the lifespan. It is not currently clear how this system of sensorimotor and body-state simulation develops and in particular how this affects the development of emotion recognition ability. We review recent findings from the emotional body recognition literature and integrate recent evidence regarding the development of mimicry and interoception to significantly expand simulation models of emotion recognition.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, Durham, United Kingdom
| | | |
Collapse
|
102
|
Halovic S, Kroos C, Stevens C. Adaptation aftereffects influence the perception of specific emotions from walking gait. Acta Psychol (Amst) 2020; 204:103026. [PMID: 32087419 DOI: 10.1016/j.actpsy.2020.103026] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2019] [Revised: 02/03/2020] [Accepted: 02/03/2020] [Indexed: 10/25/2022] Open
Abstract
We investigated the existence and nature of adaptation aftereffects on the visual perception of basic emotions displayed through walking gait. Stimuli were previously validated gender-ambiguous point-light walker models displaying various basic emotions (happy, sad, anger and fear). Results indicated that both facilitative and inhibitive aftereffects influenced the perception of all displayed emotions. Facilitative aftereffects were found between theoretically opposite emotions (i.e. happy/sad and anger/fear). Evidence suggested that low-level and high-level visual processes contributed to both stimulus aftereffect and conceptual aftereffect mechanisms. Significant aftereffects were more frequently evident for the time required to identify the displayed emotion than for emotion identification rates. The perception of basic emotions from walking gait is influenced by a number of different perceptual mechanisms which shift the categorical boundaries of each emotion as a result of perceptual experience.
Collapse
|
103
|
Cheshin A. The Impact of Non-normative Displays of Emotion in the Workplace: How Inappropriateness Shapes the Interpersonal Outcomes of Emotional Displays. Front Psychol 2020; 11:6. [PMID: 32116884 PMCID: PMC7033655 DOI: 10.3389/fpsyg.2020.00006] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2019] [Accepted: 01/03/2020] [Indexed: 11/17/2022] Open
Abstract
When it comes to evaluating emotions as either “good” or “bad,” everyday beliefs regarding emotions rely mostly on their hedonic features—does the emotion feel good to the person experiencing the emotion? However, emotions are not only felt inwardly; they are also displayed outwardly, and others’ responses to an emotional display can produce asymmetric outcomes (i.e., even emotions that feel good to the displayer can lead to negative outcomes for the displayer and others). Focusing on organizational settings, this manuscript reviews the literature on the outcomes of emotional expressions and argues that the evidence points to perceived (in)appropriateness of emotional displays as key to their consequences: emotional displays that are deemed inappropriate generate disadvantageous outcomes for the displayer, and at times also the organization. Drawing on relevant theoretical models [Emotions as Social Information (EASI) theory, the Dual Threshold Model of Anger, and Asymmetrical Outcomes of Emotions], the paper highlights three broad and interrelated reasons why emotion displays could be deemed unfitting and inappropriate: (1) characteristics of the displayer (e.g., status, gender); (2) characteristics of the display (e.g., intensity, mode); and (3) characteristics of the context (e.g., national or organizational culture, topic of interaction). The review focuses on three different emotions—anger, sadness, and happiness—which differ in their valence based on how they feel to the displayer, but can yield different interpersonal outcomes. In conclusion, the paper argues that inappropriateness must be judged separately from whether an emotional display is civil (i.e., polite and courteous) or uncivil (i.e., rude, discourteous, and offensive). Testable propositions are presented, as well as suggested future research directions.
Collapse
Affiliation(s)
- Arik Cheshin
- Department of Human Services, University of Haifa, Haifa, Israel
| |
Collapse
|
104
|
Isernia S, Sokolov AN, Fallgatter AJ, Pavlova MA. Untangling the Ties Between Social Cognition and Body Motion: Gender Impact. Front Psychol 2020; 11:128. [PMID: 32116932 PMCID: PMC7016199 DOI: 10.3389/fpsyg.2020.00128] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2020] [Accepted: 01/16/2020] [Indexed: 01/25/2023] Open
Abstract
We proved the viability of the general hypothesis that biological motion (BM) processing serves as a hallmark of social cognition. We assumed that BM processing and inferring emotions through BM (body language reading) are firmly linked and examined whether this tie is gender-specific. Healthy females and males completed two tasks with the same set of point-light BM displays portraying angry and neutral locomotion of female and male actors. For one task, perceivers had to indicate actor gender, while for the other, they had to infer the emotional content of locomotion. Thus, with identical visual input, we directed task demands either to BM processing or inferring of emotion. This design allows straight comparison between sensitivity to BM and recognition of emotions conveyed by the same BM. In addition, perceivers were administered a set of photographs from the Reading the Mind in the Eyes Test (RMET), with which they identified either emotional state or actor gender. Although there were no gender differences in performance on BM tasks, a tight link occurred between recognition accuracy of emotions and gender through BM in males. In females only, body language reading (both accuracy and response time) was associated with performance on the RMET. The outcome underscores gender-specific modes in visual social cognition and triggers investigation of body language reading in a wide range of neuropsychiatric disorders.
Collapse
Affiliation(s)
- Sara Isernia
- Department of Psychiatry and Psychotherapy, Medical School and University Hospital, Eberhard Karls University of Tübingen, Tübingen, Germany
- Department of Psychology, Università Cattolica del Sacro Cuore, Milan, Italy
- CADITeR, IRCCS Fondazione Don Carlo Gnocchi ONLUS, Milan, Italy
| | - Alexander N. Sokolov
- Department of Psychiatry and Psychotherapy, Medical School and University Hospital, Eberhard Karls University of Tübingen, Tübingen, Germany
| | - Andreas J. Fallgatter
- Department of Psychiatry and Psychotherapy, Medical School and University Hospital, Eberhard Karls University of Tübingen, Tübingen, Germany
| | - Marina A. Pavlova
- Department of Psychiatry and Psychotherapy, Medical School and University Hospital, Eberhard Karls University of Tübingen, Tübingen, Germany
| |
Collapse
|
105
|
Prediction of action outcome: Effects of available information about body structure. Atten Percept Psychophys 2019; 82:2076-2084. [PMID: 31797178 DOI: 10.3758/s13414-019-01883-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Correctly perceiving the movements of opponents is essential in everyday life as well as in many sports. Several studies have shown a better prediction performance for detailed stimuli compared to point-light displays (PLDs). However, it remains unclear whether differences in prediction performance result from explicit information about articulation or from information about body shape. We therefore presented three different types of stimuli (PLDs, stick figures, and skinned avatars) with different amounts of available information of soccer players' run-ups. Stimulus presentation was faded out at ball contact. Participants had to react to the perceived shot direction with a full-body movement. Results showed no differences for time to virtual ball contact between presentation modes. However, prediction performance was significantly better for avatars and stick figures compared to PLDs, but did not differ between avatars and stick figures, suggesting that explicit information about the articulation of the major joints is mainly relevant for better prediction performance, and plays a larger role than detailed information about body shape. We also tracked eye movements and found that gaze behavior for avatars differed from those for PLDs and stick figures, with no significant differences between PLDs and stick figures. This effect was due to more and longer fixations on the head when avatars were presented.
Collapse
|
106
|
Ross P, Flack T. Removing Hand Form Information Specifically Impairs Emotion Recognition for Fearful and Angry Body Stimuli. Perception 2019; 49:98-112. [PMID: 31801026 DOI: 10.1177/0301006619893229] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Emotion perception research has largely been dominated by work on facial expressions, but emotion is also strongly conveyed from the body. Research exploring emotion recognition from the body tends to refer to “the body” as a whole entity. However, the body is made up of different components (hands, arms, trunk, etc.), all of which could be differentially contributing to emotion recognition. We know that the hands can help to convey actions and, in particular, are important for social communication through gestures, but we currently do not know to what extent the hands influence emotion recognition from the body. Here, 93 adults viewed static emotional body stimuli with either the hands, arms, or both components removed and completed a forced-choice emotion recognition task. Removing the hands significantly reduced recognition accuracy for fear and anger but made no significant difference to the recognition of happiness and sadness. Removing the arms had no effect on emotion recognition accuracy compared with the full-body stimuli. These results suggest the hands may play a key role in the recognition of emotions from the body.
Collapse
Affiliation(s)
- Paddy Ross
- Department of Psychology, Durham University, UK
| | - Tessa Flack
- School of Psychology, University of Lincoln, UK
| |
Collapse
|
107
|
Gu Q, Li W, Lu X, Chen H, Shen M, Gao Z. Agent identity drives adaptive encoding of biological motion into working memory. J Vis 2019; 19:6. [PMID: 31826251 DOI: 10.1167/19.14.6] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
To engage in normal social interactions, we have to encode human biological motions (BMs, e.g., walking and jumping), which is one of the most salient and biologically significant types of kinetic information encountered in everyday life, into working memory (WM). Critically, each BM in real life is produced by a distinct person, carrying a dynamic motion signature (i.e., identity). Whether agent identity influences the WM processing of BMs remains unknown. Here, we addressed this question by examining whether memorizing BMs with different identities promoted the WM processing of task-irrelevant clothing colors. Two opposing hypotheses were tested: (a) WM only stores the target action (element-based hypothesis) and (b) WM stores both action and irrelevant clothing color (event-based hypothesis), interpreting each BM as an event. We required participants to memorize actions that either performed by one agent or distinct agents, while ignoring clothing colors. Then we examined whether the irrelevant color was also stored in WM by probing a distracting effect: If the color was extracted into WM, the change of irrelevant color in the probe would lead to a significant distracting effect on action performance. We found that WM encoding of BMs was adaptive: Once the memorized actions had different identities, WM adopted an event-based encoding mode regardless of memory load and probe identity (Experiment 1, different-identity group of Experiment 2, and Experiment 3). However, WM used an element-based encoding mode when memorized-actions shared the same identity (same-identity group of Experiment 2) or were inverted (Experiment 4). Overall, these findings imply that agent identity information has a significant effect on the WM processing of BMs.
Collapse
Affiliation(s)
- Quan Gu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| | - Wenmin Li
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| | - Xiqian Lu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| | - Hui Chen
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| | - Mowei Shen
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| | - Zaifeng Gao
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hangzhou, People's Republic of China
| |
Collapse
|
108
|
Riddell H, Li L, Lappe M. Heading perception from optic flow in the presence of biological motion. J Vis 2019; 19:25. [PMID: 31868898 DOI: 10.1167/19.14.25] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
We investigated whether biological motion biases heading estimation from optic flow in a similar manner to nonbiological moving objects. In two experiments, observers judged their heading from displays depicting linear translation over a random-dot ground with normal point light walkers, spatially scrambled point light walkers, or laterally moving objects composed of random dots. In Experiment 1, we found that both types of walkers biased heading estimates similarly to moving objects when they obscured the focus of expansion of the background flow. In Experiment 2, we also found that walkers biased heading estimates when they did not obscure the focus of expansion. These results show that both regular and scrambled biological motion affect heading estimation in a similar manner to simple moving objects, and suggest that biological motion is not preferentially processed for the perception of self-motion.
Collapse
Affiliation(s)
- Hugh Riddell
- Institute for Psychology and Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany
| | - Li Li
- Faculty of Arts and Science, NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, Shanghai, China
| | - Markus Lappe
- Institute for Psychology and Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Muenster, Germany
| |
Collapse
|
109
|
Bek J, Poliakoff E, Lander K. Measuring emotion recognition by people with Parkinson's disease using eye-tracking with dynamic facial expressions. J Neurosci Methods 2019; 331:108524. [PMID: 31747554 DOI: 10.1016/j.jneumeth.2019.108524] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2019] [Revised: 10/23/2019] [Accepted: 11/16/2019] [Indexed: 01/18/2023]
Abstract
BACKGROUND Motion is an important cue to emotion recognition, and it has been suggested that we recognize emotions via internal simulation of others' expressions. There is a reduction of facial expression in Parkinson's disease (PD), which may influence the ability to use motion to recognise emotions in others. However, the majority of previous work in PD has used only static expressions. Moreover, few studies have used eye-tracking to explore emotion processing in PD. NEW METHOD We measured accuracy and eye movements in people with PD and healthy controls when identifying emotions from both static and dynamic facial expressions. RESULTS The groups did not differ overall in emotion recognition accuracy, but motion significantly increased recognition only in the control group. Participants made fewer and longer fixations when viewing dynamic expressions, and interest area analysis revealed increased gaze to the mouth region and decreased gaze to the eyes for dynamic stimuli, although the latter was specific to the control group. COMPARISON WITH EXISTING METHODS Ours is the first study to directly compare recognition of static and dynamic emotional expressions in PD using eye-tracking, revealing subtle differences between groups that may otherwise be undetected. CONCLUSIONS It is feasible and informative to use eye-tracking with dynamic expressions to investigate emotion recognition in PD. Our findings suggest that people with PD may differ from healthy older adults in how they utilise motion during facial emotion recognition. Nonetheless, gaze patterns indicate some effects of motion on emotional processing, highlighting the need for further investigation in this area.
Collapse
Affiliation(s)
- Judith Bek
- Division of Neuroscience and Experimental Psychology, School of Biological Sciences, Faculty of Biology Medicine and Health, Manchester Academic Health Science Centre, University of Manchester, UK.
| | - Ellen Poliakoff
- Division of Neuroscience and Experimental Psychology, School of Biological Sciences, Faculty of Biology Medicine and Health, Manchester Academic Health Science Centre, University of Manchester, UK.
| | - Karen Lander
- Division of Neuroscience and Experimental Psychology, School of Biological Sciences, Faculty of Biology Medicine and Health, Manchester Academic Health Science Centre, University of Manchester, UK.
| |
Collapse
|
110
|
Freeman JB, Stolier RM, Brooks JA. Dynamic interactive theory as a domain-general account of social perception. ADVANCES IN EXPERIMENTAL SOCIAL PSYCHOLOGY 2019; 61:237-287. [PMID: 34326560 PMCID: PMC8317542 DOI: 10.1016/bs.aesp.2019.09.005] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
The perception of social categories, emotions, and personality traits from others' faces each have been studied extensively but in relative isolation. We synthesize emerging findings suggesting that, in each of these domains of social perception, both a variety of bottom-up facial features and top-down social cognitive processes play a part in driving initial perceptions. Among such top-down processes, social-conceptual knowledge in particular can have a fundamental structuring role in how we perceive others' faces. Extending the Dynamic Interactive framework (Freeman & Ambady, 2011), we outline a perspective whereby the perception of social categories, emotions, and traits from faces can all be conceived as emerging from an integrated system relying on domain-general cognitive properties. Such an account of social perception would envision perceptions to be a rapid, but gradual, process of negotiation between the variety of visual cues inherent to a person and the social cognitive knowledge an individual perceiver brings to the perceptual process. We describe growing evidence in support of this perspective as well as its theoretical implications for social psychology.
Collapse
|
111
|
Geiger A, Bente G, Lammers S, Tepest R, Roth D, Bzdok D, Vogeley K. Distinct functional roles of the mirror neuron system and the mentalizing system. Neuroimage 2019; 202:116102. [DOI: 10.1016/j.neuroimage.2019.116102] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2019] [Revised: 07/19/2019] [Accepted: 08/14/2019] [Indexed: 02/05/2023] Open
|
112
|
Ogren M, Kaplan B, Peng Y, Johnson KL, Johnson SP. Motion or emotion: Infants discriminate emotional biological motion based on low-level visual information. Infant Behav Dev 2019; 57:101324. [PMID: 31112859 PMCID: PMC6859203 DOI: 10.1016/j.infbeh.2019.04.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2019] [Revised: 04/17/2019] [Accepted: 04/17/2019] [Indexed: 10/26/2022]
Abstract
Infants' ability to discriminate emotional facial expressions and tones of voice is well-established, yet little is known about infant discrimination of emotional body movements. Here, we asked if 10-20-month-old infants rely on high-level emotional cues or low-level motion related cues when discriminating between emotional point-light displays (PLDs). In Study 1, infants viewed 18 pairs of angry, happy, sad, or neutral PLDs. Infants looked more at angry vs. neutral, happy vs. neutral, and neutral vs. sad. Motion analyses revealed that infants preferred the PLD with more total body movement in each pairing. Study 2, in which infants viewed inverted versions of the same pairings, yielded similar findings except for sad-neutral. Study 3 directly paired all three emotional stimuli in both orientations. The angry and happy stimuli did not significantly differ in terms of total motion, but both had more motion than the sad stimuli. Infants looked more at angry vs. sad, more at happy vs. sad, and about equally to angry vs. happy in both orientations. Again, therefore, infants preferred PLDs with more total body movement. Overall, the results indicate that a low-level motion preference may drive infants' discrimination of emotional human walking motions.
Collapse
Affiliation(s)
- Marissa Ogren
- Department of Psychology, University of California, Los Angeles, United States.
| | - Brianna Kaplan
- Department of Psychology, New York University, United States
| | - Yujia Peng
- Department of Psychology, University of California, Los Angeles, United States
| | - Kerri L Johnson
- Department of Psychology, University of California, Los Angeles, United States
| | - Scott P Johnson
- Department of Psychology, University of California, Los Angeles, United States
| |
Collapse
|
113
|
Ye T, Li P, Zhang Q, Gu Q, Lu X, Gao Z, Shen M. Relation Between Working Memory Capacity of Biological Movements and Fluid Intelligence. Front Psychol 2019; 10:2313. [PMID: 31749726 PMCID: PMC6842976 DOI: 10.3389/fpsyg.2019.02313] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Accepted: 09/27/2019] [Indexed: 11/18/2022] Open
Abstract
Studies have revealed that there is an independent buffer for holding biological movements (BM) in working memory (WM), and this BM-WM has a unique link to our social ability. However, it remains unknown as to whether the BM-WM also correlates to our cognitive abilities, such as fluid intelligence (Gf). Since BM processing has been considered as a hallmark of social cognition, which distinguishes from canonical cognitive abilities in many ways, it has been hypothesized that only canonical object-WM (e.g., memorizing color patches), but not BM-WM, emerges to have an intimate relation with Gf. We tested this prediction by measuring the relationship between WM capacity of BM and Gf. With two Gf measurements, we consistently found moderate correlations between BM-WM capacity, the score of both Raven's advanced progressive matrix (RAPM), and the Cattell culture fair intelligence test (CCFIT). This result revealed, for the first time, a close relation between WM and Gf with a social stimulus, and challenged the double-dissociation hypothesis for distinct functions of different WM buffers.
Collapse
Affiliation(s)
- Tian Ye
- Department of Psychology, Zhejiang University, Hangzhou, China
| | - Peng Li
- School of Education and Management, Yunnan Normal University, Kunming, China
| | - Qiong Zhang
- Department of Psychology, Zhejiang University, Hangzhou, China
| | - Quan Gu
- Department of Psychology, Zhejiang University, Hangzhou, China
| | - Xiqian Lu
- Department of Psychology, Zhejiang University, Hangzhou, China
| | - Zaifeng Gao
- Department of Psychology, Zhejiang University, Hangzhou, China
| | - Mowei Shen
- Department of Psychology, Zhejiang University, Hangzhou, China
| |
Collapse
|
114
|
Federman D, Maltz Schwartz R, Amital H. Extraversion in women with fibromyalgia as a predictor of better prognosis: an intervention model in dance movement therapy. BODY MOVEMENT AND DANCE IN PSYCHOTHERAPY 2019. [DOI: 10.1080/17432979.2019.1672790] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Affiliation(s)
- Dita Federman
- School of Creative Art Therapies, Faculty of Social Welfare & Health Sciences, Emili Sagol CAT Research Center, University of Haifa, Haifa, Israel
| | - Ravit Maltz Schwartz
- Graduate School of Creative Arts Therapies, Faculty of Social Welfare & Health Sciences, University of Haifa, Haifa, Israel
| | - Howard Amital
- Department of Medicine ‘B’, and Autoimmunity Center, Sheba Medical Center, Tel-Hashomer, Sackler Faculty of Medicine, Tel-Aviv University, Tel-Aviv, Israel
| |
Collapse
|
115
|
Filntisis PP, Efthymiou N, Koutras P, Potamianos G, Maragos P. Fusing Body Posture With Facial Expressions for Joint Recognition of Affect in Child–Robot Interaction. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2930434] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
116
|
Lammers S, Bente G, Tepest R, Jording M, Roth D, Vogeley K. Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies. Front Robot AI 2019; 6:94. [PMID: 33501109 PMCID: PMC7805965 DOI: 10.3389/frobt.2019.00094] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2019] [Accepted: 09/13/2019] [Indexed: 11/13/2022] Open
Abstract
Others' movements inform us about their current activities as well as their intentions and emotions. Research on the distinct mechanisms underlying action recognition and emotion inferences has been limited due to a lack of suitable comparative stimulus material. Problematic confounds can derive from low-level physical features (e.g., luminance), as well as from higher-level psychological features (e.g., stimulus difficulty). Here we present a standardized stimulus dataset, which allows to address both action and emotion recognition with identical stimuli. The stimulus set consists of 792 computer animations with a neutral avatar based on full body motion capture protocols. Motion capture was performed on 22 human volunteers, instructed to perform six everyday activities (mopping, sweeping, painting with a roller, painting with a brush, wiping, sanding) in three different moods (angry, happy, sad). Five-second clips of each motion protocol were rendered into AVI-files using two virtual camera perspectives for each clip. In contrast to video stimuli, the computer animations allowed to standardize the physical appearance of the avatar and to control lighting and coloring conditions, thus reducing the stimulus variation to mere movement. To control for low level optical features of the stimuli, we developed and applied a set of MATLAB routines extracting basic physical features of the stimuli, including average background-foreground proportion and frame-by-frame pixel change dynamics. This information was used to identify outliers and to homogenize the stimuli across action and emotion categories. This led to a smaller stimulus subset (n = 83 animations within the 792 clip database) which only contained two different actions (mopping, sweeping) and two different moods (angry, happy). To further homogenize this stimulus subset with regard to psychological criteria we conducted an online observer study (N = 112 participants) to assess the recognition rates for actions and moods, which led to a final sub-selection of 32 clips (eight per category) within the database. The ACASS database and its subsets provide unique opportunities for research applications in social psychology, social neuroscience, and applied clinical studies on communication disorders. All 792 AVI-files, selected subsets, MATLAB code, annotations, and motion capture data (FBX-files) are available online.
Collapse
Affiliation(s)
- Sebastian Lammers
- Department of Psychiatry, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany.,Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Research Center Jülich, Jülich, Germany
| | - Gary Bente
- Department of Communication, Michigan State University, East Lansing, MI, United States
| | - Ralf Tepest
- Department of Psychiatry, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany
| | - Mathis Jording
- Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Research Center Jülich, Jülich, Germany
| | - Daniel Roth
- Human-Computer Interaction, Institute for Computer Science, University of Würzburg, Würzburg, Germany
| | - Kai Vogeley
- Department of Psychiatry, Faculty of Medicine and University Hospital Cologne, University of Cologne, Cologne, Germany.,Cognitive Neuroscience (INM-3), Institute of Neuroscience and Medicine, Research Center Jülich, Jülich, Germany
| |
Collapse
|
117
|
Canderan C, Maieron M, Fabbro F, Tomasino B. Understanding Body Language Does Not Require Matching the Body's Egocentric Map to Body Posture: A Brain Activation fMRI Study. Percept Mot Skills 2019; 127:8-35. [PMID: 31537169 DOI: 10.1177/0031512519876741] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Body language (BL) is a type of nonverbal communication in which the body communicates the message. We contrasted participants' cognitive processing of body representations or meanings versus body positions. Participants (N = 20) were shown pictures depicting body postures and were instructed to focus on their meaning (BL) or on the position of a body part relative to the position of another part (body structural description [BSD]). We examined activation in brain areas related to the two types of body representation-body schema and BSD-as modulated by the two tasks. We presumed that if understanding BL triggers embodiment of body posture, a matching procedure between the egocentric map coding the position of one's body segments in space and time should occur. We found that BL (vs. BSD) differentially activated the angular gyrus bilaterally, the anterior middle temporal gyrus, the temporal pole, and the right superior temporal gyrus, the inferior frontal gyrus, the superior medial gyrus, and the left superior frontal gyrus. BSD (vs. BL) differentially activated the superior parietal lobule (Area 7A) bilaterally, the posterior inferior temporal gyrus, the middle frontal gyrus, and the left precentral gyrus. Sensorimotor areas were differentially activated by BSD when compared with BL. Inclusive masking showed significant voxels in the superior colliculus and pulvinar, fusiform gyrus, inferior temporal gyrus, superior temporal gyrus, the intraparietal sulcus bilaterally, inferior frontal gyrus bilaterally, and precentral gyrus. These results indicate common brain networks for processing BL and BSD, for which some areas show differentially stronger or weaker processing of one task or the other, with the precuneus and the superior parietal lobule, the intraparietal sulcus, and sensorimotor areas most related to the BSD as activated by the BSD task. In contrast, the parietal operculum, an area related to the body schema, a representation crucial during embodiment of body postures, was not activated for implicit masking or for the differential contrasts.
Collapse
Affiliation(s)
- Cinzia Canderan
- Scientific Institute, IRCCS "E. Medea," San Vito al Tagliamento (PN), Italy
| | - Marta Maieron
- Fisica Medica, Azienda Sanitaria Universitaria Integrata di Udine, Italy
| | - Franco Fabbro
- Dipartimento di Area Medica, Università di Udine, Italy
| | - Barbara Tomasino
- Scientific Institute, IRCCS "E. Medea," San Vito al Tagliamento (PN), Italy
| |
Collapse
|
118
|
Hsiung E, Chien SH, Chu Y, Ho MW. Adults with autism are less proficient in identifying biological motion actions portrayed with point-light displays. JOURNAL OF INTELLECTUAL DISABILITY RESEARCH : JIDR 2019; 63:1111-1124. [PMID: 31020725 PMCID: PMC6850387 DOI: 10.1111/jir.12623] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/21/2018] [Revised: 03/10/2019] [Accepted: 03/13/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND Whether individuals with autism spectrum disorder (ASD) have impairments with biological motion perception has been debated. The present study examined the ability to identify point-light-displayed (PLD) human actions in neurotypical (NT) adults and adults with ASD. METHOD Twenty-seven adults with ASD (mean age = 28.36) and 30 NT adults (mean age = 22.45) were tested. Both groups viewed 10 different biological motion actions contacting an object/tool and 10 without making contact. Each action was presented twice, and participant's naming responses and reaction times were recorded. RESULTS The ASD group had a significantly lower total number of correct items (M = 29.30 ± 5.08 out of 40) and longer response time (M = 4550 ± 1442 ms) than NT group (M = 32.77 ± 2.78; M = 3556 ± 1148 ms). Both groups were better at naming the actions without objects (ASD group: 17.33 ± 2.30, NT group: 18.67 ± 1.30) than those with objects (ASD group: 11.96 ± 3.57, NT group: 14.10 ± 1.97). Correlation analyses showed that individuals with higher Autism-spectrum Quotient scale scores tended to make more errors and responded more slowly. CONCLUSION Adults with ASD were able to identify human point-light display biological motion actions much better than chance; however, they were less proficient compared with NT adults in terms of accuracy and speed, regardless of action type.
Collapse
Affiliation(s)
- E.‐Y. Hsiung
- Graduate Institute of Biomedical SciencesChina Medical UniversityTaichungTaiwan
| | - S. H.‐L. Chien
- Graduate Institute of Biomedical SciencesChina Medical UniversityTaichungTaiwan
- Graduate Institute of Neural and Cognitive SciencesChina Medical UniversityTaichungTaiwan
| | - Y.‐H. Chu
- Department of Physical TherapyChina Medical UniversityTaichungTaiwan
| | - M. W.‐R. Ho
- Graduate Institute of Biomedical SciencesChina Medical UniversityTaichungTaiwan
| |
Collapse
|
119
|
Biological motion and animacy belief induce similar effects on involuntary shifts of attention. Atten Percept Psychophys 2019; 82:1099-1111. [PMID: 31414364 DOI: 10.3758/s13414-019-01843-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Biological motion is salient to the human visual and motor systems and may be intrinsic to the perception of animacy. Evidence for the salience of visual stimuli moving with trajectories consistent with biological motion comes from studies showing that such stimuli can trigger shifts of attention in the direction of that motion. The present study was conducted to determine whether or not top-down beliefs about animacy can modify the salience of a nonbiologically moving stimulus to the visuomotor system. A nonpredictive cuing task was used in which a white dot moved from a central location toward a left- or right-sided target placeholder. The target randomly appeared at either location 200, 600, or 1,300 ms after the motion onset. Five groups of participants experienced different stimulus conditions: (1) biological motion, (2) inverted biological motion, (3) nonbiological motion, (4) animacy belief (paired with nonbiological motion), and (5) computer-generated belief (paired with nonbiological motion). Analysis of response times revealed that the motion in the biological motion and animacy belief groups, but not in the inverted and nonbiological motion groups, affected processing of the target information. These findings indicate that biological motion is salient to the visual system and that top-down beliefs regarding the animacy of the stimulus can tune the visual and motor systems to increase the salience of nonbiological motion.
Collapse
|
120
|
Lee KS, Chang DHF. Biological motion perception is differentially predicted by Autistic trait domains. Sci Rep 2019; 9:11029. [PMID: 31363154 PMCID: PMC6667460 DOI: 10.1038/s41598-019-47377-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2018] [Accepted: 07/16/2019] [Indexed: 11/09/2022] Open
Abstract
We tested the relationship between biological motion perception and the Autism-Spectrum Quotient. In three experiments, we indexed observers' performance on a classic left-right discrimination task in which participants were asked to report the facing direction of walkers containing solely structural or kinematics information, a motion discrimination task in which participants were asked to indicate the apparent motion of a (non-biological) random-dot stimulus, and a novel naturalness discrimination task. In the naturalness discrimination task, we systematically manipulated the degree of natural acceleration contained in the stimulus by parametrically morphing between a fully veridical stimulus and one where acceleration was removed. Participants were asked to discriminate the more natural stimulus (i.e., acceleration-containing stimulus) from the constant velocity stimulus. Although we found no reliable associations between overall AQ scores nor subdomain scores with performance on the direction-related tasks, we found a robust association between performance on the biological motion naturalness task and attention switching domain scores. Our findings suggest that understanding the relationship between the Autism Spectrum and perception is a far more intricate problem than previously suggested. While it has been shown that the AQ can be used as a proxy to tap into perceptual endophenotypes in Autism, the eventual diagnostic value of the perceptual task depends on the task's consideration of biological content and demands.
Collapse
Affiliation(s)
- Ka Shu Lee
- Department of Psychology, The University of Hong Kong, Hong Kong, China
| | - Dorita H F Chang
- Department of Psychology, The University of Hong Kong, Hong Kong, China. .,State Key Laboratory of Brain and Cognitive Sciences, The University of Hong Kong, Hong Kong, China.
| |
Collapse
|
121
|
Reynolds RM, Novotny E, Lee J, Roth D, Bente G. Ambiguous Bodies: The Role of Displayed Arousal in Emotion [Mis]Perception. JOURNAL OF NONVERBAL BEHAVIOR 2019. [DOI: 10.1007/s10919-019-00312-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
122
|
Liu Y, Lu X, Wu F, Shen M, Gao Z. Biological motion is stored independently from bound representation in working memory. VISUAL COGNITION 2019. [DOI: 10.1080/13506285.2019.1638479] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Affiliation(s)
- Yang Liu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hang Zhou, People’s Republic of China
| | - Xiqian Lu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hang Zhou, People’s Republic of China
| | - Fan Wu
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hang Zhou, People’s Republic of China
| | - Mowei Shen
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hang Zhou, People’s Republic of China
| | - Zaifeng Gao
- Department of Psychology and Behavioral Sciences, Zhejiang University, Hang Zhou, People’s Republic of China
| |
Collapse
|
123
|
Melzer A, Shafir T, Tsachor RP. How Do We Recognize Emotion From Movement? Specific Motor Components Contribute to the Recognition of Each Emotion. Front Psychol 2019; 10:1389. [PMID: 31333524 PMCID: PMC6617736 DOI: 10.3389/fpsyg.2019.01389] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2018] [Accepted: 05/28/2019] [Indexed: 11/27/2022] Open
Abstract
Are there movement features that are recognized as expressing each basic emotion by most people, and what are they? In our previous study we identified sets of Laban movement components that, when moved, elicited the basic emotions of anger, sadness, fear, and happiness. Our current study aimed to investigate if movements composed from those sets would be recognized as expressing those emotions, regardless of any instruction to the mover to portray emotion. Our stimuli included 113 video-clips of five Certified Laban Movement Analysts (CMAs) moving combinations of two to four movement components from each set associated with only one emotion: happiness, sadness, fear, or anger. Each three second clip showed one CMA moving a single combination. The CMAs moved only the combination's required components. Sixty-two physically and mentally healthy men (n = 31) and women (n = 31), ages 19–48, watched the clips and rated the perceived emotion and its intensity. To confirm participants' ability to recognize emotions from movement and to compare our stimuli to existing validated emotional expression stimuli, participants rated 50 additional clips of bodily motor expressions of these same emotions validated by Atkinson et al. (2004). Results showed that for both stimuli types, all emotions were recognized far above chance level. Comparing recognition accuracy of the two clip types revealed better recognition of anger, fear, and neutral emotion from Atkinson's clips of actors expressing emotions, and similar levels of recognition accuracy for happiness and sadness. Further analysis was performed to determine the contribution of specific movement components to the recognition of the studied emotions. Our results indicated that these specific Laban motor components not only enhance feeling the associated emotions when moved, but also contribute to recognition of the associated emotions when being observed, even when the mover was not instructed to portray emotion, indicating that the presence of these movement components alone is sufficient for emotion recognition. This research-based knowledge regarding the relationship between Laban motor components and bodily emotional expressions can be used by dance-movement and drama therapists for better understanding of clients' emotional movements, for creating appropriate interventions, and for enhancing communication with other practitioners regarding bodily emotional expression.
Collapse
Affiliation(s)
- Ayelet Melzer
- Faculty of Social Welfare and Health Sciences, The Graduate School of Creative Arts Therapies, University of Haifa, Haifa, Israel
| | - Tal Shafir
- The Emili Sagol Creative Arts Therapies Research Center, University of Haifa, Haifa, Israel
| | | |
Collapse
|
124
|
Pollux PM, Craddock M, Guo K. Gaze patterns in viewing static and dynamic body expressions. Acta Psychol (Amst) 2019; 198:102862. [PMID: 31226535 DOI: 10.1016/j.actpsy.2019.05.014] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2019] [Revised: 05/09/2019] [Accepted: 05/26/2019] [Indexed: 11/25/2022] Open
Abstract
Evidence for the importance of bodily cues for emotion recognition has grown over the last two decades. Despite this growing literature, it is underspecified how observers view whole bodies for body expression recognition. Here we investigate to which extent body-viewing is face- and context-specific when participants are categorizing whole body expressions in static (Experiment 1) and dynamic displays (Experiment 2). Eye-movement recordings showed that observers viewed the face exclusively when visible in dynamic displays, whereas viewing was distributed over head, torso and arms in static displays and in dynamic displays with faces not visible. The strong face bias in dynamic face-visible expressions suggests that viewing of the body responds flexibly to the informativeness of facial cues for emotion categorisation. However, when facial expressions are static or not visible, observers adopt a viewing strategy that includes all upper body regions. This viewing strategy is further influenced by subtle viewing biases directed towards emotion-specific body postures and movements to optimise recruitment of diagnostic information for emotion categorisation.
Collapse
|
125
|
The Relationship Between Caregiver Burden and Emotion Recognition Deficits in Persons With MCI and Early AD. Alzheimer Dis Assoc Disord 2019; 33:266-271. [DOI: 10.1097/wad.0000000000000323] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
126
|
Lindor ER, van Boxtel JJ, Rinehart NJ, Fielding J. Motor difficulties are associated with impaired perception of interactive human movement in autism spectrum disorder: A pilot study. J Clin Exp Neuropsychol 2019; 41:856-874. [DOI: 10.1080/13803395.2019.1634181] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Affiliation(s)
- Ebony R. Lindor
- School of Psychological Sciences and Monash Institute of Cognitive and Clinical Neurosciences, Monash University, Victoria, Australia
- Deakin Child Study Centre, School of Psychology, Faculty of Health, Deakin University Geelong, Victoria, Australia
| | - Jeroen J.A. van Boxtel
- School of Psychological Sciences and Monash Institute of Cognitive and Clinical Neurosciences, Monash University, Victoria, Australia
- School of Psychology, Faculty of Health, University of Canberra, Canberra, Australia
| | - Nicole J. Rinehart
- School of Psychological Sciences and Monash Institute of Cognitive and Clinical Neurosciences, Monash University, Victoria, Australia
- Deakin Child Study Centre, School of Psychology, Faculty of Health, Deakin University Geelong, Victoria, Australia
| | - Joanne Fielding
- School of Psychological Sciences and Monash Institute of Cognitive and Clinical Neurosciences, Monash University, Victoria, Australia
- Department of Neuroscience, Central Clinical School, Monash University, Melbourne, Australia
| |
Collapse
|
127
|
Addabbo M, Vacaru SV, Meyer M, Hunnius S. 'Something in the way you move': Infants are sensitive to emotions conveyed in action kinematics. Dev Sci 2019; 23:e12873. [PMID: 31144771 DOI: 10.1111/desc.12873] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2018] [Revised: 05/20/2019] [Accepted: 05/27/2019] [Indexed: 11/28/2022]
Abstract
Body movements, as well as faces, communicate emotions. Research in adults has shown that the perception of action kinematics has a crucial role in understanding others' emotional experiences. Still, little is known about infants' sensitivity to body emotional expressions, since most of the research in infancy focused on faces. While there is some first evidence that infants can recognize emotions conveyed in whole-body postures, it is still an open question whether they can extract emotional information from action kinematics. We measured electromyographic (EMG) activity over the muscles involved in happy (zygomaticus major, ZM), angry (corrugator supercilii, CS) and fearful (frontalis, F) facial expressions, while 11-month-old infants observed the same action performed with either happy or angry kinematics. Results demonstrate that infants responded to angry and happy kinematics with matching facial reactions. In particular, ZM activity increased while CS activity decreased in response to happy kinematics and vice versa for angry kinematics. Our results show for the first time that infants can rely on kinematic information to pick up on the emotional content of an action. Thus, from very early in life, action kinematics represent a fundamental and powerful source of information in revealing others' emotional state.
Collapse
Affiliation(s)
- Margaret Addabbo
- Department of Psychology, University of Milano-Bicocca, Milano, Italy
| | - Stefania V Vacaru
- Donders Institute for Brain, Cognition, and Behaviour, Radbound University, Nijmegen, The Netherlands
| | - Marlene Meyer
- Department of Psychology, University of Chicago, Chicago, Illinois
| | - Sabine Hunnius
- Donders Institute for Brain, Cognition, and Behaviour, Radbound University, Nijmegen, The Netherlands
| |
Collapse
|
128
|
Siqi-Liu A, Harris AM, Atkinson AP, Reed CL. Dissociable processing of emotional and neutral body movements revealed by μ-alpha and beta rhythms. Soc Cogn Affect Neurosci 2019; 13:1269-1279. [PMID: 30351422 PMCID: PMC6277737 DOI: 10.1093/scan/nsy094] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2018] [Accepted: 10/18/2018] [Indexed: 12/23/2022] Open
Abstract
Both when actions are executed and observed, electroencephalography (EEG) has shown reduced alpha-band (8–12 Hz) oscillations over sensorimotor cortex. This ‘μ-alpha’ suppression is thought to reflect mental simulation of action, which has been argued to support internal representation of others’ emotional states. Despite the proposed role of simulation in emotion perception, little is known about the effect of emotional content on μ-suppression. We recorded high-density EEG while participants viewed point-light displays of emotional vs neutral body movements in ‘coherent’ biologically plausible and ‘scrambled’ configurations. Although coherent relative to scrambled stimuli elicited μ-alpha suppression, the comparison of emotional and neutral movement, controlling for basic visual input, revealed suppression effects in both alpha and beta bands. Whereas alpha-band activity reflected reduced power for emotional stimuli in central and occipital sensors, beta power at frontocentral sites was driven by enhancement for neutral relative to emotional actions. A median-split by autism-spectrum quotient score revealed weaker μ-alpha suppression and beta enhancement in participants with autistic tendencies, suggesting that sensorimotor simulation may be differentially engaged depending on social capabilities. Consistent with theories of embodied emotion, these data support a link between simulation and social perception while more firmly connecting emotional processing to the activity of sensorimotor systems.
Collapse
Affiliation(s)
- Audrey Siqi-Liu
- Department of Psychology, Claremont McKenna College, Claremont, CA, USA
| | - Alison M Harris
- Department of Psychology, Claremont McKenna College, Claremont, CA, USA
| | | | - Catherine L Reed
- Department of Psychology, Claremont McKenna College, Claremont, CA, USA
| |
Collapse
|
129
|
Darke H, Cropper SJ, Carter O. A Novel Dynamic Morphed Stimuli Set to Assess Sensitivity to Identity and Emotion Attributes in Faces. Front Psychol 2019; 10:757. [PMID: 31024397 PMCID: PMC6465610 DOI: 10.3389/fpsyg.2019.00757] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2018] [Accepted: 03/19/2019] [Indexed: 11/13/2022] Open
Abstract
Face-based tasks are used ubiquitously in the study of human perception and cognition. Video-based (dynamic) face stimuli are increasingly utilized by researchers because they have higher ecological validity than static images. However, there are few ready-to-use dynamic stimulus sets currently available to researchers that include non-emotional and non-face control stimuli. This paper outlines the development of three original dynamic stimulus sets: a set of emotional faces (fear and disgust), a set of non-emotional faces, and a set of car animations. Morphing software was employed to vary the intensity of the expression shown and to vary the similarity between actors. Manipulating these dimensions permits us to create tasks of varying difficulty that can be optimized to detect more subtle differences in face-processing ability. Using these new stimuli, two preliminary experiments were conducted to evaluate different aspects of facial identity recognition, emotion recognition, and non-face object discrimination. Results suggest that these five different tasks successfully avoided floor and ceiling effects in a healthy sample. A second experiment found that dynamic versions of the emotional stimuli were recognized more accurately than static versions, both for labeling, and discrimination paradigms. This indicates that, like previous emotion-only stimuli sets, the use of dynamic stimuli confers an advantage over image-based stimuli. These stimuli therefore provide a useful resource for researchers looking to investigate both emotional and non-emotional face-processing using dynamic stimuli. Moreover, these stimuli vary across crucial dimensions (i.e., face similarity and intensity of emotion) which allows researchers to modify task difficulty as required.
Collapse
Affiliation(s)
- Hayley Darke
- Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Simon J Cropper
- Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, VIC, Australia
| | - Olivia Carter
- Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, VIC, Australia
| |
Collapse
|
130
|
Tsachor RP, Shafir T. How Shall I Count the Ways? A Method for Quantifying the Qualitative Aspects of Unscripted Movement With Laban Movement Analysis. Front Psychol 2019; 10:572. [PMID: 31001158 PMCID: PMC6455080 DOI: 10.3389/fpsyg.2019.00572] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2018] [Accepted: 02/28/2019] [Indexed: 12/30/2022] Open
Abstract
There is significant clinical evidence showing that creative and expressive movement processes involved in dance/movement therapy (DMT) enhance psycho-social well-being. Yet, because movement is a complex phenomenon, statistically validating which aspects of movement change during interventions or lead to significant positive therapeutic outcomes is challenging because movement has multiple, overlapping variables appearing in unique patterns in different individuals and situations. One factor contributing to the therapeutic effects of DMT is movement's effect on clients' emotional states. Our previous study identified sets of movement variables which, when executed, enhanced specific emotions. In this paper, we describe how we selected movement variables for statistical analysis in that study, using a multi-stage methodology to identify, reduce, code, and quantify the multitude of variables present in unscripted movement. We suggest a set of procedures for using Laban Movement Analysis (LMA)-described movement variables as research data. Our study used LMA, an internationally accepted comprehensive system for movement analysis, and a primary DMT clinical assessment tool for describing movement. We began with Davis's (1970) three-stepped protocol for analyzing movement patterns and identifying the most important variables: (1) We repeatedly observed video samples of validated (Atkinson et al., 2004) emotional expressions to identify prevalent movement variables, eliminating variables appearing minimally or absent. (2) We use the criteria repetition, frequency, duration and emphasis to eliminate additional variables. (3) For each emotion, we analyzed motor expression variations to discover how variables cluster: first, by observing ten movement samples of each emotion to identify variables common to all samples; second, by qualitative analysis of the two best-recognized samples to determine if phrasing, duration or relationship among variables was significant. We added three new steps to this protocol: (4) we created Motifs (LMA symbols) combining movement variables extracted in steps 1-3; (5) we asked participants in the pilot study to move these combinations and quantify their emotional experience. Based on the results of the pilot study, we eliminated more variables; (6) we quantified the remaining variables' prevalence in each Motif for statistical analysis that examined which variables enhanced each emotion. We posit that our method successfully quantified unscripted movement data for statistical analysis.
Collapse
Affiliation(s)
| | - Tal Shafir
- The Emili Sagol Creative Arts Therapies Research Center, University of Haifa, Haifa, Israel
- Department of Psychiatry, University of Michigan, Ann Arbor, MI, United States
| |
Collapse
|
131
|
Wilson S, Gos C. Perceiving Social Cohesion: Movement Synchrony and Task Demands Both Matter. Perception 2019; 48:316-329. [PMID: 30871427 DOI: 10.1177/0301006619837878] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
Previous research has shown that interpersonal synchrony is associated with a number of prosocial effects. We investigated the respective roles of behavioural synchrony and perceived task demands on perceptions of cohesion by performing two experiments in which participants viewed pairs of point-light figures engaging in four coordinated behaviours. Behaviours were seen twice, once in perfect in-phase synchrony and once with synchrony manipulated (phase shift: 180° in Experiment 1 and 45°, 90°, 270°, and 315° in Experiment 2). Dyads were rated on perceived exertion and perceived social cohesion. Results indicate that in-phase synchrony is associated with higher levels of perceived cohesion and that perceived exertion is a good predictor of cohesion ratings. Two interactions suggest that the effect is not purely perceptual and that participants observing coordinated movement also make inferences about the intentions of those observed. Results are discussed and future directions suggested.
Collapse
Affiliation(s)
- Stuart Wilson
- Division of Psychology, Sociology & Education, Queen Margaret University, Edinburgh, UK
| | - Caroline Gos
- Division of Psychology, Sociology & Education, Queen Margaret University, Edinburgh, UK
| |
Collapse
|
132
|
Dockendorff M, Sebanz N, Knoblich G. Deviations from optimality should be an integral part of a working definition of SMC. Phys Life Rev 2019; 28:22-23. [DOI: 10.1016/j.plrev.2019.01.010] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Accepted: 01/24/2019] [Indexed: 10/27/2022]
|
133
|
Swarbrick D, Bosnyak D, Livingstone SR, Bansal J, Marsh-Rollo S, Woolhouse MH, Trainor LJ. How Live Music Moves Us: Head Movement Differences in Audiences to Live Versus Recorded Music. Front Psychol 2019; 9:2682. [PMID: 30687158 PMCID: PMC6336707 DOI: 10.3389/fpsyg.2018.02682] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2018] [Accepted: 12/13/2018] [Indexed: 11/13/2022] Open
Abstract
A live music concert is a pleasurable social event that is among the most visceral and memorable forms of musical engagement. But what inspires listeners to attend concerts, sometimes at great expense, when they could listen to recordings at home? An iconic aspect of popular concerts is engaging with other audience members through moving to the music. Head movements, in particular, reflect emotion and have social consequences when experienced with others. Previous studies have explored the affiliative social engagement experienced among people moving together to music. But live concerts have other features that might also be important, such as that during a live performance the music unfolds in a unique and not predetermined way, potentially increasing anticipation and feelings of involvement for the audience. Being in the same space as the musicians might also be exciting. Here we controlled for simply being in an audience to examine whether factors inherent to live performance contribute to the concert experience. We used motion capture to compare head movement responses at a live album release concert featuring Canadian rock star Ian Fletcher Thornley, and at a concert without the performers where the same songs were played from the recorded album. We also examined effects of a prior connection with the performers by comparing fans and neutral-listeners, while controlling for familiarity with the songs, as the album had not yet been released. Head movements were faster during the live concert than the album-playback concert. Self-reported fans moved faster and exhibited greater levels of rhythmic entrainment than neutral-listeners. These results indicate that live music engages listeners to a greater extent than pre-recorded music and that a pre-existing admiration for the performers also leads to higher engagement.
Collapse
Affiliation(s)
- Dana Swarbrick
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada.,McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada
| | - Dan Bosnyak
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada.,McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada
| | - Steven R Livingstone
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada.,McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada
| | - Jotthi Bansal
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada.,McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada.,Digital Music Lab, School of the Arts, McMaster University, Hamilton, ON, Canada
| | - Susan Marsh-Rollo
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada.,McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada
| | - Matthew H Woolhouse
- McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada.,Digital Music Lab, School of the Arts, McMaster University, Hamilton, ON, Canada
| | - Laurel J Trainor
- Department of Psychology, Neuroscience & Behaviour, McMaster University, Hamilton, ON, Canada.,McMaster Institute for Music and the Mind, McMaster University, Hamilton, ON, Canada.,Rotman Research Institute, Baycrest Hospital, Toronto, ON, Canada
| |
Collapse
|
134
|
Betz N, Hoemann K, Barrett LF. Words are a context for mental inference. ACTA ACUST UNITED AC 2019; 19:1463-1477. [PMID: 30628815 DOI: 10.1037/emo0000510] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Accumulating evidence indicates that context has an important impact on inferring emotion in facial configurations. In this paper, we report on three studies examining whether words referring to mental states contribute to mental inference in images from the Reading the Mind in the Eyes Test (Study 1), Baron-Cohen et al. (2001) in static emoji (Study 2), and in animated emoji (Study 3). Across all three studies, we predicted and found that perceivers were more likely to infer mental states when relevant words were embedded in the experimental context (i.e., in a forced-choice task) versus when those words were absent (i.e., in a free-labeling task). We discuss the implications of these findings for the widespread conclusion that faces or parts of faces "display" emotions or other mental states, as well as for psychology's continued reliance on forced-choice methods. (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Collapse
|
135
|
Ferrari C, Papagno C, Todorov A, Cattaneo Z. Differences in Emotion Recognition From Body and Face Cues Between Deaf and Hearing Individuals. Multisens Res 2019; 32:499-519. [PMID: 31117046 DOI: 10.1163/22134808-20191353] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2018] [Accepted: 04/05/2019] [Indexed: 11/19/2022]
Abstract
Deaf individuals may compensate for the lack of the auditory input by showing enhanced capacities in certain visual tasks. Here we assessed whether this also applies to recognition of emotions expressed by bodily and facial cues. In Experiment 1, we compared deaf participants and hearing controls in a task measuring recognition of the six basic emotions expressed by actors in a series of video-clips in which either the face, the body, or both the face and body were visible. In Experiment 2, we measured the weight of body and face cues in conveying emotional information when intense genuine emotions are expressed, a situation in which face expressions alone may have ambiguous valence. We found that deaf individuals were better at identifying disgust and fear from body cues (Experiment 1) and in integrating face and body cues in case of intense negative genuine emotions (Experiment 2). Our findings support the capacity of deaf individuals to compensate for the lack of the auditory input enhancing perceptual and attentional capacities in the spared modalities, showing that this capacity extends to the affective domain.
Collapse
Affiliation(s)
- Chiara Ferrari
- 1Department of Psychology, University of Milano-Bicocca, Milan 20126, Italy
| | - Costanza Papagno
- 1Department of Psychology, University of Milano-Bicocca, Milan 20126, Italy.,2CeRiN and CIMeC, University of Trento, Rovereto 38068, Italy
| | | | - Zaira Cattaneo
- 1Department of Psychology, University of Milano-Bicocca, Milan 20126, Italy.,4IRCCS Mondino Foundation, Pavia, Italy
| |
Collapse
|
136
|
Sliwa J, Takahashi D, Shepherd S. Mécanismes neuronaux pour la communication chez les primates. REVUE DE PRIMATOLOGIE 2018. [DOI: 10.4000/primatologie.2950] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
|
137
|
Christensen JF, Lambrechts A, Tsakiris M. The Warburg Dance Movement Library-The WADAMO Library: A Validation Study. Perception 2018; 48:26-57. [PMID: 30558474 DOI: 10.1177/0301006618816631] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The Warburg Dance Movement Library is a validated set of 234 video clips of dance movements for empirical research in the fields of cognitive science and neuroscience of action perception, affect perception and neuroaesthetics. The library contains two categories of video clips of dance movement sequences. Of each pair, one version of the movement sequence is emotionally expressive (Clip a), while the other version of the same sequence (Clip b) is not expressive but as technically correct as the expressive version (Clip a). We sought to complement previous dance video stimuli libraries. Facial information, colour and music have been removed, and each clip has been faded in and out. We equalised stimulus length (6 seconds, 8 counts in dance theory), the dancers' clothing and video background and included both male and female dancers, and we controlled for technical correctness of movement execution. The Warburg Dance Movement Library contains both contemporary and ballet movements. Two online surveys ( N = 160) confirmed the classification into the two categories of expressivity. Four additional online surveys ( N = 80) provided beauty and liking ratings for each clip. A correlation matrix illustrates all variables of this norming study (technical correctness, expressivity, beauty, liking, luminance, motion energy).
Collapse
Affiliation(s)
| | - Anna Lambrechts
- Autism Research Group, Department of Psychology City, University of London, UK
| | - Manos Tsakiris
- The Warburg Institute, University of London, UK; Department of Psychology, Royal Holloway, University of London, UK
| |
Collapse
|
138
|
Hortensius R, Hekele F, Cross ES. The Perception of Emotion in Artificial Agents. IEEE Trans Cogn Dev Syst 2018. [DOI: 10.1109/tcds.2018.2826921] [Citation(s) in RCA: 54] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
139
|
Bracci S, Caramazza A, Peelen MV. View-invariant representation of hand postures in the human lateral occipitotemporal cortex. Neuroimage 2018; 181:446-452. [DOI: 10.1016/j.neuroimage.2018.07.001] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2018] [Revised: 05/28/2018] [Accepted: 07/01/2018] [Indexed: 12/14/2022] Open
|
140
|
Lee JM, Baek J, Ju DY. Anthropomorphic Design: Emotional Perception for Deformable Object. Front Psychol 2018; 9:1829. [PMID: 30333773 PMCID: PMC6175972 DOI: 10.3389/fpsyg.2018.01829] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2018] [Accepted: 09/07/2018] [Indexed: 11/21/2022] Open
Abstract
Despite the increasing number of studies on user experience (UX) and user interfaces (UI), few studies have examined emotional interaction between humans and deformable objects. In the current study, we investigated how the anthropomorphic design of a flexible display interacts with emotion. For 101 unique 3D images in which an object was bent at different axes, 281 participants were asked to report how strongly the object evoked five elemental emotions (e.g., happiness, disgust, anger, fear, and sadness) in an online survey. People rated the object’s shape using three emotional categories: happiness, disgust–anger, and sadness–fear. It was also found that a combination of axis of bending (horizontal or diagonal axis) and convexity (bending convexly or concavely) predicted emotional valence, underpinning the anthropomorphic design of flexible displays. Our findings provide empirical evidence that axis of bending and convexity can be an important antecedent of emotional interaction with flexible objects, triggering at least three types of emotion in users.
Collapse
Affiliation(s)
- Jung Min Lee
- Technology and Design Research Center, Yonsei Institute of Convergence Technology, Yonsei University, Incheon, South Korea
| | - Jongsoo Baek
- Yonsei Institute of Convergence Technology, Yonsei University, Incheon, South Korea
| | - Da Young Ju
- Technology and Design Research Center, Yonsei Institute of Convergence Technology, Yonsei University, Incheon, South Korea
| |
Collapse
|
141
|
Bläsing BE, Sauzet O. My Action, My Self: Recognition of Self-Created but Visually Unfamiliar Dance-Like Actions From Point-Light Displays. Front Psychol 2018; 9:1909. [PMID: 30459668 PMCID: PMC6232674 DOI: 10.3389/fpsyg.2018.01909] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2018] [Accepted: 09/18/2018] [Indexed: 11/13/2022] Open
Abstract
Previous research has shown that motor experience of an action can facilitate the visual recognition of that action, even in the absence of visual experience. We conducted an experiment in which participants were presented point-light displays of dance-like actions that had been recorded with the same group of participants during a previous session. The stimuli had been produced with the participant in such a way that each participant experienced a subset of phrases only as observer, learnt two phrases from observation, and created one phrase while blindfolded. The clips presented in the recognition task showed movements that were either unfamiliar, only visually familiar, familiar from observational learning and execution, or self-created while blind-folded (and hence not visually familiar). Participants assigned all types of movements correctly to the respective categories, showing that all three ways of experiencing the movement (observed, learnt through observation and practice, and created blindfolded) resulted in an encoding that was adequate for recognition. Observed movements showed the lowest level of recognition accuracy, whereas the accuracy of assigning blindfolded self-created movements was on the same level as for unfamiliar and learnt movements. Self-recognition was modulated by action recognition, as participants were more likely to identify themselves as the actor in clips they had assigned to the category "created" than in clips they had assigned to the category "learnt," supporting the idea of an influence of agency on self-recognition.
Collapse
Affiliation(s)
- Bettina E. Bläsing
- Neurocognition and Action – Biomechanics Research Group, Faculty of Psychology and Sport Science, Bielefeld University, Bielefeld, Germany
- Center of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Odile Sauzet
- Bielefeld School of Public Health/AG 3 Epidemiology & International Public Health, Bielefeld University, Bielefeld, Germany
- StatBeCe, Center for Statistics, Bielefeld University, Bielefeld, Germany
| |
Collapse
|
142
|
Carreno-Medrano P, Gibet S, Marteau PF. Perceptual Validation for the Generation of Expressive Movements from End-Effector Trajectories. ACM T INTERACT INTEL 2018. [DOI: 10.1145/3150976] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
Endowing animated virtual characters with emotionally expressive behaviors is paramount to improving the quality of the interactions between humans and virtual characters. Full-body motion, in particular, with its subtle kinematic variations, represents an effective way of conveying emotionally expressive content. However, before synthesizing expressive full-body movements, it is necessary to identify and understand what qualities of human motion are salient to the perception of emotions and how these qualities can be exploited to generate novel and equally expressive full-body movements. Based on previous studies, we argue that it is possible to perceive and generate expressive full-body movements from a limited set of joint trajectories, including end-effector trajectories and additional constraints such as pelvis and elbow trajectories. Hence, these selected trajectories define a significant and reduced motion space, which is adequate for the characterization of the expressive qualities of human motion and that is both suitable for the analysis and generation of emotionally expressive full-body movements. The purpose and main contribution of this work is the methodological framework we defined and used to assess the validity and applicability of the selected trajectories for the perception and generation of expressive full-body movements. This framework consists of the creation of a motion capture database of expressive theatrical movements, the development of a motion synthesis system based on trajectories re-played or re-sampled and inverse kinematics, and two perceptual studies.
Collapse
|
143
|
Bachmann J, Munzert J, Krüger B. Neural Underpinnings of the Perception of Emotional States Derived From Biological Human Motion: A Review of Neuroimaging Research. Front Psychol 2018; 9:1763. [PMID: 30298036 PMCID: PMC6160569 DOI: 10.3389/fpsyg.2018.01763] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2018] [Accepted: 08/31/2018] [Indexed: 12/20/2022] Open
Abstract
Research on the perception of biological human motion shows that people are able to infer emotional states by observing body movements. This article reviews the methodology applied in fMRI research on the neural representation of such emotion perception. Specifically, we ask how different stimulus qualities of bodily expressions, individual emotional valence, and task instructions may affect the neural representation of an emotional scene. The review demonstrates the involvement of a variety of brain areas, thereby indicating how well the human brain is adjusted to navigate in multiple social situations. All stimulus categories (i.e., full-light body displays, point-light displays, and avatars) can induce an emotional percept and are associated with increased activation in an extensive neural network. This network seems to be organized around areas belonging to the so-called action observation network (PMC, IFG, and IPL) and the mentalizing network (TPJ, TP, dmPFC, and lOFC) as well as areas processing body form and motion (e.g., EBA, FBA, and pSTS). Furthermore, emotion-processing brain sites such as the amygdala and the hypothalamus seem to play an important role during the observation of emotional body expressions. Whereas most brain regions clearly display an increased response to emotional body movements in general, some structures respond selectively to negative valence. Moreover, neural activation seems to depend on task characteristics, indicating that certain structures are activated even when attention is shifted away from emotional body movements.
Collapse
Affiliation(s)
- Julia Bachmann
- Neuromotor Behavior Laboratory, Department of Psychology and Sport Science, Justus-Liebig-University Giessen, Giessen, Germany
| | - Jörn Munzert
- Neuromotor Behavior Laboratory, Department of Psychology and Sport Science, Justus-Liebig-University Giessen, Giessen, Germany
| | - Britta Krüger
- Neuromotor Behavior Laboratory, Department of Psychology and Sport Science, Justus-Liebig-University Giessen, Giessen, Germany
| |
Collapse
|
144
|
Social Cognition through the Lens of Cognitive and Clinical Neuroscience. BIOMED RESEARCH INTERNATIONAL 2018; 2018:4283427. [PMID: 30302338 PMCID: PMC6158937 DOI: 10.1155/2018/4283427] [Citation(s) in RCA: 65] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/15/2018] [Accepted: 08/13/2018] [Indexed: 12/13/2022]
Abstract
Social cognition refers to a set of processes, ranging from perception to decision-making, underlying the ability to decode others' intentions and behaviors to plan actions fitting with social and moral, besides individual and economic considerations. Its centrality in everyday life reflects the neural complexity of social processing and the ubiquity of social cognitive deficits in different pathological conditions. Social cognitive processes can be clustered in three domains associated with (a) perceptual processing of social information such as faces and emotional expressions (social perception), (b) grasping others' cognitive or affective states (social understanding), and (c) planning behaviors taking into consideration others', in addition to one's own, goals (social decision-making). We review these domains from the lens of cognitive neuroscience, i.e., in terms of the brain areas mediating the role of such processes in the ability to make sense of others' behavior and plan socially appropriate actions. The increasing evidence on the “social brain” obtained from healthy young individuals nowadays constitutes the baseline for detecting changes in social cognitive skills associated with physiological aging or pathological conditions. In the latter case, impairments in one or more of the abovementioned domains represent a prominent concern, or even a core facet, of neurological (e.g., acquired brain injury or neurodegenerative diseases), psychiatric (e.g., schizophrenia), and developmental (e.g., autism) disorders. To pave the way for the other papers of this issue, addressing the social cognitive deficits associated with severe acquired brain injury, we will briefly discuss the available evidence on the status of social cognition in normal aging and its breakdown in neurodegenerative disorders. Although the assessment and treatment of such impairments is a relatively novel sector in neurorehabilitation, the evidence summarized here strongly suggests that the development of remediation procedures for social cognitive skills will represent a future field of translational research in clinical neuroscience.
Collapse
|
145
|
Abstract
The study of biological point-light displays (PLDs) has fascinated researchers for more than 40 years. However, the mechanisms underlying PLD perception remain unclear, partly due to difficulties with precisely controlling and transforming PLD sequences. Furthermore, little agreement exists regarding how transformations are performed. This article introduces a new free-access program called PLAViMoP (Point-Light Display Visualization and Modification Platform) and presents the algorithms for PLD transformations actually included in the software. PLAViMoP fulfills two objectives. First, it standardizes and makes clear many classical spatial and kinematic transformations described in the PLD literature. Furthermore, given its optimized interface, PLAViMOP makes these transformations easy and fast to achieve. Overall, PLAViMoP could directly help scientists avoid technical difficulties and make possible the use of PLDs for nonacademic applications.
Collapse
|
146
|
Mousas C, Anastasiou D, Spantidi O. The effects of appearance and motion of virtual characters on emotional reactivity. COMPUTERS IN HUMAN BEHAVIOR 2018. [DOI: 10.1016/j.chb.2018.04.036] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
147
|
Jimenez AM, Lee J, Reavis EA, Wynn JK, Green MF. Aberrant patterns of neural activity when perceiving emotion from biological motion in schizophrenia. NEUROIMAGE-CLINICAL 2018; 20:380-387. [PMID: 30128276 PMCID: PMC6095949 DOI: 10.1016/j.nicl.2018.08.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/01/2018] [Revised: 07/12/2018] [Accepted: 08/08/2018] [Indexed: 11/30/2022]
Abstract
Social perceptual deficits in schizophrenia are well established. Recent work suggests that the ability to extract social information from bodily cues is reduced in patients. However, little is known about the neurobiological mechanisms underlying this deficit. In the current study, 20 schizophrenia patients and 16 controls completed two tasks using point-light animations during fMRI: a basic biological motion task and an emotion in biological motion task. The basic biological motion task was used to localize activity in posterior superior temporal sulcus (pSTS), a critical region for biological motion perception. During the emotion in biological motion task, participants viewed brief videos depicting happiness, fear, anger, or neutral emotions and were asked to decide which emotion was portrayed. Activity in pSTS and amygdala was interrogated during this task. Results indicated that patients showed overall reduced activation compared to controls in pSTS and at a trend level in amygdala across emotions, despite similar task performance. Further, a functional connectivity analysis revealed that controls, but not patients, showed significant positive connectivity between pSTS and left frontal regions as well as bilateral angular gyrus during the emotion in biological motion task. These findings indicate that schizophrenia patients show aberrant neural activity and functional connectivity when extracting complex social information from simple motion stimuli, which may contribute to social perception deficits in this disorder. Perception of social information from bodily cues is impaired in schizophrenia. We examined neural correlates of perception of emotion from biological motion. Activity in amygdala and posterior superior temporal sulcus was reduced in patients. pSTS functional connectivity with frontal and parietal regions was reduced in patients. Aberrant neural responses may contribute to social perceptual deficits in schizophrenia.
Collapse
Affiliation(s)
- Amy M Jimenez
- Desert Pacific MIRECC, VA Greater Los Angeles Healthcare System, 11301 Wilshire Blvd., Los Angeles, CA 90073, USA; Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, 405 Hilgard Ave., Los Angeles, CA 90095, USA.
| | - Junghee Lee
- Desert Pacific MIRECC, VA Greater Los Angeles Healthcare System, 11301 Wilshire Blvd., Los Angeles, CA 90073, USA; Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, 405 Hilgard Ave., Los Angeles, CA 90095, USA
| | - Eric A Reavis
- Desert Pacific MIRECC, VA Greater Los Angeles Healthcare System, 11301 Wilshire Blvd., Los Angeles, CA 90073, USA; Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, 405 Hilgard Ave., Los Angeles, CA 90095, USA
| | - Jonathan K Wynn
- Desert Pacific MIRECC, VA Greater Los Angeles Healthcare System, 11301 Wilshire Blvd., Los Angeles, CA 90073, USA; Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, 405 Hilgard Ave., Los Angeles, CA 90095, USA
| | - Michael F Green
- Desert Pacific MIRECC, VA Greater Los Angeles Healthcare System, 11301 Wilshire Blvd., Los Angeles, CA 90073, USA; Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, 405 Hilgard Ave., Los Angeles, CA 90095, USA
| |
Collapse
|
148
|
Kawai Y, Nagai Y, Asada M. Prediction Error in the PMd As a Criterion for Biological Motion Discrimination: A Computational Account. IEEE Trans Cogn Dev Syst 2018. [DOI: 10.1109/tcds.2017.2668446] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
149
|
Mortillaro M, Dukes D. Jumping for Joy: The Importance of the Body and of Dynamics in the Expression and Recognition of Positive Emotions. Front Psychol 2018; 9:763. [PMID: 29867704 PMCID: PMC5962906 DOI: 10.3389/fpsyg.2018.00763] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2018] [Accepted: 04/30/2018] [Indexed: 11/15/2022] Open
Abstract
The majority of research on emotion expression has focused on static facial prototypes of a few selected, mostly negative emotions. Implicitly, most researchers seem to have considered all positive emotions as sharing one common signal (namely, the smile), and consequently as being largely indistinguishable from each other in terms of expression. Recently, a new wave of studies has started to challenge the traditional assumption by considering the role of multiple modalities and the dynamics in the expression and recognition of positive emotions. Based on these recent studies, we suggest that positive emotions are better expressed and correctly perceived when (a) they are communicated simultaneously through the face and body and (b) perceivers have access to dynamic stimuli. Notably, we argue that this improvement is comparatively more important for positive emotions than for negative emotions. Our view is that the misperception of positive emotions has fewer immediate and potentially life-threatening consequences than the misperception of negative emotions; therefore, from an evolutionary perspective, there was only limited benefit in the development of clear, quick signals that allow observers to draw fine distinctions between them. Consequently, we suggest that the successful communication of positive emotions requires a stronger signal than that of negative emotions, and that this signal is provided by the use of the body and the way those movements unfold. We hope our contribution to this growing field provides a new direction and a theoretical grounding for the many lines of empirical research on the expression and recognition of positive emotions.
Collapse
Affiliation(s)
- Marcello Mortillaro
- Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
| | - Daniel Dukes
- Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland
- Psychology Research Institute, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
150
|
Witkower Z, Tracy JL. Bodily Communication of Emotion: Evidence for Extrafacial Behavioral Expressions and Available Coding Systems. EMOTION REVIEW 2018. [DOI: 10.1177/1754073917749880] [Citation(s) in RCA: 40] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Although scientists dating back to Darwin have noted the importance of the body in communicating emotion, current research on emotion communication tends to emphasize the face. In this article we review the evidence for bodily expressions of emotions—that is, the handful of emotions that are displayed and recognized from certain bodily behaviors (i.e., pride, joy, sadness, shame, embarrassment, anger, fear, and disgust). We also review the previously developed coding systems available for identifying emotions from bodily behaviors. Although no extant coding system provides an exhaustive list of bodily behaviors known to communicate a panoply of emotions, our review provides the foundation for developing such a system.
Collapse
Affiliation(s)
- Zachary Witkower
- Department of Psychology, University of British Columbia, Canada
| | - Jessica L. Tracy
- Department of Psychology, University of British Columbia, Canada
| |
Collapse
|