1
|
Kramer M, Hirsch D, Sacic A, Sader A, Willms J, Juckel G, Mavrogiorgou P. AI-enhanced analysis of naturalistic social interactions characterizes interaffective impairments in schizophrenia. J Psychiatr Res 2024; 178:210-218. [PMID: 39153454 DOI: 10.1016/j.jpsychires.2024.08.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/22/2024] [Revised: 07/27/2024] [Accepted: 08/09/2024] [Indexed: 08/19/2024]
Abstract
Social deficits in schizophrenia have been attributed to an impaired attunement to mutual interaction, or "interaffectivity". While impairments in emotion recognition and facial expressivity in schizophrenia have been consistently reported, findings on mimicry and social synchrony are inconsistent, and previous studies have often lacked ecological validity. To investigate interaffective behavior in dyadic interactions in a real-world-like setting, 20 individuals with schizophrenia and 20 without mental disorder played a cooperative board game with a previously unacquainted healthy control participant. Facial expression analysis was conducted using Affectiva Emotion AI in iMotions 9.3. The contingency and state space distribution of emotional facial expressions was assessed using Mangold INTERACT. Psychotic symptoms, subjective stress, affectivity and game experience were evaluated through questionnaires. Due to a considerable between-group age difference, age-adjusted ANCOVA was performed. Overall, despite an unchanged subjective experience of the social interaction, individuals with schizophrenia exhibited reduced responsiveness to positive affective stimuli. Subjective game experience did not differ between groups. Descriptively, facial expressions in schizophrenia were generally more negative, with increased sadness and decreased joy. Facial mimicry was impaired specifically regarding joyful expressions in schizophrenia, which correlated with blunted affect as measured by the SANS. Dyadic interactions involving persons with schizophrenia were less attracted toward mutual joyful affective states. Only unadjusted for age, in the absence of emotional stimuli from their interaction partner, individuals with schizophrenia showed more angry and sad expressions. These impairments in interaffective processes may contribute to social dysfunction in schizophrenia and provide new avenues for future research.
Collapse
Affiliation(s)
- Marco Kramer
- Dept. of Psychiatry, LWL University Hospital, Ruhr University Bochum, Germany.
| | - Dustin Hirsch
- Dept. of Psychiatry, LWL University Hospital, Ruhr University Bochum, Germany
| | - Anesa Sacic
- Dept. of Psychiatry, LWL University Hospital, Ruhr University Bochum, Germany
| | - Alice Sader
- Dept. of Psychiatry, LWL University Hospital, Ruhr University Bochum, Germany
| | - Julien Willms
- Dept. of Psychiatry, LWL University Hospital, Ruhr University Bochum, Germany
| | - Georg Juckel
- Dept. of Psychiatry, LWL University Hospital, Ruhr University Bochum, Germany
| | | |
Collapse
|
2
|
Gury P, Moulin M, Laroye R, Trachino M, Montazel M, Narme P, Ehrlé N. Happy facial emotional congruence in patients with relapsing-remitting multiple sclerosis. J Clin Exp Neuropsychol 2024; 46:644-654. [PMID: 39140395 DOI: 10.1080/13803395.2024.2391362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2024] [Accepted: 08/07/2024] [Indexed: 08/15/2024]
Abstract
BACKGROUND Emotion categorization has often been studied in the relapsing-remitting form of multiple sclerosis (RR-MS), suggesting an impairment in the recognition of emotions. The production of facial emotional expressions in RR-MS has not been considered, despite their importance in non-verbal communication. METHOD Twenty-five RR-MS patients and twenty-five matched controls completed a task of emotional categorization during which their faces were filmed. The stimuli were dynamic (sound or visual), expressed by adults (women or men), and expressing happy (laughing or smiling) or negative emotion. Two independent blinded raters quantified the happy facial expressions produced. The categorization task was used as a proxy for emotional categorization, while the happy facial expressions produced assessed the production of emotions. RESULTS The main analysis indicated impaired categorization of RR-MS for happy stimuli selectively, whereas their happy facial expressions were not statistically different from those of the control group. More specifically, this group effect was found for smiles (and not laughter) and for happy stimuli produced by men. Analysis of individual patient profiles suggested that 77% of patients with impaired judgments produced normal facial expressions, suggesting a high prevalence of this dissociation. Only 8% of our samples showed reverse dissociation, with happy facial expressions significantly different from those of the control group and normal emotional judgments. CONCLUSION These results corroborated the high prevalence of emotional categorization impairment in RR-MS but not for negative stimuli, which can probably be explained by the methodological specificities of the present work. The unusual impairment found for happy stimuli (for both emotional categorization and facial congruence) may be linked to the intensity of the perceived happy expressions but not to the emotional valence. Our results also indicated a mainly preserved production of facial emotions, which may be used in the future sociocognitive care of RR-MS patients with impaired emotional judgments.
Collapse
Affiliation(s)
- Pauline Gury
- Neurology Department, Maison-Blanche Hospital, Reims, France
- Laboratoire Mémoire Cerveau et Cognition (UR 7536), Université Paris Cité, Boulogne-Billancourt, France
| | | | | | - Marine Trachino
- Neurology Department, Maison-Blanche Hospital, Reims, France
| | - Marine Montazel
- Neurology Department, Maison-Blanche Hospital, Reims, France
| | - Pauline Narme
- Laboratoire Mémoire Cerveau et Cognition (UR 7536), Université Paris Cité, Boulogne-Billancourt, France
| | - Nathalie Ehrlé
- Neurology Department, Maison-Blanche Hospital, Reims, France
- Laboratoire Mémoire Cerveau et Cognition (UR 7536), Université Paris Cité, Boulogne-Billancourt, France
| |
Collapse
|
3
|
Olszanowski M, Tołopiło A. "Anger? No, thank you. I don't mimic it": how contextual modulation of facial display meaning impacts emotional mimicry. Cogn Emot 2024; 38:530-548. [PMID: 38303660 DOI: 10.1080/02699931.2024.2310759] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2023] [Accepted: 01/22/2024] [Indexed: 02/03/2024]
Abstract
Research indicates that emotional mimicry predominantly occurs in response to affiliative displays, such as happiness, while the mimicry of antagonistic displays, like anger, is seldom observed in social contexts. However, contextual factors, including the identity of the displayer (e.g. social similarity with the observer) and whose action triggered the emotional reaction (i.e. to whom display is directed), can modulate the meaning of the display. In two experiments, participants observed happiness, sadness, and anger expressed by individuals with similar or different social attitudes in response to actions from either a participant or another person. Results demonstrated that three manipulated factors - displayer social similarity, whose action caused an emotional display, and the type of emotional display - affected participants' perception of the display. In turn, mimicry was predominantly observed in response to happiness (Experiments 1 and 2), to a lesser extent to sadness (Experiment 1), and not to anger. Furthermore, participants mimicked individuals who were more socially similar (Experiment 1), while whose action caused an emotional reaction did not influence mimicry. The findings suggest that when the context mitigates the meaning of negative or antagonistic facial displays, it does not necessarily increase the inclination to mimic them.
Collapse
Affiliation(s)
- Michal Olszanowski
- Center for Research on Biological Basis of Social Behavior, SWPS University in Warsaw, Warsaw, Poland
| | - Aleksandra Tołopiło
- Center for Research on Biological Basis of Social Behavior, SWPS University in Warsaw, Warsaw, Poland
| |
Collapse
|
4
|
Efthimiou TN, Hernandez MP, Elsenaar A, Mehu M, Korb S. Application of facial neuromuscular electrical stimulation (fNMES) in psychophysiological research: Practical recommendations based on a systematic review of the literature. Behav Res Methods 2024; 56:2941-2976. [PMID: 37864116 PMCID: PMC11133044 DOI: 10.3758/s13428-023-02262-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/29/2023] [Indexed: 10/22/2023]
Abstract
Facial neuromuscular electrical stimulation (fNMES), which allows for the non-invasive and physiologically sound activation of facial muscles, has great potential for investigating fundamental questions in psychology and neuroscience, such as the role of proprioceptive facial feedback in emotion induction and emotion recognition, and may serve for clinical applications, such as alleviating symptoms of depression. However, despite illustrious origins in the 19th-century work of Duchenne de Boulogne, the practical application of fNMES remains largely unknown to today's researchers in psychology. In addition, published studies vary dramatically in the stimulation parameters used, such as stimulation frequency, amplitude, duration, and electrode size, and in the way they reported them. Because fNMES parameters impact the comfort and safety of volunteers, as well as its physiological (and psychological) effects, it is of paramount importance to establish recommendations of good practice and to ensure studies can be better compared and integrated. Here, we provide an introduction to fNMES, systematically review the existing literature focusing on the stimulation parameters used, and offer recommendations on how to safely and reliably deliver fNMES and on how to report the fNMES parameters to allow better cross-study comparison. In addition, we provide a free webpage, to easily visualise fNMES parameters and verify their safety based on current density. As an example of a potential application, we focus on the use of fNMES for the investigation of the facial feedback hypothesis.
Collapse
Affiliation(s)
| | | | - Arthur Elsenaar
- ArtScience Interfaculty, Royal Academy of Art, Royal Conservatory, The Hague, Netherlands
| | - Marc Mehu
- Department of Psychology, Webster Vienna Private University, Vienna, Austria
| | - Sebastian Korb
- Department of Psychology, University of Essex, Colchester, UK.
- Department of Cognition, Emotion, and Methods in Psychology, University of Vienna, Vienna, Austria.
| |
Collapse
|
5
|
Westermann JF, Schäfer R, Nordmann M, Richter P, Müller T, Franz M. Measuring facial mimicry: Affdex vs. EMG. PLoS One 2024; 19:e0290569. [PMID: 38165847 PMCID: PMC10760767 DOI: 10.1371/journal.pone.0290569] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2022] [Accepted: 08/09/2023] [Indexed: 01/04/2024] Open
Abstract
Facial mimicry is the automatic imitation of the facial affect expressions of others. It serves as an important component of interpersonal communication and affective co-experience. Facial mimicry has so far been measured by Electromyography (EMG), which requires a complex measuring apparatus. Recently, software for measuring facial expressions have become available, but it is still unclear how well it is suited for measuring facial mimicry. This study investigates the comparability of the automated facial coding software Affdex with EMG for measuring facial mimicry. For this purpose, facial mimicry was induced in 33 subjects by presenting naturalistic affect-expressive video sequences (anger, joy). The response of the subjects is measured simultaneously by facial EMG (corrugator supercilii muscle, zygomaticus major muscle) and by Affdex (action units lip corner puller and brow lowerer and affects joy and anger). Subsequently, the correlations between the measurement results of EMG and Affdex were calculated. After the presentation of the joy stimulus, there was an increase in zygomaticus muscle activity (EMG) about 400 ms after stimulus onset and an increase in joy and lip corner puller activity (Affdex) about 1200 ms after stimulus onset. The joy and the lip corner puller activity detected by Affdex correlate significantly with the EMG activity. After presentation of the anger stimulus, corrugator muscle activity (EMG) also increased approximately 400 ms after stimulus onset, whereas anger and brow lowerer activity (Affdex) showed no response. During the entire measurement interval, anger activity and brow lowerer activity (Affdex) did not correlate with corrugator muscle activity (EMG). Using Affdex, the facial mimicry response to a joy stimulus can be measured, but it is detected approximately 800 ms later compared to the EMG. Thus, electromyography remains the tool of choice for studying subtle mimic processes like facial mimicry.
Collapse
Affiliation(s)
- Jan-Frederik Westermann
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany
| | - Ralf Schäfer
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany
| | - Marc Nordmann
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany
| | - Peter Richter
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany
| | - Tobias Müller
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany
| | - Matthias Franz
- Medical Faculty, Clinical Institute for Psychosomatic Medicine and Psychotherapy, University Hospital of the Heinrich-Heine-University, Düsseldorf, Germany
| |
Collapse
|
6
|
Merrill J, Ackermann TI, Czepiel A. Effects of disliked music on psychophysiology. Sci Rep 2023; 13:20641. [PMID: 38001083 PMCID: PMC10674009 DOI: 10.1038/s41598-023-46963-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Accepted: 11/07/2023] [Indexed: 11/26/2023] Open
Abstract
While previous research has shown the positive effects of music listening in response to one's favorite music, the negative effects of one's most disliked music have not gained much attention. In the current study, participants listened to three self-selected disliked musical pieces which evoked highly unpleasant feelings. As a contrast, three musical pieces were individually selected for each participant based on neutral liking ratings they provided to other participants' disliked music. During music listening, real-time ratings of subjective (dis)pleasure and simultaneous recordings of peripheral measures were obtained. Results showed that compared to neutral music, listening to disliked music evoked physiological reactions reflecting higher arousal (heart rate, skin conductance response, body temperature), disgust (levator labii muscle), anger (corrugator supercilii muscle), distress and grimacing (zygomaticus major muscle). The differences between conditions were most prominent during "very unpleasant" real-time ratings, showing peak responses for the disliked music. Hence, disliked music has a strenuous effect, as shown in strong physiological arousal responses and facial expression, reflecting the listener's attitude toward the music.
Collapse
Affiliation(s)
- Julia Merrill
- Max Planck Institute for Empirical Aesthetics, Grüneburgweg 14, 60322, Frankfurt am Main, Germany.
- Institute of Music, University of Kassel, Kassel, Germany.
| | - Taren-Ida Ackermann
- Max Planck Institute for Empirical Aesthetics, Grüneburgweg 14, 60322, Frankfurt am Main, Germany
| | - Anna Czepiel
- Max Planck Institute for Empirical Aesthetics, Grüneburgweg 14, 60322, Frankfurt am Main, Germany
| |
Collapse
|
7
|
Guntinas-Lichius O, Trentzsch V, Mueller N, Heinrich M, Kuttenreich AM, Dobel C, Volk GF, Graßme R, Anders C. High-resolution surface electromyographic activities of facial muscles during the six basic emotional expressions in healthy adults: a prospective observational study. Sci Rep 2023; 13:19214. [PMID: 37932337 PMCID: PMC10628297 DOI: 10.1038/s41598-023-45779-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2023] [Accepted: 10/24/2023] [Indexed: 11/08/2023] Open
Abstract
High-resolution facial surface electromyography (HR-sEMG) is suited to discriminate between different facial movements. Whether HR-sEMG also allows a discrimination among the six basic emotions of facial expression is unclear. 36 healthy participants (53% female, 18-67 years) were included for four sessions. Electromyograms were recorded from both sides of the face using a muscle-position oriented electrode application (Fridlund scheme) and by a landmark-oriented, muscle unrelated symmetrical electrode arrangement (Kuramoto scheme) simultaneously on the face. In each session, participants expressed the six basic emotions in response to standardized facial images expressing the corresponding emotions. This was repeated once on the same day. Both sessions were repeated two weeks later to assess repetition effects. HR-sEMG characteristics showed systematic regional distribution patterns of emotional muscle activation for both schemes with very low interindividual variability. Statistical discrimination between the different HR-sEMG patterns was good for both schemes for most but not all basic emotions (ranging from p > 0.05 to mostly p < 0.001) when using HR-sEMG of the entire face. When using information only from the lower face, the Kuramoto scheme allowed a more reliable discrimination of all six emotions (all p < 0.001). A landmark-oriented HR-sEMG recording allows specific discrimination of facial muscle activity patterns during basic emotional expressions.
Collapse
Affiliation(s)
- Orlando Guntinas-Lichius
- Department of Otorhinolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany.
- Facial-Nerve-Center Jena, Jena University Hospital, Jena, Germany.
- Center for Rare Diseases, Jena University Hospital, Jena, Germany.
| | - Vanessa Trentzsch
- Department of Otorhinolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany
- Division Motor Research, Pathophysiology and Biomechanics, Department of Trauma, Hand and Reconstructive Surgery, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Nadiya Mueller
- Department of Otorhinolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany
- Division Motor Research, Pathophysiology and Biomechanics, Department of Trauma, Hand and Reconstructive Surgery, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| | - Martin Heinrich
- Department of Otorhinolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany
- Facial-Nerve-Center Jena, Jena University Hospital, Jena, Germany
- Center for Rare Diseases, Jena University Hospital, Jena, Germany
| | - Anna-Maria Kuttenreich
- Department of Otorhinolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany
- Facial-Nerve-Center Jena, Jena University Hospital, Jena, Germany
- Center for Rare Diseases, Jena University Hospital, Jena, Germany
| | - Christian Dobel
- Department of Otorhinolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany
- Facial-Nerve-Center Jena, Jena University Hospital, Jena, Germany
| | - Gerd Fabian Volk
- Department of Otorhinolaryngology, Jena University Hospital, Friedrich-Schiller-University Jena, Am Klinikum 1, 07747, Jena, Germany
- Facial-Nerve-Center Jena, Jena University Hospital, Jena, Germany
- Center for Rare Diseases, Jena University Hospital, Jena, Germany
| | - Roland Graßme
- Division Motor Research, Pathophysiology and Biomechanics, Department of Trauma, Hand and Reconstructive Surgery, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
- Department of Prevention, Biomechanics, German Social Accident Insurance Institution for the Foodstuffs and Catering Industry, Erfurt, Germany
| | - Christoph Anders
- Division Motor Research, Pathophysiology and Biomechanics, Department of Trauma, Hand and Reconstructive Surgery, Jena University Hospital, Friedrich-Schiller-University Jena, Jena, Germany
| |
Collapse
|
8
|
Czepiel A, Fink LK, Seibert C, Scharinger M, Kotz SA. Aesthetic and physiological effects of naturalistic multimodal music listening. Cognition 2023; 239:105537. [PMID: 37487303 DOI: 10.1016/j.cognition.2023.105537] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2022] [Revised: 05/31/2023] [Accepted: 06/24/2023] [Indexed: 07/26/2023]
Abstract
Compared to audio only (AO) conditions, audiovisual (AV) information can enhance the aesthetic experience of a music performance. However, such beneficial multimodal effects have yet to be studied in naturalistic music performance settings. Further, peripheral physiological correlates of aesthetic experiences are not well-understood. Here, participants were invited to a concert hall for piano performances of Bach, Messiaen, and Beethoven, which were presented in two conditions: AV and AO. They rated their aesthetic experience (AE) after each piece (Experiment 1 and 2), while peripheral signals (cardiorespiratory measures, skin conductance, and facial muscle activity) were continuously measured (Experiment 2). Factor scores of AE were significantly higher in the AV condition in both experiments. LF/HF ratio, a heart rhythm that represents activation of the sympathetic nervous system, was higher in the AO condition, suggesting increased arousal, likely caused by less predictable sound onsets in the AO condition. We present partial evidence that breathing was faster and facial muscle activity was higher in the AV condition, suggesting that observing a performer's movements likely enhances motor mimicry in these more voluntary peripheral measures. Further, zygomaticus ('smiling') muscle activity was a significant predictor of AE. Thus, we suggest physiological measures are related to AE, but at different levels: the more involuntary measures (i.e., heart rhythms) may reflect more sensory aspects, while the more voluntary measures (i.e., muscular control of breathing and facial responses) may reflect the liking aspect of an AE. In summary, we replicate and extend previous findings that AV information enhances AE in a naturalistic music performance setting. We further show that a combination of self-report and peripheral measures benefit a meaningful assessment of AE in naturalistic music performance settings.
Collapse
Affiliation(s)
- Anna Czepiel
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany; Department of Neuropsychology and Psychopharmacology, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands.
| | - Lauren K Fink
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany; Max Planck-NYU Center for Language, Music, and Emotion, Frankfurt am Main, Germany
| | - Christoph Seibert
- Institute for Music Informatics and Musicology, University of Music Karlsruhe, Karlsruhe, Germany
| | - Mathias Scharinger
- Research Group Phonetics, Department of German Linguistics, University of Marburg, Marburg, Germany; Department of Language and Literature, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Sonja A Kotz
- Department of Neuropsychology and Psychopharmacology, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands; Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| |
Collapse
|
9
|
Mahadevan AS, Cornblath EJ, Lydon-Staley DM, Zhou D, Parkes L, Larsen B, Adebimpe A, Kahn AE, Gur RC, Gur RE, Satterthwaite TD, Wolf DH, Bassett DS. Alprazolam modulates persistence energy during emotion processing in first-degree relatives of individuals with schizophrenia: a network control study. Mol Psychiatry 2023; 28:3314-3323. [PMID: 37353585 PMCID: PMC10618098 DOI: 10.1038/s41380-023-02121-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/12/2022] [Revised: 04/28/2023] [Accepted: 06/06/2023] [Indexed: 06/25/2023]
Abstract
Schizophrenia is marked by deficits in facial affect processing associated with abnormalities in GABAergic circuitry, deficits also found in first-degree relatives. Facial affect processing involves a distributed network of brain regions including limbic regions like amygdala and visual processing areas like fusiform cortex. Pharmacological modulation of GABAergic circuitry using benzodiazepines like alprazolam can be useful for studying this facial affect processing network and associated GABAergic abnormalities in schizophrenia. Here, we use pharmacological modulation and computational modeling to study the contribution of GABAergic abnormalities toward emotion processing deficits in schizophrenia. Specifically, we apply principles from network control theory to model persistence energy - the control energy required to maintain brain activation states - during emotion identification and recall tasks, with and without administration of alprazolam, in a sample of first-degree relatives and healthy controls. Here, persistence energy quantifies the magnitude of theoretical external inputs during the task. We find that alprazolam increases persistence energy in relatives but not in controls during threatening face processing, suggesting a compensatory mechanism given the relative absence of behavioral abnormalities in this sample of unaffected relatives. Further, we demonstrate that regions in the fusiform and occipital cortices are important for facilitating state transitions during facial affect processing. Finally, we uncover spatial relationships (i) between regional variation in differential control energy (alprazolam versus placebo) and (ii) both serotonin and dopamine neurotransmitter systems, indicating that alprazolam may exert its effects by altering neuromodulatory systems. Together, these findings provide a new perspective on the distributed emotion processing network and the effect of GABAergic modulation on this network, in addition to identifying an association between schizophrenia risk and abnormal GABAergic effects on persistence energy during threat processing.
Collapse
Affiliation(s)
- Arun S Mahadevan
- Department of Bioengineering, School of Engineering & Applied Science, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Eli J Cornblath
- Department of Bioengineering, School of Engineering & Applied Science, University of Pennsylvania, Philadelphia, PA, 19104, USA
- Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Pennsylvania, PA, 19104, USA
| | - David M Lydon-Staley
- Department of Bioengineering, School of Engineering & Applied Science, University of Pennsylvania, Philadelphia, PA, 19104, USA
- Annenberg School for Communication, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Dale Zhou
- Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Pennsylvania, PA, 19104, USA
| | - Linden Parkes
- Department of Bioengineering, School of Engineering & Applied Science, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Bart Larsen
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Azeez Adebimpe
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Ari E Kahn
- Department of Bioengineering, School of Engineering & Applied Science, University of Pennsylvania, Philadelphia, PA, 19104, USA
- Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Pennsylvania, PA, 19104, USA
| | - Ruben C Gur
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
- Department of Neurology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
- Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Pennsylvania, PA, 19104, USA
| | - Raquel E Gur
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
- Department of Neurology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
- Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Pennsylvania, PA, 19104, USA
| | - Theodore D Satterthwaite
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Daniel H Wolf
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA
| | - Dani S Bassett
- Department of Bioengineering, School of Engineering & Applied Science, University of Pennsylvania, Philadelphia, PA, 19104, USA.
- Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA.
- Department of Neurology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA.
- Department of Electrical & Systems Engineering, School of Engineering & Applied Science, University of Pennsylvania, Philadelphia, PA, 19104, USA.
- Department of Physics & Astronomy, College of Arts & Sciences, University of Pennsylvania, Philadelphia, PA, 19104, USA.
- Santa Fe Institute, 1399 Hyde Park Rd, Santa Fe, NM, 87501, USA.
- Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, PA, 19104, USA.
| |
Collapse
|
10
|
Silva Neto JAD, Afonso SLA, Souza WCD. A Utilização da Imitação Facial em Tarefas de Reconhecimento de Expressões Emocionais. PSICOLOGIA: CIÊNCIA E PROFISSÃO 2023. [DOI: 10.1590/1982-3703003249386] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/08/2023] Open
Abstract
Resumo A imitação facial é um comportamento involuntário capaz de facilitar a transmissão de informações não verbais relevantes em diferentes contextos sociais. Este estudo teve por objetivo analisar a capacidade de reconhecimento de expressões emocionais enquanto o observador tensiona a própria face ou imita a face-alvo. A hipótese utilizada foi a de que indivíduos que tensionam a própria face terão menor probabilidade de acertos na execução das tarefas de reconhecimento de expressões emocionais e aqueles que imitam a expressão terão uma maior probabilidade de acertos na execução das mesmas tarefas. A amostra foi composta por 30 participantes, divididos em dois grupos experimentais: o Grupo Imitação (GI) e o Grupo Ruído (GR), ambos com 18 participantes do sexo feminino e 12 do sexo masculino. O experimento consistiu em apresentar fotos de atores expressando facialmente uma emoção básica por 10 segundos. Neste período, os participantes deveriam, então, observar ou intervir facialmente, imitando ou tensionando a própria face (de acordo com o grupo alocado, Imitação ou Ruído). Após os 10 segundos executando a instrução (observar, imitar ou interferir), o participante deveria responder - entre as opções alegria, tristeza, nojo, raiva, surpresa e medo - a emoção correspondente à imagem. Os resultados apresentaram diferenças significativas quando comparadas as tarefas de tensionar ou imitar a face-alvo, sugerindo que a alteração da própria face do observador pode influenciar durante o desempenho de uma tarefa de reconhecimento de emoções em faces.
Collapse
|
11
|
Kleiser R, Raffelsberger T, Trenkler J, Meckel S, Seitz R. What influence do face masks have on reading emotions in faces? NEUROIMAGE: REPORTS 2022. [DOI: 10.1016/j.ynirp.2022.100141] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
12
|
Davis JD, Coulson S, Blaison C, Hess U, Winkielman P. Mimicry of partially occluded emotional faces: do we mimic what we see or what we know? Cogn Emot 2022; 36:1555-1575. [PMID: 36300446 DOI: 10.1080/02699931.2022.2135490] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
Abstract
Facial electromyography (EMG) was used to investigate patterns of facial mimicry in response to partial facial expressions in two contexts that differ in how naturalistic and socially significant the faces are. Experiment 1 presented participants with either the upper- or lower-half of facial expressions and used a forced-choice emotion categorisation task. This task emphasises cognition at the expense of ecological and social validity. Experiment 2 presented whole heads and expressions were occluded by clothing. Additionally, the emotion recognition task is more open-ended. This context has greater social validity. We found mimicry in both experiments, however mimicry differed in terms of which emotions were mimicked and the extent to which the mimicry involved muscle sites that were not observed. In the more cognitive context, there was relatively more motor matching (i.e. mimicking only what was seen). In the more socially valid context, participants were less likely to mimic only what they saw - and instead mimicked what they knew. Additionally, participants mimicked anger in the cognitive context but not the social context. These findings suggest that mimicry involves multiple mechanisms and that the more social the context, the more likely it is to reflect a mechanism of social regulation.
Collapse
Affiliation(s)
- Joshua D Davis
- Cognitive Science Department, University of California, San Diego, San Diego, USA
- Social and Behavioral Sciences Department, Southwestern College, Chula Vista, CA, USA
| | - Seana Coulson
- Cognitive Science Department, University of California, San Diego, San Diego, USA
| | | | - Ursula Hess
- Psychology Department, Humboldt University, Berlin, Germany
| | - Piotr Winkielman
- Psychology Department, University of California, San Diego, San Diego, USA
- Psychology Department, SWPS University of Social Sciences and Humanities, Warsaw, Poland
| |
Collapse
|
13
|
Mauersberger H, Kastendieck T, Hess U. I looked at you, you looked at me, I smiled at you, you smiled at me—The impact of eye contact on emotional mimicry. Front Psychol 2022; 13:970954. [PMID: 36248540 PMCID: PMC9556997 DOI: 10.3389/fpsyg.2022.970954] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2022] [Accepted: 09/05/2022] [Indexed: 11/13/2022] Open
Abstract
Eye contact is an essential element of human interaction and direct eye gaze has been shown to have effects on a range of attentional and cognitive processes. Specifically, direct eye contact evokes a positive affective reaction. As such, it has been proposed that obstructed eye contact reduces emotional mimicry (i.e., the imitation of our counterpart’s emotions). So far, emotional mimicry research has used averted-gaze faces or unnaturally covered eyes (with black censor bars) to analyze the effect of eye contact on emotional mimicry. However, averted gaze can also signal disinterest/ disengagement and censor bars obscure eye-adjacent areas as well and hence impede emotion recognition. In the present study (N = 44), we used a more ecological valid approach by showing photos of actors who expressed either happiness, sadness, anger, or disgust while either wearing mirroring sunglasses that obstruct eye contact or clear glasses. The glasses covered only the direct eye region but not the brows, nose ridge, and cheeks. Our results confirm that participants were equally accurate in recognizing the emotions of their counterparts in both conditions (sunglasses vs. glasses). Further, in line with our hypotheses, participants felt closer to the targets and mimicked affiliative emotions more intensely when their counterparts wore glasses instead of sunglasses. For antagonistic emotions, we found the opposite pattern: Disgust mimicry, which was interpreted as an affective reaction rather than genuine mimicry, could be only found in the sunglasses condition. It may be that obstructed eye contact increased the negative impression of disgusted facial expressions and hence the negative feelings disgust faces evoked. The present study provides further evidence for the notion that eye contact is an important prerequisite for emotional mimicry and hence for smooth and satisfying social interactions.
Collapse
|
14
|
Investigating the Relationship between Facial Mimicry and Empathy. Behav Sci (Basel) 2022; 12:bs12080250. [PMID: 35892350 PMCID: PMC9330546 DOI: 10.3390/bs12080250] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2022] [Revised: 07/15/2022] [Accepted: 07/20/2022] [Indexed: 12/30/2022] Open
Abstract
Facial expressions play a key role in interpersonal communication when it comes to negotiating our emotions and intentions, as well as interpreting those of others. Research has shown that we can connect to other people better when we exhibit signs of empathy and facial mimicry. However, the relationship between empathy and facial mimicry is still debated. Among the factors contributing to the difference in results across existing studies is the use of different instruments for measuring both empathy and facial mimicry, as well as often ignoring the differences across various demographic groups. This study first looks at the differences in the empathetic abilities of people across different demographic groups based on gender, ethnicity and age. The empathetic ability is measured based on the Empathy Quotient, capturing a balanced representation of both emotional and cognitive empathy. Using statistical and machine learning methods, this study then investigates the correlation between the empathetic ability and facial mimicry of subjects in response to images portraying different emotions displayed on a computer screen. Unlike the existing studies measuring facial mimicry using electromyography, this study employs a technology detecting facial expressions based on video capture and deep learning. This choice was made in the context of increased online communication during and after the COVID-19 pandemic. The results of this study confirm the previously reported difference in the empathetic ability between females and males. However, no significant difference in empathetic ability was found across different age and ethnic groups. Furthermore, no strong correlation was found between empathy and facial reactions to faces portraying different emotions shown on a computer screen. Overall, the results of this study can be used to inform the design of online communication technologies and tools for training empathy team leaders, educators, social and healthcare providers.
Collapse
|
15
|
Abstract
AbstractSocial resemblance, like group membership or similar attitudes, increases the mimicry of the observed emotional facial display. In this study, we investigate whether facial self-resemblance (manipulated by computer morphing) modulates emotional mimicry in a similar manner. Participants watched dynamic expressions of faces that either did or did not resemble their own, while their facial muscle activity was measured using EMG. Additionally, after each presentation, respondents completed social evaluations of the faces they saw. The results show that self-resemblance evokes convergent facial reactions. More specifically, participants mimicked the happiness and, to a lesser extent, the anger of self-resembling faces. In turn, the happiness of non-resembling faces was less likely mimicked than in the case of self-resembling faces, while anger evoked a more divergent, smile-like response. Finally, we found that social evaluations were in general increased by happiness displays, but not influenced by resemblance. Overall, the study demonstrates an interesting and novel phenomenon, particularly that mimicry can be modified by relatively subtle cues of physical resemblance.
Collapse
|
16
|
Zhang S, Liu X, Yang X, Shu Y, Liu N, Zhang D, Liu YJ. The Influence of Key Facial Features on Recognition of Emotion in Cartoon Faces. Front Psychol 2021; 12:687974. [PMID: 34447333 PMCID: PMC8382696 DOI: 10.3389/fpsyg.2021.687974] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2021] [Accepted: 07/13/2021] [Indexed: 12/24/2022] Open
Abstract
Cartoon faces are widely used in social media, animation production, and social robots because of their attractive ability to convey different emotional information. Despite their popular applications, the mechanisms of recognizing emotional expressions in cartoon faces are still unclear. Therefore, three experiments were conducted in this study to systematically explore a recognition process for emotional cartoon expressions (happy, sad, and neutral) and to examine the influence of key facial features (mouth, eyes, and eyebrows) on emotion recognition. Across the experiments, three presentation conditions were employed: (1) a full face; (2) individual feature only (with two other features concealed); and (3) one feature concealed with two other features presented. The cartoon face images used in this study were converted from a set of real faces acted by Chinese posers, and the observers were Chinese. The results show that happy cartoon expressions were recognized more accurately than neutral and sad expressions, which was consistent with the happiness recognition advantage revealed in real face studies. Compared with real facial expressions, sad cartoon expressions were perceived as sadder, and happy cartoon expressions were perceived as less happy, regardless of whether full-face or single facial features were viewed. For cartoon faces, the mouth was demonstrated to be a feature that is sufficient and necessary for the recognition of happiness, and the eyebrows were sufficient and necessary for the recognition of sadness. This study helps to clarify the perception mechanism underlying emotion recognition in cartoon faces and sheds some light on directions for future research on intelligent human-computer interactions.
Collapse
Affiliation(s)
- Shu Zhang
- Department of Computer Science and Technology, Tsinghua University, Beijing, China
- Beijing National Research Center for Information Science and Technology, Beijing, China
| | - Xinge Liu
- Department of Computer Science and Technology, Tsinghua University, Beijing, China
- Beijing National Research Center for Information Science and Technology, Beijing, China
| | - Xuan Yang
- Department of Computer Science and Technology, Tsinghua University, Beijing, China
- Beijing National Research Center for Information Science and Technology, Beijing, China
| | - Yezhi Shu
- Department of Computer Science and Technology, Tsinghua University, Beijing, China
| | - Niqi Liu
- Department of Computer Science and Technology, Tsinghua University, Beijing, China
| | - Dan Zhang
- Department of Psychology, Tsinghua University, Beijing, China
- Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing, China
| | - Yong-Jin Liu
- Department of Computer Science and Technology, Tsinghua University, Beijing, China
- Beijing National Research Center for Information Science and Technology, Beijing, China
- Key Laboratory of Pervasive Computing, Ministry of Education, Beijing, China
| |
Collapse
|
17
|
Kastendieck T, Mauersberger H, Blaison C, Ghalib J, Hess U. Laughing at funerals and frowning at weddings: Top-down influences of context-driven social judgments on emotional mimicry. Acta Psychol (Amst) 2021; 212:103195. [PMID: 33137612 DOI: 10.1016/j.actpsy.2020.103195] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2020] [Revised: 09/17/2020] [Accepted: 09/28/2020] [Indexed: 11/16/2022] Open
Abstract
This research aimed to assess top-down effects of social judgments on (facial) emotional mimicry. Based on the mimicry as social regulator model (Hess & Fischer, 2013) and the notion that people can use emotion expressions as cues to an expresser's traits (Hareli & Hess, 2010), we predicted that participants judge expressers who show affectively deviant expressions more negatively, feel less close to them and, thus, show reduced mimicry. Participants saw smiles and sad expressions embedded in either a wedding or funeral scene (or neutral control). In Study 1, affectively deviant expressions were rated as inappropriate and led to less self-reported interpersonal closeness to the expresser. In Study 2, both happiness and sadness mimicry were affected by the normativeness of the expression. However, the specific effect varied. Participants mimicked both deviant and normative happy expressions only when they felt close to the expresser. However, in the case of deviant expressions, closeness was lower. When participants did not feel close to the expresser, their expression was neutral, that is, they did not mimic. Sadness was only mimicked when appropriate to the context, that is, when deemed a legitimate response and a valid appeal for help, regardless of closeness. In this sense, facial mimicry of sadness expression can be considered an empathic reaction. In sum, the present research shows strong evidence for a top-down effect of social judgments on mimicry. It further suggests that this effect differed as a function of emotion expression and the meaning and social appeal conveyed by that expression.
Collapse
|
18
|
Kuang B, Li X, Li X, Lin M, Liu S, Hu P. The effect of eye gaze direction on emotional mimicry: A multimodal study with electromyography and electroencephalography. Neuroimage 2020; 226:117604. [PMID: 33278584 DOI: 10.1016/j.neuroimage.2020.117604] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2020] [Revised: 11/21/2020] [Accepted: 11/24/2020] [Indexed: 11/15/2022] Open
Abstract
Emotional mimicry plays an important role in social interaction and is influenced by social context, especially eye gaze direction. However, the neural mechanism underlying the effect of eye gaze direction on emotional mimicry is unclear. Here, we explored how eye gaze direction influenced emotional mimicry with a combination of electromyography (EMG) and electroencephalography (EEG) techniques, which may provide a more comprehensive measure. To do this, we recorded facial EMG and scalp EEG signals simultaneously while participants observed emotional faces (happy vs. angry) with direct or averted gaze. Then, we split the EEG trials into two mimicry intensity categories (high mimicry intensity, HMI vs. low mimicry intensity, LMI) according to EMG activity. The ERP difference between HMI and LMI EEG trials revealed four ERP components (P50, P150, N200 and P300), and the effect of eye gaze direction on emotional mimicry was prominent on P300 at P7 and P8. Moreover, we also observed differences in the effect of eye gaze direction on mimicry of happy faces and angry faces, which were found on P300 at P7, as well as P150 at P7 and N200 at P7 and Pz. In short, the present study isolated the neural signals of emotional mimicry with a new multimodal method, and provided empirical neural evidence that eye gaze direction affected emotional mimicry.
Collapse
Affiliation(s)
- Beibei Kuang
- International Studies College, National University of Defense Technology, Nanjing, China; Department of Psychology, Renmin University of China, Room 1005, D Block, Huixian Building, 59 Zhongguancun St., Haidian Dist., Beijing, 100872, China
| | - Xueting Li
- Department of Psychology, Renmin University of China, Room 1005, D Block, Huixian Building, 59 Zhongguancun St., Haidian Dist., Beijing, 100872, China
| | - Xintong Li
- Department of Psychology, Renmin University of China, Room 1005, D Block, Huixian Building, 59 Zhongguancun St., Haidian Dist., Beijing, 100872, China
| | - Mingxiao Lin
- Department of Psychology, Renmin University of China, Room 1005, D Block, Huixian Building, 59 Zhongguancun St., Haidian Dist., Beijing, 100872, China
| | - Shanrou Liu
- Department of Psychology, Renmin University of China, Room 1005, D Block, Huixian Building, 59 Zhongguancun St., Haidian Dist., Beijing, 100872, China
| | - Ping Hu
- Department of Psychology, Renmin University of China, Room 1005, D Block, Huixian Building, 59 Zhongguancun St., Haidian Dist., Beijing, 100872, China.
| |
Collapse
|
19
|
Schneider JN, Brick TR, Dziobek I. Distance to the Neutral Face Predicts Arousal Ratings of Dynamic Facial Expressions in Individuals With and Without Autism Spectrum Disorder. Front Psychol 2020; 11:577494. [PMID: 33329224 PMCID: PMC7729191 DOI: 10.3389/fpsyg.2020.577494] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Accepted: 10/29/2020] [Indexed: 11/18/2022] Open
Abstract
Arousal is one of the dimensions of core affect and frequently used to describe experienced or observed emotional states. While arousal ratings of facial expressions are collected in many studies it is not well understood how arousal is displayed in or interpreted from facial expressions. In the context of socioemotional disorders such as Autism Spectrum Disorder, this poses the question of a differential use of facial information for arousal perception. In this study, we demonstrate how automated face-tracking tools can be used to extract predictors of arousal judgments. We find moderate to strong correlations among all measures of static information on one hand and all measures of dynamic information on the other. Based on these results, we tested two measures, average distance to the neutral face and average facial movement speed, within and between neurotypical individuals (N = 401) and individuals with autism (N = 19). Distance to the neutral face was predictive of arousal in both groups. Lower mean arousal ratings were found for the autistic group, but no difference in correlation of the measures and arousal ratings could be found between groups. Results were replicated in an high autistic traits group. The findings suggest a qualitatively similar perception of arousal for individuals with and without autism. No correlations between valence ratings and any of the measures could be found, emphasizing the specificity of our tested measures. Distance and speed predictors share variability and thus speed should not be discarded as a predictor of arousal ratings.
Collapse
Affiliation(s)
- Jan N. Schneider
- Institut für Informatik und Computational Science, Universität Potsdam, Potsdam, Germany
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Timothy R. Brick
- Human Development and Family Studies and Institute for CyberScience, The Pennsylvania State University, State College, PA, United States
| | - Isabel Dziobek
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
| |
Collapse
|