1
|
Chen T, Helminen TM, Linnunsalo S, Hietanen JK. Autonomic and facial electromyographic responses to watching eyes. Iperception 2024; 15:20416695231226059. [PMID: 38268784 PMCID: PMC10807318 DOI: 10.1177/20416695231226059] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Accepted: 12/27/2023] [Indexed: 01/26/2024] Open
Abstract
We measured participants' psychophysiological responses and gaze behavior while viewing a stimulus person's direct and averted gaze in three different conditions manipulating the participants' experience of being watched. The results showed that skin conductance responses and heart rate deceleration responses were greater to direct than averted gaze only in the condition in which the participants had the experience of being watched by the other individual. In contrast, gaze direction had no effects on these responses when the participants were manipulated to believe that the other individual could not watch them or when the stimulus person was presented in a pre-recorded video. Importantly, the eye tracking measures showed no differences in participants' looking behavior between these stimulus presentation conditions. The results of facial electromyography responses suggested that direct gaze elicited greater zygomatic and periocular responses than averted gaze did, independent of the presentation condition. It was concluded that the affective arousal and attention-orienting indexing autonomic responses to eye contact are driven by the experience of being watched. In contrast, the facial responses seem to reflect automatized affiliative responses which can be elicited even in conditions in which seeing another's direct gaze does not signal that the self is being watched.
Collapse
Affiliation(s)
- Tingji Chen
- Department of Psychology, School of Education, Soochow University, Suzhou, Jiangsu, China
| | - Terhi M Helminen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Samuli Linnunsalo
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Jari K Hietanen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| |
Collapse
|
2
|
Being watched by a humanoid robot and a human: Effects on affect-related psychophysiological responses. Biol Psychol 2022; 175:108451. [DOI: 10.1016/j.biopsycho.2022.108451] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2022] [Revised: 10/27/2022] [Accepted: 10/31/2022] [Indexed: 11/06/2022]
|
3
|
Mauersberger H, Kastendieck T, Hess U. I looked at you, you looked at me, I smiled at you, you smiled at me—The impact of eye contact on emotional mimicry. Front Psychol 2022; 13:970954. [PMID: 36248540 PMCID: PMC9556997 DOI: 10.3389/fpsyg.2022.970954] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2022] [Accepted: 09/05/2022] [Indexed: 11/13/2022] Open
Abstract
Eye contact is an essential element of human interaction and direct eye gaze has been shown to have effects on a range of attentional and cognitive processes. Specifically, direct eye contact evokes a positive affective reaction. As such, it has been proposed that obstructed eye contact reduces emotional mimicry (i.e., the imitation of our counterpart’s emotions). So far, emotional mimicry research has used averted-gaze faces or unnaturally covered eyes (with black censor bars) to analyze the effect of eye contact on emotional mimicry. However, averted gaze can also signal disinterest/ disengagement and censor bars obscure eye-adjacent areas as well and hence impede emotion recognition. In the present study (N = 44), we used a more ecological valid approach by showing photos of actors who expressed either happiness, sadness, anger, or disgust while either wearing mirroring sunglasses that obstruct eye contact or clear glasses. The glasses covered only the direct eye region but not the brows, nose ridge, and cheeks. Our results confirm that participants were equally accurate in recognizing the emotions of their counterparts in both conditions (sunglasses vs. glasses). Further, in line with our hypotheses, participants felt closer to the targets and mimicked affiliative emotions more intensely when their counterparts wore glasses instead of sunglasses. For antagonistic emotions, we found the opposite pattern: Disgust mimicry, which was interpreted as an affective reaction rather than genuine mimicry, could be only found in the sunglasses condition. It may be that obstructed eye contact increased the negative impression of disgusted facial expressions and hence the negative feelings disgust faces evoked. The present study provides further evidence for the notion that eye contact is an important prerequisite for emotional mimicry and hence for smooth and satisfying social interactions.
Collapse
|
4
|
Hsu CT, Sato W, Kochiyama T, Nakai R, Asano K, Abe N, Yoshikawa S. Enhanced Mirror Neuron Network Activity and Effective Connectivity during Live Interaction Among Female Subjects. Neuroimage 2022; 263:119655. [PMID: 36182055 DOI: 10.1016/j.neuroimage.2022.119655] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2022] [Revised: 09/26/2022] [Accepted: 09/27/2022] [Indexed: 11/24/2022] Open
Abstract
Facial expressions are indispensable in daily human communication. Previous neuroimaging studies investigating facial expression processing have presented pre-recorded stimuli and lacked live face-to-face interaction. Our paradigm alternated between presentations of real-time model performance and pre-recorded videos of dynamic facial expressions to participants. Simultaneous functional magnetic resonance imaging (fMRI) and facial electromyography activity recordings, as well as post-scan valence and arousal ratings were acquired from 44 female participants. Live facial expressions enhanced the subjective valence and arousal ratings as well as facial muscular responses. Live performances showed greater engagement of the right posterior superior temporal sulcus (pSTS), right inferior frontal gyrus (IFG), right amygdala and right fusiform gyrus, and modulated the effective connectivity within the right mirror neuron system (IFG, pSTS, and right inferior parietal lobule). A support vector machine algorithm could classify multivoxel activation patterns in brain regions involved in dynamic facial expression processing in the mentalizing networks (anterior and posterior cingulate cortex). These results indicate that live social interaction modulates the activity and connectivity of the right mirror neuron system and enhances spontaneous mimicry, further facilitating emotional contagion.
Collapse
Affiliation(s)
- Chun-Ting Hsu
- Psychological Process Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan..
| | - Wataru Sato
- Psychological Process Research Team, Guardian Robot Project, RIKEN, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan..
| | - Takanori Kochiyama
- Brain Activity Imaging Center, ATR- Promotions, Inc., 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan
| | - Ryusuke Nakai
- Institute for the Future of Human Society, Kyoto University, 46 Yoshidashimoadachi-cho, Sakyo-ku, Kyoto, 606-8501 Japan
| | - Kohei Asano
- Institute for the Future of Human Society, Kyoto University, 46 Yoshidashimoadachi-cho, Sakyo-ku, Kyoto, 606-8501 Japan; Department of Children Education, Osaka University of Comprehensive Children Education, 6-chome-4-26 Yuzato, Higashisumiyoshi Ward, Osaka, 546-0013, Japan
| | - Nobuhito Abe
- Institute for the Future of Human Society, Kyoto University, 46 Yoshidashimoadachi-cho, Sakyo-ku, Kyoto, 606-8501 Japan
| | - Sakiko Yoshikawa
- Institute of Philosophy and Human Values, Kyoto University of the Arts, 2-116 Uryuyama Kitashirakawa, Sakyo, Kyoto, Kyoto 606-8271, Japan
| |
Collapse
|
5
|
Social signalling as a framework for second-person neuroscience. Psychon Bull Rev 2022; 29:2083-2095. [PMID: 35650463 DOI: 10.3758/s13423-022-02103-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/07/2022] [Indexed: 11/08/2022]
Abstract
Despite the recent increase in second-person neuroscience research, it is still hard to understand which neurocognitive mechanisms underlie real-time social behaviours. Here, we propose that social signalling can help us understand social interactions both at the single- and two-brain level in terms of social signal exchanges between senders and receivers. First, we show how subtle manipulations of being watched provide an important tool to dissect meaningful social signals. We then focus on how social signalling can help us build testable hypotheses for second-person neuroscience with the example of imitation and gaze behaviour. Finally, we suggest that linking neural activity to specific social signals will be key to fully understand the neurocognitive systems engaged during face-to-face interactions.
Collapse
|
6
|
Jeganathan J, Breakspear M. An active inference perspective on the negative symptoms of schizophrenia. Lancet Psychiatry 2021; 8:732-738. [PMID: 33865502 DOI: 10.1016/s2215-0366(20)30527-7] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Revised: 11/03/2020] [Accepted: 11/23/2020] [Indexed: 10/21/2022]
Abstract
Predictive coding has played a transformative role in the study of psychosis, casting delusions and hallucinations as statistical inference in a system with abnormal precision. However, the negative symptoms of schizophrenia, such as affective blunting, avolition, and asociality, remain poorly understood. We propose a computational framework for emotional expression based on active inference-namely that affective behaviours such as smiling are driven by predictions about the social consequences of smiling. Similarly to how delusions and hallucinations can be explained by predictive uncertainty in sensory circuits, negative symptoms naturally arise from uncertainty in social prediction circuits. This perspective draws on computational principles to explain blunted facial expressiveness and apathy-anhedonia in schizophrenia. Its phenomenological consequences also shed light on the content of paranoid delusions and indistinctness of self-other boundaries. Close links are highlighted between social prediction, facial affect mirroring, and the fledgling study of interoception. Advances in automated analysis of facial expressions and acoustic speech patterns will allow empirical testing of these computational models of the negative symptoms of schizophrenia.
Collapse
Affiliation(s)
- Jayson Jeganathan
- School of Psychology, College of Engineering, Science, and the Environment, The University of Newcastle, Newcastle, NSW, Australia; Hunter Medical Research Institute, Newcastle, NSW, Australia.
| | - Michael Breakspear
- School of Psychology, College of Engineering, Science, and the Environment, The University of Newcastle, Newcastle, NSW, Australia; School of Medicine and Public Health, College of Health and Medicine, The University of Newcastle, Newcastle, NSW, Australia; Hunter Medical Research Institute, Newcastle, NSW, Australia
| |
Collapse
|
7
|
Hietanen JK, Peltola MJ. The eye contact smile: The effects of sending and receiving a direct gaze. VISUAL COGNITION 2021. [DOI: 10.1080/13506285.2021.1915904] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
- Jari K. Hietanen
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| | - Mikko J. Peltola
- Human Information Processing Laboratory, Faculty of Social Sciences/Psychology, Tampere University, Tampere, Finland
| |
Collapse
|
8
|
Soundirarajan M, Aghasian E, Krejcar O, Namazi H. Complexity-based analysis of the coupling between facial muscle and brain activities. Biomed Signal Process Control 2021. [DOI: 10.1016/j.bspc.2021.102511] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
|
9
|
Pictures of social interaction prompt a sustained increase of the smile expression and induce sociability. Sci Rep 2021; 11:5518. [PMID: 33750836 PMCID: PMC7943771 DOI: 10.1038/s41598-021-84880-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2020] [Accepted: 02/22/2021] [Indexed: 11/08/2022] Open
Abstract
Viewing pictures of social interaction can facilitate approach behaviors. We conducted two studies to investigate if social interaction cues, empathy, and/or social touch modulate facial electromyographic (EMG) reactivity (as evidenced by the zygomaticus major and corrugator supercilii muscles) and mood states. We presented bonding pictures (depicting social interaction) and control pictures (without social interaction) while continuously recording zygomatic and corrugator EMG activities. In both studies, picture blocks were paired by valence and arousal. All participants were college students. In study 1, participants (n = 80, 47 women) read relevant priming texts immediately before viewing each block of 14 pictures. In study 2, participants did not read (n = 82, 63 women) priming texts before each block of 28 pictures. In study 1 and study 2, participants also completed mood states questionnaires to assess sociability and altruistic behavior. Empathy and social touch frequency were also assessed by self-reported questionnaires. In both studies, bonding pictures increased the zygomatic activity and the self-reported sociability feeling compared to control pictures. Only in study 2, bonding pictures decreased median corrugator activity compared to control pictures. We concluded that social interaction cues were efficient to increase sociability and prompt a sustained smile expression regardless of priming texts.
Collapse
|
10
|
Electromyographic evidence of reduced emotion mimicry in individuals with a history of non-suicidal self-injury. PLoS One 2020; 15:e0243860. [PMID: 33370320 PMCID: PMC7769269 DOI: 10.1371/journal.pone.0243860] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2020] [Accepted: 11/27/2020] [Indexed: 11/20/2022] Open
Abstract
Engaging in facial emotion mimicry during social interactions encourages empathy and functions as a catalyst for interpersonal bonding. Decreased reflexive mirroring of facial expressions has been observed in individuals with different non-psychotic disorders, relative to healthy controls. Given reports of interpersonal relationship difficulties experienced by those who engage in non-suicidal self-injury (NSSI), it is of interest to explore facial emotion mimicry in individuals with a history of this behaviour (HNSSI). Among other things, this will enable us to better understand their emotion regulation and social interaction challenges. Surface facial electromyography (fEMG) was used to record the reflexive facial mimicry of 30 HNSSI and 30 controls while they passively observed a series of dynamic facial stimuli showing various facial expressions of emotion. Beginning with a neutral expression, the stimuli quickly morphed to one of 6 prototypic emotional expressions (anger, fear, surprise, disgust, happiness, or sadness). Mimicry was assessed by affixing surface electrodes to facial muscles known to exhibit a high degree of electrical activity in response to positive and negative emotions: the corrugator supercilii and the zygomaticus major. HNSSI participants, relative to controls, exhibited significantly less electrical activity in the corrugator muscle in response to viewing angry stimuli, and significantly less of an expected relaxation in muscle activity in response to viewing happy stimuli. Mirroring these results, greater endorsement of social influence as a motivator for engaging in NSSI was associated with less mimicry, and greater endorsement of emotion regulation as a motivator was associated with greater incongruent muscle response when viewing happy faces. These findings lend support to the theory that social interaction difficulties in HNSSI might be related to implicit violations of expected social rules exhibited through facial mimicry nonconformity.
Collapse
|
11
|
Kiilavuori H, Sariola V, Peltola MJ, Hietanen JK. Making eye contact with a robot: Psychophysiological responses to eye contact with a human and with a humanoid robot. Biol Psychol 2020; 158:107989. [PMID: 33217486 DOI: 10.1016/j.biopsycho.2020.107989] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2020] [Revised: 11/09/2020] [Accepted: 11/10/2020] [Indexed: 11/18/2022]
Abstract
Previous research has shown that eye contact, in human-human interaction, elicits increased affective and attention related psychophysiological responses. In the present study, we investigated whether eye contact with a humanoid robot would elicit these responses. Participants were facing a humanoid robot (NAO) or a human partner, both physically present and looking at or away from the participant. The results showed that both in human-robot and human-human condition, eye contact versus averted gaze elicited greater skin conductance responses indexing autonomic arousal, greater facial zygomatic muscle responses (and smaller corrugator responses) associated with positive affect, and greater heart deceleration responses indexing attention allocation. With regard to the skin conductance and zygomatic responses, the human model's gaze direction had a greater effect on the responses as compared to the robot's gaze direction. In conclusion, eye contact elicits automatic affective and attentional reactions both when shared with a humanoid robot and with another human.
Collapse
Affiliation(s)
- Helena Kiilavuori
- Human Information Processing Laboratory, Psychology, Faculty of Social Sciences, FI -33014, Tampere University, Finland
| | - Veikko Sariola
- Faculty of Medicine and Health Technology, Korkeakoulunkatu 3, FI - 33720, Tampere University, Finland
| | - Mikko J Peltola
- Human Information Processing Laboratory, Psychology, Faculty of Social Sciences, FI -33014, Tampere University, Finland
| | - Jari K Hietanen
- Human Information Processing Laboratory, Psychology, Faculty of Social Sciences, FI -33014, Tampere University, Finland.
| |
Collapse
|
12
|
Enhanced emotional and motor responses to live versus videotaped dynamic facial expressions. Sci Rep 2020; 10:16825. [PMID: 33033355 PMCID: PMC7544832 DOI: 10.1038/s41598-020-73826-2] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 09/22/2020] [Indexed: 11/08/2022] Open
Abstract
Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry. However, such highly controlled experimental procedures may lack the vividness of real-life social interactions. This study incorporated a live image relay system that delivered models' real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles. Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models' positive facial expressions. The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.
Collapse
|
13
|
Cañigueral R, Ward JA, Hamilton AFDC. Effects of being watched on eye gaze and facial displays of typical and autistic individuals during conversation. AUTISM : THE INTERNATIONAL JOURNAL OF RESEARCH AND PRACTICE 2020; 25:210-226. [PMID: 32854524 PMCID: PMC7812513 DOI: 10.1177/1362361320951691] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial motion patterns in autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial displays as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies.
Collapse
Affiliation(s)
| | - Jamie A Ward
- University College London, UK.,Goldsmiths, University of London, UK
| | | |
Collapse
|