1
|
Chiang KW, Tan CH, Hong WP, Yu RL. Disgust-specific impairment of facial emotion recognition in Parkinson's disease patients with mild cognitive impairment. Soc Cogn Affect Neurosci 2024; 19:nsae073. [PMID: 39417289 PMCID: PMC11561469 DOI: 10.1093/scan/nsae073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2024] [Revised: 08/10/2024] [Accepted: 10/10/2024] [Indexed: 10/19/2024] Open
Abstract
This study investigated the association between cognitive function and facial emotion recognition (FER) in patients with Parkinson's disease (PD) and mild cognitive impairment (PD-MCI). We enrolled 126 participants from Taiwan, including 63 patients with idiopathic PD and 63 matched healthy controls. The PD group was divided into two groups: those with normal cognitive function (PD-NC) and those with MCI (PD-MCI). Participants underwent a modality emotion recognition test and comprehensive cognitive assessment. Our findings reveal that patients with PD-MCI exhibit significantly impaired FER, especially in recognizing "disgust," compared with patients with PD-NC and healthy adults (P = .001). This deficit correlates with executive function, attention, memory, and visuospatial abilities. Attention mediates the relationship between executive function and "disgust" FER. The findings highlight how patients with PD-MCI are specifically challenged when recognizing "disgust" and suggest that cognitive training focusing on cognitive flexibility and attention may improve their FER abilities. This study contributes to our understanding of the nuanced relationship between cognitive dysfunction and FER in patients with PD-MCI, emphasizing the need for targeted interventions.
Collapse
Affiliation(s)
- Ke-Wei Chiang
- Institute of Behavioral Medicine, College of Medicine, National Cheng Kung University, Tainan 701, Taiwan
- Department of Psychiatry, China Medical University Hospital, China Medical University, Taichung 404327, Taiwan
| | - Chun-Hsiang Tan
- Department of Neurology, Kaohsiung Medical University Hospital, Kaohsiung Medical University, Kaohsiung 807378, Taiwan
- Graduate Institute of Clinical Medicine, College of Medicine, Kaohsiung Medical University, Kaohsiung 807378, Taiwan
| | - Wei-Pin Hong
- Department of Neurology, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan 701401, Taiwan
| | - Rwei-Ling Yu
- Institute of Behavioral Medicine, College of Medicine, National Cheng Kung University, Tainan 701, Taiwan
- Office of Strategic Planning, National Cheng Kung University, Tainan 701401, Taiwan
| |
Collapse
|
2
|
Jankowski M, Goroncy A. Anatomical variants of acne differ in their impact on social perception. J Eur Acad Dermatol Venereol 2024; 38:1628-1636. [PMID: 38379351 DOI: 10.1111/jdv.19798] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Accepted: 12/11/2023] [Indexed: 02/22/2024]
Abstract
BACKGROUND Acne negatively affects quality of life, however quality-of-life scores poorly correlate with disease severity scores. Previous research demonstrated existence of facial areas in which skin lesions have greater impact on gaze patterns. Therefore, we hypothesized that anatomical variants of acne may be perceived differently. OBJECTIVES The aim was to investigate effect of anatomical variants of acne on natural gaze patterns and resulting impact on social perception of acne patients. METHODS We tracked eye movements of participants viewing neutral and emotional faces with acne. Images were rated for acne-related visual disturbance, and emotional faces were rated for valence intensity. Respondents of an online survey were asked to rate their perception of pictured individuals' personality traits. RESULTS All faces with acne were perceived as less attractive and received poorer personality judgements with mid-facial acne presenting smallest deviation from healthy faces. T-zone and mixed acne exhibited the least significant difference in respondents gaze behaviour pattern from each other. In addition, there was no significant difference in respondents' grading of acne visual disturbance or ratings for attractiveness, success and trustworthiness. U-zone adult female acne was rated as the most visually disturbing and received the lowest scores for attractiveness. Happy faces with adult female acne were rated as less happy compared to other acne variants and clear-skin faces. CONCLUSIONS Anatomic variants of acne have a distinct impact on gaze patterns and social perception. Adult female acne has the strongest negative effect on recognition of positive emotions in affected individuals, attractiveness ratings and forming social impressions. If perioral acne lesions are absent, frontal lesions determine impact of acne on social perception irrespective of the presence of mid-facial lesions. This perceptive hierarchy should be taken into consideration while deciding treatment goals in acne patients, prioritizing achieving remission in perioral and frontal area.
Collapse
Affiliation(s)
- Marek Jankowski
- Department of Dermatology and Venereology, Faculty of Medicine in Bydgoszcz, Nicolaus Copernicus University, Bydgoszcz, Poland
| | - Agnieszka Goroncy
- Department of Mathematical Statistics and Data Mining, Faculty of Mathematics and Computer Science, Nicolaus Copernicus University in Torun, Torun, Poland
| |
Collapse
|
3
|
Huang K, Tian Z, Zhang Q, Yang H, Wen S, Feng J, Tang W, Wang Q, Feng L. Reduced eye gaze fixation during emotion recognition among patients with temporal lobe epilepsy. J Neurol 2024; 271:2560-2572. [PMID: 38289536 DOI: 10.1007/s00415-024-12202-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2023] [Revised: 01/09/2024] [Accepted: 01/16/2024] [Indexed: 04/28/2024]
Abstract
OBJECTIVES To investigate the facial scan patterns during emotion recognition (ER) through the dynamic facial expression task and the awareness of social interference test (TASIT) using eye tracking (ET) technology, and to find some ET indicators that can accurately depict the ER process, which is a beneficial supplement to existing ER assessment tools. METHOD Ninety-six patients with TLE and 88 healthy controls (HCs) were recruited. All participants watched the dynamic facial expression task and TASIT including a synchronized eye movement recording and recognized the emotion (anger, disgust, happiness, or sadness). The accuracy of ER was recorded. The first fixation time, first fixation duration, dwell time, and fixation count were selected and analyzed. RESULTS TLE patients exhibited ER impairment especially for disgust (Z = - 3.391; p = 0.001) and sadness (Z = - 3.145; p = 0.002). TLE patients fixated less on the face, as evidenced by the reduced fixation count (Z = - 2.549; p = 0.011) of the face and a significant decrease in the fixation count rate (Z = - 1.993; p = 0.046). During the dynamic facial expression task, TLE patients focused less on the eyes, as evidenced by the decreased first fixation duration (Z = - 4.322; p = 0.000), dwell time (Z = - 4.083; p = 0.000), and fixation count (Z = - 3.699; p = 0.000) of the eyes. CONCLUSION TLE patients had ER impairment, especially regarding negative emotions, which may be attributable to their reduced fixation on the eyes during ER, and the increased fixation on the mouth could be a compensatory effect to improve ER performance. Eye-tracking technology could provide the process indicators of ER, and is a valuable supplement to traditional ER assessment tasks.
Collapse
Affiliation(s)
- Kailing Huang
- Department of Neurology, Xiangya Hospital, Central South University, Changsha, 410008, People's Republic of China
- National Clinical Research Center for Geriatric Disorders, Xiangya Hospital, Central South University, Changsha, Hunan, 410008, People's Republic of China
| | - Ziwei Tian
- Key Laboratory of Biomedical Spectroscopy of Xi'an, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi'an, 710119, China
- Key Laboratory of Spectral Imaging Technology, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi'an, 710119, China
- University of Chinese Academy of Sciences, Beijing, 101400, China
| | - Qiong Zhang
- Department of Neurology, Xiangya Hospital, Central South University, Changsha, 410008, People's Republic of China
- National Clinical Research Center for Geriatric Disorders, Xiangya Hospital, Central South University, Changsha, Hunan, 410008, People's Republic of China
| | - Haojun Yang
- Department of Neurology, Xiangya Hospital, Central South University, Changsha, 410008, People's Republic of China
- National Clinical Research Center for Geriatric Disorders, Xiangya Hospital, Central South University, Changsha, Hunan, 410008, People's Republic of China
| | - Shirui Wen
- Department of Neurology, Xiangya Hospital, Central South University, Changsha, 410008, People's Republic of China
- National Clinical Research Center for Geriatric Disorders, Xiangya Hospital, Central South University, Changsha, Hunan, 410008, People's Republic of China
| | - Jie Feng
- Department of Neurology, Xiangya Hospital, Central South University, Changsha, 410008, People's Republic of China
- National Clinical Research Center for Geriatric Disorders, Xiangya Hospital, Central South University, Changsha, Hunan, 410008, People's Republic of China
| | - Weiting Tang
- Department of Neurology, Xiangya Hospital, Central South University, Changsha, 410008, People's Republic of China
- National Clinical Research Center for Geriatric Disorders, Xiangya Hospital, Central South University, Changsha, Hunan, 410008, People's Republic of China
| | - Quan Wang
- Key Laboratory of Biomedical Spectroscopy of Xi'an, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi'an, 710119, China.
- Key Laboratory of Spectral Imaging Technology, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi'an, 710119, China.
| | - Li Feng
- Department of Neurology, Xiangya Hospital, Central South University, Changsha, 410008, People's Republic of China.
- National Clinical Research Center for Geriatric Disorders, Xiangya Hospital, Central South University, Changsha, Hunan, 410008, People's Republic of China.
- Department of Neurology, Xiangya Hospital, Central South University (Jiangxi Branch), Nanchang, 330000, Jiangxi, China.
| |
Collapse
|
4
|
Urtado MB, Rodrigues RD, Fukusima SS. Visual Field Restriction in the Recognition of Basic Facial Expressions: A Combined Eye Tracking and Gaze Contingency Study. Behav Sci (Basel) 2024; 14:355. [PMID: 38785846 PMCID: PMC11117586 DOI: 10.3390/bs14050355] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2023] [Revised: 04/05/2024] [Accepted: 04/18/2024] [Indexed: 05/25/2024] Open
Abstract
Uncertainties and discrepant results in identifying crucial areas for emotional facial expression recognition may stem from the eye tracking data analysis methods used. Many studies employ parameters of analysis that predominantly prioritize the examination of the foveal vision angle, ignoring the potential influences of simultaneous parafoveal and peripheral information. To explore the possible underlying causes of these discrepancies, we investigated the role of the visual field aperture in emotional facial expression recognition with 163 volunteers randomly assigned to three groups: no visual restriction (NVR), parafoveal and foveal vision (PFFV), and foveal vision (FV). Employing eye tracking and gaze contingency, we collected visual inspection and judgment data over 30 frontal face images, equally distributed among five emotions. Raw eye tracking data underwent Eye Movements Metrics and Visualizations (EyeMMV) processing. Accordingly, the visual inspection time, number of fixations, and fixation duration increased with the visual field restriction. Nevertheless, the accuracy showed significant differences among the NVR/FV and PFFV/FV groups, despite there being no difference in NVR/PFFV. The findings underscore the impact of specific visual field areas on facial expression recognition, highlighting the importance of parafoveal vision. The results suggest that eye tracking data analysis methods should incorporate projection angles extending to at least the parafoveal level.
Collapse
Affiliation(s)
- Melina Boratto Urtado
- Faculty of Philosophy, Sciences and Letters at Ribeirão Preto, University of São Paulo, Ribeirão Preto 14040-901, Brazil;
| | | | - Sergio Sheiji Fukusima
- Faculty of Philosophy, Sciences and Letters at Ribeirão Preto, University of São Paulo, Ribeirão Preto 14040-901, Brazil;
| |
Collapse
|
5
|
Richoz AR, Stacchi L, Schaller P, Lao J, Papinutto M, Ticcinelli V, Caldara R. Recognizing facial expressions of emotion amid noise: A dynamic advantage. J Vis 2024; 24:7. [PMID: 38197738 PMCID: PMC10790674 DOI: 10.1167/jov.24.1.7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2023] [Accepted: 11/12/2023] [Indexed: 01/11/2024] Open
Abstract
Humans communicate internal states through complex facial movements shaped by biological and evolutionary constraints. Although real-life social interactions are flooded with dynamic signals, current knowledge on facial expression recognition mainly arises from studies using static face images. This experimental bias might stem from previous studies consistently reporting that young adults minimally benefit from the richer dynamic over static information, whereas children, the elderly, and clinical populations very strongly do (Richoz, Jack, Garrod, Schyns, & Caldara, 2015, Richoz, Jack, Garrod, Schyns, & Caldara, 2018b). These observations point to a near-optimal facial expression decoding system in young adults, almost insensitive to the advantage of dynamic over static cues. Surprisingly, no study has yet tested the idea that such evidence might be rooted in a ceiling effect. To this aim, we asked 70 healthy young adults to perform static and dynamic facial expression recognition of the six basic expressions while parametrically and randomly varying the low-level normalized phase and contrast signal (0%-100%) of the faces. As predicted, when 100% face signals were presented, static and dynamic expressions were recognized with equal efficiency with the exception of those with the most informative dynamics (i.e., happiness and surprise). However, when less signal was available, dynamic expressions were all better recognized than their static counterpart (peaking at ∼20%). Our data show that facial movements increase our ability to efficiently identify emotional states of others under the suboptimal visual conditions that can occur in everyday life. Dynamic signals are more effective and sensitive than static ones for decoding all facial expressions of emotion for all human observers.
Collapse
Affiliation(s)
- Anne-Raphaëlle Richoz
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Lisa Stacchi
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Pauline Schaller
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Junpeng Lao
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Michael Papinutto
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Valentina Ticcinelli
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Roberto Caldara
- Eye and Brain Mapping Laboratory (iBMLab), Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
6
|
Woodward SH, Jamison AL, Khan C, Gala S, Bhowmick C, Villasenor D, Tamayo G, Puckett M, Parker KJ. Reading the mind in the eyes in PTSD: Limited Moderation by the presence of a service dog. J Psychiatr Res 2022; 155:320-330. [PMID: 36174367 DOI: 10.1016/j.jpsychires.2022.09.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/26/2022] [Revised: 08/02/2022] [Accepted: 09/16/2022] [Indexed: 11/25/2022]
Abstract
Persons with posttraumatic stress disorder (PTSD) frequently experience relationship failures in family and occupational domains resulting in loss of social supports. Prior research has implicated impairments in social cognition. The Reading the Mind in the Eyes Test (RMET) measures a key component of social cognition, the ability to infer the internal states of other persons based on features of the eyes region of the face; however, studies administering this popular test to persons with PTSD have yielded mixed results. This study assessed RMET performance in 47 male U.S. military Veterans with chronic, severe PTSD. Employing a within-subjects design that avoided selection biases, it aimed specifically to determine whether components of RMET performance, including accuracy, response latency, and stimulus dwell time, were improved by the company of a service dog, an intervention that has improved social function in other populations. RMET accuracies and response latencies in this PTSD sample were in the normal range. The presence of a familiar service dog did not improve RMET accuracy, reduce response latencies, or increase dwell times. Dog presence increased the speed of visual scanning perhaps consistent with reduced social fear.
Collapse
Affiliation(s)
- Steven H Woodward
- National Center for PTSD, Dissemination and Training Division, VA Palo Alto Healthcare System, 3801 Miranda Ave, Palo Alto, CA, 94304, USA.
| | - Andrea L Jamison
- National Center for PTSD, Dissemination and Training Division, VA Palo Alto Healthcare System, 3801 Miranda Ave, Palo Alto, CA, 94304, USA
| | - Christina Khan
- Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, 401 Quarry Rd., Stanford, CA, 94305-5485, USA
| | - Sasha Gala
- National Center for PTSD, Dissemination and Training Division, VA Palo Alto Healthcare System, 3801 Miranda Ave, Palo Alto, CA, 94304, USA
| | - Chloe Bhowmick
- National Center for PTSD, Dissemination and Training Division, VA Palo Alto Healthcare System, 3801 Miranda Ave, Palo Alto, CA, 94304, USA
| | - Diana Villasenor
- National Center for PTSD, Dissemination and Training Division, VA Palo Alto Healthcare System, 3801 Miranda Ave, Palo Alto, CA, 94304, USA
| | - Gisselle Tamayo
- National Center for PTSD, Dissemination and Training Division, VA Palo Alto Healthcare System, 3801 Miranda Ave, Palo Alto, CA, 94304, USA
| | - Melissa Puckett
- Trauma Recovery Programs and Recreation Service, VA Palo Alto Healthcare System, 3801 Miranda Ave, Palo Alto, CA, 94304, USA
| | - Karen J Parker
- Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, 401 Quarry Rd., Stanford, CA, 94305-5485, USA; Department of Comparative Medicine, Stanford University School of Medicine, 300 Pasteur Drive, Stanford, CA, 94305-5342, USA
| |
Collapse
|
7
|
Vicente-Querol MA, Fernandez-Caballero A, Molina JP, Gonzalez-Gualda LM, Fernandez-Sotos P, Garcia AS. Facial Affect Recognition in Immersive Virtual Reality: Where Is the Participant Looking? Int J Neural Syst 2022; 32:2250029. [DOI: 10.1142/s0129065722500290] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
8
|
Duran N, Atkinson AP. Foveal processing of emotion-informative facial features. PLoS One 2021; 16:e0260814. [PMID: 34855898 PMCID: PMC8638924 DOI: 10.1371/journal.pone.0260814] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Accepted: 11/17/2021] [Indexed: 11/18/2022] Open
Abstract
Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 2b) and when briefly presented at the mouth (Experiment 2a). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.
Collapse
Affiliation(s)
- Nazire Duran
- Department of Psychology, Durham University, Durham, United Kingdom
| | - Anthony P. Atkinson
- Department of Psychology, Durham University, Durham, United Kingdom
- * E-mail:
| |
Collapse
|
9
|
Goshvarpour A, Goshvarpour A. Human Emotion Recognition using Polar-Based Lagged Poincare Plot Indices of Eye-Blinking Data. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS 2021. [DOI: 10.1142/s1469026821500231] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Emotion recognition using bio-signals is currently a hot and challenging topic in human–computer interferences, robotics, and affective computing. A broad range of literature has been published by analyzing the internal/external behaviors of the subjects in confronting emotional events/stimuli. Eye movements, as an external behavior, are frequently used in the multi-modal emotion recognition system. On the other hand, classic statistical features of the signal have generally been assessed, and the evaluation of its dynamics has been neglected so far. For the first time, the dynamics of single-modal eye-blinking data are characterized. Novel polar-based indices of the lagged Poincare plots were introduced. The optimum lag was estimated using mutual information. After reconstruction of the plot, the polar measures of all points were characterized using statistical measures. The support vector machine (SVM), decision tree, and Naïve Bayes were implemented to complete the process of classification. The highest accuracy of 100% with an average accuracy of 84.17% was achieved for fear/sad discrimination using SVM. The suggested framework provided outstanding performances in terms of recognition rates, simplicity of the methodology, and less computational cost. Our results also showed that eye-blinking data possesses the potential for emotion recognition, especially in classifying fear emotion.
Collapse
Affiliation(s)
- Atefeh Goshvarpour
- Department of Biomedical Engineering, Faculty of Electrical Engineering, Sahand University of Technology, Tabriz, Iran
| | - Ateke Goshvarpour
- Department of Biomedical Engineering, Imam Reza International University, Mashhad, Razavi Khorasan, Iran
| |
Collapse
|