1
|
Jia SJ, Jing JQ, Yang CJ. A Review on Autism Spectrum Disorder Screening by Artificial Intelligence Methods. J Autism Dev Disord 2024:10.1007/s10803-024-06429-9. [PMID: 38842671 DOI: 10.1007/s10803-024-06429-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/30/2024] [Indexed: 06/07/2024]
Abstract
PURPOSE With the increasing prevalence of autism spectrum disorders (ASD), the importance of early screening and diagnosis has been subject to considerable discussion. Given the subtle differences between ASD children and typically developing children during the early stages of development, it is imperative to investigate the utilization of automatic recognition methods powered by artificial intelligence. We aim to summarize the research work on this topic and sort out the markers that can be used for identification. METHODS We searched the papers published in the Web of Science, PubMed, Scopus, Medline, SpringerLink, Wiley Online Library, and EBSCO databases from 1st January 2013 to 13th November 2023, and 43 articles were included. RESULTS These articles mainly divided recognition markers into five categories: gaze behaviors, facial expressions, motor movements, voice features, and task performance. Based on the above markers, the accuracy of artificial intelligence screening ranged from 62.13 to 100%, the sensitivity ranged from 69.67 to 100%, the specificity ranged from 54 to 100%. CONCLUSION Therefore, artificial intelligence recognition holds promise as a tool for identifying children with ASD. However, it still needs to continually enhance the screening model and improve accuracy through multimodal screening, thereby facilitating timely intervention and treatment.
Collapse
Affiliation(s)
- Si-Jia Jia
- Faculty of Education, East China Normal University, Shanghai, China
| | - Jia-Qi Jing
- Faculty of Education, East China Normal University, Shanghai, China
| | - Chang-Jiang Yang
- Faculty of Education, East China Normal University, Shanghai, China.
- China Research Institute of Care and Education of Infants and Young, Shanghai, China.
| |
Collapse
|
2
|
Hsu CT, Sato W. Electromyographic Validation of Spontaneous Facial Mimicry Detection Using Automated Facial Action Coding. SENSORS (BASEL, SWITZERLAND) 2023; 23:9076. [PMID: 38005462 PMCID: PMC10675524 DOI: 10.3390/s23229076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/28/2023] [Revised: 11/06/2023] [Accepted: 11/08/2023] [Indexed: 11/26/2023]
Abstract
Although electromyography (EMG) remains the standard, researchers have begun using automated facial action coding system (FACS) software to evaluate spontaneous facial mimicry despite the lack of evidence of its validity. Using the facial EMG of the zygomaticus major (ZM) as a standard, we confirmed the detection of spontaneous facial mimicry in action unit 12 (AU12, lip corner puller) via an automated FACS. Participants were alternately presented with real-time model performance and prerecorded videos of dynamic facial expressions, while simultaneous ZM signal and frontal facial videos were acquired. Facial videos were estimated for AU12 using FaceReader, Py-Feat, and OpenFace. The automated FACS is less sensitive and less accurate than facial EMG, but AU12 mimicking responses were significantly correlated with ZM responses. All three software programs detected enhanced facial mimicry by live performances. The AU12 time series showed a roughly 100 to 300 ms latency relative to the ZM. Our results suggested that while the automated FACS could not replace facial EMG in mimicry detection, it could serve a purpose for large effect sizes. Researchers should be cautious with the automated FACS outputs, especially when studying clinical populations. In addition, developers should consider the EMG validation of AU estimation as a benchmark.
Collapse
Affiliation(s)
- Chun-Ting Hsu
- Psychological Process Research Team, Guardian Robot Project, RIKEN, Soraku-gun, Kyoto 619-0288, Japan
| | - Wataru Sato
- Psychological Process Research Team, Guardian Robot Project, RIKEN, Soraku-gun, Kyoto 619-0288, Japan
| |
Collapse
|
3
|
Folz J, Akdağ R, Nikolić M, van Steenbergen H, Kret ME. Facial mimicry and metacognitive judgments in emotion recognition are distinctly modulated by social anxiety and autistic traits. Sci Rep 2023; 13:9730. [PMID: 37322077 PMCID: PMC10272184 DOI: 10.1038/s41598-023-35773-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2022] [Accepted: 05/23/2023] [Indexed: 06/17/2023] Open
Abstract
Facial mimicry as well as the accurate assessment of one's performance when judging others' emotional expressions have been suggested to inform successful emotion recognition. Differences in the integration of these two information sources might explain alterations in the perception of others' emotions in individuals with Social Anxiety Disorder and individuals on the autism spectrum. Using a non-clinical sample (N = 57), we examined the role of social anxiety and autistic traits in the link between facial mimicry, or confidence in one's performance, and emotion recognition. While participants were presented with videos of spontaneous emotional facial expressions, we measured their facial muscle activity, asked them to label the expressions and indicate their confidence in accurately labelling the expressions. Our results showed that confidence in emotion recognition was lower with higher social anxiety traits even though actual recognition was not related to social anxiety traits. Higher autistic traits, in contrast, were associated with worse recognition, and a weakened link between facial mimicry and performance. Consequently, high social anxiety traits might not affect emotion recognition itself, but the top-down evaluation of own abilities in emotion recognition contexts. High autistic traits, in contrast, may be related to lower integration of sensorimotor simulations, which promote emotion recognition.
Collapse
Affiliation(s)
- Julia Folz
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands.
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands.
| | - Rüya Akdağ
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
| | - Milica Nikolić
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
- Research Institute of Child Development and Education, University of Amsterdam, Amsterdam, The Netherlands
| | - Henk van Steenbergen
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
| | - Mariska E Kret
- Department of Cognitive Psychology, Institute of Psychology, Leiden University, Leiden, The Netherlands
- Leiden Institute for Brain and Cognition (LIBC), Leiden University, Leiden, The Netherlands
| |
Collapse
|
4
|
Hedlund Å. Autistic nurses: do they exist? BRITISH JOURNAL OF NURSING (MARK ALLEN PUBLISHING) 2023; 32:210-214. [PMID: 36828568 DOI: 10.12968/bjon.2023.32.4.210] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/26/2023]
Abstract
Autism spectrum disorder is an increasing diagnosis on a global scale. Despite limitations related to the diagnosis, many people with autism are active in the workforce, often within the health care sector. It is reasonable to assume that some of those are nurses. There are very few examples of nurses with autism in the literature, mostly in non-scientific contexts, and that these mention both autism-related strengths and limitations at work. A conclusion is that research about nurses with autism is almost non-existent, and it is high time to conduct explorative research in this area. If employers are given the knowledge and the ability to support the needs of nurses with autism, it is likely to benefit the health of the individual nurse, the psychosocial working climate and patient safety.
Collapse
Affiliation(s)
- Åsa Hedlund
- PhD-student, Department of Caring Sciences, University of Gävle, Sweden
| |
Collapse
|
5
|
Quinde-Zlibut J, Munshi A, Biswas G, Cascio CJ. Identifying and describing subtypes of spontaneous empathic facial expression production in autistic adults. J Neurodev Disord 2022; 14:43. [PMID: 35915404 PMCID: PMC9342940 DOI: 10.1186/s11689-022-09451-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/01/2021] [Accepted: 07/08/2022] [Indexed: 11/17/2022] Open
Abstract
BACKGROUND It is unclear whether atypical patterns of facial expression production metrics in autism reflect the dynamic and nuanced nature of facial expressions across people or a true diagnostic difference. Furthermore, the heterogeneity observed across autism symptomatology suggests a need for more adaptive and personalized social skills programs. Towards this goal, it would be useful to have a more concrete and empirical understanding of the different expressiveness profiles within the autistic population and how they differ from neurotypicals. METHODS We used automated facial coding and an unsupervised clustering approach to limit inter-individual variability in facial expression production that may have otherwise obscured group differences in previous studies, allowing an "apples-to-apples" comparison between autistic and neurotypical adults. Specifically, we applied k-means clustering to identify subtypes of facial expressiveness in an autism group (N = 27) and a neurotypical control group (N = 57) separately. The two most stable clusters from these analyses were then further characterized and compared based on their expressiveness and emotive congruence to emotionally charged stimuli. RESULTS Our main finding was that a subset of autistic adults in our sample show heightened spontaneous facial expressions irrespective of image valence. We did not find evidence for greater incongruous (i.e., inappropriate) facial expressions in autism. Finally, we found a negative trend between expressiveness and emotion recognition within the autism group. CONCLUSION The results from our previous study on self-reported empathy and current expressivity findings point to a higher degree of facial expressions recruited for emotional resonance in autism that may not always be adaptive (e.g., experiencing similar emotional resonance regardless of valence). These findings also build on previous work indicating that facial expression intensity is not diminished in autism and suggest the need for intervention programs to focus on emotion recognition and social skills in the context of both negative and positive emotions.
Collapse
Affiliation(s)
- Jennifer Quinde-Zlibut
- Graduate Program in Neuroscience, Vanderbilt University, Nashville, USA. .,Frist Center for Autism and Innovation, Vanderbilt University, Nashville, USA.
| | - Anabil Munshi
- grid.152326.10000 0001 2264 7217Institute for Software Integrated Systems, Vanderbilt University, Nashville, USA
| | - Gautam Biswas
- grid.152326.10000 0001 2264 7217Institute for Software Integrated Systems, Vanderbilt University, Nashville, USA
| | - Carissa J. Cascio
- grid.412807.80000 0004 1936 9916Department of Psychiatry and Behavioral Sciences, Vanderbilt University Medical Center, Nashville, USA
| |
Collapse
|
6
|
Zhang Y, Li D, Yang T, Chen C, Li H, Zhu C. Characteristics of emotional gaze on threatening faces in children with autism spectrum disorders. Front Psychiatry 2022; 13:920821. [PMID: 36072450 PMCID: PMC9441573 DOI: 10.3389/fpsyt.2022.920821] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Accepted: 08/03/2022] [Indexed: 11/13/2022] Open
Abstract
Most evidence suggested that individuals with autism spectrum disorder (ASD) experienced gaze avoidance when looking at the eyes compared to typically developing (TD) individuals. Children with ASD magnified their fears when received threatening stimuli, resulting in a reduced duration of eye contact. Few studies have explored the gaze characteristics of children with ASD by dividing emotional faces into threatening and non-threatening pairs. In addition, although dynamic videos are more helpful in understanding the gaze characteristics of children with ASD, the experimental stimuli for some of the previous studies were still emotional pictures. We explored the viewing of dynamic threatening and non-threatening faces by children with ASD in different areas of interest (AOIs). In this study, 6-10 years old children with and without ASD viewed faces with threatening (fearful and angry) and non-threatening (sad and happy) expressions, respectively, with their eyes movements recorded. The results showed that when confronted with threatening faces, children with ASD, rather than TD, showed substantial eye avoidances, particularly non-specific avoidances in the fixation time on the mouths and significantly less time gazing at the mouths in any emotions, which was not observed for non-threatening faces. No correlations were found between the severity of symptoms and characteristics of gaze at the eyes and mouths in children with ASD. These results further enhance the understanding of the gaze characteristics of children with ASD on threatening and non-threatening faces and possibly provide additional evidence for their social interaction improvements.
Collapse
Affiliation(s)
- Yifan Zhang
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China
| | - Dandan Li
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, China.,Department of Neurology, First Affiliated Hospital, Anhui Medical University, Hefei, China
| | - Tingting Yang
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China
| | - Chuanao Chen
- Anhui Province Hefei Kang Hua Rehabilitation Hospital, Hefei, China
| | - Hong Li
- Anhui Hospital Affiliated to the Pediatric Hospital of Fudan University, Hefei, China
| | - Chunyan Zhu
- The School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, China
| |
Collapse
|
7
|
Volovik DD, Omelchenko MA, Ivanova AM. Emotional Response to Humour Perception and Gelotophobia Among Healthy Individuals and Patients with Schizophrenia and Depression, with Signs of a High Clinical Risk of Psychosis. CONSORTIUM PSYCHIATRICUM 2021; 2:8-17. [PMID: 38601097 PMCID: PMC11003346 DOI: 10.17816/cp65] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2021] [Accepted: 03/15/2021] [Indexed: 11/08/2022] Open
Abstract
Introduction Investigating early changes in the emotional sphere within the schizophrenia course is a perspective direction in clinical psychology and psychiatry. Intactness of positive emotions, in particular, humour perception, may be a very important resource for patients. At the same time, humour perception is very sensitive to pathological conditions, such as the fear of being laughed at, known as gelotophobia. Those with gelotophobia perceive laughter as dangerous, rather than pleasant, and they can hardly distinguish between teasing and ridicule. Gelotophobia was confirmed to be expressed among people with mental disorders. Nonetheless, knowledge relating to the fear of being laughed at, was mostly generated among the non-clinical samples. Objectives Thus, the aim of the study was to provide more clinical data on gelotophobia manifestations associated with schizophrenia spectrum disorders; the emotional response and facial expression of patients with gelotophobia were studied, in particular, regarding their perception of humour, including during the early stages of disorders, by comparison with healthy individuals. Methods n=30 controls and n=32 patients with schizophrenia and with depression with signs of a high clinical risk of psychosis took part. Two short videos, comic and neutral, were shown to the participants, while videotaping their facial expression, followed each by a self-reported measure of emotional responses. Participants also completed the State-Trait Anxiety Inventory, the PhoPhiKat30 and the Toronto Alexithymia Scale. Results Gelotophobia was significantly higher within the clinical group. It correlated with a lower frequency of grins among the patients during the comic video, while this was not the case in the control group. Gelotophobia was related to state and trait anxiety in both groups, but only in the clinical group did state anxiety increase after watching the comic video. Gelotophobia correlated with alexithymia and was twice higher among the patients compared to the controls. Conclusion Thus, gelotophobia has not only quantitative, but also qualitative specifics in patients with schizophrenia, and those with depression with signs of a clinically high risk of psychosis, compared to healthy controls.
Collapse
|
8
|
Van der Donck S, Vettori S, Dzhelyova M, Mahdi SS, Claes P, Steyaert J, Boets B. Investigating automatic emotion processing in boys with autism via eye tracking and facial mimicry recordings. Autism Res 2021; 14:1404-1420. [PMID: 33704930 DOI: 10.1002/aur.2490] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 02/08/2021] [Indexed: 11/08/2022]
Abstract
Difficulties in automatic emotion processing in individuals with autism spectrum disorder (ASD) might remain concealed in behavioral studies due to compensatory strategies. To gain more insight in the mechanisms underlying facial emotion recognition, we recorded eye tracking and facial mimicry data of 20 school-aged boys with ASD and 20 matched typically developing controls while performing an explicit emotion recognition task. Proportional looking times to specific face regions (eyes, nose, and mouth) and face exploration dynamics were analyzed. In addition, facial mimicry was assessed. Boys with ASD and controls were equally capable to recognize expressions and did not differ in proportional looking times, and number and duration of fixations. Yet, specific facial expressions elicited particular gaze patterns, especially within the control group. Both groups showed similar face scanning dynamics, although boys with ASD demonstrated smaller saccadic amplitudes. Regarding the facial mimicry, we found no emotion specific facial responses and no group differences in the responses to the displayed facial expressions. Our results indicate that boys with and without ASD employ similar eye gaze strategies to recognize facial expressions. Smaller saccadic amplitudes in boys with ASD might indicate a less exploratory face processing strategy. Yet, this slightly more persistent visual scanning behavior in boys with ASD does not imply less efficient emotion information processing, given the similar behavioral performance. Results on the facial mimicry data indicate similar facial responses to emotional faces in boys with and without ASD. LAY SUMMARY: We investigated (i) whether boys with and without autism apply different face exploration strategies when recognizing facial expressions and (ii) whether they mimic the displayed facial expression to a similar extent. We found that boys with and without ASD recognize facial expressions equally well, and that both groups show similar facial reactions to the displayed facial emotions. Yet, boys with ASD visually explored the faces slightly less than the boys without ASD.
Collapse
Affiliation(s)
- Stephanie Van der Donck
- Center for Developmental Psychiatry, Department of Neurosciences, KU Leuven, Leuven, Belgium.,Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Sofie Vettori
- Center for Developmental Psychiatry, Department of Neurosciences, KU Leuven, Leuven, Belgium.,Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Milena Dzhelyova
- Institute of Research in Psychological Sciences, Institute of Neuroscience, Université de Louvain, Louvain-La-Neuve, Belgium
| | - Soha Sadat Mahdi
- Medical Imaging Research Center, MIRC, Leuven, Belgium.,Department of Electrical Engineering (ESAT/PSI), KU Leuven, Leuven, Belgium
| | - Peter Claes
- Medical Imaging Research Center, MIRC, Leuven, Belgium.,Department of Electrical Engineering (ESAT/PSI), KU Leuven, Leuven, Belgium.,Department of Human Genetics, KU Leuven, Leuven, Belgium
| | - Jean Steyaert
- Center for Developmental Psychiatry, Department of Neurosciences, KU Leuven, Leuven, Belgium.,Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| | - Bart Boets
- Center for Developmental Psychiatry, Department of Neurosciences, KU Leuven, Leuven, Belgium.,Leuven Autism Research (LAuRes), KU Leuven, Leuven, Belgium
| |
Collapse
|
9
|
Briot K, Pizano A, Bouvard M, Amestoy A. New Technologies as Promising Tools for Assessing Facial Emotion Expressions Impairments in ASD: A Systematic Review. Front Psychiatry 2021; 12:634756. [PMID: 34025469 PMCID: PMC8131507 DOI: 10.3389/fpsyt.2021.634756] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/28/2020] [Accepted: 03/25/2021] [Indexed: 11/13/2022] Open
Abstract
The ability to recognize and express emotions from facial expressions are essential for successful social interactions. Facial Emotion Recognition (FER) and Facial Emotion Expressions (FEEs), both of which seem to be impaired in Autism Spectrum Disorders (ASD) and contribute to socio-communicative difficulties, participate in the diagnostic criteria for ASD. Only a few studies have focused on FEEs processing and the rare behavioral studies of FEEs in ASD have yielded mixed results. Here, we review studies comparing the production of FEEs between participants with ASD and non-ASD control subjects, with a particular focus on the use of automatic facial expression analysis software. A systematic literature search in accordance with the PRISMA statement identified 20 reports published up to August 2020 concerning the use of new technologies to evaluate both spontaneous and voluntary FEEs in participants with ASD. Overall, the results highlight the importance of considering socio-demographic factors and psychiatric co-morbidities which may explain the previous inconsistent findings, particularly regarding quantitative data on spontaneous facial expressions. There is also reported evidence for an inadequacy of FEEs in individuals with ASD in relation to expected emotion, with a lower quality and coordination of facial muscular movements. Spatial and kinematic approaches to characterizing the synchrony, symmetry and complexity of facial muscle movements thus offer clues to identifying and exploring promising new diagnostic targets. These findings have allowed hypothesizing that there may be mismatches between mental representations and the production of FEEs themselves in ASD. Such considerations are in line with the Facial Feedback Hypothesis deficit in ASD as part of the Broken Mirror Theory, with the results suggesting impairments of neural sensory-motor systems involved in processing emotional information and ensuring embodied representations of emotions, which are the basis of human empathy. In conclusion, new technologies are promising tools for evaluating the production of FEEs in individuals with ASD, and controlled studies involving larger samples of patients and where possible confounding factors are considered, should be conducted in order to better understand and counter the difficulties in global emotional processing in ASD.
Collapse
Affiliation(s)
- Kellen Briot
- Medical Sciences Department, University of Bordeaux, Bordeaux, France.,Pôle Universitaire de Psychiatrie de l'Enfant et de l'Adolescent, Centre Hospitalier Charles-Perrens, Bordeaux, France.,Aquitaine Institute for Cognitive and Integrative Neuroscience (INCIA), UMR 5287, CNRS, Bordeaux, France
| | - Adrien Pizano
- Medical Sciences Department, University of Bordeaux, Bordeaux, France.,Pôle Universitaire de Psychiatrie de l'Enfant et de l'Adolescent, Centre Hospitalier Charles-Perrens, Bordeaux, France.,Aquitaine Institute for Cognitive and Integrative Neuroscience (INCIA), UMR 5287, CNRS, Bordeaux, France
| | - Manuel Bouvard
- Medical Sciences Department, University of Bordeaux, Bordeaux, France.,Pôle Universitaire de Psychiatrie de l'Enfant et de l'Adolescent, Centre Hospitalier Charles-Perrens, Bordeaux, France.,Aquitaine Institute for Cognitive and Integrative Neuroscience (INCIA), UMR 5287, CNRS, Bordeaux, France
| | - Anouck Amestoy
- Medical Sciences Department, University of Bordeaux, Bordeaux, France.,Pôle Universitaire de Psychiatrie de l'Enfant et de l'Adolescent, Centre Hospitalier Charles-Perrens, Bordeaux, France.,Aquitaine Institute for Cognitive and Integrative Neuroscience (INCIA), UMR 5287, CNRS, Bordeaux, France
| |
Collapse
|
10
|
Yang T, Li D, Zhang Y, Zhang L, Li H, Ji GJ, Yang Z, Zhang L, Zhu C, Wang K. Eye Avoidance of Threatening Facial Expressions in Parents of Children with ASD. Neuropsychiatr Dis Treat 2021; 17:1869-1879. [PMID: 34140771 PMCID: PMC8203098 DOI: 10.2147/ndt.s300491] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/08/2021] [Accepted: 05/17/2021] [Indexed: 11/23/2022] Open
Abstract
OBJECTIVE Previous research found that autism spectrum disorder (ASD) was characterized by eye avoidance of threatening facial expressions. However, it still remains unclear as to whether these abnormalities are present in parents of children with ASD. Our study aimed to investigate the gaze patterns of parents of children with ASD in the threatening facial expressions. METHODS Thirty-four parents of children with ASD and 35 parents of typically developing (TD) children participated in our study. We investigated the total fixation time of participants when they viewed different facial expression (eg, happy, fearful, angry, sad) videos and examined changes in the fixation duration over time. RESULTS We observed the following: a) the total fixation time of the parents of children with ASD on the eyes of fearful faces was significantly shorter than that of the normal group, and the difference lasted for five seconds (four to six seconds, eight to nine seconds) throughout the process; and b) The parents of children with ASD avoided the eyes of angry expression faces at around five seconds after the stimulus onset. CONCLUSION We concluded that parents of children with ASD tended to avoid the eyes of threatening expression faces while viewing the dynamic emotions video.
Collapse
Affiliation(s)
- Tingting Yang
- School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, 230022, People's Republic of China
| | - Dandan Li
- School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, 230022, People's Republic of China.,Department of Neurology, the First Affiliated Hospital of Anhui Medical University, Hefei, 230022, People's Republic of China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, 230022, People's Republic of China
| | - Yifan Zhang
- School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, 230022, People's Republic of China
| | - Long Zhang
- Department of Neurology, the First Affiliated Hospital of Anhui Medical University, Hefei, 230022, People's Republic of China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, 230022, People's Republic of China
| | - Hong Li
- Department of Neurological Rehabilitation of Children, Anhui Provincial Children's Hospital, Hefei, 230022, People's Republic of China
| | - Gong-Jun Ji
- School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, 230022, People's Republic of China.,Department of Neurology, the First Affiliated Hospital of Anhui Medical University, Hefei, 230022, People's Republic of China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, 230022, People's Republic of China
| | - Zhenhai Yang
- Department of Rehabilitation therapy, The First Clinical Medical College of Anhui Medical University, Hefei, 230022, People's Republic of China
| | - Lei Zhang
- School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, 230022, People's Republic of China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, 230022, People's Republic of China
| | - Chunyan Zhu
- School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, 230022, People's Republic of China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, 230022, People's Republic of China
| | - Kai Wang
- School of Mental Health and Psychological Sciences, Anhui Medical University, Hefei, 230022, People's Republic of China.,Department of Neurology, the First Affiliated Hospital of Anhui Medical University, Hefei, 230022, People's Republic of China.,Anhui Province Key Laboratory of Cognition and Neuropsychiatric Disorders, Hefei, 230022, People's Republic of China.,Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, Hefei, 230022, People's Republic of China
| |
Collapse
|
11
|
Vandewouw MM, Choi E, Hammill C, Arnold P, Schachar R, Lerch JP, Anagnostou E, Taylor MJ. Emotional face processing across neurodevelopmental disorders: a dynamic faces study in children with autism spectrum disorder, attention deficit hyperactivity disorder and obsessive-compulsive disorder. Transl Psychiatry 2020; 10:375. [PMID: 33139709 PMCID: PMC7608673 DOI: 10.1038/s41398-020-01063-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Revised: 04/14/2020] [Accepted: 04/21/2020] [Indexed: 12/12/2022] Open
Abstract
Autism spectrum disorder (ASD) is classically associated with poor face processing skills, yet evidence suggests that those with obsessive-compulsive disorder (OCD) and attention deficit hyperactivity disorder (ADHD) also have difficulties understanding emotions. We determined the neural underpinnings of dynamic emotional face processing across these three clinical paediatric groups, including developmental trajectories, compared with typically developing (TD) controls. We studied 279 children, 5-19 years of age but 57 were excluded due to excessive motion in fMRI, leaving 222: 87 ASD, 44 ADHD, 42 OCD and 49 TD. Groups were sex- and age-matched. Dynamic faces (happy, angry) and dynamic flowers were presented in 18 pseudo-randomized blocks while fMRI data were collected with a 3T MRI. Group-by-age interactions and group difference contrasts were analysed for the faces vs. flowers and between happy and angry faces. TD children demonstrated different activity patterns across the four contrasts; these patterns were more limited and distinct for the NDDs. Processing happy and angry faces compared to flowers yielded similar activation in occipital regions in the NDDs compared to TDs. Processing happy compared to angry faces showed an age by group interaction in the superior frontal gyrus, increasing with age for ASD and OCD, decreasing for TDs. Children with ASD, ADHD and OCD differentiated less between dynamic faces and dynamic flowers, with most of the effects seen in the occipital and temporal regions, suggesting that emotional difficulties shared in NDDs may be partly attributed to shared atypical visual information processing.
Collapse
Affiliation(s)
- Marlee M Vandewouw
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Canada
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
| | - EunJung Choi
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Canada
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Bloorview Research Institute, University of Toronto, 150 Kilgour Road, Toronto, Canada
| | - Christopher Hammill
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
| | - Paul Arnold
- Mathison Centre for Mental Health Research & Education, Hotchkiss Brain Institute, Cumming School of Medicine, University of Calgary, Alberta, Canada
| | - Russell Schachar
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Department of Psychiatry, Hospital for Sick Children, Toronto, Canada
| | - Jason P Lerch
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Wellcome Centre for Integrative Neuroimaging, FMRIB, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, UK
- Department of Medical Biophysics, University of Toronto, Toronto, Canada
| | - Evdokia Anagnostou
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada
- Bloorview Research Institute, University of Toronto, 150 Kilgour Road, Toronto, Canada
| | - Margot J Taylor
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Canada.
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Canada.
- Department of Psychology, University of Toronto, Toronto, Canada.
- Department of Medical Imaging, University of Toronto, Toronto, Canada.
| |
Collapse
|
12
|
Vandewouw MM, Choi EJ, Hammill C, Lerch JP, Anagnostou E, Taylor MJ. Changing Faces: Dynamic Emotional Face Processing in Autism Spectrum Disorder Across Childhood and Adulthood. BIOLOGICAL PSYCHIATRY: COGNITIVE NEUROSCIENCE AND NEUROIMAGING 2020; 6:825-836. [PMID: 33279458 DOI: 10.1016/j.bpsc.2020.09.006] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/17/2020] [Revised: 08/17/2020] [Accepted: 09/04/2020] [Indexed: 11/19/2022]
Abstract
BACKGROUND Autism spectrum disorder (ASD) is classically associated with poor emotional face processing. Few studies, however, have used more ecological dynamic stimuli. We contrasted functional magnetic resonance imaging measures of dynamic emotional face processing in ASD and typically developing (TD) cohorts across a wide age range to determine if the processing and age-related trajectories differed between participants with and without ASD. METHODS Functional magnetic resonance imaging data collected from 200 participants (5-42 years old; 107 in ASD cohort, 93 in TD cohort) during the presentation of dynamic emotional faces (neutral-to-happy, neutral-to-angry) and dynamic flowers (closed-to-open) were analyzed. Group differences and group-by-age interactions in the faces versus flowers and between emotion contrasts were investigated. RESULTS Differences in activation between dynamic faces and flowers in occipital regions, including the fusiform gyri, were reduced in the ASD group. Contrasting the two emotions, ASD compared with TD participants showed increased engagement of the precentral, postcentral, and superior temporal gyri to happy faces and increased activation to angry faces occipitally. Emotion processing regions, such as insula, temporal pole, and frontal regions, showed increased recruitment with age to happy faces compared with both angry faces and flowers in the TD group, but decreased recruitment with age in the ASD group. CONCLUSIONS Using dynamic stimuli, we demonstrated that participants with ASD processed faces similarly to nonface stimuli, and age-related atypicalities were more pronounced to happy faces in participants with ASD. We demonstrated emotion-specific atypicalities in a large group of participants with ASD that underscore persistent difficulties from childhood into mid-adulthood.
Collapse
Affiliation(s)
- Marlee M Vandewouw
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Ontario, Canada; Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Ontario, Canada; Autism Research Center, Bloorview Research Institute, Holland Bloorview Kids Rehabiliation Hospital, Toronto, Ontario, Canada; Institute of Biomedical Engineering, University of Toronto, Toronto, Ontario, Canada.
| | - Eun Jung Choi
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Ontario, Canada; Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Ontario, Canada; Autism Research Center, Bloorview Research Institute, Holland Bloorview Kids Rehabiliation Hospital, Toronto, Ontario, Canada
| | - Christopher Hammill
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Ontario, Canada
| | - Jason P Lerch
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Ontario, Canada; Department of Medical Biophysics, University of Toronto, Toronto, Ontario, Canada; Wellcome Centre for Integrative Neuroimaging, Oxford Centre for Functional MRI of the Brain, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| | - Evdokia Anagnostou
- Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Ontario, Canada; Autism Research Center, Bloorview Research Institute, Holland Bloorview Kids Rehabiliation Hospital, Toronto, Ontario, Canada; Institute of Medical Science, University of Toronto, Toronto, Ontario, Canada
| | - Margot J Taylor
- Department of Diagnostic Imaging, Hospital for Sick Children, Toronto, Ontario, Canada; Program in Neurosciences & Mental Health, Hospital for Sick Children, Toronto, Ontario, Canada; Department of Psychology, University of Toronto, Toronto, Ontario, Canada; Department of Medical Imaging, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
13
|
Computational Analysis of Deep Visual Data for Quantifying Facial Expression Production. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9214542] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The computational analysis of facial expressions is an emerging research topic that could overcome the limitations of human perception and get quick and objective outcomes in the assessment of neurodevelopmental disorders (e.g., Autism Spectrum Disorders, ASD). Unfortunately, there have been only a few attempts to quantify facial expression production and most of the scientific literature aims at the easier task of recognizing if either a facial expression is present or not. Some attempts to face this challenging task exist but they do not provide a comprehensive study based on the comparison between human and automatic outcomes in quantifying children’s ability to produce basic emotions. Furthermore, these works do not exploit the latest solutions in computer vision and machine learning. Finally, they generally focus only on a homogeneous (in terms of cognitive capabilities) group of individuals. To fill this gap, in this paper some advanced computer vision and machine learning strategies are integrated into a framework aimed to computationally analyze how both ASD and typically developing children produce facial expressions. The framework locates and tracks a number of landmarks (virtual electromyography sensors) with the aim of monitoring facial muscle movements involved in facial expression production. The output of these virtual sensors is then fused to model the individual ability to produce facial expressions. Gathered computational outcomes have been correlated with the evaluation provided by psychologists and evidence has been given that shows how the proposed framework could be effectively exploited to deeply analyze the emotional competence of ASD children to produce facial expressions.
Collapse
|