1
|
Zhou Y, Yao X, Han W, Li Y, Xue J, Li Z. Measurement of neuropsychiatric symptoms in the older adults with mild cognitive impairment based on speech and facial expressions: a cross-sectional observational study. Aging Ment Health 2024; 28:828-837. [PMID: 37970813 DOI: 10.1080/13607863.2023.2280913] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 10/31/2023] [Indexed: 11/19/2023]
Abstract
OBJECTIVES To examine the association between speech and facial features with depression, anxiety, and apathy in older adults with mild cognitive impairment (MCI). METHODS Speech and facial expressions of 319 MCI patients were digitally recorded via audio and video recording software. Three of the most common neuropsychiatric symptoms (NPS) were evaluated by the Public Health Questionnaire, General Anxiety Disorder, and Apathy Evaluation Scale, respectively. Speech and facial features were extracted using the open-source data analysis toolkits. Machine learning techniques were used to validate the diagnostic power of extracted features. RESULTS Different speech and facial features were associated with specific NPS. Depression was associated with spectral and temporal features, anxiety and apathy with frequency, energy, spectral, and temporal features. Additionally, depression was associated with facial features (action unit, AU) 10, 12, 15, 17, 25, anxiety with AU 10, 15, 17, 25, 26, 45, and apathy with AU 5, 26, 45. Significant differences in speech and facial features were observed between males and females. Based on machine learning models, the highest accuracy for detecting depression, anxiety, and apathy reached 95.8%, 96.1%, and 83.3% for males, and 87.8%, 88.2%, and 88.6% for females, respectively. CONCLUSION Depression, anxiety, and apathy were characterized by distinct speech and facial features. The machine learning model developed in this study demonstrated good classification in detecting depression, anxiety, and apathy. A combination of audio and video may provide objective methods for the precise classification of these symptoms.
Collapse
Affiliation(s)
- Ying Zhou
- School of Nursing, Shanghai Jiao Tong University, Shanghai, China
| | - Xiuyu Yao
- School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| | - Wei Han
- Department of Epidemiology and Biostatistics, Institute of Basic Medical Sciences, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| | - Yingxin Li
- Institute of Biomedical Engineering, Chinese Academy of Medical Sciences & Peking Union Medical College, Tianjin, China
| | - Jiajun Xue
- School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| | - Zheng Li
- School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| |
Collapse
|
2
|
Zhou Y, Han W, Yao X, Xue J, Li Z, Li Y. Developing a machine learning model for detecting depression, anxiety, and apathy in older adults with mild cognitive impairment using speech and facial expressions: A cross-sectional observational study. Int J Nurs Stud 2023; 146:104562. [PMID: 37531702 DOI: 10.1016/j.ijnurstu.2023.104562] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2023] [Revised: 06/23/2023] [Accepted: 07/01/2023] [Indexed: 08/04/2023]
Abstract
BACKGROUND Depression, anxiety, and apathy are highly prevalent in older people with preclinical dementia and mild cognitive impairment. These symptoms have also proven valuable in predicting the progression from mild cognitive impairment to dementia, enabling a timely diagnosis and treatment. However, objective and reliable indicators to detect and distinguish depression, anxiety, and apathy are relatively scarce. OBJECTIVE This study aimed to develop a machine learning model to detect and distinguish depression, anxiety, and apathy based on speech and facial expressions. DESIGN An observational, cross-sectional study design. SETTING(S) The memory outpatient department of a tertiary hospital. PARTICIPANTS 319 older adults diagnosed with mild cognitive impairment. METHODS Depression, anxiety, and apathy were evaluated by the Public Health Questionnaire, General Anxiety Disorder, and Apathy Evaluation Scale, respectively. Speech and facial expressions of older adults with mild cognitive impairment were digitally captured using audio and video recording software. Open-source data analysis toolkits were utilized to extract speech, facial, and text features. The multiclass classification was used to develop classification models, and shapely additive explanations were used to explain the contribution of each feature within the model. RESULTS The random forest method was used to develop a multiclass emotion classification model, which performed well in classifying emotions with a weighted-average F1 score of 96.6 %. The model also demonstrated high accuracy, precision, and recall, with 87.4 %, 86.6 %, and 87.6 %, respectively. CONCLUSIONS The machine learning model developed in this study demonstrated strong classification performance in detecting and differentiating depression, anxiety, and apathy. This innovative approach combines text, audio, and video to provide objective methods for precise classification and remote monitoring of these symptoms in nursing practice. REGISTRATION This study was registered at the Chinese Clinical Trial Registry (registration number: ChiCTR1900023892; registration date: June 19th, 2019).
Collapse
Affiliation(s)
- Ying Zhou
- School of Nursing, Shanghai Jiao Tong University, Shanghai, China.
| | - Wei Han
- Department of Epidemiology and Biostatistics, Institute of Basic Medical Sciences, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China.
| | - Xiuyu Yao
- School of Nursing, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - JiaJun Xue
- School of Nursing, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Zheng Li
- School of Nursing, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China.
| | - Yingxin Li
- Institute of Biomedical Engineering, Chinese Academy of Medical Sciences and Peking Union Medical College, Tianjin, China.
| |
Collapse
|
3
|
Alfalahi H, Dias SB, Khandoker AH, Chaudhuri KR, Hadjileontiadis LJ. A scoping review of neurodegenerative manifestations in explainable digital phenotyping. NPJ Parkinsons Dis 2023; 9:49. [PMID: 36997573 PMCID: PMC10063633 DOI: 10.1038/s41531-023-00494-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2022] [Accepted: 03/16/2023] [Indexed: 04/03/2023] Open
Abstract
Neurologists nowadays no longer view neurodegenerative diseases, like Parkinson's and Alzheimer's disease, as single entities, but rather as a spectrum of multifaceted symptoms with heterogeneous progression courses and treatment responses. The definition of the naturalistic behavioral repertoire of early neurodegenerative manifestations is still elusive, impeding early diagnosis and intervention. Central to this view is the role of artificial intelligence (AI) in reinforcing the depth of phenotypic information, thereby supporting the paradigm shift to precision medicine and personalized healthcare. This suggestion advocates the definition of disease subtypes in a new biomarker-supported nosology framework, yet without empirical consensus on standardization, reliability and interpretability. Although the well-defined neurodegenerative processes, linked to a triad of motor and non-motor preclinical symptoms, are detected by clinical intuition, we undertake an unbiased data-driven approach to identify different patterns of neuropathology distribution based on the naturalistic behavior data inherent to populations in-the-wild. We appraise the role of remote technologies in the definition of digital phenotyping specific to brain-, body- and social-level neurodegenerative subtle symptoms, emphasizing inter- and intra-patient variability powered by deep learning. As such, the present review endeavors to exploit digital technologies and AI to create disease-specific phenotypic explanations, facilitating the understanding of neurodegenerative diseases as "bio-psycho-social" conditions. Not only does this translational effort within explainable digital phenotyping foster the understanding of disease-induced traits, but it also enhances diagnostic and, eventually, treatment personalization.
Collapse
Affiliation(s)
- Hessa Alfalahi
- Department of Biomedical Engineering, Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates.
- Healthcare Engineering Innovation Center (HEIC), Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates.
| | - Sofia B Dias
- Department of Biomedical Engineering, Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates
- Healthcare Engineering Innovation Center (HEIC), Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates
- CIPER, Faculdade de Motricidade Humana, University of Lisbon, Lisbon, Portugal
| | - Ahsan H Khandoker
- Department of Biomedical Engineering, Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates
- Healthcare Engineering Innovation Center (HEIC), Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates
| | - Kallol Ray Chaudhuri
- Parkinson Foundation, International Center of Excellence, King's College London, Denmark Hills, London, UK
- Department of Basic and Clinical Neurosciences, Institute of Psychiatry, Psychology and Neuroscience, King's College London, De Crespigny Park, London, UK
| | - Leontios J Hadjileontiadis
- Department of Biomedical Engineering, Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates
- Healthcare Engineering Innovation Center (HEIC), Khalifa University of Science and Technology, Abu Dhabi, United Arab Emirates
- Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece
| |
Collapse
|
4
|
Zhou Y, Yao X, Han W, Wang Y, Li Z, Li Y. Distinguishing apathy and depression in older adults with mild cognitive impairment using text, audio, and video based on multiclass classification and shapely additive explanations. Int J Geriatr Psychiatry 2022; 37. [PMID: 36284449 DOI: 10.1002/gps.5827] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/07/2022] [Accepted: 09/29/2022] [Indexed: 11/06/2022]
Abstract
OBJECTIVES This study aimed to develop a classification model to detect and distinguish apathy and depression based on text, audio, and video features and to make use of the shapely additive explanations (SHAP) toolkit to increase the model interpretability. METHODS Subjective scales and objective experiments were conducted on 319 mild cognitive impairment (MCI) patients to measure apathy and depression. The MCI patients were classified into four groups, depression only, apathy only, depressed-apathetic, and the normal group. Speech, facial and text features were extracted using the open-source data analysis toolkits. Multiclass classification and SHAP toolkits were used to develop a classification model and explain the contribution of specific features. RESULTS The macro-averaged f1 score and accuracy for overall model were 0.91 and 0.90, respectively. The accuracy for the apathetic, depressed, depressed-apathetic, and normal groups were 0.98, 0.88, 0.93, and 0.82, respectively. The SHAP toolkit identified speech features (Mel-frequency cepstral coefficient (MFCC) 4, spectral slopes, F0, F1), facial features (action unit (AU) 14, 26, 28, 45), and text feature (text 6 semantic) associated with apathy. Meanwhile, speech features (spectral slopes, shimmer, F0) and facial expression (AU 2, 6, 7, 10, 14, 26, 45) were associated with depression. Apart from the shared features mentioned above, new speech (MFCC 2, loudness) and facial (AU 9) features were observed in the depressive-apathetic group. CONCLUSIONS Apathy and depression shared some verbal and facial features while also exhibited distinct features. A combination of text, audio, and video could be used to improve the early detection and differential diagnosis of apathy and depression in MCI patients.
Collapse
Affiliation(s)
- Ying Zhou
- School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| | - Xiuyu Yao
- School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| | - Wei Han
- Institute of Basic Medical Sciences, Chinese Academy of Medical Sciences, School of Basic Medicine Peking Union Medical College, Beijing, China
| | - Yidan Wang
- School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| | - Zheng Li
- School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, China
| | - Yingxin Li
- Institute of Biomedical Engineering, Chinese Academy of Medical Sciences & Peking Union Medical College, Tianjin, China
| |
Collapse
|
5
|
Changes in Computer-Analyzed Facial Expressions with Age. SENSORS 2021; 21:s21144858. [PMID: 34300600 PMCID: PMC8309819 DOI: 10.3390/s21144858] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/17/2021] [Revised: 07/13/2021] [Accepted: 07/15/2021] [Indexed: 11/17/2022]
Abstract
Facial expressions are well known to change with age, but the quantitative properties of facial aging remain unclear. In the present study, we investigated the differences in the intensity of facial expressions between older (n = 56) and younger adults (n = 113). In laboratory experiments, the posed facial expressions of the participants were obtained based on six basic emotions and neutral facial expression stimuli, and the intensities of their faces were analyzed using a computer vision tool, OpenFace software. Our results showed that the older adults expressed strong expressions for some negative emotions and neutral faces. Furthermore, when making facial expressions, older adults used more face muscles than younger adults across the emotions. These results may help to understand the characteristics of facial expressions in aging and can provide empirical evidence for other fields regarding facial recognition.
Collapse
|
6
|
Manera V, Galperti G, Rovini E, Zeghari R, Mancioppi G, Fiorini L, Gros A, Mouton A, Robert P, Cavallo F. Grasping Social Apathy: The Role of Reach-To-Grasp Action Kinematics for the Assessment of Social Apathy in Mild Neurocognitive Disorders. J Alzheimers Dis 2021; 81:569-582. [PMID: 33814424 DOI: 10.3233/jad-200966] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
BACKGROUND Social apathy, a reduction in initiative in proposing or engaging in social activities or interactions, is common in mild neurocognitive disorders (MND). Current apathy assessment relies on self-reports or clinical scales, but growing attention is devoted to defining more objective, measurable and non-invasive apathy proxies. OBJECTIVE In the present study we investigated the interest of recording action kinematics in a social reach-to-grasp task for the assessment of social apathy. METHODS Thirty participants took part in the study: 11 healthy controls (HC; 6 females, mean age = 68.3±10.5 years) and 19 subjects with MND (13 females, mean age = 75.7±6.3 years). Based on the Diagnostic Criteria for Apathy, MND subjects were classified as socially apathetic (A-MND, N = 9) versus non-apathetic (NA-MND, N = 10). SensRing, a ring-shaped wearable sensor, was placed on their index finger, and subjects were asked to reach and grasp a can to place it into a cup (individual condition) and pass it to a partner (social condition). RESULTS In the reach-to-grasp phase of the action, HC and NA-MND showed different acceleration and velocity profiles in the social versus individual condition. No differences were found for A-MND. CONCLUSION Previous studies showed the interest of recording patients' level of weekly motor activity for apathy assessment. Here we showed that a 10-min reach-to-grasp task may provide information to differentiate socially apathetic and non-apathetic subjects with MND, thus providing a tool easily usable in the clinical practice. Future studies with a bigger sample are needed to better characterize these findings.
Collapse
Affiliation(s)
- Valeria Manera
- CoBTeK Laboratory, Université Cote d'Azur, Nice, France.,IA Association, Nice, France
| | - Guenda Galperti
- BioRobotics Institute, Scuola Superiore Sant'Anna, Pontedera, Italy.,Department of Excellence in Robotics & AI, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Erika Rovini
- BioRobotics Institute, Scuola Superiore Sant'Anna, Pontedera, Italy.,Department of Excellence in Robotics & AI, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Radia Zeghari
- CoBTeK Laboratory, Université Cote d'Azur, Nice, France.,IA Association, Nice, France
| | - Gianmaria Mancioppi
- BioRobotics Institute, Scuola Superiore Sant'Anna, Pontedera, Italy.,Department of Excellence in Robotics & AI, Scuola Superiore Sant'Anna, Pisa, Italy
| | - Laura Fiorini
- Department of Industrial Engineering, University of Florence, Florence, Italy
| | - Auriane Gros
- CoBTeK Laboratory, Université Cote d'Azur, Nice, France.,IA Association, Nice, France.,Department of Speech Therapy,Université Cote d'Azur, Nice, France
| | - Aurélie Mouton
- CoBTeK Laboratory, Université Cote d'Azur, Nice, France.,IA Association, Nice, France.,Centre Mémoire de Ressources et de Recherche, Nice University Hospital, Nice, France
| | - Philippe Robert
- CoBTeK Laboratory, Université Cote d'Azur, Nice, France.,IA Association, Nice, France.,Centre Mémoire de Ressources et de Recherche, Nice University Hospital, Nice, France
| | - Filippo Cavallo
- Department of Industrial Engineering, University of Florence, Florence, Italy.,BioRobotics Institute, Scuola Superiore Sant'Anna, Pontedera, Italy.,Department of Excellence in Robotics & AI, Scuola Superiore Sant'Anna, Pisa, Italy
| |
Collapse
|