1
|
Kong HH, Shin K, Yang DS, Gu HY, Joo HS, Shon HC. Digital assessment of walking ability: Validity and reliability of the automated figure-of-eight walk test in older adults. PLoS One 2025; 20:e0316612. [PMID: 39928640 PMCID: PMC11809801 DOI: 10.1371/journal.pone.0316612] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2024] [Accepted: 12/13/2024] [Indexed: 02/12/2025] Open
Abstract
BACKGROUND The Figure-of-Eight Walk Test (F8WT) can assess straight- and curved-path walking ability, but the validity and reliability of automated measurement of the F8WT using digital device has not yet been studied. The aim of this study was to verify the validity (method comparison) and test-retest reliability of the automated FW8T (aFW8T) using a digital device based on image analysis by comparing the results of the aF8WT with those of the manual F8WT (mF8WT). METHODS Community-dwelling older adults underwent the mF8WT performed by a physiotherapist and the aF8WT using the Digital Senior Fitness Test system. To verify the test-retest reliability, the aF8WT was administered again to a randomly selected group of participants one week after the baseline test. The intraclass correlation coefficient (ICC) and Pearson's correlation analysis were used to verify the degree of agreement between the results of and correlation between the mF8WT and aF8WT, respectively. The 95% confidence interval (CI) of the limits of agreement (LoA) was obtained using Bland-Altman analysis. RESULTS The analysis included 83 participants (mean age 71.6 ± 4.7 years). The participants' mF8WT and aF8WT results were 29.1 ± 4.9 and 29.8 ± 4.9 seconds, respectively. Pearson's correlation analysis showed a very strong correlation between the mF8WT and aF8WT results with r = 0.91 (p < 0.001), and the ICC between the mF8WT and aF8WT results was 0.95 (0.91-0.97), showing excellent agreement. The 95% CI of the LoA was -0.7 (-4.8 to 3.3) seconds in the Bland-Altman analysis. In an analysis of the test-retest reliability of the aF8WT, participants' aF8WT results were 30.9 ± 4.7 seconds (baseline) and 29.6 ± 4.9 seconds (retest), with an ICC of 0.94 (0.81-0.98, p < 0.001), indicating excellent reliability. CONCLUSION Automated measurement of the F8WT using a digital device showed excellent validity and reliability. The aF8WT can be used to assess and monitor the walking ability of community-dwelling older adults.
Collapse
Affiliation(s)
- Hyun-Ho Kong
- Department of Rehabilitation Medicine, Chungbuk National University Hospital, Cheongju, Republic of Korea
- Department of Rehabilitation Medicine, Chungbuk National University College of Medicine, Cheongju, Republic of Korea
| | - Kwangsoo Shin
- Graduate School of Public Health and Healthcare Management, Songeui Medical Campus, The Catholic University of Korea, Seoul, Republic of Korea
| | - Dong-Seok Yang
- Technology Strategy Center, Neofect, Seongnam, Republic of Korea
| | - Hye-Young Gu
- Department of Rehabilitation Medicine, Chungbuk National University Hospital, Cheongju, Republic of Korea
| | - Hyeon-Seong Joo
- Department of Physical Therapy, Daejeon University, Daejeon, Republic of Korea
| | - Hyun-Chul Shon
- Department of Orthopaedic Surgery, Chungbuk National University Hospital, Cheongju, Republic of Korea
- Department of Orthopaedic Surgery, Chungbuk National University College of Medicine, Cheongju, Republic of Korea
| |
Collapse
|
2
|
Halvorsen K, Peng W, Olsson F, Åberg AC. Two-step deep-learning identification of heel keypoints from video-recorded gait. Med Biol Eng Comput 2025; 63:229-237. [PMID: 39292381 PMCID: PMC11695559 DOI: 10.1007/s11517-024-03189-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2023] [Accepted: 08/27/2024] [Indexed: 09/19/2024]
Abstract
Accurate and fast extraction of step parameters from video recordings of gait allows for richer information to be obtained from clinical tests such as Timed Up and Go. Current deep-learning methods are promising, but lack in accuracy for many clinical use cases. Extracting step parameters will often depend on extracted landmarks (keypoints) on the feet. We hypothesize that such keypoints can be determined with an accuracy relevant for clinical practice from video recordings by combining an existing general-purpose pose estimation method (OpenPose) with custom convolutional neural networks (convnets) specifically trained to identify keypoints on the heel. The combined method finds keypoints on the posterior and lateral aspects of the heel of the foot in side-view and frontal-view images from which step length and step width can be determined for calibrated cameras. Six different candidate convnets were evaluated, combining three different standard architectures as networks for feature extraction (backbone), and with two different networks for predicting keypoints on the heel (head networks). Using transfer learning, the backbone networks were pre-trained on the ImageNet dataset, and the combined networks (backbone + head) were fine-tuned on data from 184 trials of older, unimpaired adults. The data was recorded at three different locations and consisted of 193 k side-view images and 110 k frontal-view images. We evaluated the six different models using the absolute distance on the floor between predicted keypoints and manually labelled keypoints. For the best-performing convnet, the median error was 0.55 cm and the 75% quartile was below 1.26 cm using data from the side-view camera. The predictions are overall accurate, but show some outliers. The results indicate potential for future clinical use by automating a key step in marker-less gait parameter extraction.
Collapse
Affiliation(s)
| | - Wei Peng
- Department of Public Health and Caring Sciences, Uppsala University, Uppsala, Sweden
| | | | - Anna Cristina Åberg
- School of Health and Welfare, Dalarna University, Falun, Sweden
- Department of Public Health and Caring Sciences, Uppsala University, Uppsala, Sweden
| |
Collapse
|
3
|
Kim JH, Hong H, Lee K, Jeong Y, Ryu H, Kim H, Jang SH, Park HK, Han JY, Park HJ, Bae H, Oh BM, Kim WS, Lee SY, Lee SU. AI in evaluating ambulation of stroke patients: severity classification with video and functional ambulation category scale. Top Stroke Rehabil 2024:1-9. [PMID: 38841903 DOI: 10.1080/10749357.2024.2359342] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2023] [Accepted: 05/18/2024] [Indexed: 06/07/2024]
Abstract
BACKGROUND The evaluation of gait function and severity classification of stroke patients are important to determine the rehabilitation goal and the level of exercise. Physicians often qualitatively evaluate patients' walking ability through visual gait analysis using naked eye, video images, or standardized assessment tools. Gait evaluation through observation relies on the doctor's empirical judgment, potentially introducing subjective opinions. Therefore, conducting research to establish a basis for more objective judgment is crucial. OBJECTIVE To verify a deep learning model that classifies gait image data of stroke patients according to Functional Ambulation Category (FAC) scale. METHODS Gait vision data from 203 stroke patients and 182 healthy individuals recruited from six medical institutions were collected to train a deep learning model for classifying gait severity in stroke patients. The recorded videos were processed using OpenPose. The dataset was randomly split into 80% for training and 20% for testing. RESULTS The deep learning model attained a training accuracy of 0.981 and test accuracy of 0.903. Area Under the Curve(AUC) values of 0.93, 0.95, and 0.96 for discriminating among the mild, moderate, and severe stroke groups, respectively. CONCLUSION This confirms the potential of utilizing human posture estimation based on vision data not only to develop gait parameter models but also to develop models to classify severity according to the FAC criteria used by physicians. To develop an AI-based severity classification model, a large amount and variety of data is necessary and data collected in non-standardized real environments, not in laboratories, can also be used meaningfully.
Collapse
Affiliation(s)
- Jeong-Hyun Kim
- Department of Rehabilitation Medicine, Seoul Metropolitan Government Boramae Medical Center, Seoul, South Korea
| | - Hyeon Hong
- Department of Rehabilitation Medicine, Seoul Metropolitan Government Boramae Medical Center, Seoul, South Korea
| | - Kyuwon Lee
- Department of Rehabilitation Medicine, Seoul Metropolitan Government Boramae Medical Center, Seoul, South Korea
| | - Yeji Jeong
- Department of Rehabilitation Medicine, Seoul Metropolitan Government Boramae Medical Center, Seoul, South Korea
| | - Hokyoung Ryu
- Department of Graduate School of Technology and Innovation Management, Hanyang University, Seoul, South Korea
| | - Hyundo Kim
- Department of Intelligence Computing, Hanyang University, Seoul, South Korea
| | - Seong-Ho Jang
- Department of Rehabilitation Medicine, Hanyang University, Guri Hospital, Gyeonggi-do, South Korea
| | - Hyeng-Kyu Park
- Department of Physical & Rehabilitation Medicine, Regional Cardiocerebrovascular Center, Center for Aging and Geriatrics, Chonnam National University Medical School & Hospital, Gwangju, South Korea
| | - Jae-Young Han
- Department of Physical & Rehabilitation Medicine, Regional Cardiocerebrovascular Center, Center for Aging and Geriatrics, Chonnam National University Medical School & Hospital, Gwangju, South Korea
| | - Hye Jung Park
- Department of Rehabilitation Medicine, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, South Korea
| | - Hasuk Bae
- Department of Rehabilitation Medicine, Ewha Woman's University, Seoul, South Korea
| | - Byung-Mo Oh
- Department of Rehabilitation, Seoul National University Hospital, Seoul, South Korea
| | - Won-Seok Kim
- Department of Rehabilitation Medicine, Seoul National University College of Medicine, Seoul, South Korea
| | - Sang Yoon Lee
- Department of Rehabilitation Medicine, Seoul National University College of Medicine, SMG-SNU Boramae Medical Center, Seoul, South Korea
| | - Shi-Uk Lee
- Department of Rehabilitation Medicine, Seoul Metropolitan Government Boramae Medical Center, Seoul, South Korea
- Department of Physical Medicine & Rehabilitation, College of Medicine, Seoul National University, Seoul, South Korea
| |
Collapse
|
4
|
Lorenzo-García P, Cavero-Redondo I, Núñez de Arenas-Arroyo S, Guzmán-Pavón MJ, Priego-Jiménez S, Álvarez-Bueno C. Effects of physical exercise interventions on balance, postural stability and general mobility in Parkinson's disease: a network meta-analysis. J Rehabil Med 2024; 56:jrm10329. [PMID: 38298133 PMCID: PMC10847976 DOI: 10.2340/jrm.v56.10329] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Accepted: 11/30/2023] [Indexed: 02/02/2024] Open
Abstract
OBJECTIVE To assess which type of physical exercise intervention has the most beneficial effects on balance, postural stability and general mobility in patients with Parkinson's disease. These parameters were assessed using the Activities-specific Balance Confidence (ABC) scale, Berg Balance Scale (BBS), Mini-Balance Evaluation Systems Test (MiniBESTest) and Timed Up and Go Test (TUG). DESIGN Network meta-analysis. METHODS The PubMed, Cochrane Central Register of Controlled Trials, and Web of Science databases were searched up to August 2022 to identify randomized controlled trials on the effects of physical exercise interventions on balance, postural stability, and general mobility. The network meta-analysis included pairwise and indirect comparisons of results on the ABC scale, BBS, MiniBESTest, and TUG across 8 categories of physical exercise. RESULTS Eighty-six studies with a total of 4,693 patients were included. For the ABC scale, the indirect comparison showed that the highest effect size was observed for balance vs sensorimotor training without including endurance interventions (0.62; 95% confidence interval (95% CI) 0.06, 1.17). The highest effect sizes for BBS were observed for alternative exercises (1.21; 95% CI 0.62, 1.81), body-weight supported (BWS) interventions (1.31; 95% CI 0.57, 2.05), dance (1.18; 95% CI 0.33, 2.03) and sensorimotor training, including endurance interventions (1.10; 95% CI 0.46, 1.75) vs control groups. Indirect comparisons showed that the highest effect size for the MiniBESTest were observed for balance (0.75; 95% CI 0.46, 1.04) and resistance (0.58; 95% CI 0.10, 1.07) vs control groups. For the TUG, comparisons showed a significant effect size for alternative exercises (-0.54; 95% CI -0.82, -0.26), balance (-0.42; 95% CI -0.75, -0.08), resistance (-0.60; 95% CI -0.89, -0.31), and sensorimotor training including endurance interventions (-0.61; 95% CI -0.95, -0.27) vs control comparisons. CONCLUSION Balance interventions improve balance, postural stability, and general mobility in people with Parkinson's disease. Moreover, alternative exercises, dance, BWS interventions, resistance, and sensorimotor training, including and not including endurance interventions, are also effective.
Collapse
Affiliation(s)
| | - Iván Cavero-Redondo
- Universidad de Castilla La Mancha, Health and Social Research Center, Cuenca, Spain; Facultad de Ciencias de La Salud, Universidad Autónoma de Chile, Talca, Chile
| | | | | | | | - Celia Álvarez-Bueno
- Universidad de Castilla La Mancha, Health and Social Research Center, Cuenca, Spain; Universidad Politécnica y Artística del Paraguay, Asunción, Paraguay
| |
Collapse
|
5
|
Hu R, Diao Y, Wang Y, Li G, He R, Ning Y, Lou N, Li G, Zhao G. Effective evaluation of HGcnMLP method for markerless 3D pose estimation of musculoskeletal diseases patients based on smartphone monocular video. Front Bioeng Biotechnol 2024; 11:1335251. [PMID: 38264579 PMCID: PMC10803458 DOI: 10.3389/fbioe.2023.1335251] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Accepted: 12/22/2023] [Indexed: 01/25/2024] Open
Abstract
Markerless pose estimation based on computer vision provides a simpler and cheaper alternative to human motion capture, with great potential for clinical diagnosis and remote rehabilitation assessment. Currently, the markerless 3D pose estimation is mainly based on multi-view technology, while the more promising single-view technology has defects such as low accuracy and reliability, which seriously limits clinical application. This study proposes a high-resolution graph convolutional multilayer perception (HGcnMLP) human 3D pose estimation framework for smartphone monocular videos and estimates 15 healthy adults and 12 patients with musculoskeletal disorders (sarcopenia and osteoarthritis) gait spatiotemporal, knee angle, and center-of-mass (COM) velocity parameters, etc., and compared with the VICON gold standard system. The results show that most of the calculated parameters have excellent reliability (VICON, ICC (2, k): 0.853-0.982; Phone, ICC (2, k): 0.839-0.975) and validity (Pearson r: 0.808-0.978, p< 0.05). In addition, the proposed system can better evaluate human gait balance ability, and the K-means++ clustering algorithm can successfully distinguish patients into different recovery level groups. This study verifies the potential of a single smartphone video for 3D human pose estimation for rehabilitation auxiliary diagnosis and balance level recognition, and is an effective attempt at the clinical application of emerging computer vision technology. In the future, it is hoped that the corresponding smartphone program will be developed to provide a low-cost, effective, and simple new tool for remote monitoring and rehabilitation assessment of patients.
Collapse
Affiliation(s)
- Rui Hu
- CAS Key Laboratory of Human-Machine Intelligence-Synergy Systems, Research Center for Neural Engineering, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Yanan Diao
- CAS Key Laboratory of Human-Machine Intelligence-Synergy Systems, Research Center for Neural Engineering, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
- Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Yingchi Wang
- CAS Key Laboratory of Human-Machine Intelligence-Synergy Systems, Research Center for Neural Engineering, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Gaoqiang Li
- Department of Orthopedic and Rehabilitation Center, University of Hong Kong–Shenzhen Hospital, Shenzhen, China
| | - Rong He
- Department of Orthopedic and Rehabilitation Center, University of Hong Kong–Shenzhen Hospital, Shenzhen, China
| | - Yunkun Ning
- CAS Key Laboratory of Human-Machine Intelligence-Synergy Systems, Research Center for Neural Engineering, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Nan Lou
- Department of Orthopedic and Rehabilitation Center, University of Hong Kong–Shenzhen Hospital, Shenzhen, China
| | - Guanglin Li
- CAS Key Laboratory of Human-Machine Intelligence-Synergy Systems, Research Center for Neural Engineering, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Guoru Zhao
- CAS Key Laboratory of Human-Machine Intelligence-Synergy Systems, Research Center for Neural Engineering, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| |
Collapse
|
6
|
Åberg AC, Petersson JR, Giedraitis V, McKee KJ, Rosendahl E, Halvorsen K, Berglund L. Prediction of conversion to dementia disorders based on timed up and go dual-task test verbal and motor outcomes: a five-year prospective memory-clinic-based study. BMC Geriatr 2023; 23:535. [PMID: 37660032 PMCID: PMC10475186 DOI: 10.1186/s12877-023-04262-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Accepted: 08/28/2023] [Indexed: 09/04/2023] Open
Abstract
BACKGROUND While assessment tools can increase the detection of cognitive impairment, there is currently insufficient evidence regarding clinical outcomes based on screening for cognitive impairment in older adults. METHODS The study purpose was to investigate whether Timed Up and Go dual-task test (TUGdt) results, based on TUG combined with two different verbal tasks (name different animals, TUGdt-NA, and recite months in reverse order, TUGdt-MB), predicted dementia incidence over a period of five years among patients (N = 186, mean = 70.7 years; 45.7% female) diagnosed with Subjective Cognitive Impairment (SCI) and Mild Cognitive Impairment (MCI) following assessment at two memory clinics. Associations between TUG parameters and dementia incidence were examined in Cox regression models. RESULTS During follow-up time (median (range) 3.7 (0.1-6.1) years) 98 participants converted to dementia. Novel findings indicated that the TUGdt parameter words/time, after adjustment for age, gender, and education, can be used for the prediction of conversion to dementia in participants with SCI or MCI over a period of five years. Among the TUG-related parameters investigated, words/time showed the best predictive capacity, while time scores of TUG and TUGdt as well as TUGdt cost did not produce significant predictive results. Results further showed that the step parameter step length during TUGdt predicts conversion to dementia before adjustment for age, gender, and education. Optimal TUGdt cutoffs for predicting dementia at 2- and 4-year follow-up based on words/time were calculated. The sensitivity of the TUGdt cutoffs was high at 2-year follow-up: TUGdt-NA words/time, 0.79; TUGdt-MB words/time, 0.71; reducing respectively to 0.64 and 0.65 at 4-year follow-up. CONCLUSIONS TUGdt words/time parameters have potential as cost-efficient tools for conversion-to-dementia risk assessment, useful for research and clinical purposes. These parameters may be able to bridge the gap of insufficient evidence for such clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT05893524: https://www. CLINICALTRIALS gov/study/NCT05893524?id=NCT05893524&rank=1 .
Collapse
Affiliation(s)
- Anna Cristina Åberg
- School of Health and Welfare, Dalarna University, 791 88, Falun, Sweden.
- Department of Public Health and Caring Sciences, Geriatrics, Uppsala Universit, y, Box 564, 52 37, UPPSALA, Sweden.
| | - Johanna R Petersson
- School of Health and Welfare, Dalarna University, 791 88, Falun, Sweden
- Department of Public Health and Caring Sciences, Geriatrics, Uppsala Universit, y, Box 564, 52 37, UPPSALA, Sweden
| | - Vilmantas Giedraitis
- School of Health and Welfare, Dalarna University, 791 88, Falun, Sweden
- Department of Public Health and Caring Sciences, Geriatrics, Uppsala Universit, y, Box 564, 52 37, UPPSALA, Sweden
| | - Kevin J McKee
- School of Health and Welfare, Dalarna University, 791 88, Falun, Sweden
| | - Erik Rosendahl
- Department of Community Medicine and Rehabilitation, Physiotherapy, Umeå University, 90187, Umeå, Sweden
| | - Kjartan Halvorsen
- School of Health and Welfare, Dalarna University, 791 88, Falun, Sweden
- Department of Mechatronics, School of Engineering and Sciences, Campus Estado de Mexico, Tecnologico de Monterrey, Atizapan, Mexico, Carretera Lago de Guadalupe Km 3.5, 52926, Atizapan, Estado de Mexico, Mexico
| | - Lars Berglund
- School of Health and Welfare, Dalarna University, 791 88, Falun, Sweden
- Department of Public Health and Caring Sciences, Geriatrics, Uppsala Universit, y, Box 564, 52 37, UPPSALA, Sweden
| |
Collapse
|
7
|
Timed up & go quantification algorithm using IMU and sEMG signal. Biomed Signal Process Control 2023. [DOI: 10.1016/j.bspc.2022.104309] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
8
|
Martini E, Boldo M, Aldegheri S, Valè N, Filippetti M, Smania N, Bertucco M, Picelli A, Bombieri N. Enabling Gait Analysis in the Telemedicine Practice through Portable and Accurate 3D Human Pose Estimation. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2022; 225:107016. [PMID: 35907374 DOI: 10.1016/j.cmpb.2022.107016] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 06/24/2022] [Accepted: 07/07/2022] [Indexed: 06/15/2023]
Abstract
Human pose estimation (HPE) through deep learning-based software applications is a trend topic for markerless motion analysis. Thanks to the accuracy of the state-of-the-art technology, HPE could enable gait analysis in the telemedicine practice. On the other hand, delivering such a service at a distance requires the system to satisfy multiple and different constraints like accuracy, portability, real-time, and privacy compliance at the same time. Existing solutions either guarantee accuracy and real-time (e.g., the widespread OpenPose software on well-equipped computing platforms) or portability and data privacy (e.g., light convolutional neural networks on mobile phones). We propose a portable and low-cost platform that implements real-time and accurate 3D HPE through an embedded software on a low-power off-the-shelf computing device that guarantees privacy by default and by design. We present an extended evaluation of both accuracy and performance of the proposed solution conducted with a marker-based motion capture system (i.e., Vicon) as ground truth. The results show that the platform achieves real-time performance and high-accuracy with a deviation below the error tolerance when compared to the marker-based motion capture system (e.g., less than an error of 5∘ on the estimated knee flexion difference on the entire gait cycle and correlation 0.91<ρ<0.99). We provide a proof-of-concept study, showing that such portable technology, considering the limited discrepancies with respect to the marker-based motion capture system and its working tolerance, could be used for gait analysis at a distance without leading to different clinical interpretation.
Collapse
Affiliation(s)
- Enrico Martini
- Department of Computer Science, University of Verona, Italy.
| | - Michele Boldo
- Department of Computer Science, University of Verona, Italy.
| | | | - Nicola Valè
- Neuromotor and Cognitive Rehabilitation Research Center (CRRNC) - Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy.
| | - Mirko Filippetti
- Neuromotor and Cognitive Rehabilitation Research Center (CRRNC) - Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy.
| | - Nicola Smania
- Neuromotor and Cognitive Rehabilitation Research Center (CRRNC) - Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy.
| | - Matteo Bertucco
- Neuromotor and Cognitive Rehabilitation Research Center (CRRNC) - Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy.
| | - Alessandro Picelli
- Neuromotor and Cognitive Rehabilitation Research Center (CRRNC) - Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy.
| | - Nicola Bombieri
- Department of Computer Science, University of Verona, Italy.
| |
Collapse
|
9
|
Li W, Chen X, Zhang J, Lu J, Zhang C, Bai H, Liang J, Wang J, Du H, Xue G, Ling Y, Ren K, Zou W, Chen C, Li M, Chen Z, Zou H. Recognition of Freezing of Gait in Parkinson’s Disease Based on Machine Vision. Front Aging Neurosci 2022; 14:921081. [PMID: 35912091 PMCID: PMC9329960 DOI: 10.3389/fnagi.2022.921081] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Accepted: 06/21/2022] [Indexed: 11/13/2022] Open
Abstract
BackgroundFreezing of gait (FOG) is a common clinical manifestation of Parkinson’s disease (PD), mostly occurring in the intermediate and advanced stages. FOG is likely to cause patients to fall, resulting in fractures, disabilities and even death. Currently, the pathogenesis of FOG is unclear, and FOG detection and screening methods have various defects, including subjectivity, inconvenience, and high cost. Due to limited public healthcare and transportation resources during the COVID-19 pandemic, there are greater inconveniences for PD patients who need diagnosis and treatment.ObjectiveA method was established to automatically recognize FOG in PD patients through videos taken by mobile phone, which is time-saving, labor-saving, and low-cost for daily use, which may overcome the above defects. In the future, PD patients can undergo FOG assessment at any time in the home rather than in the hospital.MethodsIn this study, motion features were extracted from timed up and go (TUG) test and the narrow TUG (Narrow) test videos of 50 FOG-PD subjects through a machine learning method; then a motion recognition model to distinguish between walking and turning stages and a model to recognize FOG in these stages were constructed using the XGBoost algorithm. Finally, we combined these three models to form a multi-stage FOG recognition model.ResultsWe adopted the leave-one-subject-out (LOSO) method to evaluate model performance, and the multi-stage FOG recognition model achieved a sensitivity of 87.5% sensitivity and a specificity of 79.82%.ConclusionA method to realize remote PD patient FOG recognition based on mobile phone video is presented in this paper. This method is convenient with high recognition accuracy and can be used to rapidly evaluate FOG in the home environment and remotely manage FOG-PD, or screen patients in large-scale communities.
Collapse
Affiliation(s)
- Wendan Li
- Department of Neurosurgery, General Hospital of Southern Theater Command of PLA, Guangzhou, China
- Graduate School, Guangzhou University of Chinese Medicine, Guangzhou, China
| | | | - Jintao Zhang
- Department of Neurology, The 960th Hospital of PLA, Taian, China
| | - Jianjun Lu
- Department of Neurosurgery, Guangdong Second Provincial General Hospital, Guangzhou, China
| | - Chencheng Zhang
- Department of Neurosurgery, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Hongmin Bai
- Department of Neurosurgery, General Hospital of Southern Theater Command of PLA, Guangzhou, China
| | - Junchao Liang
- Department of Neurosurgery, General Hospital of Southern Theater Command of PLA, Guangzhou, China
| | - Jiajia Wang
- Department of Neurosurgery, General Hospital of Southern Theater Command of PLA, Guangzhou, China
| | - Hanqiang Du
- Department of Neurosurgery, General Hospital of Southern Theater Command of PLA, Guangzhou, China
| | - Gaici Xue
- Department of Neurosurgery, General Hospital of Southern Theater Command of PLA, Guangzhou, China
| | - Yun Ling
- GYENNO SCIENCE Co., LTD., Shenzhen, China
| | - Kang Ren
- GYENNO SCIENCE Co., LTD., Shenzhen, China
| | | | - Cheng Chen
- GYENNO SCIENCE Co., LTD., Shenzhen, China
| | - Mengyan Li
- Department of Neurology, Guangzhou First People’s Hospital, Guangzhou, China
- *Correspondence: Mengyan Li,
| | - Zhonglue Chen
- GYENNO SCIENCE Co., LTD., Shenzhen, China
- HUST-GYENNO CNS Intelligent Digital Medicine Technology Center, Wuhan, China
- Zhonglue Chen,
| | - Haiqiang Zou
- Department of Neurosurgery, General Hospital of Southern Theater Command of PLA, Guangzhou, China
- Branch of National Clinical Research Center for Geriatric Diseases, Chinese PLA General Hospital, Guangzhou, China
- Haiqiang Zou,
| |
Collapse
|