1
|
Yang S, Garg NP, Gao R, Yuan M, Noronha B, Ang WT, Accoto D. Learning-Based Motion-Intention Prediction for End-Point Control of Upper-Limb-Assistive Robots. SENSORS (BASEL, SWITZERLAND) 2023; 23:2998. [PMID: 36991709 PMCID: PMC10056111 DOI: 10.3390/s23062998] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Revised: 03/04/2023] [Accepted: 03/08/2023] [Indexed: 06/19/2023]
Abstract
The lack of intuitive and active human-robot interaction makes it difficult to use upper-limb-assistive devices. In this paper, we propose a novel learning-based controller that intuitively uses onset motion to predict the desired end-point position for an assistive robot. A multi-modal sensing system comprising inertial measurement units (IMUs), electromyographic (EMG) sensors, and mechanomyography (MMG) sensors was implemented. This system was used to acquire kinematic and physiological signals during reaching and placing tasks performed by five healthy subjects. The onset motion data of each motion trial were extracted to input into traditional regression models and deep learning models for training and testing. The models can predict the position of the hand in planar space, which is the reference position for low-level position controllers. The results show that using IMU sensor with the proposed prediction model is sufficient for motion intention detection, which can provide almost the same prediction performance compared with adding EMG or MMG. Additionally, recurrent neural network (RNN)-based models can predict target positions over a short onset time window for reaching motions and are suitable for predicting targets over a longer horizon for placing tasks. This study's detailed analysis can improve the usability of the assistive/rehabilitation robots.
Collapse
Affiliation(s)
- Sibo Yang
- School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore 639798, Singapore
| | - Neha P. Garg
- Rehabilitation Research Institute of Singapore (RRIS), Nanyang Technological University, Singapore 308232, Singapore
| | - Ruobin Gao
- School of Computer Science and Engineering, Nanyang Technological University, Singapore 639798, Singapore
| | - Meng Yuan
- Rehabilitation Research Institute of Singapore (RRIS), Nanyang Technological University, Singapore 308232, Singapore
| | - Bernardo Noronha
- School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore 639798, Singapore
| | - Wei Tech Ang
- School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore 639798, Singapore
- Rehabilitation Research Institute of Singapore (RRIS), Nanyang Technological University, Singapore 308232, Singapore
| | - Dino Accoto
- Department of Mechanical Engineering, Robotics, Automation and Mechatronics Division, KU Leuven, 3590 Diepenbeek, Belgium
| |
Collapse
|
2
|
Choffin Z, Jeong N, Callihan M, Sazonov E, Jeong S. Lower Body Joint Angle Prediction Using Machine Learning and Applied Biomechanical Inverse Dynamics. SENSORS (BASEL, SWITZERLAND) 2022; 23:228. [PMID: 36616825 PMCID: PMC9824079 DOI: 10.3390/s23010228] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Revised: 11/21/2022] [Accepted: 12/23/2022] [Indexed: 06/17/2023]
Abstract
Extreme angles in lower body joints may adversely increase the risk of injury to joints. These injuries are common in the workplace and cause persistent pain and significant financial losses to people and companies. The purpose of this study was to predict lower body joint angles from the ankle to the lumbosacral joint (L5S1) by measuring plantar pressures in shoes. Joint angle prediction was aided by a designed footwear sensor consisting of six force-sensing resistors (FSR) and a microcontroller fitted with Bluetooth LE sensors. An Xsens motion capture system was utilized as a ground truth validation measuring 3D joint angles. Thirty-seven human subjects were tested squatting in an IRB-approved study. The Gaussian Process Regression (GPR) linear regression algorithm was used to create a progressive model that predicted the angles of ankle, knee, hip, and L5S1. The footwear sensor showed a promising root mean square error (RMSE) for each joint. The L5S1 angle was predicted to be RMSE of 0.21° for the X-axis and 0.22° for the Y-axis, respectively. This result confirmed that the proposed plantar sensor system had the capability to predict and monitor lower body joint angles for potential injury prevention and training of occupational workers.
Collapse
Affiliation(s)
- Zachary Choffin
- Department of Electrical and Computer Engineering, The University of Alabama, Tuscaloosa, AL 35487, USA
| | - Nathan Jeong
- Department of Electrical and Computer Engineering, The University of Alabama, Tuscaloosa, AL 35487, USA
| | - Michael Callihan
- Capstone College of Nursing, University of Alabama, Tuscaloosa, AL 35487, USA
| | - Edward Sazonov
- Department of Electrical and Computer Engineering, The University of Alabama, Tuscaloosa, AL 35487, USA
| | - Seongcheol Jeong
- Department of Electrical Engineering, Pohang University of Science and Technology, Pohang 37673, Republic of Korea
| |
Collapse
|
3
|
Bao T, Xie SQ, Yang P, Zhou P, Zhang ZQ. Towards Robust, Adaptive and Reliable Upper-limb Motion Estimation Using Machine Learning and Deep Learning--A Survey in Myoelectric Control. IEEE J Biomed Health Inform 2022; 26:3822-3835. [PMID: 35294368 DOI: 10.1109/jbhi.2022.3159792] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
To develop multi-functional human-machine interfaces that can help disabled people reconstruct lost functions of upper-limbs, machine learning (ML) and deep learning (DL) techniques have been widely implemented to decode human movement intentions from surface electromyography (sEMG) signals. However, due to the high complexity of upper-limb movements and the inherent non-stable characteristics of sEMG, the usability of ML/DL based control schemes is still greatly limited in practical scenarios. To this end, tremendous efforts have been made to improve model robustness, adaptation, and reliability. In this article, we provide a systematic review on recent achievements, mainly from three categories: multi-modal sensing fusion to gain additional information of the user, transfer learning (TL) methods to eliminate domain shift impacts on estimation models, and post-processing approaches to obtain more reliable outcomes. Special attention is given to fusion strategies, deep TL frameworks, and confidence estimation. \textcolor{red}{Research challenges and emerging opportunities, with respect to hardware development, public resources, and decoding strategies, are also analysed to provide perspectives for future developments.
Collapse
|
4
|
Martinez-Hernandez U, Metcalfe B, Assaf T, Jabban L, Male J, Zhang D. Wearable Assistive Robotics: A Perspective on Current Challenges and Future Trends. SENSORS (BASEL, SWITZERLAND) 2021; 21:6751. [PMID: 34695964 PMCID: PMC8539021 DOI: 10.3390/s21206751] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/30/2021] [Revised: 09/30/2021] [Accepted: 10/06/2021] [Indexed: 11/16/2022]
Abstract
Wearable assistive robotics is an emerging technology with the potential to assist humans with sensorimotor impairments to perform daily activities. This assistance enables individuals to be physically and socially active, perform activities independently, and recover quality of life. These benefits to society have motivated the study of several robotic approaches, developing systems ranging from rigid to soft robots with single and multimodal sensing, heuristics and machine learning methods, and from manual to autonomous control for assistance of the upper and lower limbs. This type of wearable robotic technology, being in direct contact and interaction with the body, needs to comply with a variety of requirements to make the system and assistance efficient, safe and usable on a daily basis by the individual. This paper presents a brief review of the progress achieved in recent years, the current challenges and trends for the design and deployment of wearable assistive robotics including the clinical and user need, material and sensing technology, machine learning methods for perception and control, adaptability and acceptability, datasets and standards, and translation from lab to the real world.
Collapse
Affiliation(s)
- Uriel Martinez-Hernandez
- Multimodal Inte-R-Action Lab, University of Bath, Bath BA2 7AY, UK;
- Centre for Autonomous Robotics (CENTAUR), University of Bath, Bath BA2 7AY, UK; (B.M.); (T.A.); (D.Z.)
- Centre for Biosensors, Bioelectronics and Biodevices (C3Bio), University of Bath, Bath BA2 7AY, UK;
- Department of Electronics and Electrical Engineering, University of Bath, Bath BA2 7AY, UK
| | - Benjamin Metcalfe
- Centre for Autonomous Robotics (CENTAUR), University of Bath, Bath BA2 7AY, UK; (B.M.); (T.A.); (D.Z.)
- Centre for Biosensors, Bioelectronics and Biodevices (C3Bio), University of Bath, Bath BA2 7AY, UK;
- Department of Electronics and Electrical Engineering, University of Bath, Bath BA2 7AY, UK
| | - Tareq Assaf
- Centre for Autonomous Robotics (CENTAUR), University of Bath, Bath BA2 7AY, UK; (B.M.); (T.A.); (D.Z.)
- Centre for Biosensors, Bioelectronics and Biodevices (C3Bio), University of Bath, Bath BA2 7AY, UK;
- Department of Electronics and Electrical Engineering, University of Bath, Bath BA2 7AY, UK
| | - Leen Jabban
- Centre for Biosensors, Bioelectronics and Biodevices (C3Bio), University of Bath, Bath BA2 7AY, UK;
- Department of Electronics and Electrical Engineering, University of Bath, Bath BA2 7AY, UK
| | - James Male
- Multimodal Inte-R-Action Lab, University of Bath, Bath BA2 7AY, UK;
- Centre for Autonomous Robotics (CENTAUR), University of Bath, Bath BA2 7AY, UK; (B.M.); (T.A.); (D.Z.)
- Department of Electronics and Electrical Engineering, University of Bath, Bath BA2 7AY, UK
| | - Dingguo Zhang
- Centre for Autonomous Robotics (CENTAUR), University of Bath, Bath BA2 7AY, UK; (B.M.); (T.A.); (D.Z.)
- Centre for Biosensors, Bioelectronics and Biodevices (C3Bio), University of Bath, Bath BA2 7AY, UK;
- Department of Electronics and Electrical Engineering, University of Bath, Bath BA2 7AY, UK
| |
Collapse
|