1
|
González-Graniel E, Mercado-Gutierrez JA, Martínez-Díaz S, Castro-Liera I, Santillan-Mendez IM, Yanez-Suarez O, Quiñones-Uriostegui I, Rodríguez-Reyes G. Sensing and Control Strategies Used in FES Systems Aimed at Assistance and Rehabilitation of Foot Drop: A Systematic Literature Review. J Pers Med 2024; 14:874. [PMID: 39202064 PMCID: PMC11355777 DOI: 10.3390/jpm14080874] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2024] [Revised: 08/11/2024] [Accepted: 08/12/2024] [Indexed: 09/03/2024] Open
Abstract
Functional electrical stimulation (FES) is a rehabilitation and assistive technique used for stroke survivors. FES systems mainly consist of sensors, a control algorithm, and a stimulation unit. However, there is a critical need to reassess sensing and control techniques in FES systems to enhance their efficiency. This SLR was carried out following the PRISMA 2020 statement. Four databases (PubMed, Scopus, Web of Science, Wiley Online Library) from 2010 to 2024 were searched using terms related to sensing and control strategies in FES systems. A total of 322 articles were chosen in the first stage, while only 60 of them remained after the final filtering stage. This systematic review mainly focused on sensor techniques and control strategies to deliver FES. The most commonly used sensors reported were inertial measurement units (IMUs), 45% (27); biopotential electrodes, 36.7% (22); vision-based systems, 18.3% (11); and switches, 18.3% (11). The control strategy most reported is closed-loop; however, most of the current commercial FES systems employ open-loop strategies due to their simplicity. Three main factors were identified that should be considered when choosing a sensor for gait-oriented FES systems: wearability, accuracy, and affordability. We believe that the combination of computer vision systems with artificial intelligence-based control algorithms can contribute to the development of minimally invasive and personalized FES systems for the gait rehabilitation of patients with FDS.
Collapse
Affiliation(s)
- Estefanía González-Graniel
- División de estudios de Posgrado e Investiagación, TecNM-Instituto Tecnológico de la Paz, La Paz 28080, Mexico; (E.G.-G.); (I.C.-L.); (I.M.S.-M.)
| | - Jorge A. Mercado-Gutierrez
- Instituto Nacional de Rehabilitación Luis Guillermo Ibarra Ibarra, Mexico City 14389, Mexico; (J.A.M.-G.); (I.Q.-U.)
| | - Saúl Martínez-Díaz
- División de estudios de Posgrado e Investiagación, TecNM-Instituto Tecnológico de la Paz, La Paz 28080, Mexico; (E.G.-G.); (I.C.-L.); (I.M.S.-M.)
| | - Iliana Castro-Liera
- División de estudios de Posgrado e Investiagación, TecNM-Instituto Tecnológico de la Paz, La Paz 28080, Mexico; (E.G.-G.); (I.C.-L.); (I.M.S.-M.)
| | - Israel M. Santillan-Mendez
- División de estudios de Posgrado e Investiagación, TecNM-Instituto Tecnológico de la Paz, La Paz 28080, Mexico; (E.G.-G.); (I.C.-L.); (I.M.S.-M.)
| | - Oscar Yanez-Suarez
- Electrical Engineering Department, Universidad Autónoma Metropolitana—Unidad Iztapalapa, Mexico City 09340, Mexico;
| | - Ivett Quiñones-Uriostegui
- Instituto Nacional de Rehabilitación Luis Guillermo Ibarra Ibarra, Mexico City 14389, Mexico; (J.A.M.-G.); (I.Q.-U.)
| | - Gerardo Rodríguez-Reyes
- Instituto Nacional de Rehabilitación Luis Guillermo Ibarra Ibarra, Mexico City 14389, Mexico; (J.A.M.-G.); (I.Q.-U.)
| |
Collapse
|
2
|
Mouchoux J, Carisi S, Dosen S, Farina D, Schilling AF, Markovic M. Artificial Perception and Semiautonomous Control in Myoelectric Hand Prostheses Increases Performance and Decreases Effort. IEEE T ROBOT 2021. [DOI: 10.1109/tro.2020.3047013] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
3
|
HANDS: a multimodal dataset for modeling toward human grasp intent inference in prosthetic hands. INTEL SERV ROBOT 2019; 13:179-185. [PMID: 33312264 DOI: 10.1007/s11370-019-00293-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
Upper limb and hand functionality is critical to many activities of daily living and the amputation of one can lead to significant functionality loss for individuals. From this perspective, advanced prosthetic hands of the future are anticipated to benefit from improved shared control between a robotic hand and its human user, but more importantly from the improved capability to infer human intent from multimodal sensor data to provide the robotic hand perception abilities regarding the operational context. Such multimodal sensor data may include various environment sensors including vision, as well as human physiology and behavior sensors including electromyography and inertial measurement units. A fusion methodology for environmental state and human intent estimation can combine these sources of evidence in order to help prosthetic hand motion planning and control. In this paper, we present a dataset of this type that was gathered with the anticipation of cameras being built into prosthetic hands, and computer vision methods will need to assess this hand-view visual evidence in order to estimate human intent. Specifically, paired images from human eye-view and hand-view of various objects placed at different orientations have been captured at the initial state of grasping trials, followed by paired video, EMG and IMU from the arm of the human during a grasp, lift, put-down, and retract style trial structure. For each trial, based on eye-view images of the scene showing the hand and object on a table, multiple humans were asked to sort in decreasing order of preference, five grasp types appropriate for the object in its given configuration relative to the hand. The potential utility of paired eye-view and hand-view images was illustrated by training a convolutional neural network to process hand-view images in order to predict eye-view labels assigned by humans.
Collapse
|
4
|
Likitlersuang J, Koh R, Gong X, Jovanovic L, Bolivar-Tellería I, Myers M, Zariffa J, Márquez-Chin C. EEG-Controlled Functional Electrical Stimulation Therapy With Automated Grasp Selection: A Proof-of-Concept Study. Top Spinal Cord Inj Rehabil 2018; 24:265-274. [PMID: 29997429 DOI: 10.1310/sci2403-265] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Background: Functional electrical stimulation therapy (FEST) is a promising intervention for the restoration of upper extremity function after cervical spinal cord injury (SCI). Objectives: This study describes and evaluates a novel FEST system designed to incorporate voluntary movement attempts and massed practice of functional grasp through the use of brain-computer interface (BCI) and computer vision (CV) modules. Methods: An EEG-based BCI relying on a single electrode was used to detect movement initiation attempts. A CV system identified the target object and selected the appropriate grasp type. The required grasp type and trigger command were sent to an FES stimulator, which produced one of four multichannel muscle stimulation patterns (precision, lateral, palmar, or lumbrical grasp). The system was evaluated with five neurologically intact participants and one participant with complete cervical SCI. Results: An integrated BCI-CV-FES system was demonstrated. The overall classification accuracy of the CV module was 90.8%, when selecting out of a set of eight objects. The average latency for the BCI module to trigger the movement across all participants was 5.9 ± 1.5 seconds. For the participant with SCI alone, the CV accuracy was 87.5% and the BCI latency was 5.3 ± 9.4 seconds. Conclusion: BCI and CV methods can be integrated into an FEST system without the need for costly resources or lengthy setup times. The result is a clinically relevant system designed to promote voluntary movement attempts and more repetitions of varied functional grasps during FEST.
Collapse
Affiliation(s)
- Jirapat Likitlersuang
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada.,Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada
| | - Ryan Koh
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada.,Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada
| | - Xinyi Gong
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada.,Division of Engineering Science, University of Toronto, Toronto, Canada
| | - Lazar Jovanovic
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada.,Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada
| | - Isabel Bolivar-Tellería
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada.,Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada
| | - Matthew Myers
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada.,Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada
| | - José Zariffa
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada.,Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada
| | - César Márquez-Chin
- Toronto Rehabilitation Institute - University Health Network, Toronto, Canada
| |
Collapse
|
5
|
Simonsen D, Spaich EG, Hansen J, Andersen OK. Design and Test of a Closed-Loop FES System for Supporting Function of the Hemiparetic Hand Based on Automatic Detection Using the Microsoft Kinect Sensor. IEEE Trans Neural Syst Rehabil Eng 2017; 25:1249-1256. [DOI: 10.1109/tnsre.2016.2622160] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
6
|
Kinect-Based Sliding Mode Control for Lynxmotion Robotic Arm. ADVANCES IN HUMAN-COMPUTER INTERACTION 2016. [DOI: 10.1155/2016/7921295] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Recently, the technological development of manipulator robot increases very quickly and provides a positive impact to human life. The implementation of the manipulator robot technology offers more efficiency and high performance for several human’s tasks. In reality, efforts published in this context are focused on implementing control algorithms with already preprogrammed desired trajectories (passive robots case) or trajectory generation based on feedback sensors (active robots case). However, gesture based control robot can be considered as another channel of system control which is not widely discussed. This paper focuses on a Kinect-based real-time interactive control system implementation. Based on LabVIEW integrated development environment (IDE), a developed human-machine-interface (HMI) allows user to control in real time a Lynxmotion robotic arm. The Kinect software development kit (SDK) provides a tool to keep track of human body skeleton and abstract it into 3-dimensional coordinates. Therefore, the Kinect sensor is integrated into our control system to detect the different user joints coordinates. The Lynxmotion dynamic has been implemented in a real-time sliding mode control algorithm. The experimental results are carried out to test the effectiveness of the system, and the results verify the tracking ability, stability, and robustness.
Collapse
|