1
|
Mohammadi M, Cardoso ASS, Andreasen Struijk LNS. Using workspace restrictiveness for adaptive velocity adjustment of assistive robots and upper limb exoskeletons. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38082906 DOI: 10.1109/embc40787.2023.10341183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Individuals with severe disabilities can benefit from assistive robotic systems (ARS) for performing activities of daily living. However, limited control interfaces are available for individuals who cannot use their hands for the control, and most of these interfaces require high effort to perform simple tasks. Therefore, autonomous and intelligent control strategies were proposed for assisting with the control in complex tasks. In this paper, we presented an autonomous and adaptive method for adjusting an assistive robot's velocity in different regions of its workspace and reducing the robot velocity where fine control is required. Two participants controlled a JACO assistive robot to grasp and lift a bottle with and without the velocity adjustment method. The task was performed 9.1% faster with velocity adjustment. Furthermore, analyzing the robot trajectory showed that the method recognized highly restrictive regions and reduced the robot end-effector velocity accordingly.Clinical relevance- The autonomous velocity adjustment method can ease the control of ARSs and improve their usability, leading to a higher quality of life for individuals with severe disabilities who can benefit from ARSs.
Collapse
|
2
|
Readioff R, Siddiqui ZK, Stewart C, Fulbrook L, O’Connor RJ, Chadwick EK. Use and evaluation of assistive technologies for upper limb function in tetraplegia. J Spinal Cord Med 2022; 45:809-820. [PMID: 33606599 PMCID: PMC9662059 DOI: 10.1080/10790268.2021.1878342] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
Abstract
CONTEXT More than half of all spinal cord injuries (SCI) occur at the cervical level leading to loss of upper limb function, restricted activity and reduced independence. Several technologies have been developed to assist with upper limb functions in the SCI population. OBJECTIVE There is no clear clinical consensus on the effectiveness of the current assistive technologies for the cervical SCI population, hence this study reviews the literature in the years between 1999 and 2019. METHODS A systematic review was performed on the state-of-the-art assistive technology that supports and improves the function of impaired upper limbs in cervical SCI populations. Combinations of terms, covering assistive technology, SCI, and upper limb, were used in the search, which resulted in a total of 1770 articles. Data extractions were performed on the selected studies which involved summarizing details on the assistive technologies, characteristics of study participants, outcome measures, and improved upper limb functions when using the device. RESULTS A total of 24 articles were found and grouped into five categories, including neuroprostheses (invasive and non-invasive), orthotic devices, hybrid systems, robots, and arm supports. Only a few selected studies comprehensively reported characteristics of the participants. There was a wide range of outcome measures and all studies reported improvements in upper limb function with the devices. CONCLUSIONS This study highlighted that assistive technologies can improve functions of the upper limbs in SCI patients. It was challenging to draw generalizable conclusions because of factors, such as heterogeneity of recruited participants, a wide range of outcome measures, and the different technologies employed.
Collapse
Affiliation(s)
- Rosti Readioff
- School of Pharmacy and Bioengineering, Keele University, Stoke-on-Trent, UK,Correspondence to: Rosti Readioff, Institute of Medical and Biological Engineering, School of Mechanical Engineering, University of Leeds, LeedsLS2 9JT, UK. ; @Dr_Rosti
| | - Zaha Kamran Siddiqui
- Academic Department of Rehabilitation Medicine, Faculty of Medicine and Health, University of Leeds, Leeds, UK
| | - Caroline Stewart
- School of Pharmacy and Bioengineering, Keele University, Stoke-on-Trent, UK,The Orthotic Research and Locomotor Assessment Unit (ORLAU), the Robert Jones and Agnes Hunt Orthopaedic Hospital, NHS Foundation Trust, Oswestry, UK
| | - Louisa Fulbrook
- The Orthotic Research and Locomotor Assessment Unit (ORLAU), the Robert Jones and Agnes Hunt Orthopaedic Hospital, NHS Foundation Trust, Oswestry, UK
| | - Rory J. O’Connor
- Academic Department of Rehabilitation Medicine, Faculty of Medicine and Health, University of Leeds, Leeds, UK
| | | |
Collapse
|
3
|
Thøgersen MB, Mohammadi M, Gull MA, Bengtson SH, Kobbelgaard FV, Bentsen B, Khan BYA, Severinsen KE, Bai S, Bak T, Moeslund TB, Kanstrup AM, Andreasen Struijk LNS. User Based Development and Test of the EXOTIC Exoskeleton: Empowering Individuals with Tetraplegia Using a Compact, Versatile, 5-DoF Upper Limb Exoskeleton Controlled through Intelligent Semi-Automated Shared Tongue Control. SENSORS (BASEL, SWITZERLAND) 2022; 22:6919. [PMID: 36146260 PMCID: PMC9502221 DOI: 10.3390/s22186919] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 09/06/2022] [Accepted: 09/07/2022] [Indexed: 06/16/2023]
Abstract
This paper presents the EXOTIC- a novel assistive upper limb exoskeleton for individuals with complete functional tetraplegia that provides an unprecedented level of versatility and control. The current literature on exoskeletons mainly focuses on the basic technical aspects of exoskeleton design and control while the context in which these exoskeletons should function is less or not prioritized even though it poses important technical requirements. We considered all sources of design requirements, from the basic technical functions to the real-world practical application. The EXOTIC features: (1) a compact, safe, wheelchair-mountable, easy to don and doff exoskeleton capable of facilitating multiple highly desired activities of daily living for individuals with tetraplegia; (2) a semi-automated computer vision guidance system that can be enabled by the user when relevant; (3) a tongue control interface allowing for full, volitional, and continuous control over all possible motions of the exoskeleton. The EXOTIC was tested on ten able-bodied individuals and three users with tetraplegia caused by spinal cord injury. During the tests the EXOTIC succeeded in fully assisting tasks such as drinking and picking up snacks, even for users with complete functional tetraplegia and the need for a ventilator. The users confirmed the usability of the EXOTIC.
Collapse
Affiliation(s)
- Mikkel Berg Thøgersen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Mostafa Mohammadi
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Muhammad Ahsan Gull
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Stefan Hein Bengtson
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Bo Bentsen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Benjamin Yamin Ali Khan
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Kåre Eg Severinsen
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Shaoping Bai
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Bak
- Department of Electronic Systems, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Baltzer Moeslund
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Lotte N. S. Andreasen Struijk
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| |
Collapse
|
4
|
Rulik I, Sunny MSH, Sanjuan De Caro JD, Zarif MII, Brahmi B, Ahamed SI, Schultz K, Wang I, Leheng T, Longxiang JP, Rahman MH. Control of a Wheelchair-Mounted 6DOF Assistive Robot With Chin and Finger Joysticks. Front Robot AI 2022; 9:885610. [PMID: 35937617 PMCID: PMC9354078 DOI: 10.3389/frobt.2022.885610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Accepted: 06/15/2022] [Indexed: 11/13/2022] Open
Abstract
Throughout the last decade, many assistive robots for people with disabilities have been developed; however, researchers have not fully utilized these robotic technologies to entirely create independent living conditions for people with disabilities, particularly in relation to activities of daily living (ADLs). An assistive system can help satisfy the demands of regular ADLs for people with disabilities. With an increasing shortage of caregivers and a growing number of individuals with impairments and the elderly, assistive robots can help meet future healthcare demands. One of the critical aspects of designing these assistive devices is to improve functional independence while providing an excellent human–machine interface. People with limited upper limb function due to stroke, spinal cord injury, cerebral palsy, amyotrophic lateral sclerosis, and other conditions find the controls of assistive devices such as power wheelchairs difficult to use. Thus, the objective of this research was to design a multimodal control method for robotic self-assistance that could assist individuals with disabilities in performing self-care tasks on a daily basis. In this research, a control framework for two interchangeable operating modes with a finger joystick and a chin joystick is developed where joysticks seamlessly control a wheelchair and a wheelchair-mounted robotic arm. Custom circuitry was developed to complete the control architecture. A user study was conducted to test the robotic system. Ten healthy individuals agreed to perform three tasks using both (chin and finger) joysticks for a total of six tasks with 10 repetitions each. The control method has been tested rigorously, maneuvering the robot at different velocities and under varying payload (1–3.5 lb) conditions. The absolute position accuracy was experimentally found to be approximately 5 mm. The round-trip delay we observed between the commands while controlling the xArm was 4 ms. Tests performed showed that the proposed control system allowed individuals to perform some ADLs such as picking up and placing items with a completion time of less than 1 min for each task and 100% success.
Collapse
Affiliation(s)
- Ivan Rulik
- Department of Computer Sciences, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
- *Correspondence: Ivan Rulik,
| | - Md Samiul Haque Sunny
- Department of Computer Sciences, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
| | | | | | - Brahim Brahmi
- Electrical Engineering Department, Collège Ahuntsic, Montreal, QC, Canada
| | | | - Katie Schultz
- Assistive Technology Program, Clement J. Zablocki VA Medical Center, Milwaukee, WI, United States
| | - Inga Wang
- Department of Rehabilitation Sciences & Technology, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
| | - Tony Leheng
- UFACTORY Technology Co., Ltd., Shenzhen, China
| | | | - Mohammad H. Rahman
- Department of Mechanical Engineering, University of Wisconsin-Milwaukee, Milwaukee, WI, United States
| |
Collapse
|
5
|
Manual 3D Control of an Assistive Robotic Manipulator Using Alpha Rhythms and an Auditory Menu: A Proof-of-Concept. SIGNALS 2022. [DOI: 10.3390/signals3020024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Brain–Computer Interfaces (BCIs) have been regarded as potential tools for individuals with severe motor disabilities, such as those with amyotrophic lateral sclerosis, that render interfaces that rely on movement unusable. This study aims to develop a dependent BCI system for manual end-point control of a robotic arm. A proof-of-concept system was devised using parieto-occipital alpha wave modulation and a cyclic menu with auditory cues. Users choose a movement to be executed and asynchronously stop said action when necessary. Tolerance intervals allowed users to cancel or confirm actions. Eight able-bodied subjects used the system to perform a pick-and-place task. To investigate the potential learning effects, the experiment was conducted twice over the course of two consecutive days. Subjects obtained satisfactory completion rates (84.0 ± 15.0% and 74.4 ± 34.5% for the first and second day, respectively) and high path efficiency (88.9 ± 11.7% and 92.2 ± 9.6%). Subjects took on average 439.7 ± 203.3 s to complete each task, but the robot was only in motion 10% of the time. There was no significant difference in performance between both days. The developed control scheme provided users with intuitive control, but a considerable amount of time is spent waiting for the right target (auditory cue). Implementing other brain signals may increase its speed.
Collapse
|
6
|
Development of a Vision-Guided Shared-Control System for Assistive Robotic Manipulators. SENSORS 2022; 22:s22124351. [PMID: 35746131 PMCID: PMC9228253 DOI: 10.3390/s22124351] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Revised: 05/26/2022] [Accepted: 06/01/2022] [Indexed: 02/04/2023]
Abstract
Assistive robotic manipulators (ARMs) provide a potential solution to mitigating the difficulties and lost independence associated with manipulation deficits in individuals with upper-limb impairments. However, achieving efficient control of an ARM can be a challenge due to the multiple degrees of freedom (DoFs) of an ARM that need to be controlled. This study describes the development of a vision-guided shared-control (VGS) system and how it is applied to a multi-step drinking task. The VGS control allows the user to control the gross motion of the ARM via teleoperation and commands the ARM to autonomously perform fine manipulation. A bench-top test of the autonomous actions showed that success rates for different subtasks ranged from 80% to 100%. An evaluation with three test pilots showed that the overall task performance, in terms of success rate, task completion time, and joystick mode-switch frequency, was better with VGS than with teleoperation. Similar trends were observed with a case participant with a spinal cord injury. While his performance was better and he perceived a smaller workload with VGS, his perceived usability for VGS and teleoperation was similar. More work is needed to further improve and test VGS on participants with disabilities.
Collapse
|
7
|
Computer Vision-Based Adaptive Semi-Autonomous Control of an Upper Limb Exoskeleton for Individuals with Tetraplegia. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094374] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
We propose the use of computer vision for adaptive semi-autonomous control of an upper limb exoskeleton for assisting users with severe tetraplegia to increase independence and quality of life. A tongue-based interface was used together with the semi-autonomous control such that individuals with complete tetraplegia were able to use it despite being paralyzed from the neck down. The semi-autonomous control uses computer vision to detect nearby objects and estimate how to grasp them to assist the user in controlling the exoskeleton. Three control schemes were tested: non-autonomous (i.e., manual control using the tongue) control, semi-autonomous control with a fixed level of autonomy, and a semi-autonomous control with a confidence-based adaptive level of autonomy. Studies on experimental participants with and without tetraplegia were carried out. The control schemes were evaluated both in terms of their performance, such as the time and number of commands needed to complete a given task, as well as ratings from the users. The studies showed a clear and significant improvement in both performance and user ratings when using either of the semi-autonomous control schemes. The adaptive semi-autonomous control outperformed the fixed version in some scenarios, namely, in the more complex tasks and with users with more training in using the system.
Collapse
|
8
|
Kaeseler RL, Johansson TW, Struijk LNSA, Jochumsen M. Feature- and classification analysis for detection and classification of tongue movements from single-trial pre-movement EEG. IEEE Trans Neural Syst Rehabil Eng 2022; 30:678-687. [PMID: 35290187 DOI: 10.1109/tnsre.2022.3157959] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Individuals with severe tetraplegia can benefit from brain-computer interfaces (BCIs). While most movement-related BCI systems focus on right/left hand and/or foot movements, very few studies have considered tongue movements to construct a multiclass BCI. The aim of this study was to decode four movement directions of the tongue (left, right, up, and down) from single-trial pre-movement EEG and provide a feature and classifier investigation. In offline analyses (from ten healthy participants) detection and classification were performed using temporal, spectral, entropy, and template features classified using either a linear discriminative analysis, support vector machine, random forest or multilayer perceptron classifiers. Besides the 4-class classification scenario, all possible 3-, and 2-class scenarios were tested to find the most discriminable movement type. The linear discriminant analysis achieved on average, higher classification accuracies for both movement detection and classification. The right- and down tongue movements provided the highest and lowest detection accuracy (95.3±4.3% and 91.7±4.8%), respectively. The 4-class classification achieved an accuracy of 62.6±7.2%, while the best 3-class classification (using left, right, and up movements) and 2-class classification (using left and right movements) achieved an accuracy of 75.6±8.4% and 87.7±8.0%, respectively. Using only a combination of the temporal and template feature groups provided further classification accuracy improvements. Presumably, this is because these feature groups utilize the movement-related cortical potentials, which are noticeably different on the left- versus right brain hemisphere for the different movements. This study shows that the cortical representation of the tongue is useful for extracting control signals for multi-class movement detection BCIs.
Collapse
|
9
|
Mohammadi M, Knoche H, Thøgersen M, Bengtson SH, Gull MA, Bentsen B, Gaihede M, Severinsen KE, Andreasen Struijk LNS. Eyes-Free Tongue Gesture and Tongue Joystick Control of a Five DOF Upper-Limb Exoskeleton for Severely Disabled Individuals. Front Neurosci 2022; 15:739279. [PMID: 34975367 PMCID: PMC8718615 DOI: 10.3389/fnins.2021.739279] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2021] [Accepted: 11/23/2021] [Indexed: 11/30/2022] Open
Abstract
Spinal cord injury can leave the affected individual severely disabled with a low level of independence and quality of life. Assistive upper-limb exoskeletons are one of the solutions that can enable an individual with tetraplegia (paralysis in both arms and legs) to perform simple activities of daily living by mobilizing the arm. Providing an efficient user interface that can provide full continuous control of such a device—safely and intuitively—with multiple degrees of freedom (DOFs) still remains a challenge. In this study, a control interface for an assistive upper-limb exoskeleton with five DOFs based on an intraoral tongue-computer interface (ITCI) for individuals with tetraplegia was proposed. Furthermore, we evaluated eyes-free use of the ITCI for the first time and compared two tongue-operated control methods, one based on tongue gestures and the other based on dynamic virtual buttons and a joystick-like control. Ten able-bodied participants tongue controlled the exoskeleton for a drinking task with and without visual feedback on a screen in three experimental sessions. As a baseline, the participants performed the drinking task with a standard gamepad. The results showed that it was possible to control the exoskeleton with the tongue even without visual feedback and to perform the drinking task at 65.1% of the speed of the gamepad. In a clinical case study, an individual with tetraplegia further succeeded to fully control the exoskeleton and perform the drinking task only 5.6% slower than the able-bodied group. This study demonstrated the first single-modal control interface that can enable individuals with complete tetraplegia to fully and continuously control a five-DOF upper limb exoskeleton and perform a drinking task after only 2 h of training. The interface was used both with and without visual feedback.
Collapse
Affiliation(s)
- Mostafa Mohammadi
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Hendrik Knoche
- Human Machine Interaction, Department of Architecture, Design and Media Technology, Aalborg University, Aalborg, Denmark
| | - Mikkel Thøgersen
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Stefan Hein Bengtson
- Human Machine Interaction, Department of Architecture, Design and Media Technology, Aalborg University, Aalborg, Denmark
| | - Muhammad Ahsan Gull
- Department of Materials and Production, Aalborg University, Aalborg, Denmark
| | - Bo Bentsen
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Michael Gaihede
- Department of Clinical Medicine, Aalborg University, Aalborg, Denmark
| | | | - Lotte N S Andreasen Struijk
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| |
Collapse
|
10
|
IMU-Based Hand Gesture Interface Implementing a Sequence-Matching Algorithm for the Control of Assistive Technologies. SIGNALS 2021. [DOI: 10.3390/signals2040043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Assistive technologies (ATs) often have a high-dimensionality of possible movements (e.g., assistive robot with several degrees of freedom or a computer), but the users have to control them with low-dimensionality sensors and interfaces (e.g., switches). This paper presents the development of an open-source interface based on a sequence-matching algorithm for the control of ATs. Sequence matching allows the user to input several different commands with low-dimensionality sensors by not only recognizing their output, but also their sequential pattern through time, similarly to Morse code. In this paper, the algorithm is applied to the recognition of hand gestures, inputted using an inertial measurement unit worn by the user. An SVM-based algorithm, that is aimed to be robust, with small training sets (e.g., five examples per class) is developed to recognize gestures in real-time. Finally, the interface is applied to control a computer’s mouse and keyboard. The interface was compared against (and combined with) the head movement-based AssystMouse software. The hand gesture interface showed encouraging results for this application but could also be used with other body parts (e.g., head and feet) and could control various ATs (e.g., assistive robotic arm and prosthesis).
Collapse
|
11
|
Jafar MR, Nagesh DS. Literature review on assistive devices available for quadriplegic people: Indian context. Disabil Rehabil Assist Technol 2021; 18:1-13. [PMID: 34176416 DOI: 10.1080/17483107.2021.1938708] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2021] [Accepted: 06/01/2021] [Indexed: 10/21/2022]
Abstract
PURPOSE This literature review aims to find the current state of the art in self-help devices (SHD) available for people with quadriplegia. MATERIALS AND METHODS We searched original articles, technical and case studies, conference articles, and literature reviews published between 2014 to 2019 with the keywords ("Self-help devices" OR "Assistive Devices" OR "Assistive Product" OR "Assistive Technology") AND "Quadriplegia" in Science Direct, Pubmed, IEEE Xplore digital library and Web of Science. RESULTS Total 222 articles were found. After removing duplicates and screening these articles based on their title and abstracts 80 articles remained. After this, we reviewed the full text, and articles unrelated to SHD development or about the patients who require mechanical ventilation or where the upper limb is functional (C2 or above and T2 or below injuries) were discarded. After the exclusion of articles using the above-mentioned criterion 75 articles were used for further review. CONCLUSION The abandonment rate of SHD currently available in the literature is very high. The major requirement of the people was independence and improved quality of life. The situation in India is very bad as compared to the developed countries. The people with spinal cord injury in India are uneducated and very poor, with an average income of 3000 ₹ (41$). They require SHDs and training specially designed for them, keeping their needs in mind.Implications for rehabilitationPeople with quadriplegia are totally dependent on caregivers. Assistive devices not only help these people to do day-to-day tasks but also provides them self-confidence.Even though there are a lot of self-help devices currently available, still they are not able to fulfil the requirements of people with quadriplegia, hence there is a very high abandonment rate of such devices.This study provides an evidence that developing devices after understanding the functional and non-functional requirements of these subjects will decrease the abandonment rate and increase the effectiveness of the device.The results of this study can be used for planning and developing assistive devices which are more focussed on fulfilling the requirements of people with quadriplegia.
Collapse
Affiliation(s)
- Mohd Rizwan Jafar
- Department of Mechanical Engineering, Delhi Technological University, Delhi, India
| | - D S Nagesh
- Department of Mechanical Engineering, Delhi Technological University, Delhi, India
| |
Collapse
|
12
|
Udupa S, Kamat VR, Menassa CC. Shared autonomy in assistive mobile robots: a review. Disabil Rehabil Assist Technol 2021:1-22. [PMID: 34133906 DOI: 10.1080/17483107.2021.1928778] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
PURPOSE Shared autonomy has played a major role in assistive mobile robotics as it has the potential to effectively balance user satisfaction and smooth functioning of systems by adapting itself to each user's needs and preferences. Many shared control paradigms have been developed over the years. However, despite these advancements, shared control paradigms have not been widely adopted as there are several integral aspects that have not fully matured. The purpose of this paper is to discuss and review various aspects of shared control and the technologies leading up to the current advancements in shared control for assistive mobile robots. METHODS A comprehensive review of the literature was conducted following a dichotomy of studies from the pre-2000 and the post-2000 periods to focus on both the early developments and the current state of the art in this domain. RESULTS A systematic review of 135 research papers and 7 review papers selected from the literature was conducted. To facilitate the organization of the reviewed work, a 6-level ladder categorization was developed based on the extent of autonomy shared between the human and the robot in the use of assistive mobile robots. This taxonomy highlights the chronological improvements in this domain. CONCLUSION It was found that most prior studies have focussed on basic functionalities, thus paving the way for research to now focus on the higher levels of the ladder taxonomy. It was concluded that further research in the domain must focus on ensuring safety in mobility and adaptability to varying environments.Implications for rehabilitationShared autonomy in assistive mobile robots plays a vital role in effectively adapting to ensure safety while also considering the user comfort.User's immediate desires should be considered in decision making to ensure that the users are in control of the assistive robots.The current focus of research should be towards successful adaptation of the assistive mobile robots to varying environments to assure safety of the user.
Collapse
Affiliation(s)
- Sumukha Udupa
- Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor, MI, USA.,Robotics Institute, University of Michigan, Ann Arbor, MI, USA
| | - Vineet R Kamat
- Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor, MI, USA.,Robotics Institute, University of Michigan, Ann Arbor, MI, USA
| | - Carol C Menassa
- Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor, MI, USA.,Robotics Institute, University of Michigan, Ann Arbor, MI, USA
| |
Collapse
|
13
|
Abstract
We will increasingly become dependent on automation to support our manufacturing and daily living, and robots are likely to take an important place in this. Unfortunately, currently not all the robots are accessible for all users. This is due to the different characteristics of users, as users with visual, hearing, motor or cognitive disabilities were not considered during the design, implementation or interaction phase, causing accessibility barriers to users who have limitations. This research presents a proposal for accessibility guidelines for human-robot interaction (HRI). The guidelines have been evaluated by seventeen HRI designers and/or developers. A questionnaire of nine five-point Likert Scale questions and 6 open-ended questions was developed to evaluate the proposed guidelines for developers and designers, in terms of four main factors: usability, social acceptance, user experience and social impact. The questions act as indicators for each factor. The majority (15 of 17 participants) agreed that the guidelines are helpful for them to design and implement accessible robot interfaces and applications. Some of them had considered some ad hoc guidelines in their design practice, but none of them showed awareness of or had applied all the proposed guidelines in their design practice, 72% of the proposed guidelines have been applied by less than or equal to 8 participants for each guideline. Moreover, 16 of 17 participants would use the proposed guidelines in their future robot designs or evaluation. The participants recommended the importance of aligning the proposed guidelines with safety requirements, environment of interaction (indoor or outdoor), cost and users’ expectations.
Collapse
|
14
|
AMiCUS 2.0-System Presentation and Demonstration of Adaptability to Personal Needs by the Example of an Individual with Progressed Multiple Sclerosis. SENSORS 2020; 20:s20041194. [PMID: 32098240 PMCID: PMC7070692 DOI: 10.3390/s20041194] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/08/2020] [Revised: 02/10/2020] [Accepted: 02/18/2020] [Indexed: 01/02/2023]
Abstract
AMiCUS is a human–robot interface that enables tetraplegics to control an assistive robotic arm in real-time using only head motion, allowing them to perform simple manipulation tasks independently. The interface may be used as a standalone system or to provide direct control as part of a semi-autonomous system. Within this work, we present our new gesture-free prototype AMiCUS 2.0, which has been designed with special attention to accessibility and ergonomics. As such, AMiCUS 2.0 addresses the needs of tetraplegics with additional impairments that may come along with multiple sclerosis. In an experimental setup, both AMiCUS 1.0 and 2.0 are compared with each other, showing higher accessibility and usability for AMiCUS 2.0. Moreover, in an activity of daily living, a proof-of-concept is provided that an individual with progressed multiple sclerosis is able to operate the robotic arm in a temporal and functional scope, as would be necessary to perform direct control tasks for use in a commercial semi-autonomous system. The results indicate that AMiCUS 2.0 makes an important step towards closing the gaps of assistive technology, being accessible to those who rely on such technology the most.
Collapse
|
15
|
Cio YSLK, Raison M, Leblond Menard C, Achiche S. Proof of Concept of an Assistive Robotic Arm Control Using Artificial Stereovision and Eye-Tracking. IEEE Trans Neural Syst Rehabil Eng 2019; 27:2344-2352. [PMID: 31675337 DOI: 10.1109/tnsre.2019.2950619] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Assistive robotic arms have become popular to help users with upper limb disabilities achieve autonomy in their daily tasks, such as drinking and grasping objects in general. Usually, these robotic arms are controlled with an adapted joystick. Joysticks are user-friendly when it comes to a general approach to an object. However, they are not as intuitive when having to accurately approach an object, especially when obstacles are present. Alternatively, the combined use of artificial stereovision and eye-tracking seems to be a promising solution, as the user's vision is usually dissociated from their upper limb disability. Hence, the objective of this study was to develop a proof of concept for the control of an assistive robotic arm using a low-cost combination of stereovision and eye-tracking. Using the developed control system, a typically developed person was able to control the robotic arm successfully reaching and grasping an object for 92% of the trials without obstacles with an average time of 13.8 seconds. Then, another set of trials with one obstacle had a success rate of 91% with an average time of 17.3 seconds. Finally, the last set of trials with two obstacles had a success rate of 98% with an average time of 18.4 seconds. Furthermore, the cost of an eye-tracker and stereovision remains below 400$.
Collapse
|
16
|
Hildebrand M, Bonde F, Kobborg RVN, Andersen C, Norman AF, Thogersen M, Bengtson SH, Dosen S, Struijk NSLA. Semi-Autonomous Tongue Control of an Assistive Robotic Arm for Individuals with Quadriplegia. IEEE Int Conf Rehabil Robot 2019; 2019:157-162. [PMID: 31374623 DOI: 10.1109/icorr.2019.8779457] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Individuals suffering from quadriplegia can achieve increased independence by using an assistive robotic manipulator (ARM). However, due to their disability, the interfaces that can be used to operate such devices become limited. A versatile intraoral tongue control interface (ITCI) has previously been develop for this user group, as the tongue is usually spared from disability. A previous study has shown that the ITCI can provide direct and continuous control of 6-7 degrees of freedom (DoF) of an ARM, due to a high number of provided inputs (18). In the present pilot study we investigated whether semi-automation might further improve the efficiency of the ITCI, when controlling an ARM. This was achieved by adding a camera to the end effector of the ARM and using computer vision algorithms to guide the ARM to grasp a target object. Three ITCI and one joystick control scheme were tested and compared: 1) manual Cartesian control with a base frame reference point, 2) manual Cartesian control with an end effector reference point 3) manual Cartesian control with an end effector reference point and an autonomous grasp function 4) regular JACO2 joystick control. The results indicated that end effector control was superior to the base frame control in total task time, number of commands issued and path efficiency. The addition of the automatic grasp function did not improve the performance, but resulted in fewer collisions/displacements of the target object when grasping.
Collapse
|
17
|
Mohammadi M, Knoche H, Gaihede M, Bentsen B, Andreasen Struijk LNS. A high-resolution tongue-based joystick to enable robot control for individuals with severe disabilities. IEEE Int Conf Rehabil Robot 2019; 2019:1043-1048. [PMID: 31374767 DOI: 10.1109/icorr.2019.8779434] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Assistive robotic arms have shown the potential to improve the quality of life of people with severe disabilities. However, a high performance and intuitive control interface for robots with 6-7 DOFs is still missing for these individuals. An inductive tongue computer interface (ITCI) was recently tested for control of robots and the study illustrated potential in this field. The paper describes the investigation of the possibility of developing a high performance tongue-based joystick-like controller for robots through two studies. The first compared different methods for mapping the 18 sensor signals to a 2D coordinate, as a touchpad. The second evaluated the performance of a novel approach for emulating an analog joystick by the ITCI based on the ISO9241-411 standard. Two subjects performed a multi-directional tapping test using a standard analog joystick, the ITCI system held in hand and operated by the other hand, and finally by tongue when mounted inside the mouth. Throughput was measured as the evaluation parameter. The results show that the contact on the touchpads can be localized by almost 1 mm accuracy. The effective throughput of ITCI system for the multi-directional tapping test was 2.03 bps while keeping it in the hand and 1.31 bps when using it inside the mouth.
Collapse
|
18
|
Kaseler RL, Leerskov K, Andreasen Struijk LNS, Dremstrup K, Jochumsen M. Designing a brain computer interface for control of an assistive robotic manipulator using steady state visually evoked potentials. IEEE Int Conf Rehabil Robot 2019; 2019:1067-1072. [PMID: 31374771 DOI: 10.1109/icorr.2019.8779376] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
An assistive robotic manipulator (ARM) can provide independence and improve the quality of life for patients suffering from tetraplegia. However, to properly control such device to a satisfactory level without any motor functions requires a very high performing brain-computer interface (BCI). Steady-state visual evoked potentials (SSVEP) based BCI are among the best performing. Thus, this study investigates the design of a system for a full workspace control of a 7 degrees of freedom ARM. A SSVEP signal is elicited by observing a visual stimulus flickering at a specific frequency and phase. This study investigates the best combination of unique frequencies and phases to provide a 16-target BCI by testing three different systems off line. Furthermore, a fourth system is developed to investigate the impact of the stimulating monitor refresh rate. Experiments conducted on two subjects suggest that a 16-target BCI created by four unique frequencies and 16-unique phases provide the best performance. Subject 1 reaches a maximum estimated ITR of 235 bits/min while subject 2 reaches 140 bits/min. The findings suggest that the optimal SSVEP stimuli to generate 16 targets are a low number of frequencies and a high number of unique phases. Moreover, the findings do not suggest any need for considering the monitor refresh rate if stimuli are modulated using a sinusoidal signal sampled at the refresh rate.
Collapse
|
19
|
Orejuela-Zapata JF, Rodriguez S, Ramirez GL. Self-Help Devices for Quadriplegic Population: A Systematic Literature Review. IEEE Trans Neural Syst Rehabil Eng 2019; 27:692-701. [DOI: 10.1109/tnsre.2019.2901399] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
20
|
Bertomeu-Motos A, Ezquerro S, Barios JA, Lledó LD, Domingo S, Nann M, Martin S, Soekadar SR, Garcia-Aracil N. User activity recognition system to improve the performance of environmental control interfaces: a pilot study with patients. J Neuroeng Rehabil 2019; 16:10. [PMID: 30646915 PMCID: PMC6334466 DOI: 10.1186/s12984-018-0477-5] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2018] [Accepted: 11/18/2018] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Assistive technologies aim to increase quality of life, reduce dependence on care giver and on the long term care system. Several studies have demonstrated the effectiveness in the use of assistive technology for environment control and communication systems. The progress of brain-computer interfaces (BCI) research together with exoskeleton enable a person with motor impairment to interact with new elements in the environment. This paper aims to evaluate the environment control interface (ECI) developed under the AIDE project conditions, a multimodal interface able to analyze and extract relevant information from the environments as well as from the identification of residual abilities, behaviors, and intentions of the user. METHODS This study evaluated the ECI in a simulated scenario using a two screen layout: one with the ECI and the other with a simulated home environment, developed for this purpose. The sensorimotor rhythms and the horizontal oculoversion, acquired through BCI2000, a multipurpose standard BCI platform, were used to online control the ECI after the user training and system calibration. Eight subjects with different neurological diseases and spinal cord injury participated in this study. The subjects performed simulated activities of daily living (ADLs), i.e. actions in the simulated environment as drink, switch on a lamp or raise the bed head, during ten minutes in two different modes, AIDE mode, using a prediction model, to recognize the user intention facilitating the scan, and Manual mode, without a prediction model. RESULTS The results show that the mean task time spent in the AIDE mode was less than in the Manual, i.e the users were able to perform more tasks in the AIDE mode during the same time. The results showed a statistically significant differences with p<0.001. Regarding the steps, i.e the number of abstraction levels crossed in the ECI to perform an ADL, the users performed one step in the 90% of the tasks using the AIDE mode and three steps, at least, were necessary in the Manual mode. The user's intention prediction was performed through conditional random fields (CRF), with a global accuracy about 87%. CONCLUSIONS The environment analysis and the identification of the user's behaviors can be used to predict the user intention opening a new paradigm in the design of the ECIs. Although the developed ECI was tested only in a simulated home environment, it can be easily adapted to a real environment increasing the user independence at home.
Collapse
Affiliation(s)
- Arturo Bertomeu-Motos
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, Elche, 03202 Spain
| | - Santiago Ezquerro
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, Elche, 03202 Spain
| | - Juan A. Barios
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, Elche, 03202 Spain
| | - Luis D. Lledó
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, Elche, 03202 Spain
| | - Sergio Domingo
- BJ Adaptaciones, St Mare de Déu del Coll, 70, Barcelona, 08023 Spain
| | - Marius Nann
- University Hospital of Tuebingen, Applied Neurotechnology Lab, Calwerstr. 14, Tübingen, D-72076 Germany
| | - Suzanne Martin
- The Cedar Foundation, 1 Upper Lisburn Road, Belfast, BT10 0GW UK
| | - Surjo R. Soekadar
- Clinical Neurotechnology Laboratory, Neuroscience Research Center (NWFZ), Charité University Medicine Berlin, Charitéplatz 1, Berlin, 10117 Germany
| | - Nicolas Garcia-Aracil
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, Elche, 03202 Spain
| |
Collapse
|
21
|
Chu FJ, Xu R, Zhang Z, Vela PA, Ghovanloo M. The Helping Hand: An Assistive Manipulation Framework Using Augmented Reality and Tongue-Drive Interfaces. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:2158-2161. [PMID: 30440831 DOI: 10.1109/embc.2018.8512668] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
A human-in-the-loop system is proposed to enable collaborative manipulation tasks for person with physical disabilities. Studies show that the cognitive burden of subject reduces with increased autonomy of assistive system. Our framework obtains high-level intent from the user to specify manipulation tasks. The system processes sensor input to interpret the user's environment. Augmented reality glasses provide ego-centric visual feedback of the interpretation and summarize robot affordances on a menu. A tongue drive system serves as the input modality for triggering a robotic arm to execute the tasks. Assistance experiments compare the system to Cartesian control and to state-of-the-art approaches. Our system achieves competitive results with faster completion time by simplifying manipulation tasks.
Collapse
|
22
|
Fonseca L, Bo A, Guiraud D, Navarro B, Gelis A, Azevedo-Coste C. Investigating Upper Limb Movement Classification on Users with Tetraplegia as a Possible Neuroprosthesis Interface. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:5053-5056. [PMID: 30441476 DOI: 10.1109/embc.2018.8513418] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Spinal cord injury (SCI), stroke and other nervous system conditions can result in partial or total paralysis of individual's limbs. Numerous technologies have been proposed to assist neurorehabilitation or movement restoration, e.g. robotics or neuroprosthesis. However, individuals with tetraplegia often find difficult to pilot these devices. We developed a system based on a single inertial measurement unit located on the upper limb that is able to classify performed movements using principal component analysis. We analyzed three calibration algorithms: unsupervised learning, supervised learning and adaptive learning. Eight participants with tetraplegia (C4C7) piloted three different postures in a robotic hand. We achieved 89% accuracy using the supervised learning algorithm. Through offline simulation, we found accuracies of 76% on the unsupervised learning, and 88% on the adaptive one.
Collapse
|
23
|
Fall CL, Quevillon F, Blouin M, Latour S, Campeau-Lecours A, Gosselin C, Gosselin B. A Multimodal Adaptive Wireless Control Interface for People With Upper-Body Disabilities. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2018; 12:564-575. [PMID: 29877820 DOI: 10.1109/tbcas.2018.2810256] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
This paper describes a multimodal body-machine interface (BoMI) to help individuals with upper-limb disabilities using advanced assistive technologies, such as robotic arms. The proposed system uses a wearable and wireless body sensor network (WBSN) supporting up to six sensor nodes to measure the natural upper-body gesture of the users and translate it into control commands. Natural gesture of the head and upper-body parts, as well as muscular activity, are measured using inertial measurement units (IMUs) and surface electromyography (sEMG) using custom-designed multimodal wireless sensor nodes. An IMU sensing node is attached to a headset worn by the user. It has a size of 2.9 cm 2.9 cm, a maximum power consumption of 31 mW, and provides angular precision of 1. Multimodal patch sensor nodes, including both IMU and sEMG sensing modalities are placed over the user able-body parts to measure the motion and muscular activity. These nodes have a size of 2.5 cm 4.0 cm and a maximum power consumption of 11 mW. The proposed BoMI runs on a Raspberry Pi. It can adapt to several types of users through different control scenarios using the head and shoulder motion, as well as muscular activity, and provides a power autonomy of up to 24 h. JACO, a 6-DoF assistive robotic arm, is used as a testbed to evaluate the performance of the proposed BoMI. Ten able-bodied subjects performed ADLs while operating the AT device, using the Test d'Évaluation des Membres Supérieurs de Personnes Âgées to evaluate and compare the proposed BoMI with the conventional joystick controller. It is shown that the users can perform all tasks with the proposed BoMI, almost as fast as with the joystick controller, with only 30% time overhead on average, while being potentially more accessible to the upper-body disabled who cannot use the conventional joystick controller. Tests show that control performance with the proposed BoMI improved by up to 17% on average, after three trials.
Collapse
|