1
|
Santos Cardoso AS, Mohammadi M, Kaseler RL, Jochumsen M, Andreasen Struijk LNS. Assessing Mode-Switching Strategies for Assistive Robotic Manipulators Using a Preliminary Version of the Novel Non-invasive Tongue-Computer Interface. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38082858 DOI: 10.1109/embc40787.2023.10340946] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
The inductive tongue-computer interface allows individuals with tetraplegia to control assistive devices. However, controlling assistive robotic arms often requires more than 14 different commands, which cannot always fit into a single control layout. Previous studies have separated the commands into modes, but few have investigated strategies to switch between them. In this feasibility study, we compare the efficiency of switching modes using buttons, swipe gestures and double taps using a preliminary version of a new non-invasive mouthpiece unit (nMPU), which includes an integrated activation unit and a single sensor board. Three participants controlled a JACO assistive robot to pick up a bottle using different mode-switching strategies. Compared with switching modes with buttons, switching modes with swipes and double taps increased the task completion time by 21% and 58% respectively. Therefore, we recommend that configurations with multiple modes for the non-invasive tongue-computer interface include buttons for mode-switching.Clinical relevance- Cumbersome mode-switching strategies can lower a control interface's responsiveness and contribute to end-user abandonment of assistive technologies. This study showed that using buttons to switch modes is more reliable. Moreover, this study will inform the development of future control layouts with improved usability.
Collapse
|
2
|
de Almeida e Bueno L, Kwong MT, Bergmann JHM. Performance of Oral Cavity Sensors: A Systematic Review. SENSORS (BASEL, SWITZERLAND) 2023; 23:588. [PMID: 36679385 PMCID: PMC9862524 DOI: 10.3390/s23020588] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/15/2022] [Revised: 12/21/2022] [Accepted: 12/28/2022] [Indexed: 05/31/2023]
Abstract
Technological advancements are enabling new applications within biomedical engineering. As a connection point between the outer environment and the human system, the oral cavity offers unique opportunities for sensing technologies. This paper systematically reviews the performance of measurement systems tested in the human oral cavity. Performance was defined by metrics related to accuracy and agreement estimation. A comprehensive search identifying human studies that reported on the accuracy or agreement of intraoral sensors found 85 research papers. Most of the literature (62%) was in dentistry, followed by neurology (21%), and physical medicine and rehabilitation (12%). The remaining papers were on internal medicine, obstetrics, and aerospace medicine. Most of the studies applied force or pressure sensors (32%), while optical and image sensors were applied most widely across fields. The main challenges for future adoption include the lack of large human trials, the maturity of emerging technologies (e.g., biochemical sensors), and the absence of standardization of evaluation in specific fields. New research should aim to employ robust performance metrics to evaluate their systems and incorporate real-world evidence as part of the evaluation process. Oral cavity sensors offer the potential for applications in healthcare and wellbeing, but for many technologies, more research is needed.
Collapse
Affiliation(s)
| | - Man Ting Kwong
- Guy’s and St. Thomas’ NHS Foundation Trust, St. Thomas’ Hospital, Westminster Bridge Rd., London SE1 7EH, UK
| | | |
Collapse
|
3
|
Thøgersen MB, Mohammadi M, Gull MA, Bengtson SH, Kobbelgaard FV, Bentsen B, Khan BYA, Severinsen KE, Bai S, Bak T, Moeslund TB, Kanstrup AM, Andreasen Struijk LNS. User Based Development and Test of the EXOTIC Exoskeleton: Empowering Individuals with Tetraplegia Using a Compact, Versatile, 5-DoF Upper Limb Exoskeleton Controlled through Intelligent Semi-Automated Shared Tongue Control. SENSORS (BASEL, SWITZERLAND) 2022; 22:6919. [PMID: 36146260 PMCID: PMC9502221 DOI: 10.3390/s22186919] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 09/06/2022] [Accepted: 09/07/2022] [Indexed: 06/16/2023]
Abstract
This paper presents the EXOTIC- a novel assistive upper limb exoskeleton for individuals with complete functional tetraplegia that provides an unprecedented level of versatility and control. The current literature on exoskeletons mainly focuses on the basic technical aspects of exoskeleton design and control while the context in which these exoskeletons should function is less or not prioritized even though it poses important technical requirements. We considered all sources of design requirements, from the basic technical functions to the real-world practical application. The EXOTIC features: (1) a compact, safe, wheelchair-mountable, easy to don and doff exoskeleton capable of facilitating multiple highly desired activities of daily living for individuals with tetraplegia; (2) a semi-automated computer vision guidance system that can be enabled by the user when relevant; (3) a tongue control interface allowing for full, volitional, and continuous control over all possible motions of the exoskeleton. The EXOTIC was tested on ten able-bodied individuals and three users with tetraplegia caused by spinal cord injury. During the tests the EXOTIC succeeded in fully assisting tasks such as drinking and picking up snacks, even for users with complete functional tetraplegia and the need for a ventilator. The users confirmed the usability of the EXOTIC.
Collapse
Affiliation(s)
- Mikkel Berg Thøgersen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Mostafa Mohammadi
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Muhammad Ahsan Gull
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Stefan Hein Bengtson
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Bo Bentsen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Benjamin Yamin Ali Khan
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Kåre Eg Severinsen
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Shaoping Bai
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Bak
- Department of Electronic Systems, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Baltzer Moeslund
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Lotte N. S. Andreasen Struijk
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| |
Collapse
|
4
|
Kirtas O, Veltink P, Lontis R, Mohammadi M, Andreasen Struijk LNS. Development of inductive sensors for a robotic interface based on noninvasive tongue control. IEEE Int Conf Rehabil Robot 2022; 2022:1-6. [PMID: 36176082 DOI: 10.1109/icorr55369.2022.9896548] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Tongue based robotic interfaces have shown the potential to control assistive robotic devices developed for individuals with severe disabilities due to spinal cord injury. However, current tongue-robotic interfaces require invasive methods such as piercing to attach an activation unit (AU) to the tongue. A noninvasive tongue interface concept, which used a frame integrated AU instead of a tongue attached AU, was previously proposed. However, there is a need for the development of compact one-piece sensor printed circuit boards (PCBs) to enable activation of all inductive sensors. In this study, we developed and tested four designs of compact one-piece sensor PCBs incorporating inductive sensors for the design of a noninvasive tongue-robotic interface. We measured electrical parameters of the developed sensors to detect activation and compared them with a sensor of the current version of the inductive tongue-computer interface (ITCI) by moving AUs with different contact surfaces at the surface of the sensors. Results showed that, the newly developed inductive sensors had higher and wider activation than the sensor of ITCI and the AU with a flat contact surface had 3.5 - 4 times higher activation than the AU with a spherical contact surface. A higher sensor activation can result in a higher signal to noise ratio and thus a higher AU tracking resolution.
Collapse
|
5
|
Computer Vision-Based Adaptive Semi-Autonomous Control of an Upper Limb Exoskeleton for Individuals with Tetraplegia. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094374] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
We propose the use of computer vision for adaptive semi-autonomous control of an upper limb exoskeleton for assisting users with severe tetraplegia to increase independence and quality of life. A tongue-based interface was used together with the semi-autonomous control such that individuals with complete tetraplegia were able to use it despite being paralyzed from the neck down. The semi-autonomous control uses computer vision to detect nearby objects and estimate how to grasp them to assist the user in controlling the exoskeleton. Three control schemes were tested: non-autonomous (i.e., manual control using the tongue) control, semi-autonomous control with a fixed level of autonomy, and a semi-autonomous control with a confidence-based adaptive level of autonomy. Studies on experimental participants with and without tetraplegia were carried out. The control schemes were evaluated both in terms of their performance, such as the time and number of commands needed to complete a given task, as well as ratings from the users. The studies showed a clear and significant improvement in both performance and user ratings when using either of the semi-autonomous control schemes. The adaptive semi-autonomous control outperformed the fixed version in some scenarios, namely, in the more complex tasks and with users with more training in using the system.
Collapse
|
6
|
Kaeseler RL, Johansson TW, Struijk LNSA, Jochumsen M. Feature- and classification analysis for detection and classification of tongue movements from single-trial pre-movement EEG. IEEE Trans Neural Syst Rehabil Eng 2022; 30:678-687. [PMID: 35290187 DOI: 10.1109/tnsre.2022.3157959] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Individuals with severe tetraplegia can benefit from brain-computer interfaces (BCIs). While most movement-related BCI systems focus on right/left hand and/or foot movements, very few studies have considered tongue movements to construct a multiclass BCI. The aim of this study was to decode four movement directions of the tongue (left, right, up, and down) from single-trial pre-movement EEG and provide a feature and classifier investigation. In offline analyses (from ten healthy participants) detection and classification were performed using temporal, spectral, entropy, and template features classified using either a linear discriminative analysis, support vector machine, random forest or multilayer perceptron classifiers. Besides the 4-class classification scenario, all possible 3-, and 2-class scenarios were tested to find the most discriminable movement type. The linear discriminant analysis achieved on average, higher classification accuracies for both movement detection and classification. The right- and down tongue movements provided the highest and lowest detection accuracy (95.3±4.3% and 91.7±4.8%), respectively. The 4-class classification achieved an accuracy of 62.6±7.2%, while the best 3-class classification (using left, right, and up movements) and 2-class classification (using left and right movements) achieved an accuracy of 75.6±8.4% and 87.7±8.0%, respectively. Using only a combination of the temporal and template feature groups provided further classification accuracy improvements. Presumably, this is because these feature groups utilize the movement-related cortical potentials, which are noticeably different on the left- versus right brain hemisphere for the different movements. This study shows that the cortical representation of the tongue is useful for extracting control signals for multi-class movement detection BCIs.
Collapse
|
7
|
Mohammadi M, Knoche H, Thøgersen M, Bengtson SH, Gull MA, Bentsen B, Gaihede M, Severinsen KE, Andreasen Struijk LNS. Eyes-Free Tongue Gesture and Tongue Joystick Control of a Five DOF Upper-Limb Exoskeleton for Severely Disabled Individuals. Front Neurosci 2022; 15:739279. [PMID: 34975367 PMCID: PMC8718615 DOI: 10.3389/fnins.2021.739279] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2021] [Accepted: 11/23/2021] [Indexed: 11/30/2022] Open
Abstract
Spinal cord injury can leave the affected individual severely disabled with a low level of independence and quality of life. Assistive upper-limb exoskeletons are one of the solutions that can enable an individual with tetraplegia (paralysis in both arms and legs) to perform simple activities of daily living by mobilizing the arm. Providing an efficient user interface that can provide full continuous control of such a device—safely and intuitively—with multiple degrees of freedom (DOFs) still remains a challenge. In this study, a control interface for an assistive upper-limb exoskeleton with five DOFs based on an intraoral tongue-computer interface (ITCI) for individuals with tetraplegia was proposed. Furthermore, we evaluated eyes-free use of the ITCI for the first time and compared two tongue-operated control methods, one based on tongue gestures and the other based on dynamic virtual buttons and a joystick-like control. Ten able-bodied participants tongue controlled the exoskeleton for a drinking task with and without visual feedback on a screen in three experimental sessions. As a baseline, the participants performed the drinking task with a standard gamepad. The results showed that it was possible to control the exoskeleton with the tongue even without visual feedback and to perform the drinking task at 65.1% of the speed of the gamepad. In a clinical case study, an individual with tetraplegia further succeeded to fully control the exoskeleton and perform the drinking task only 5.6% slower than the able-bodied group. This study demonstrated the first single-modal control interface that can enable individuals with complete tetraplegia to fully and continuously control a five-DOF upper limb exoskeleton and perform a drinking task after only 2 h of training. The interface was used both with and without visual feedback.
Collapse
Affiliation(s)
- Mostafa Mohammadi
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Hendrik Knoche
- Human Machine Interaction, Department of Architecture, Design and Media Technology, Aalborg University, Aalborg, Denmark
| | - Mikkel Thøgersen
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Stefan Hein Bengtson
- Human Machine Interaction, Department of Architecture, Design and Media Technology, Aalborg University, Aalborg, Denmark
| | - Muhammad Ahsan Gull
- Department of Materials and Production, Aalborg University, Aalborg, Denmark
| | - Bo Bentsen
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| | - Michael Gaihede
- Department of Clinical Medicine, Aalborg University, Aalborg, Denmark
| | | | - Lotte N S Andreasen Struijk
- Neurorehabilitation Robotics and Engineering, Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, Aalborg, Denmark
| |
Collapse
|
8
|
Mohammadi M, Knoche H, Struijk LNSA. Continuous Tongue Robot Mapping for Paralyzed Individuals Improves the Functional Performance of Tongue-Based Robotic Assistance. IEEE Trans Biomed Eng 2021; 68:2552-2562. [PMID: 33513095 DOI: 10.1109/tbme.2021.3055250] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Individuals with tetraplegia have a challenging life due to a lack of independence and autonomy. Assistive robots have the potential to assist with the activities of daily living and thus improve the quality of life. However, an efficient and reliable control interface for severely disabled individuals is still missing. An intraoral tongue-computer interface (ITCI) for people with tetraplegia has previously been introduced and tested for controlling a robotic manipulator in a study deploying discrete tongue robot mapping. To improve the efficiency of the interface, the current study proposed the use of virtual buttons based on the ITCI and evaluated them in combination with a joystick-like control implementation, enabling continuous control commands. Twelve able-bodied volunteers participated in a three-day experiment. They controlled an assistive robotic manipulator through the tongue to perform two tasks: Pouring water in a cup (PW) and picking up a roll of tape (PUT). Four different tongue-robot mapping methods were compared. The results showed that using continuous commands reduced the task completion time by 16% and the number of commands of the PUT test by 20% compared with discrete commands. The highest success rate for completing the tasks was 77.8% for the PUT test and 100% for the PW test, both achieved by the control methods with continuous commands. Thus, the study demonstrated that incorporating continuous commands can improve the performance of the ITCI system for controlling robotic manipulators.
Collapse
|