1
|
Penna MF, Giordano L, Tortora S, Astarita D, Amato L, Dell’Agnello F, Menegatti E, Gruppioni E, Vitiello N, Crea S, Trigili E. A muscle synergies-based controller to drive a powered upper-limb exoskeleton in reaching tasks. WEARABLE TECHNOLOGIES 2024; 5:e14. [PMID: 39575326 PMCID: PMC11579892 DOI: 10.1017/wtc.2024.16] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/25/2024] [Revised: 06/21/2024] [Accepted: 07/09/2024] [Indexed: 11/24/2024]
Abstract
This work introduces a real-time intention decoding algorithm grounded in muscle synergies (Syn-ID). The algorithm detects the electromyographic (EMG) onset and infers the direction of the movement during reaching tasks to control a powered shoulder-elbow exoskeleton. Features related to muscle synergies are used in a Gaussian Mixture Model and probability accumulation-based logic to infer the user's movement direction. The performance of the algorithm was verified by a feasibility study including eight healthy participants. The experiments comprised a transparent session, during which the exoskeleton did not provide any assistance, and an assistive session in which the Syn-ID strategy was employed. Participants were asked to reach eight targets equally spaced on a circumference of 25 cm radius (adjusted chance level: 18.1%). The results showed an average accuracy of 48.7% after 0.6 s from the EMG onset. Most of the confusion of the estimate was found along directions adjacent to the actual one (type 1 error: 33.4%). Effects of the assistance were observed in a statistically significant reduction in the activation of Posterior Deltoid and Triceps Brachii. The final positions of the movements during the assistive session were on average 1.42 cm far from the expected ones, both when the directions were estimated correctly and when type 1 errors occurred. Therefore, combining accurate estimates with type 1 errors, we computed a modified accuracy of 82.10±6.34%. Results were benchmarked with respect to a purely kinematics-based approach. The Syn-ID showed better performance in the first portion of the movement (0.14 s after EMG onset).
Collapse
Affiliation(s)
- Michele Francesco Penna
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Luca Giordano
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Stefano Tortora
- Department of Information Engineering, University of Padova, Padova, Italy
- Padova Neuroscience Center, University of Padova, Padova, Italy
| | - Davide Astarita
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Lorenzo Amato
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Filippo Dell’Agnello
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Emanuele Menegatti
- Department of Information Engineering, University of Padova, Padova, Italy
- Padova Neuroscience Center, University of Padova, Padova, Italy
| | | | - Nicola Vitiello
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Simona Crea
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Emilio Trigili
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, Pisa, Italy
| |
Collapse
|
2
|
Manero A, Rivera V, Fu Q, Schwartzman JD, Prock-Gibbs H, Shah N, Gandhi D, White E, Crawford KE, Coathup MJ. Emerging Medical Technologies and Their Use in Bionic Repair and Human Augmentation. Bioengineering (Basel) 2024; 11:695. [PMID: 39061777 PMCID: PMC11274085 DOI: 10.3390/bioengineering11070695] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2024] [Revised: 07/04/2024] [Accepted: 07/07/2024] [Indexed: 07/28/2024] Open
Abstract
As both the proportion of older people and the length of life increases globally, a rise in age-related degenerative diseases, disability, and prolonged dependency is projected. However, more sophisticated biomedical materials, as well as an improved understanding of human disease, is forecast to revolutionize the diagnosis and treatment of conditions ranging from osteoarthritis to Alzheimer's disease as well as impact disease prevention. Another, albeit quieter, revolution is also taking place within society: human augmentation. In this context, humans seek to improve themselves, metamorphosing through self-discipline or more recently, through use of emerging medical technologies, with the goal of transcending aging and mortality. In this review, and in the pursuit of improved medical care following aging, disease, disability, or injury, we first highlight cutting-edge and emerging materials-based neuroprosthetic technologies designed to restore limb or organ function. We highlight the potential for these technologies to be utilized to augment human performance beyond the range of natural performance. We discuss and explore the growing social movement of human augmentation and the idea that it is possible and desirable to use emerging technologies to push the boundaries of what it means to be a healthy human into the realm of superhuman performance and intelligence. This potential future capability is contrasted with limitations in the right-to-repair legislation, which may create challenges for patients. Now is the time for continued discussion of the ethical strategies for research, implementation, and long-term device sustainability or repair.
Collapse
Affiliation(s)
- Albert Manero
- Limbitless Solutions, University of Central Florida, 12703 Research Parkway, Suite 100, Orlando, FL 32826, USA (V.R.)
- Biionix Cluster, University of Central Florida, Orlando, FL 32827, USA; (Q.F.); (K.E.C.)
| | - Viviana Rivera
- Limbitless Solutions, University of Central Florida, 12703 Research Parkway, Suite 100, Orlando, FL 32826, USA (V.R.)
| | - Qiushi Fu
- Biionix Cluster, University of Central Florida, Orlando, FL 32827, USA; (Q.F.); (K.E.C.)
- Department of Mechanical and Aerospace Engineering, University of Central Florida, Orlando, FL 32816, USA
| | - Jonathan D. Schwartzman
- College of Medicine, University of Central Florida, Orlando, FL 32827, USA; (J.D.S.); (H.P.-G.); (N.S.); (D.G.); (E.W.)
| | - Hannah Prock-Gibbs
- College of Medicine, University of Central Florida, Orlando, FL 32827, USA; (J.D.S.); (H.P.-G.); (N.S.); (D.G.); (E.W.)
| | - Neel Shah
- College of Medicine, University of Central Florida, Orlando, FL 32827, USA; (J.D.S.); (H.P.-G.); (N.S.); (D.G.); (E.W.)
| | - Deep Gandhi
- College of Medicine, University of Central Florida, Orlando, FL 32827, USA; (J.D.S.); (H.P.-G.); (N.S.); (D.G.); (E.W.)
| | - Evan White
- College of Medicine, University of Central Florida, Orlando, FL 32827, USA; (J.D.S.); (H.P.-G.); (N.S.); (D.G.); (E.W.)
| | - Kaitlyn E. Crawford
- Biionix Cluster, University of Central Florida, Orlando, FL 32827, USA; (Q.F.); (K.E.C.)
- Department of Materials Science and Engineering, University of Central Florida, Orlando, FL 32816, USA
| | - Melanie J. Coathup
- Biionix Cluster, University of Central Florida, Orlando, FL 32827, USA; (Q.F.); (K.E.C.)
- College of Medicine, University of Central Florida, Orlando, FL 32827, USA; (J.D.S.); (H.P.-G.); (N.S.); (D.G.); (E.W.)
| |
Collapse
|
3
|
Lee J, Miri S, Bayro A, Kim M, Jeong H, Yeo WH. Biosignal-integrated robotic systems with emerging trends in visual interfaces: A systematic review. BIOPHYSICS REVIEWS 2024; 5:011301. [PMID: 38510371 PMCID: PMC10903439 DOI: 10.1063/5.0185568] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Accepted: 01/29/2024] [Indexed: 03/22/2024]
Abstract
Human-machine interfaces (HMI) are currently a trendy and rapidly expanding area of research. Interestingly, the human user does not readily observe the interface between humans and machines. Instead, interactions between the machine and electrical signals from the user's body are obscured by complex control algorithms. The result is effectively a one-way street, wherein data is only transmitted from human to machine. Thus, a gap remains in the literature: how can information be effectively conveyed to the user to enable mutual understanding between humans and machines? Here, this paper reviews recent advancements in biosignal-integrated wearable robotics, with a particular emphasis on "visualization"-the presentation of relevant data, statistics, and visual feedback to the user. This review article covers various signals of interest, such as electroencephalograms and electromyograms, and explores novel sensor architectures and key materials. Recent developments in wearable robotics are examined from control and mechanical design perspectives. Additionally, we discuss current visualization methods and outline the field's future direction. While much of the HMI field focuses on biomedical and healthcare applications, such as rehabilitation of spinal cord injury and stroke patients, this paper also covers less common applications in manufacturing, defense, and other domains.
Collapse
Affiliation(s)
| | - Sina Miri
- Department of Mechanical and Industrial Engineering, The University of Illinois at Chicago, Chicago, Illinois 60607, USA
| | - Allison Bayro
- School of Biological and Health Systems Engineering, Ira A. Fulton Schools of Engineering, Arizona State University, Tempe, Arizona 85287, USA
| | - Myunghee Kim
- Department of Mechanical and Industrial Engineering, The University of Illinois at Chicago, Chicago, Illinois 60607, USA
| | - Heejin Jeong
- Authors to whom correspondence should be addressed:; ; and
| | - Woon-Hong Yeo
- Authors to whom correspondence should be addressed:; ; and
| |
Collapse
|
4
|
Fischer-Janzen A, Wendt TM, Van Laerhoven K. A scoping review of gaze and eye tracking-based control methods for assistive robotic arms. Front Robot AI 2024; 11:1326670. [PMID: 38440775 PMCID: PMC10909843 DOI: 10.3389/frobt.2024.1326670] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2023] [Accepted: 01/29/2024] [Indexed: 03/06/2024] Open
Abstract
Background: Assistive Robotic Arms are designed to assist physically disabled people with daily activities. Existing joysticks and head controls are not applicable for severely disabled people such as people with Locked-in Syndrome. Therefore, eye tracking control is part of ongoing research. The related literature spans many disciplines, creating a heterogeneous field that makes it difficult to gain an overview. Objectives: This work focuses on ARAs that are controlled by gaze and eye movements. By answering the research questions, this paper provides details on the design of the systems, a comparison of input modalities, methods for measuring the performance of these controls, and an outlook on research areas that gained interest in recent years. Methods: This review was conducted as outlined in the PRISMA 2020 Statement. After identifying a wide range of approaches in use the authors decided to use the PRISMA-ScR extension for a scoping review to present the results. The identification process was carried out by screening three databases. After the screening process, a snowball search was conducted. Results: 39 articles and 6 reviews were included in this article. Characteristics related to the system and study design were extracted and presented divided into three groups based on the use of eye tracking. Conclusion: This paper aims to provide an overview for researchers new to the field by offering insight into eye tracking based robot controllers. We have identified open questions that need to be answered in order to provide people with severe motor function loss with systems that are highly useable and accessible.
Collapse
Affiliation(s)
- Anke Fischer-Janzen
- Faculty Economy, Work-Life Robotics Institute, University of Applied Sciences Offenburg, Offenburg, Germany
| | - Thomas M. Wendt
- Faculty Economy, Work-Life Robotics Institute, University of Applied Sciences Offenburg, Offenburg, Germany
| | - Kristof Van Laerhoven
- Ubiquitous Computing, Department of Electrical Engineering and Computer Science, University of Siegen, Siegen, Germany
| |
Collapse
|
5
|
Bates M, Sunderam S. Hand-worn devices for assessment and rehabilitation of motor function and their potential use in BCI protocols: a review. Front Hum Neurosci 2023; 17:1121481. [PMID: 37484920 PMCID: PMC10357516 DOI: 10.3389/fnhum.2023.1121481] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2022] [Accepted: 06/01/2023] [Indexed: 07/25/2023] Open
Abstract
Introduction Various neurological conditions can impair hand function. Affected individuals cannot fully participate in activities of daily living due to the lack of fine motor control. Neurorehabilitation emphasizes repetitive movement and subjective clinical assessments that require clinical experience to administer. Methods Here, we perform a review of literature focused on the use of hand-worn devices for rehabilitation and assessment of hand function. We paid particular attention to protocols that involve brain-computer interfaces (BCIs) since BCIs are gaining ground as a means for detecting volitional signals as the basis for interactive motor training protocols to augment recovery. All devices reviewed either monitor, assist, stimulate, or support hand and finger movement. Results A majority of studies reviewed here test or validate devices through clinical trials, especially for stroke. Even though sensor gloves are the most commonly employed type of device in this domain, they have certain limitations. Many such gloves use bend or inertial sensors to monitor the movement of individual digits, but few monitor both movement and applied pressure. The use of such devices in BCI protocols is also uncommon. Discussion We conclude that hand-worn devices that monitor both flexion and grip will benefit both clinical diagnostic assessment of function during treatment and closed-loop BCI protocols aimed at rehabilitation.
Collapse
Affiliation(s)
- Madison Bates
- Neural Systems Lab, F. Joseph Halcomb III, M.D. Department of Biomedical Engineering, University of Kentucky, Lexington, KY, United States
| | | |
Collapse
|
6
|
Catalán JM, Trigili E, Nann M, Blanco-Ivorra A, Lauretti C, Cordella F, Ivorra E, Armstrong E, Crea S, Alcañiz M, Zollo L, Soekadar SR, Vitiello N, García-Aracil N. Hybrid brain/neural interface and autonomous vision-guided whole-arm exoskeleton control to perform activities of daily living (ADLs). J Neuroeng Rehabil 2023; 20:61. [PMID: 37149621 PMCID: PMC10164333 DOI: 10.1186/s12984-023-01185-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2022] [Accepted: 04/26/2023] [Indexed: 05/08/2023] Open
Abstract
BACKGROUND The aging of the population and the progressive increase of life expectancy in developed countries is leading to a high incidence of age-related cerebrovascular diseases, which affect people's motor and cognitive capabilities and might result in the loss of arm and hand functions. Such conditions have a detrimental impact on people's quality of life. Assistive robots have been developed to help people with motor or cognitive disabilities to perform activities of daily living (ADLs) independently. Most of the robotic systems for assisting on ADLs proposed in the state of the art are mainly external manipulators and exoskeletal devices. The main objective of this study is to compare the performance of an hybrid EEG/EOG interface to perform ADLs when the user is controlling an exoskeleton rather than using an external manipulator. METHODS Ten impaired participants (5 males and 5 females, mean age 52 ± 16 years) were instructed to use both systems to perform a drinking task and a pouring task comprising multiple subtasks. For each device, two modes of operation were studied: synchronous mode (the user received a visual cue indicating the sub-tasks to be performed at each time) and asynchronous mode (the user started and finished each of the sub-tasks independently). Fluent control was assumed when the time for successful initializations ranged below 3 s and a reliable control in case it remained below 5 s. NASA-TLX questionnaire was used to evaluate the task workload. For the trials involving the use of the exoskeleton, a custom Likert-Scale questionnaire was used to evaluate the user's experience in terms of perceived comfort, safety, and reliability. RESULTS All participants were able to control both systems fluently and reliably. However, results suggest better performances of the exoskeleton over the external manipulator (75% successful initializations remain below 3 s in case of the exoskeleton and bellow 5s in case of the external manipulator). CONCLUSIONS Although the results of our study in terms of fluency and reliability of EEG control suggest better performances of the exoskeleton over the external manipulator, such results cannot be considered conclusive, due to the heterogeneity of the population under test and the relatively limited number of participants.
Collapse
Affiliation(s)
- José M Catalán
- Robotics and Artificial Intelligence Group of the Bioengineering Institute, Miguel Hernandez University, 03202, Elche, Spain.
| | - Emilio Trigili
- BioRobotics Institute, Scuola Superiore Sant'Anna, 56025, Pontedera, Italy.
- Department of Excellence in Robotics & AI, Scuola Superiore Sant'Anna, Pisa, Italy.
| | - Marius Nann
- Clinical Neurotechnology Laboratory, Charité, Universitätsmedizin Berlin, 10117, Belin, Germany
| | - Andrea Blanco-Ivorra
- Robotics and Artificial Intelligence Group of the Bioengineering Institute, Miguel Hernandez University, 03202, Elche, Spain
| | - Clemente Lauretti
- Laboratory of Biomedical Robotics and Biomicrosystems, Università Campus Bio-Medico di Roma, 00128, Rome, Italy
| | - Francesca Cordella
- Laboratory of Biomedical Robotics and Biomicrosystems, Università Campus Bio-Medico di Roma, 00128, Rome, Italy
| | - Eugenio Ivorra
- University Institute for Human-Centered Technology Research (Human-Tech), Universitat Politècnica de València, 46022, Valencia, Spain
| | | | - Simona Crea
- BioRobotics Institute, Scuola Superiore Sant'Anna, 56025, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant'Anna, Pisa, Italy
- IRCCS, Fondazione Don Carlo Gnocchi, Milan, Italy
| | - Mariano Alcañiz
- University Institute for Human-Centered Technology Research (Human-Tech), Universitat Politècnica de València, 46022, Valencia, Spain
| | - Loredana Zollo
- Laboratory of Biomedical Robotics and Biomicrosystems, Università Campus Bio-Medico di Roma, 00128, Rome, Italy
| | - Surjo R Soekadar
- Clinical Neurotechnology Laboratory, Charité, Universitätsmedizin Berlin, 10117, Belin, Germany
| | - Nicola Vitiello
- BioRobotics Institute, Scuola Superiore Sant'Anna, 56025, Pontedera, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant'Anna, Pisa, Italy
- IRCCS, Fondazione Don Carlo Gnocchi, Milan, Italy
| | - Nicolás García-Aracil
- Robotics and Artificial Intelligence Group of the Bioengineering Institute, Miguel Hernandez University, 03202, Elche, Spain
| |
Collapse
|
7
|
Zhang Z, Li D, Zhao Y, Fan Z, Xiang J, Wang X, Cui X. A flexible speller based on time-space frequency conversion SSVEP stimulation paradigm under dry electrode. Front Comput Neurosci 2023; 17:1101726. [PMID: 36817318 PMCID: PMC9929550 DOI: 10.3389/fncom.2023.1101726] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2022] [Accepted: 01/10/2023] [Indexed: 02/04/2023] Open
Abstract
Introduction Speller is the best way to express the performance of the brain-computer interface (BCI) paradigm. Due to its advantages of short analysis time and high accuracy, the SSVEP paradigm has been widely used in the BCI speller system based on the wet electrode. It is widely known that the wet electrode operation is cumbersome and that the subjects have a poor experience. In addition, in the asynchronous SSVEP system based on threshold analysis, the system flickers continuously from the beginning to the end of the experiment, which leads to visual fatigue. The dry electrode has a simple operation and provides a comfortable experience for subjects. The EOG signal can avoid the stimulation of SSVEP for a long time, thus reducing fatigue. Methods This study first designed the brain-controlled switch based on continuous blinking EOG signal and SSVEP signal to improve the flexibility of the BCI speller. Second, in order to increase the number of speller instructions, we designed the time-space frequency conversion (TSFC) SSVEP stimulus paradigm by constantly changing the time and space frequency of SSVEP sub-stimulus blocks, and designed a speller in a dry electrode environment. Results Seven subjects participated and completed the experiments. The results showed that the accuracy of the brain-controlled switch designed in this study was up to 94.64%, and all the subjects could use the speller flexibly. The designed 60-character speller based on the TSFC-SSVEP stimulus paradigm has an accuracy rate of 90.18% and an information transmission rate (ITR) of 117.05 bits/min. All subjects can output the specified characters in a short time. Discussion This study designed and implemented a multi-instruction SSVEP speller based on dry electrode. Through the combination of EOG and SSVEP signals, the speller can be flexibly controlled. The frequency of SSVEP stimulation sub-block is recoded in time and space by TSFC-SSVEP stimulation paradigm, which greatly improves the number of output instructions of BCI system in dry electrode environment. This work only uses FBCCA algorithm to test the stimulus paradigm, which requires a long stimulus time. In the future, we will use trained algorithms to study stimulus paradigm to improve its overall performance.
Collapse
|
8
|
Bibliometric analysis on Brain-computer interfaces in a 30-year period. APPL INTELL 2022. [DOI: 10.1007/s10489-022-04226-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/03/2022]
|
9
|
Thøgersen MB, Mohammadi M, Gull MA, Bengtson SH, Kobbelgaard FV, Bentsen B, Khan BYA, Severinsen KE, Bai S, Bak T, Moeslund TB, Kanstrup AM, Andreasen Struijk LNS. User Based Development and Test of the EXOTIC Exoskeleton: Empowering Individuals with Tetraplegia Using a Compact, Versatile, 5-DoF Upper Limb Exoskeleton Controlled through Intelligent Semi-Automated Shared Tongue Control. SENSORS (BASEL, SWITZERLAND) 2022; 22:6919. [PMID: 36146260 PMCID: PMC9502221 DOI: 10.3390/s22186919] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 09/06/2022] [Accepted: 09/07/2022] [Indexed: 06/16/2023]
Abstract
This paper presents the EXOTIC- a novel assistive upper limb exoskeleton for individuals with complete functional tetraplegia that provides an unprecedented level of versatility and control. The current literature on exoskeletons mainly focuses on the basic technical aspects of exoskeleton design and control while the context in which these exoskeletons should function is less or not prioritized even though it poses important technical requirements. We considered all sources of design requirements, from the basic technical functions to the real-world practical application. The EXOTIC features: (1) a compact, safe, wheelchair-mountable, easy to don and doff exoskeleton capable of facilitating multiple highly desired activities of daily living for individuals with tetraplegia; (2) a semi-automated computer vision guidance system that can be enabled by the user when relevant; (3) a tongue control interface allowing for full, volitional, and continuous control over all possible motions of the exoskeleton. The EXOTIC was tested on ten able-bodied individuals and three users with tetraplegia caused by spinal cord injury. During the tests the EXOTIC succeeded in fully assisting tasks such as drinking and picking up snacks, even for users with complete functional tetraplegia and the need for a ventilator. The users confirmed the usability of the EXOTIC.
Collapse
Affiliation(s)
- Mikkel Berg Thøgersen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Mostafa Mohammadi
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Muhammad Ahsan Gull
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Stefan Hein Bengtson
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Bo Bentsen
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Benjamin Yamin Ali Khan
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Kåre Eg Severinsen
- Spinal Cord Injury Centre of Western Denmark, Viborg Regional Hospital, 8800 Viborg, Denmark
| | - Shaoping Bai
- Department of Materials and Production Technology, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Bak
- Department of Electronic Systems, Aalborg University, 9220 Aalborg, Denmark
| | - Thomas Baltzer Moeslund
- Visual Analysis and Perception (VAP) Lab, Department of Architecture, Design, and Media Technology, Aalborg University, 9000 Aalborg, Denmark
| | | | - Lotte N. S. Andreasen Struijk
- Center for Rehabilitation Robotics, Department of Health Science and Technology, Aalborg University, 9220 Aalborg, Denmark
| |
Collapse
|
10
|
Struijk LNSA, Kanstrup AM, Bai S, Bak T, Thogersen MB, Mohammadi M, Bengtson SH, Kobbelgaard FV, Gull MA, Bentsen B, Severinsen KE, Kasch H, Moeslund TB. The impact of interdisciplinarity and user involvement on the design and usability of an assistive upper limb exoskeleton - a case study on the EXOTIC. IEEE Int Conf Rehabil Robot 2022; 2022:1-5. [PMID: 36176141 DOI: 10.1109/icorr55369.2022.9896500] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
This study describes an interdisciplinary approach to develop a 5 degrees of freedom assistive upper limb exoskeleton (ULE) for users with severe to complete functional tetraplegia. Four different application levels were identified for the ULE ranging from basic technical application to interaction with users, interaction with caregivers and interaction with the society, each level posing requirements for the design and functionality of the ULE. These requirements were addressed through an interdisciplinary collaboration involving users, clinicians and researchers within social sciences and humanities, mechanical engineering, control engineering media technology and biomedical engineering. The results showed that the developed ULE, the EXOTIC, had a high level of usability, safety and adoptability. Further, the results showed that several topics are important to explicitly address in relation to the facilitation of interdisciplinary collaboration including, defining a common language, a joint visualization of the end goal and a physical frame for the collaboration, such as a shared laboratory. The study underlined the importance of interdisciplinarity and we believe that future collaboration amongst interdisciplinary researchers and centres, also at an international level, can strongly facilitate the usefulness and adoption of assistive exoskeletons and similar technologies.
Collapse
|
11
|
Zheng L, Feng W, Ma Y, Lian P, Xiao Y, Yi Z, Wu X. Ensemble learning method based on temporal, spatial features with multi-scale filter banks for motor imagery EEG classification. Biomed Signal Process Control 2022. [DOI: 10.1016/j.bspc.2022.103634] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
12
|
Belkhiria C, Boudir A, Hurter C, Peysakhovich V. EOG-Based Human-Computer Interface: 2000-2020 Review. SENSORS (BASEL, SWITZERLAND) 2022; 22:4914. [PMID: 35808414 PMCID: PMC9269776 DOI: 10.3390/s22134914] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Revised: 06/23/2022] [Accepted: 06/25/2022] [Indexed: 11/28/2022]
Abstract
Electro-oculography (EOG)-based brain-computer interface (BCI) is a relevant technology influencing physical medicine, daily life, gaming and even the aeronautics field. EOG-based BCI systems record activity related to users' intention, perception and motor decisions. It converts the bio-physiological signals into commands for external hardware, and it executes the operation expected by the user through the output device. EOG signal is used for identifying and classifying eye movements through active or passive interaction. Both types of interaction have the potential for controlling the output device by performing the user's communication with the environment. In the aeronautical field, investigations of EOG-BCI systems are being explored as a relevant tool to replace the manual command and as a communicative tool dedicated to accelerating the user's intention. This paper reviews the last two decades of EOG-based BCI studies and provides a structured design space with a large set of representative papers. Our purpose is to introduce the existing BCI systems based on EOG signals and to inspire the design of new ones. First, we highlight the basic components of EOG-based BCI studies, including EOG signal acquisition, EOG device particularity, extracted features, translation algorithms, and interaction commands. Second, we provide an overview of EOG-based BCI applications in the real and virtual environment along with the aeronautical application. We conclude with a discussion of the actual limits of EOG devices regarding existing systems. Finally, we provide suggestions to gain insight for future design inquiries.
Collapse
Affiliation(s)
- Chama Belkhiria
- ISAE-SUPAERO, Université de Toulouse, 31400 Toulouse, France;
| | - Atlal Boudir
- ENAC, Université de Toulouse, 31400 Toulouse, France; (A.B.); (C.H.)
| | - Christophe Hurter
- ENAC, Université de Toulouse, 31400 Toulouse, France; (A.B.); (C.H.)
| | | |
Collapse
|
13
|
Fazli E, Rakhtala SM, Mirrashid N, Karimi HR. Real-time implementation of a super twisting control algorithm for an upper limb wearable robot. MECHATRONICS 2022; 84:102808. [DOI: 10.1016/j.mechatronics.2022.102808] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/01/2023]
|
14
|
Computer Vision-Based Adaptive Semi-Autonomous Control of an Upper Limb Exoskeleton for Individuals with Tetraplegia. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094374] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
We propose the use of computer vision for adaptive semi-autonomous control of an upper limb exoskeleton for assisting users with severe tetraplegia to increase independence and quality of life. A tongue-based interface was used together with the semi-autonomous control such that individuals with complete tetraplegia were able to use it despite being paralyzed from the neck down. The semi-autonomous control uses computer vision to detect nearby objects and estimate how to grasp them to assist the user in controlling the exoskeleton. Three control schemes were tested: non-autonomous (i.e., manual control using the tongue) control, semi-autonomous control with a fixed level of autonomy, and a semi-autonomous control with a confidence-based adaptive level of autonomy. Studies on experimental participants with and without tetraplegia were carried out. The control schemes were evaluated both in terms of their performance, such as the time and number of commands needed to complete a given task, as well as ratings from the users. The studies showed a clear and significant improvement in both performance and user ratings when using either of the semi-autonomous control schemes. The adaptive semi-autonomous control outperformed the fixed version in some scenarios, namely, in the more complex tasks and with users with more training in using the system.
Collapse
|
15
|
Lanotte F, McKinney Z, Grazi L, Chen B, Crea S, Vitiello N. Adaptive Control Method for Dynamic Synchronization of Wearable Robotic Assistance to Discrete Movements: Validation for Use Case of Lifting Tasks. IEEE T ROBOT 2021. [DOI: 10.1109/tro.2021.3073836] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
16
|
Xie C, Yang Q, Huang Y, Su S, Xu T, Song R. A Hybrid Arm-Hand Rehabilitation Robot With EMG-Based Admittance Controller. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2021; 15:1332-1342. [PMID: 34813476 DOI: 10.1109/tbcas.2021.3130090] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Reach-and-grasp is one of the most fundamental activities in daily life, while few rehabilitation robots provide integrated and active training of the arm and hand for patients after stroke to improve their mobility. In this study, a novel hybrid arm-hand rehabilitation robot (HAHRR) was built for the reach-and-grasp task. This hybrid structure consisted of a cable-driven module for three-dimensional arm motion and an exoskeleton for hand motion, which enabled assistance of the arm and hand simultaneously. To implement active compliance control, an EMG-based admittance controller was applied to the HAHRR. Experimental results showed that the HAHRR with the EMG-based admittance controller could not only assist the subject in fulfilling the reach-and-grasp task, but also generate smoother trajectories compared with the force-sensing-based admittance controller. These findings also suggested that the proposed approach might be applicable to post-stroke arm-hand rehabilitation training.
Collapse
|
17
|
Barios JA, Ezquerro S, Bertomeu-Motos A, Catalan JM, Sanchez-Aparicio JM, Donis-Barber L, Fernandez E, Garcia-Aracil N. Movement-Related EEG Oscillations of Contralesional Hemisphere Discloses Compensation Mechanisms of Severely Affected Motor Chronic Stroke Patients. Int J Neural Syst 2021; 31:2150053. [PMID: 34719347 DOI: 10.1142/s0129065721500532] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Conventional rehabilitation strategies for stroke survivors become difficult when voluntary movements are severely disturbed. Combining passive limb mobilization, robotic devices and EEG-based brain-computer interfaces (BCI) systems might improve treatment and clinical follow-up of these patients, but detailed knowledge of neurophysiological mechanisms involved in functional recovery, which might help for tailoring stroke treatment strategies, is lacking. Movement-related EEG changes (EEG event-related desynchronization (ERD) in [Formula: see text] and [Formula: see text] bands, an indicator of motor cortex activation traditionally used for BCI systems), were evaluated in a group of 23 paralyzed chronic stroke patients in two unilateral motor tasks alternating paretic and healthy hands ((i) passive movement, using a hand exoskeleton, and (ii) voluntary movement), and compared to nine healthy subjects. In tasks using unaffected hand, we observed an increase of contralesional hemisphere activation for stroke patients group. Unexpectedly, when using paralyzed hand, motor cortex activation was reduced or absent in severely affected group of patients, while patients with moderate motor deficit showed an activation greater than control group. Cortical activation was reduced or absent in damaged hemisphere of all the patients in both tasks. Significant differences related to severity of motor deficit were found in the time course of [Formula: see text] bands power ratio in EEG of contralesional hemisphere while moving affected hand. These findings suggest the presence of different compensation mechanisms in contralesional hemisphere of stroke patients related to the grade of motor disability, that might turn quantitative EEG during a movement task, obtained from a BCI system controlling a robotic device included in a rehabilitation task, into a valuable tool for monitoring clinical progression, evaluating recovery, and tailoring treatment of stroke patients.
Collapse
Affiliation(s)
- Juan A Barios
- Biomedical Neuroengineering Research Group (nBio), Miguel Hernández, University, Avda. de la Universidad s/n, 03202 Elche, Spain.,Laboratory for New Technologies in Neurorehabilitation, Fundación Instituto San Jose, Pinar San Jose s/n, 28003 Madrid, Spain
| | - Santiago Ezquerro
- Biomedical Neuroengineering Research Group (nBio), Miguel Hernández, University, Avda. de la Universidad s/n, 03202 Elche, Spain.,Laboratory for New Technologies in Neurorehabilitation, Fundación Instituto San Jose, Pinar San Jose s/n, 28003 Madrid, Spain
| | - Arturo Bertomeu-Motos
- Biomedical Neuroengineering Research Group (nBio), Miguel Hernández, University, Avda. de la Universidad s/n, 03202 Elche, Spain
| | - Jose M Catalan
- Biomedical Neuroengineering Research Group (nBio), Miguel Hernández, University, Avda. de la Universidad s/n, 03202 Elche, Spain
| | - Jose M Sanchez-Aparicio
- Laboratory for New Technologies in Neurorehabilitation, Fundación Instituto San Jose, Pinar San Jose s/n, 28003 Madrid, Spain
| | - Luis Donis-Barber
- Laboratory for New Technologies in Neurorehabilitation, Fundación Instituto San Jose, Pinar San Jose s/n, 28003 Madrid, Spain
| | - Eduardo Fernandez
- Biomedical Neuroengineering Research Group (nBio), Miguel Hernández, University, Avda. de la Universidad s/n, 03202 Elche, Spain
| | - Nicolas Garcia-Aracil
- Biomedical Neuroengineering Research Group (nBio), Miguel Hernández, University, Avda. de la Universidad s/n, 03202 Elche, Spain.,Laboratory for New Technologies in Neurorehabilitation, Fundación Instituto San Jose, Pinar San Jose s/n, 28003 Madrid, Spain
| |
Collapse
|
18
|
A Modular Mobile Robotic Platform to Assist People with Different Degrees of Disability. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11157130] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Robotics to support elderly people in living independently and to assist disabled people in carrying out the activities of daily living independently have demonstrated good results. Basically, there are two approaches: one of them is based on mobile robot assistants, such as Care-O-bot, PR2, and Tiago, among others; the other one is the use of an external robotic arm or a robotic exoskeleton fixed or mounted on a wheelchair. In this paper, a modular mobile robotic platform to assist moderately and severely impaired people based on an upper limb robotic exoskeleton mounted on a robotized wheel chair is presented. This mobile robotic platform can be customized for each user’s needs by exploiting its modularity. Finally, experimental results in a simulated home environment with a living room and a kitchen area, in order to simulate the interaction of the user with different elements of a home, are presented. In this experiment, a subject suffering from multiple sclerosis performed different activities of daily living (ADLs) using the platform in front of a group of clinicians composed of nurses, doctors, and occupational therapists. After that, the subject and the clinicians replied to a usability questionnaire. The results were quite good, but two key factors arose that need to be improved: the complexity and the cumbersome aspect of the platform.
Collapse
|
19
|
Soekadar SR, Kohl SH, Mihara M, von Lühmann A. Optical brain imaging and its application to neurofeedback. Neuroimage Clin 2021; 30:102577. [PMID: 33545580 PMCID: PMC7868728 DOI: 10.1016/j.nicl.2021.102577] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2020] [Revised: 12/30/2020] [Accepted: 01/15/2021] [Indexed: 12/30/2022]
Abstract
Besides passive recording of brain electric or magnetic activity, also non-ionizing electromagnetic or optical radiation can be used for real-time brain imaging. Here, changes in the radiation's absorption or scattering allow for continuous in vivo assessment of regional neurometabolic and neurovascular activity. Besides magnetic resonance imaging (MRI), over the last years, also functional near-infrared spectroscopy (fNIRS) was successfully established in real-time metabolic brain imaging. In contrast to MRI, fNIRS is portable and can be applied at bedside or in everyday life environments, e.g., to restore communication and movement. Here we provide a comprehensive overview of the history and state-of-the-art of real-time optical brain imaging with a special emphasis on its clinical use towards neurofeedback and brain-computer interface (BCI) applications. Besides pointing to the most critical challenges in clinical use, also novel approaches that combine real-time optical neuroimaging with other recording modalities (e.g. electro- or magnetoencephalography) are described, and their use in the context of neuroergonomics, neuroenhancement or neuroadaptive systems discussed.
Collapse
Affiliation(s)
- Surjo R Soekadar
- Clinical Neurotechnology Laboratory, Dept. of Psychiatry and Psychotherapy, Neuroscience Research Center, Campus Charité Mitte (CCM), Charité - University Medicine of Berlin, Berlin, Germany.
| | - Simon H Kohl
- JARA-Institute Molecular Neuroscience and Neuroimaging (INM-11), Jülich Research Centre, Jülich, Germany; Child Neuropsychology Section, Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, Medical Faculty, RWTH Aachen University, Germany
| | - Masahito Mihara
- Department of Neurology, Kawasaki Medical School, Kurashiki-City, Okayama, Japan
| | - Alexander von Lühmann
- Machine Learning Department, Computer Science, Technische Universität Berlin, Berlin, Germany; Neurophotonics Center, Biomedical Engineering, Boston University, Boston, USA
| |
Collapse
|
20
|
Paek AY, Brantley JA, Sujatha Ravindran A, Nathan K, He Y, Eguren D, Cruz-Garza JG, Nakagome S, Wickramasuriya DS, Chang J, Rashed-Al-Mahfuz M, Amin MR, Bhagat NA, Contreras-Vidal JL. A Roadmap Towards Standards for Neurally Controlled End Effectors. IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY 2021; 2:84-90. [PMID: 35402986 PMCID: PMC8979628 DOI: 10.1109/ojemb.2021.3059161] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Revised: 12/24/2020] [Accepted: 02/09/2021] [Indexed: 12/02/2022] Open
Abstract
The control and manipulation of various types of end effectors such as powered exoskeletons, prostheses, and ‘neural’ cursors by brain-machine interface (BMI) systems has been the target of many research projects. A seamless “plug and play” interface between any BMI and end effector is desired, wherein similar user's intent cause similar end effectors to behave identically. This report is based on the outcomes of an IEEE Standards Association Industry Connections working group on End Effectors for Brain-Machine Interfacing that convened to identify and address gaps in the existing standards for BMI-based solutions with a focus on the end-effector component. A roadmap towards standardization of end effectors for BMI systems is discussed by identifying current device standards that are applicable for end effectors. While current standards address basic electrical and mechanical safety, and to some extent, performance requirements, several gaps exist pertaining to unified terminologies, data communication protocols, patient safety and risk mitigation.
Collapse
Affiliation(s)
| | - Justin A Brantley
- University of Houston Houston TX 77204 USA
- Department of BioengineeringUniversity of Pennsylvania Philadelphia PA 19104 USA
| | | | | | | | | | - Jesus G Cruz-Garza
- University of Houston Houston TX 77204 USA
- Department of Design and Environmental AnalysisCornell University Ithaca NY 14853 USA
| | | | | | | | - Md Rashed-Al-Mahfuz
- University of Houston Houston TX 77204 USA
- Department of Computer Science and EngineeringUniversity of Rajshahi Rajshahi 6205 Bangladesh
| | | | - Nikunj A Bhagat
- University of Houston Houston TX 77204 USA
- Feinstein Institutes for Medical Research Manhasset NY 11030 USA
| | | |
Collapse
|
21
|
Belkhiria C, Peysakhovich V. Electro-Encephalography and Electro-Oculography in Aeronautics: A Review Over the Last Decade (2010-2020). FRONTIERS IN NEUROERGONOMICS 2020; 1:606719. [PMID: 38234309 PMCID: PMC10790927 DOI: 10.3389/fnrgo.2020.606719] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/17/2020] [Indexed: 01/19/2024]
Abstract
Electro-encephalography (EEG) and electro-oculography (EOG) are methods of electrophysiological monitoring that have potentially fruitful applications in neuroscience, clinical exploration, the aeronautical industry, and other sectors. These methods are often the most straightforward way of evaluating brain oscillations and eye movements, as they use standard laboratory or mobile techniques. This review describes the potential of EEG and EOG systems and the application of these methods in aeronautics. For example, EEG and EOG signals can be used to design brain-computer interfaces (BCI) and to interpret brain activity, such as monitoring the mental state of a pilot in determining their workload. The main objectives of this review are to, (i) offer an in-depth review of literature on the basics of EEG and EOG and their application in aeronautics; (ii) to explore the methodology and trends of research in combined EEG-EOG studies over the last decade; and (iii) to provide methodological guidelines for beginners and experts when applying these methods in environments outside the laboratory, with a particular focus on human factors and aeronautics. The study used databases from scientific, clinical, and neural engineering fields. The review first introduces the characteristics and the application of both EEG and EOG in aeronautics, undertaking a large review of relevant literature, from early to more recent studies. We then built a novel taxonomy model that includes 150 combined EEG-EOG papers published in peer-reviewed scientific journals and conferences from January 2010 to March 2020. Several data elements were reviewed for each study (e.g., pre-processing, extracted features and performance metrics), which were then examined to uncover trends in aeronautics and summarize interesting methods from this important body of literature. Finally, the review considers the advantages and limitations of these methods as well as future challenges.
Collapse
|
22
|
Nann M, Peekhaus N, Angerhöfer C, Soekadar SR. Feasibility and Safety of Bilateral Hybrid EEG/EOG Brain/Neural-Machine Interaction. Front Hum Neurosci 2020; 14:580105. [PMID: 33362490 PMCID: PMC7756108 DOI: 10.3389/fnhum.2020.580105] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Accepted: 11/09/2020] [Indexed: 02/06/2023] Open
Abstract
Cervical spinal cord injuries (SCIs) often lead to loss of motor function in both hands and legs, limiting autonomy and quality of life. While it was shown that unilateral hand function can be restored after SCI using a hybrid electroencephalography/electrooculography (EEG/EOG) brain/neural hand exoskeleton (B/NHE), it remained unclear whether such hybrid paradigm also could be used for operating two hand exoskeletons, e.g., in the context of bimanual tasks such as eating with fork and knife. To test whether EEG/EOG signals allow for fluent and reliable as well as safe and user-friendly bilateral B/NHE control, eight healthy participants (six females, mean age 24.1 ± 3.2 years) as well as four chronic tetraplegics (four males, mean age 51.8 ± 15.2 years) performed a complex sequence of EEG-controlled bilateral grasping and EOG-controlled releasing motions of two exoskeletons visually presented on a screen. A novel EOG command performed by prolonged horizontal eye movements (>1 s) to the left or right was introduced as a reliable switch to activate either the left or right exoskeleton. Fluent EEG control was defined as average “time to initialize” (TTI) grasping motions below 3 s. Reliable EEG control was assumed when classification accuracy exceeded 80%. Safety was defined as “time to stop” (TTS) all unintended grasping motions within 2 s. After the experiment, tetraplegics were asked to rate the user-friendliness of bilateral B/NHE control using Likert scales. Average TTI and accuracy of EEG-controlled operations ranged at 2.14 ± 0.66 s and 85.89 ± 15.81% across healthy participants and at 1.90 ± 0.97 s and 81.25 ± 16.99% across tetraplegics. Except for one tetraplegic, all participants met the safety requirements. With 88 ± 11% of the maximum achievable score, tetraplegics rated the control paradigm as user-friendly and reliable. These results suggest that hybrid EEG/EOG B/NHE control of two assistive devices is feasible and safe, paving the way to test this paradigm in larger clinical trials performing bimanual tasks in everyday life environments.
Collapse
Affiliation(s)
- Marius Nann
- Clinical Neurotechnology Lab, Charité - University Medicine Berlin, Berlin, Germany.,Applied Neurotechnology Lab, University Hospital Tübingen, Tübingen, Germany
| | - Niels Peekhaus
- Clinical Neurotechnology Lab, Charité - University Medicine Berlin, Berlin, Germany.,Applied Neurotechnology Lab, University Hospital Tübingen, Tübingen, Germany
| | - Cornelius Angerhöfer
- Clinical Neurotechnology Lab, Charité - University Medicine Berlin, Berlin, Germany.,Applied Neurotechnology Lab, University Hospital Tübingen, Tübingen, Germany
| | - Surjo R Soekadar
- Clinical Neurotechnology Lab, Charité - University Medicine Berlin, Berlin, Germany.,Applied Neurotechnology Lab, University Hospital Tübingen, Tübingen, Germany
| |
Collapse
|
23
|
Intra-cortical brain-machine interfaces for controlling upper-limb powered muscle and robotic systems in spinal cord injury. Clin Neurol Neurosurg 2020; 196:106069. [DOI: 10.1016/j.clineuro.2020.106069] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Revised: 07/03/2020] [Accepted: 07/04/2020] [Indexed: 12/20/2022]
|
24
|
Ditz JC, Schwarz A, Müller-Putz GR. Perturbation-evoked potentials can be classified from single-trial EEG. J Neural Eng 2020; 17:036008. [PMID: 32299075 DOI: 10.1088/1741-2552/ab89fb] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
OBJECTIVE Loss of balance control can have serious consequences on interaction between humans and machines as well as the general well-being of humans. Perceived balance perturbations are always accompanied by a specific cortical activation, the so-called perturbation-evoked potential (PEP). In this study, we investigate the possibility to classify PEPs from ongoing EEG. APPROACH Fifteen healthy subjects were exposed to seated whole-body perturbations. Each participant performed 120 trials; they were rapidly tilted to the right and left, 60 times respectively. MAIN RESULTS We achieved classification accuracies of more than 85% between PEPs and rest EEG using a window-based classification approach. Different window lengths and electrode layouts were compared. We were able to achieve excellent classification performance (87.6 ± 8.0% accuracy) by using a short window length of 200 ms and a minimal electrode layout consisting of only the Cz electrode. The peak classification accuracy coincides in time with the strongest component of PEPs, called N1. SIGNIFICANCE We showed that PEPs can be discriminated against ongoing EEG with high accuracy. These findings can contribute to the development of a system that can detect balance perturbations online.
Collapse
Affiliation(s)
- Jonas C Ditz
- Institute of Neural Engineering, Graz University of Technology, Graz, Austria. Methods in Medical Informatics, Department of Computer Science, University of Tübingen, Tübingen, Germany
| | | | | |
Collapse
|
25
|
Hosni SM, Shedeed HA, Mabrouk MS, Tolba MF. EEG-EOG based Virtual Keyboard: Toward Hybrid Brain Computer Interface. Neuroinformatics 2020; 17:323-341. [PMID: 30368637 DOI: 10.1007/s12021-018-9402-0] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Abstract
The past twenty years have ignited a new spark in the research of Electroencephalogram (EEG), which was pursued to develop innovative Brain Computer Interfaces (BCIs) in order to help severely disabled people live a better life with a high degree of independence. Current BCIs are more theoretical than practical and are suffering from numerous challenges. New trends of research propose combining EEG to other simple and efficient bioelectric inputs such as Electro-oculography (EOG) resulting from eye movements, to produce more practical and robust Hybrid Brain Computer Interface systems (hBCI) or Brain/Neuronal Computer Interface (BNCI). Working towards this purpose, existing research in EOG based Human Computer Interaction (HCI) applications, must be organized and surveyed in order to develop a vision on the potential benefits of combining both input modalities and give rise to new designs that maximize these benefits. Our aim is to support and inspire the design of new hBCI systems based on both EEG and EOG signals, in doing so; first the current EOG based HCI systems were surveyed with a particular focus on EOG based systems for communication using virtual keyboard. Then, a survey of the current EEG-EOG virtual keyboard was performed highlighting the design protocols employed. We concluded with a discussion of the potential advantages of combining both systems with recommendations to give deep insight for future design issues for all EEG-EOG hBCI systems. Finally, a general architecture was proposed for a new EEG-EOG hBCI system. The proposed hybrid system completely alters the traditional view of the eye movement features present in EEG signal as artifacts that should be removed; instead EOG traces are extracted from EEG in our proposed hybrid architecture and are considered as an additional input modality sharing control according to the chosen design protocol.
Collapse
Affiliation(s)
- Sarah M Hosni
- Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt
| | - Howida A Shedeed
- Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt
| | - Mai S Mabrouk
- Biomedical Engineering Department, Misr University for Science and Technology, Giza, Egypt.
| | - Mohamed F Tolba
- Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt
| |
Collapse
|
26
|
Badesa FJ, Diez JA, Catalan JM, Trigili E, Cordella F, Nann M, Crea S, Soekadar SR, Zollo L, Vitiello N, Garcia-Aracil N. Physiological Responses During Hybrid BNCI Control of an Upper-Limb Exoskeleton. SENSORS (BASEL, SWITZERLAND) 2019; 19:E4931. [PMID: 31726745 PMCID: PMC6891352 DOI: 10.3390/s19224931] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2019] [Revised: 10/30/2019] [Accepted: 11/05/2019] [Indexed: 11/20/2022]
Abstract
When combined with assistive robotic devices, such as wearable robotics, brain/neural-computer interfaces (BNCI) have the potential to restore the capabilities of handicapped people to carry out activities of daily living. To improve applicability of such systems, workload and stress should be reduced to a minimal level. Here, we investigated the user's physiological reactions during the exhaustive use of the interfaces of a hybrid control interface. Eleven BNCI-naive healthy volunteers participated in the experiments. All participants sat in a comfortable chair in front of a desk and wore a whole-arm exoskeleton as well as wearable devices for monitoring physiological, electroencephalographic (EEG) and electrooculographic (EoG) signals. The experimental protocol consisted of three phases: (i) Set-up, calibration and BNCI training; (ii) Familiarization phase; and (iii) Experimental phase during which each subject had to perform EEG and EoG tasks. After completing each task, the NASA-TLX questionnaire and self-assessment manikin (SAM) were completed by the user. We found significant differences (p-value < 0.05) in heart rate variability (HRV) and skin conductance level (SCL) between participants during the use of the two different biosignal modalities (EEG, EoG) of the BNCI. This indicates that EEG control is associated with a higher level of stress (associated with a decrease in HRV) and mental work load (associated with a higher level of SCL) when compared to EoG control. In addition, HRV and SCL modulations correlated with the subject's workload perception and emotional responses assessed through NASA-TLX questionnaires and SAM.
Collapse
Affiliation(s)
- Francisco J. Badesa
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, 03202 Alicante, Spain; (J.M.C.); (N.G.-A.)
- Universidad de Cádiz, Av. de la Universidad n10, 11519 Puerto Real, Spain
- New technologies for Neurorehabilitation Lab., Av. de la Hospitalidad, s/n, 28054 Madrid, Spain
| | - Jorge A. Diez
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, 03202 Alicante, Spain; (J.M.C.); (N.G.-A.)
- New technologies for Neurorehabilitation Lab., Av. de la Hospitalidad, s/n, 28054 Madrid, Spain
| | - Jose Maria Catalan
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, 03202 Alicante, Spain; (J.M.C.); (N.G.-A.)
- New technologies for Neurorehabilitation Lab., Av. de la Hospitalidad, s/n, 28054 Madrid, Spain
| | - Emilio Trigili
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Viale Rinaldo Piaggio 34, 56025 Pontedera, Pisa, Italy; (E.T.); (S.C.); (N.V.)
| | - Francesca Cordella
- Unit of Advanced Robotics and Human-centred Technologies, Campus Bio-Medico University of Rome, 00128 Rome, Italy; (F.C.); (L.Z.)
| | - Marius Nann
- Applied Neurotechnology Laboratory, Department of Psychiatry and Psychotherapy, University Hopsital of Tübingen, Calwerstr. 14, 72076 Tübingen, Germany;
| | - Simona Crea
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Viale Rinaldo Piaggio 34, 56025 Pontedera, Pisa, Italy; (E.T.); (S.C.); (N.V.)
- IRCCS Fondazione Don Carlo Gnocchi, Via Alfonso Capecelatro 66, 20148 Milan, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, 56025 Pontedera, Pisa, Italy
| | - Surjo R. Soekadar
- Clinical Neurotechnology Laboratory, Department of Psychiatry and Psychotherapy (CCM), Charité-Universitätsmedizin Berlin, Charitéplatz 1, 10117 Berlin, Germany;
| | - Loredana Zollo
- Unit of Advanced Robotics and Human-centred Technologies, Campus Bio-Medico University of Rome, 00128 Rome, Italy; (F.C.); (L.Z.)
| | - Nicola Vitiello
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Viale Rinaldo Piaggio 34, 56025 Pontedera, Pisa, Italy; (E.T.); (S.C.); (N.V.)
- IRCCS Fondazione Don Carlo Gnocchi, Via Alfonso Capecelatro 66, 20148 Milan, Italy
- Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, 56025 Pontedera, Pisa, Italy
| | - Nicolas Garcia-Aracil
- Miguel Hernández University of Elche, Av. Universidad w/n, Ed. Innova, 03202 Alicante, Spain; (J.M.C.); (N.G.-A.)
- New technologies for Neurorehabilitation Lab., Av. de la Hospitalidad, s/n, 28054 Madrid, Spain
| |
Collapse
|
27
|
Schwarz A, Pereira J, Kobler R, Muller-Putz GR. Unimanual and Bimanual Reach-and-Grasp Actions Can Be Decoded From Human EEG. IEEE Trans Biomed Eng 2019; 67:1684-1695. [PMID: 31545707 DOI: 10.1109/tbme.2019.2942974] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
While most tasks of daily life can be handled through a small number of different grasps, many tasks require the action of both hands. In these bimanual tasks, the second hand has either a supporting role (e.g. for fixating a jar) or a more active role (e.g. grasping a pot on both handles). In this study we attempt to discriminate the neural correlates of unimanual (performed with left and right hand) from bimanual reach-and-grasp actions using the low-frequency time-domain electroencephalogram (EEG). In a self-initiated movement task, 15 healthy participants were asked to perform unimanual (palmar and lateral grasps with left and right hand) and bimanual (double lateral, mixed palmar/lateral) reach-and-grasps on objects of daily life. Using EEG time-domain features in the frequency range of 0.3-3 Hz, we achieved multiclass-classification accuracies of 38.6 ± 6.6% (7 classes, 17.1% chance level) for a combination of 6 movements and 1 rest condition. The grand average confusion matrix shows highest true positive rates (TPR) for the rest (63%) condition while TPR for the movement classes varied between 33 to 41%. The underlying movement-related cortical potentials (MRCPs) show significant differences between unimanual (e.g left hand vs. right hand grasps) as well unimanual vs. bimanual conditions which both can be attributed to lateralization effects. We believe that these findings can be exploited and further used for attempts in providing persons with spinal cord injury a form of natural control for bimanual neuroprostheses.
Collapse
|
28
|
Bockbrader M. Upper limb sensorimotor restoration through brain–computer interface technology in tetraparesis. CURRENT OPINION IN BIOMEDICAL ENGINEERING 2019. [DOI: 10.1016/j.cobme.2019.09.002] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
|
29
|
Wang X, Song Q, Zhou S, Tang J, Chen K, Cao H. Multi-connection load compensation and load information calculation for an upper-limb exoskeleton based on a six-axis force/torque sensor. INT J ADV ROBOT SYST 2019. [DOI: 10.1177/1729881419863186] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
In this article, a method of multi-connection load compensation and load information calculation for an upper-limb exoskeleton is proposed based on a six-axis force/torque sensor installed between the exoskeleton and the end effector. The proposed load compensation method uses a mounted sensor to measure the force and torque between the exoskeleton and load of different connections and adds a compensator to the controller to compensate the component caused by the load in the human–robot interaction force, so that the human–robot interaction force is only used to operate the exoskeleton. Therefore, the operator can manipulate the exoskeleton with the same interaction force to lift loads of different weights with a passive or fixed connection, and the human–robot interaction force is minimized. Moreover, the proposed load information calculation method can calculate the weight of the load and the position of its center of gravity relative to the exoskeleton and end effector accurately, which is necessary for acquiring the upper-limb exoskeleton center of gravity and stability control of whole-body exoskeleton. In order to verify the effectiveness of the proposed method, we performed load handling and operational stability experiments. The experimental results showed that the proposed method realized the expected function.
Collapse
Affiliation(s)
- Xin Wang
- School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Qiuzhi Song
- School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Shitong Zhou
- Beijing Research Institute of Precise Mechanical and Electronic Control Equipment, Beijing, China
| | - Jing Tang
- School of Information Engineering, Wuhan University of Technology, Wuhan, China
| | - Kezhong Chen
- China Ship Development and Design Center, Wuhan, China
| | - Heng Cao
- School of Automation Science and Electrical Engineering, Beihang University, Beijing, China
| |
Collapse
|
30
|
Kim HH, Jeong J. Decoding electroencephalographic signals for direction in brain-computer interface using echo state network and Gaussian readouts. Comput Biol Med 2019; 110:254-264. [PMID: 31233971 DOI: 10.1016/j.compbiomed.2019.05.024] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2019] [Revised: 05/31/2019] [Accepted: 05/31/2019] [Indexed: 10/26/2022]
Abstract
BACKGROUND Noninvasive brain-computer interfaces (BCI) for movement control via an electroencephalogram (EEG) have been extensively investigated. However, most previous studies decoded user intention for movement directions based on sensorimotor rhythms during motor imagery. BCI systems based on mapping imagery movement of body parts (e.g., left or right hands) to movement directions (left or right directional movement of a machine or cursor) are less intuitive and less convenient due to the complex training procedures. Thus, direct decoding methods for detecting user intention about movement directions are urgently needed. METHODS Here, we describe a novel direct decoding method for user intention about the movement directions using the echo state network and Gaussian readouts. Importantly parameters in the network were optimized using the genetic algorithm method to achieve better decoding performance. We tested the decoding performance of this method with four healthy subjects and an inexpensive wireless EEG system containing 14 channels and then compared the performance outcome with that of a conventional machine learning method. RESULTS We showed that this decoding method successfully classified eight directions of intended movement (approximately 95% of an accuracy). CONCLUSIONS We suggest that the echo state network and Gaussian readouts can be a useful decoding method to directly read user intention of movement directions even using an inexpensive and portable EEG system.
Collapse
Affiliation(s)
- Hoon-Hee Kim
- Department of Bio and Brain Engineering, College of Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, 34141, Republic of Korea
| | - Jaeseung Jeong
- Department of Bio and Brain Engineering, College of Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, 34141, Republic of Korea; Program of Brain and Cognitive Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, 34141, Republic of Korea.
| |
Collapse
|
31
|
Park SJ, Park CH. Suit-type Wearable Robot Powered by Shape-memory-alloy-based Fabric Muscle. Sci Rep 2019; 9:9157. [PMID: 31235870 PMCID: PMC6591235 DOI: 10.1038/s41598-019-45722-x] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2018] [Accepted: 06/13/2019] [Indexed: 11/09/2022] Open
Abstract
A suit-type wearable robot (STWR) is a new type of soft wearable robot (SWR) that can be worn easily anywhere and anytime to assist the muscular strength of wearers because it can be worn like normal clothes and is comfortable to wear even with no power supply. This paper proposes an STWR, in which a shape-memory-alloy-based fabric muscle (SFM) is used as the actuator. The STWR, which weighs less than 1 kg, has a simple structure, with the following components: SFMs, wire encoders for measuring the contraction length of the SFMs, and BOA that fix the actuators on the forearms. In this study, a position controller for the SFM using the wire encoder was developed, and a prototype STWR was fabricated using this position controller. Moreover, by putting the STWR on a mannequin, step-response experiments were performed in which the arms of the mannequin lifted barbells weighing 2 kg and 4 kg to a certain target position. A fast response of moving to the target position in less than 1 s was observed in all steps except for the initial heating step for the 2 kg barbell. The response speed of the SFM was noticeably slower for the 4 kg barbell compared to that for the 2 kg barbell; it moved to the target position in approximately 3 s in all the steps except for the initial heating step. The SFM-applied STWR could overcome the limitations of conventional robots in terms of weight and inconvenience, thereby demonstrating the application potential of STWRs.
Collapse
Affiliation(s)
- Seong Jun Park
- Department of Robotics & Mechatronics, Korea Institute of Machinery & Materials, Daejeon, 34103, Korea
| | - Cheol Hoon Park
- Department of Robotics & Mechatronics, Korea Institute of Machinery & Materials, Daejeon, 34103, Korea.
| |
Collapse
|
32
|
Trigili E, Grazi L, Crea S, Accogli A, Carpaneto J, Micera S, Vitiello N, Panarese A. Detection of movement onset using EMG signals for upper-limb exoskeletons in reaching tasks. J Neuroeng Rehabil 2019; 16:45. [PMID: 30922326 PMCID: PMC6440169 DOI: 10.1186/s12984-019-0512-1] [Citation(s) in RCA: 48] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2018] [Accepted: 03/08/2019] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND To assist people with disabilities, exoskeletons must be provided with human-robot interfaces and smart algorithms capable to identify the user's movement intentions. Surface electromyographic (sEMG) signals could be suitable for this purpose, but their applicability in shared control schemes for real-time operation of assistive devices in daily-life activities is limited due to high inter-subject variability, which requires custom calibrations and training. Here, we developed a machine-learning-based algorithm for detecting the user's motion intention based on electromyographic signals, and discussed its applicability for controlling an upper-limb exoskeleton for people with severe arm disabilities. METHODS Ten healthy participants, sitting in front of a screen while wearing the exoskeleton, were asked to perform several reaching movements toward three LEDs, presented in a random order. EMG signals from seven upper-limb muscles were recorded. Data were analyzed offline and used to develop an algorithm that identifies the onset of the movement across two different events: moving from a resting position toward the LED (Go-forward), and going back to resting position (Go-backward). A set of subject-independent time-domain EMG features was selected according to information theory and their probability distributions corresponding to rest and movement phases were modeled by means of a two-component Gaussian Mixture Model (GMM). The detection of movement onset by two types of detectors was tested: the first type based on features extracted from single muscles, whereas the second from multiple muscles. Their performances in terms of sensitivity, specificity and latency were assessed for the two events with a leave one-subject out test method. RESULTS The onset of movement was detected with a maximum sensitivity of 89.3% for Go-forward and 60.9% for Go-backward events. Best performances in terms of specificity were 96.2 and 94.3% respectively. For both events the algorithm was able to detect the onset before the actual movement, while computational load was compatible with real-time applications. CONCLUSIONS The detection performances and the low computational load make the proposed algorithm promising for the control of upper-limb exoskeletons in real-time applications. Fast initial calibration makes it also suitable for helping people with severe arm disabilities in performing assisted functional tasks.
Collapse
Affiliation(s)
- Emilio Trigili
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Lorenzo Grazi
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Simona Crea
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
- IRCCS Fondazione Don Carlo Gnocchi, Milan, Italy
| | | | - Jacopo Carpaneto
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
| | - Silvestro Micera
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
- Bertarelli Foundation Chair in Translational NeuroEngineering, Center for Neuroprosthetics and Institute of Bioengineering, School of Engineering, École Polytechnique Federale de Lausanne, Lausanne, Switzerland
| | - Nicola Vitiello
- The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
- IRCCS Fondazione Don Carlo Gnocchi, Milan, Italy
| | | |
Collapse
|
33
|
Fujiwara T, Ushiba J, Soekadar SR. Neurorehabilitation: Neural Plasticity and Functional Recovery 2018. Neural Plast 2019; 2019:7812148. [PMID: 30804993 PMCID: PMC6360543 DOI: 10.1155/2019/7812148] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2018] [Accepted: 12/06/2018] [Indexed: 11/18/2022] Open
Affiliation(s)
- Toshiyuki Fujiwara
- Department of Rehabilitation Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo, Tokyo 113-8421, Japan
| | - Junichi Ushiba
- Faculty of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Kanagawa 223-8522, Japan
- Keio Institute of Pure and Applied Sciences (KiPAS), Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Kanagawa 223-8522, Japan
- Department of Biosciences and Informatics, Faculty of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Kanagawa 223-8522, Japan
| | - Surjo R. Soekadar
- Clinical Neurotechnology Laboratory, Department of Psychiatry and Psychotherapy, Neuroscience Research Center (NWFZ), Charité-University Medicine Berlin, Germany
- Department of Psychiatry and Psychotherapy, Eberhard-Karls-University Tübingen, Germany
| |
Collapse
|
34
|
Restoration of Finger and Arm Movements Using Hybrid Brain/Neural Assistive Technology in Everyday Life Environments. SPRINGERBRIEFS IN ELECTRICAL AND COMPUTER ENGINEERING 2019. [DOI: 10.1007/978-3-030-05668-1_5] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
|
35
|
Abstract
SummaryIn this work we present NEUROExos, a novel generation of upper-limb exoskeletons developed in recent years at The BioRobotics Institute of Scuola Superiore Sant’Anna (Italy). Specifically, we present our attempts to progressively (i) improve the ergonomics and safety (ii) reduce the encumbrance and weight, and (iii) develop more intuitive human–robot cognitive interfaces. Our latest prototype, described here for the first time, extends the field of application to assistance in activities of daily living, thanks to its compact and portable design. The experimental studies carried out on these devices are summarized, and a perspective on future developments is presented.
Collapse
|
36
|
Wu Q, Wu H. Development, Dynamic Modeling, and Multi-Modal Control of a Therapeutic Exoskeleton for Upper Limb Rehabilitation Training. SENSORS 2018; 18:s18113611. [PMID: 30356005 PMCID: PMC6263634 DOI: 10.3390/s18113611] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/26/2018] [Revised: 10/12/2018] [Accepted: 10/17/2018] [Indexed: 11/16/2022]
Abstract
Robot-assisted training is a promising technology in clinical rehabilitation providing effective treatment to the patients with motor disability. In this paper, a multi-modal control strategy for a therapeutic upper limb exoskeleton is proposed to assist the disabled persons perform patient-passive training and patient-cooperative training. A comprehensive overview of the exoskeleton with seven actuated degrees of freedom is introduced. The dynamic modeling and parameters identification strategies of the human-robot interaction system are analyzed. Moreover, an adaptive sliding mode controller with disturbance observer (ASMCDO) is developed to ensure the position control accuracy in patient-passive training. A cascade-proportional-integral-derivative (CPID)-based impedance controller with graphical game-like interface is designed to improve interaction compliance and motivate the active participation of patients in patient-cooperative training. Three typical experiments are conducted to verify the feasibility of the proposed control strategy, including the trajectory tracking experiments, the trajectory tracking experiments with impedance adjustment, and the intention-based training experiments. The experimental results suggest that the tracking error of ASMCDO controller is smaller than that of terminal sliding mode controller. By optimally changing the impedance parameters of CPID-based impedance controller, the training intensity can be adjusted to meet the requirement of different patients.
Collapse
Affiliation(s)
- Qingcong Wu
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China.
| | - Hongtao Wu
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China.
- State Key Laboratory of Robotics and System, Harbin Institute of Technology (HIT), Harbin 150001, China.
| |
Collapse
|