1
|
Park J, Lee Y, Cho S, Choe A, Yeom J, Ro YG, Kim J, Kang DH, Lee S, Ko H. Soft Sensors and Actuators for Wearable Human-Machine Interfaces. Chem Rev 2024; 124:1464-1534. [PMID: 38314694 DOI: 10.1021/acs.chemrev.3c00356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2024]
Abstract
Haptic human-machine interfaces (HHMIs) combine tactile sensation and haptic feedback to allow humans to interact closely with machines and robots, providing immersive experiences and convenient lifestyles. Significant progress has been made in developing wearable sensors that accurately detect physical and electrophysiological stimuli with improved softness, functionality, reliability, and selectivity. In addition, soft actuating systems have been developed to provide high-quality haptic feedback by precisely controlling force, displacement, frequency, and spatial resolution. In this Review, we discuss the latest technological advances of soft sensors and actuators for the demonstration of wearable HHMIs. We particularly focus on highlighting material and structural approaches that enable desired sensing and feedback properties necessary for effective wearable HHMIs. Furthermore, promising practical applications of current HHMI technology in various areas such as the metaverse, robotics, and user-interactive devices are discussed in detail. Finally, this Review further concludes by discussing the outlook for next-generation HHMI technology.
Collapse
Affiliation(s)
- Jonghwa Park
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Youngoh Lee
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Seungse Cho
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Ayoung Choe
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Jeonghee Yeom
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Yun Goo Ro
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Jinyoung Kim
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Dong-Hee Kang
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Seungjae Lee
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| | - Hyunhyub Ko
- School of Energy and Chemical Engineering, Ulsan National Institute of Science and Technology (UNIST), Ulsan Metropolitan City 44919, Republic of Korea
| |
Collapse
|
2
|
Tanwear A, Liang X, Paz E, Bohnert T, Ghannam R, Ferreira R, Heidari H. Spintronic Eyeblink Gesture Sensor With Wearable Interface System. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2022; 16:779-792. [PMID: 35830413 DOI: 10.1109/tbcas.2022.3190689] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
This work presents an eyeblink system that detects magnets placed on the eyelid via integrated magnetic sensors and an analogue circuit on an eyewear frame (without a glass lens). The eyelid magnets were detected using tunnelling magnetoresistance (TMR) bridge sensors with a sensitivity of 14 mV/V/Oe and were positioned centre-right and centre-left of the eyewear frame. Each eye side has a single TMR sensor wired to a single circuit, where the signal was filtered (<0.5 Hz and >30 Hz) and amplified to detect the weak magnetic field produced by the 3-millimetre (mm) diameter and 0.5 mm thickness N42 Neodymium magnets attached to a medical tape strip, for the adult-age demographic. Each eyeblink was repeated by a trigger command (right eyeblink) followed by the appropriate command, right, left or both eyeblinks. The eyeblink gesture system has shown repeatability, resulting in blinking classification based on the analogue signal amplitude threshold. As a result, the signal can be scaled and classified as well as, integrated with a Bluetooth module in real-time. This will enable end-users to connect to various other Bluetooth enabled devices for wireless assistive technologies. The eyeblink system was tested by 14 participants via a stimuli-based game. Within an average time of 185-seconds, the system demonstrated a group mean accuracy of 72% for 40 commands. Moreover, the maximum information transfer rate (ITR) of the participants was 35.95 Bits per minute.
Collapse
|
3
|
Ramakrishnan J, Mavaluru D, Sakthivel RS, Alqahtani AS, Mubarakali A, Retnadhas M. Brain–computer interface for amyotrophic lateral sclerosis patients using deep learning network. Neural Comput Appl 2022. [DOI: 10.1007/s00521-020-05026-y] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
4
|
Liu K, Yu Y, Liu Y, Tang J, Liang X, Chu X, Zhou Z. A novel brain-controlled wheelchair combined with computer vision and augmented reality. Biomed Eng Online 2022; 21:50. [PMID: 35883092 PMCID: PMC9327337 DOI: 10.1186/s12938-022-01020-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2022] [Accepted: 07/11/2022] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Brain-controlled wheelchairs (BCWs) are important applications of brain-computer interfaces (BCIs). Currently, most BCWs are semiautomatic. When users want to reach a target of interest in their immediate environment, this semiautomatic interaction strategy is slow. METHODS To this end, we combined computer vision (CV) and augmented reality (AR) with a BCW and proposed the CVAR-BCW: a BCW with a novel automatic interaction strategy. The proposed CVAR-BCW uses a translucent head-mounted display (HMD) as the user interface, uses CV to automatically detect environments, and shows the detected targets through AR technology. Once a user has chosen a target, the CVAR-BCW can automatically navigate to it. For a few scenarios, the semiautomatic strategy might be useful. We integrated a semiautomatic interaction framework into the CVAR-BCW. The user can switch between the automatic and semiautomatic strategies. RESULTS We recruited 20 non-disabled subjects for this study and used the accuracy, information transfer rate (ITR), and average time required for the CVAR-BCW to reach each designated target as performance metrics. The experimental results showed that our CVAR-BCW performed well in indoor environments: the average accuracies across all subjects were 83.6% (automatic) and 84.1% (semiautomatic), the average ITRs were 8.2 bits/min (automatic) and 8.3 bits/min (semiautomatic), the average times required to reach a target were 42.4 s (automatic) and 93.4 s (semiautomatic), and the average workloads and degrees of fatigue for the two strategies were both approximately 20. CONCLUSIONS Our CVAR-BCW provides a user-centric interaction approach and a good framework for integrating more advanced artificial intelligence technologies, which may be useful in the field of disability assistance.
Collapse
Affiliation(s)
- Kaixuan Liu
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, China
| | - Yang Yu
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, China.
| | - Yadong Liu
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, China
| | - Jingsheng Tang
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, China
| | - Xinbin Liang
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, China
| | - Xingxing Chu
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, China
| | - Zongtan Zhou
- College of Intelligence Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, China
| |
Collapse
|
5
|
Belkhiria C, Boudir A, Hurter C, Peysakhovich V. EOG-Based Human-Computer Interface: 2000-2020 Review. SENSORS (BASEL, SWITZERLAND) 2022; 22:4914. [PMID: 35808414 PMCID: PMC9269776 DOI: 10.3390/s22134914] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Revised: 06/23/2022] [Accepted: 06/25/2022] [Indexed: 11/28/2022]
Abstract
Electro-oculography (EOG)-based brain-computer interface (BCI) is a relevant technology influencing physical medicine, daily life, gaming and even the aeronautics field. EOG-based BCI systems record activity related to users' intention, perception and motor decisions. It converts the bio-physiological signals into commands for external hardware, and it executes the operation expected by the user through the output device. EOG signal is used for identifying and classifying eye movements through active or passive interaction. Both types of interaction have the potential for controlling the output device by performing the user's communication with the environment. In the aeronautical field, investigations of EOG-BCI systems are being explored as a relevant tool to replace the manual command and as a communicative tool dedicated to accelerating the user's intention. This paper reviews the last two decades of EOG-based BCI studies and provides a structured design space with a large set of representative papers. Our purpose is to introduce the existing BCI systems based on EOG signals and to inspire the design of new ones. First, we highlight the basic components of EOG-based BCI studies, including EOG signal acquisition, EOG device particularity, extracted features, translation algorithms, and interaction commands. Second, we provide an overview of EOG-based BCI applications in the real and virtual environment along with the aeronautical application. We conclude with a discussion of the actual limits of EOG devices regarding existing systems. Finally, we provide suggestions to gain insight for future design inquiries.
Collapse
Affiliation(s)
- Chama Belkhiria
- ISAE-SUPAERO, Université de Toulouse, 31400 Toulouse, France;
| | - Atlal Boudir
- ENAC, Université de Toulouse, 31400 Toulouse, France; (A.B.); (C.H.)
| | - Christophe Hurter
- ENAC, Université de Toulouse, 31400 Toulouse, France; (A.B.); (C.H.)
| | | |
Collapse
|
6
|
Current State of Robotics in Hand Rehabilitation after Stroke: A Systematic Review. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094540] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
Among the methods of hand function rehabilitation after stroke, robot-assisted rehabilitation is widely used, and the use of hand rehabilitation robots can provide functional training of the hand or assist the paralyzed hand with activities of daily living. However, patients with hand disorders consistently report that the needs of some users are not being met. The purpose of this review is to understand the reasons why these user needs are not being adequately addressed, to explore research on hand rehabilitation robots, to review their current state of research in recent years, and to summarize future trends in the hope that it will be useful to researchers in this research area. This review summarizes the techniques in this paper in a systematic way. We first provide a comprehensive review of research institutions, commercial products, and literature. Thus, the state of the art and deficiencies of functional hand rehabilitation robots are sought and guide the development of subsequent hand rehabilitation robots. This review focuses specifically on the actuation and control of hand functional rehabilitation robots, as user needs are primarily focused on actuation and control strategies. We also review hand detection technologies and compare them with patient needs. The results show that the trends in recent years are more inclined to pursue new lightweight materials to improve hand adaptability, investigating intelligent control methods for human-robot interaction in hand functional rehabilitation robots to improve control robustness and accuracy, and VR virtual task positioning to improve the effectiveness of active rehabilitation training.
Collapse
|
7
|
Cai X, Pan J. Toward a Brain-Computer Interface- and Internet of Things-Based Smart Ward Collaborative System Using Hybrid Signals. JOURNAL OF HEALTHCARE ENGINEERING 2022; 2022:6894392. [PMID: 35480157 PMCID: PMC9038386 DOI: 10.1155/2022/6894392] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/12/2022] [Accepted: 03/26/2022] [Indexed: 11/24/2022]
Abstract
This study proposes a brain-computer interface (BCI)- and Internet of Things (IoT)-based smart ward collaborative system using hybrid signals. The system is divided into hybrid asynchronous electroencephalography (EEG)-, electrooculography (EOG)- and gyro-based BCI control system and an IoT monitoring and management system. The hybrid BCI control system proposes a GUI paradigm with cursor movement. The user uses the gyro to control the cursor area selection and uses blink-related EOG to control the cursor click. Meanwhile, the attention-related EEG signals are classified based on a support-vector machine (SVM) to make the final judgment. The judgment of the cursor area and the judgment of the attention state are reduced, thereby reducing the false operation rate in the hybrid BCI system. The accuracy in the hybrid BCI control system was 96.65 ± 1.44%, and the false operation rate and command response time were 0.89 ± 0.42 events/min and 2.65 ± 0.48 s, respectively. These results show the application potential of the hybrid BCI control system in daily tasks. In addition, we develop an architecture to connect intelligent things in a smart ward based on narrowband Internet of Things (NB-IoT) technology. The results demonstrate that our system provides superior communication transmission quality.
Collapse
Affiliation(s)
- Xugang Cai
- School of Software, South China Normal University, Guangzhou 510631, China
| | - Jiahui Pan
- School of Software, South China Normal University, Guangzhou 510631, China
- Pazhou Lab, Guangzhou 510330, China
| |
Collapse
|
8
|
Chen X, Yu Y, Tang J, Zhou L, Liu K, Liu Z, Chen S, Wang J, Zeng LL, Liu J, Hu D. Clinical validation of BCI-controlled wheelchairs in subjects with severe spinal cord injury. IEEE Trans Neural Syst Rehabil Eng 2022; 30:579-589. [PMID: 35259107 DOI: 10.1109/tnsre.2022.3156661] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Brain-controlled wheelchairs are one of the most promising applications that can help people gain mobility after their normal interaction pathways have been compromised by neuromuscular diseases. The feasibility of using brain signals to control wheelchairs has been well demonstrated by healthy people in previous studies. However, most potential users of brain-controlled wheelchairs are people suffering from severe physical disabilities or who are in a "locked-in" state. To further validate the clinical practicability of our previously proposed P300-based brain-controlled wheelchair, in this study, 10 subjects with severe spinal cord injuries participated in three experiments and completed ten predefined tasks in each experiment. The average accuracy and information transfer rate (ITR) were 94.8% and 4.2 bits/min, respectively. Moreover, we evaluated the physiological and cognitive burdens experienced by these individuals before and after the experiments. There were no significant changes in vital signs during the experiment, indicating minimal physiological and cognitive burden. The patients' average systolic blood pressure before and after the experiment was 113±13.7 mmHg and 114±11.9 mmHg, respectively (P=0.122). The patients' average heart rates before and after the experiment were 79±8.4/min and 79±8.2/min, respectively (P=0.147). The average task load, measured by the National Aeronautics and Space Administration task load index, ranged from 10.0 to 25.5. The results suggest that the proposed P300-based brain-controlled wheelchair is safe and reliable; additionally, it does not significantly increase the patient's physical and mental task burden, demonstrating its potential value in clinical applications. Our study promotes the development of a more practical brain-controlled wheelchair system.
Collapse
|
9
|
Esposito D, Centracchio J, Andreozzi E, Gargiulo GD, Naik GR, Bifulco P. Biosignal-Based Human-Machine Interfaces for Assistance and Rehabilitation: A Survey. SENSORS 2021; 21:s21206863. [PMID: 34696076 PMCID: PMC8540117 DOI: 10.3390/s21206863] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 09/30/2021] [Accepted: 10/12/2021] [Indexed: 12/03/2022]
Abstract
As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal-based HMIs for assistance and rehabilitation to outline state-of-the-art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full-text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever-growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complexity, so their usefulness should be carefully evaluated for the specific application.
Collapse
Affiliation(s)
- Daniele Esposito
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Jessica Centracchio
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Emilio Andreozzi
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Gaetano D. Gargiulo
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The MARCS Institute, Western Sydney University, Penrith, NSW 2751, Australia
| | - Ganesh R. Naik
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The Adelaide Institute for Sleep Health, Flinders University, Bedford Park, SA 5042, Australia
- Correspondence:
| | - Paolo Bifulco
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| |
Collapse
|
10
|
Pérez-Reynoso FD, Rodríguez-Guerrero L, Salgado-Ramírez JC, Ortega-Palacios R. Human-Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot. SENSORS (BASEL, SWITZERLAND) 2021; 21:5882. [PMID: 34502773 PMCID: PMC8434373 DOI: 10.3390/s21175882] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Revised: 08/24/2021] [Accepted: 08/26/2021] [Indexed: 01/25/2023]
Abstract
People with severe disabilities require assistance to perform their routine activities; a Human-Machine Interface (HMI) will allow them to activate devices that respond according to their needs. In this work, an HMI based on electrooculography (EOG) is presented, the instrumentation is placed on portable glasses that have the task of acquiring both horizontal and vertical EOG signals. The registration of each eye movement is identified by a class and categorized using the one hot encoding technique to test precision and sensitivity of different machine learning classification algorithms capable of identifying new data from the eye registration; the algorithm allows to discriminate blinks in order not to disturb the acquisition of the eyeball position commands. The implementation of the classifier consists of the control of a three-wheeled omnidirectional robot to validate the response of the interface. This work proposes the classification of signals in real time and the customization of the interface, minimizing the user's learning curve. Preliminary results showed that it is possible to generate trajectories to control an omnidirectional robot to implement in the future assistance system to control position through gaze orientation.
Collapse
Affiliation(s)
| | - Liliam Rodríguez-Guerrero
- Research Center on Technology of Information and Systems (CITIS), Electric and Control Academic Group, Universidad Autónoma del Estado de Hidalgo (UAEH), Pachuca de Soto 42039, Mexico
| | | | - Rocío Ortega-Palacios
- Biomedical Engineering, Universidad Politécnica de Pachuca (UPP), Zempoala 43830, Mexico
| |
Collapse
|
11
|
Zhou H, Li D, He X, Hui X, Guo H, Hu C, Mu X, Wang ZL. Bionic Ultra-Sensitive Self-Powered Electromechanical Sensor for Muscle-Triggered Communication Application. ADVANCED SCIENCE (WEINHEIM, BADEN-WURTTEMBERG, GERMANY) 2021; 8:e2101020. [PMID: 34081406 PMCID: PMC8336610 DOI: 10.1002/advs.202101020] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/16/2021] [Revised: 04/15/2021] [Indexed: 06/12/2023]
Abstract
The past few decades have witnessed the tremendous progress of human-machine interface (HMI) in communication, education, and manufacturing fields. However, due to signal acquisition devices' limitations, the research on HMI related to communication aid applications for the disabled is progressing slowly. Here, inspired by frogs' croaking behavior, a bionic triboelectric nanogenerator (TENG)-based ultra-sensitive self-powered electromechanical sensor for muscle-triggered communication HMI application is developed. The sensor possesses a high sensitivity (54.6 mV mm-1 ), a high-intensity signal (± 700 mV), and a wide sensing range (0-5 mm). The signal intensity is 206 times higher than that of traditional biopotential electromyography methods. By leveraging machine learning algorithms and Morse code, the safe, accurate (96.3%), and stable communication aid HMI applications are achieved. The authors' bionic TENG-based electromechanical sensor provides a valuable toolkit for HMI applications of the disabled, and it brings new insights into the interdisciplinary cross-integration between TENG technology and bionics.
Collapse
Affiliation(s)
- Hong Zhou
- Key Laboratory of Optoelectronic Technology & SystemsMinistry of Educationand International R & D center of Micro‐nano Systems and New Materials TechnologyChongqing UniversityChongqing400044P. R. China
| | - Dongxiao Li
- Key Laboratory of Optoelectronic Technology & SystemsMinistry of Educationand International R & D center of Micro‐nano Systems and New Materials TechnologyChongqing UniversityChongqing400044P. R. China
| | - Xianming He
- Key Laboratory of Optoelectronic Technology & SystemsMinistry of Educationand International R & D center of Micro‐nano Systems and New Materials TechnologyChongqing UniversityChongqing400044P. R. China
| | - Xindan Hui
- Key Laboratory of Optoelectronic Technology & SystemsMinistry of Educationand International R & D center of Micro‐nano Systems and New Materials TechnologyChongqing UniversityChongqing400044P. R. China
| | - Hengyu Guo
- Department of Applied PhysicsChongqing UniversityChongqing400044P. R. China
| | - Chenguo Hu
- Department of Applied PhysicsChongqing UniversityChongqing400044P. R. China
| | - Xiaojing Mu
- Key Laboratory of Optoelectronic Technology & SystemsMinistry of Educationand International R & D center of Micro‐nano Systems and New Materials TechnologyChongqing UniversityChongqing400044P. R. China
| | - Zhong Lin Wang
- Beijing Institute of Nanoenergy and NanosystemsChinese Academy of SciencesBeijing100083P. R. China
- School of Material Science and EngineeringGeorgia Institute of TechnologyAtlantaGA30332‐0245USA
| |
Collapse
|
12
|
A Bibliometric Analysis of Human-Machine Interaction Methodology for Electric-Powered Wheelchairs Driving from 1998 to 2020. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2021; 18:ijerph18147567. [PMID: 34300017 PMCID: PMC8304937 DOI: 10.3390/ijerph18147567] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Revised: 05/17/2021] [Accepted: 07/10/2021] [Indexed: 11/17/2022]
Abstract
Electric power wheelchairs (EPWs) enhance the mobility capability of the elderly and the disabled, while the human-machine interaction (HMI) determines how well the human intention will be precisely delivered and how human-machine system cooperation will be efficiently conducted. A bibliometric quantitative analysis of 1154 publications related to this research field, published between 1998 and 2020, was conducted. We identified the development status, contributors, hot topics, and potential future research directions of this field. We believe that the combination of intelligence and humanization of an EPW HMI system based on human-machine collaboration is an emerging trend in EPW HMI methodology research. Particular attention should be paid to evaluating the applicability and benefits of the EPW HMI methodology for the users, as well as how much it contributes to society. This study offers researchers a comprehensive understanding of EPW HMI studies in the past 22 years and latest trends from the evolutionary footprints and forward-thinking insights regarding future research.
Collapse
|
13
|
Singh HP, Kumar P. Developments in the human machine interface technologies and their applications: a review. J Med Eng Technol 2021; 45:552-573. [PMID: 34184601 DOI: 10.1080/03091902.2021.1936237] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
Human-machine interface (HMI) techniques use bioelectrical signals to gain real-time synchronised communication between the human body and machine functioning. HMI technology not only provides a real-time control access but also has the ability to control multiple functions at a single instance of time with modest human inputs and increased efficiency. The HMI technologies yield advanced control access on numerous applications such as health monitoring, medical diagnostics, development of prosthetic and assistive devices, automotive and aerospace industry, robotic controls and many more fields. In this paper, various physiological signals, their acquisition and processing techniques along with their respective applications in different HMI technologies have been discussed.
Collapse
Affiliation(s)
- Harpreet Pal Singh
- Department of Mechanical Engineering, Punjabi University, Patiala, India
| | - Parlad Kumar
- Department of Mechanical Engineering, Punjabi University, Patiala, India
| |
Collapse
|
14
|
Wang X, Xiao Y, Deng F, Chen Y, Zhang H. Eye-Movement-Controlled Wheelchair Based on Flexible Hydrogel Biosensor and WT-SVM. BIOSENSORS 2021; 11:198. [PMID: 34208524 PMCID: PMC8234407 DOI: 10.3390/bios11060198] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Revised: 05/31/2021] [Accepted: 06/07/2021] [Indexed: 11/17/2022]
Abstract
To assist patients with restricted mobility to control wheelchair freely, this paper presents an eye-movement-controlled wheelchair prototype based on a flexible hydrogel biosensor and Wavelet Transform-Support Vector Machine (WT-SVM) algorithm. Considering the poor deformability and biocompatibility of rigid metal electrodes, we propose a flexible hydrogel biosensor made of conductive HPC/PVA (Hydroxypropyl cellulose/Polyvinyl alcohol) hydrogel and flexible PDMS (Polydimethylsiloxane) substrate. The proposed biosensor is affixed to the wheelchair user's forehead to collect electrooculogram (EOG) and strain signals, which are the basis to recognize eye movements. The low Young's modulus (286 KPa) and exceptional breathability (18 g m-2 h-1 of water vapor transmission rate) of the biosensor ensures a conformal and unobtrusive adhesion between it and the epidermis. To improve the recognition accuracy of eye movements (straight, upward, downward, left, and right), the WT-SVM algorithm is introduced to classify EOG and strain signals according to different features (amplitude, duration, interval). The average recognition accuracy reaches 96.3%, thus the wheelchair can be manipulated precisely.
Collapse
Affiliation(s)
| | | | - Fangming Deng
- School of Electrical and Automation Engineering, East China Jiaotong University, Nanchang 330013, China; (X.W.); (Y.X.); (Y.C.); (H.Z.)
| | | | | |
Collapse
|
15
|
Udupa S, Kamat VR, Menassa CC. Shared autonomy in assistive mobile robots: a review. Disabil Rehabil Assist Technol 2021:1-22. [PMID: 34133906 DOI: 10.1080/17483107.2021.1928778] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
PURPOSE Shared autonomy has played a major role in assistive mobile robotics as it has the potential to effectively balance user satisfaction and smooth functioning of systems by adapting itself to each user's needs and preferences. Many shared control paradigms have been developed over the years. However, despite these advancements, shared control paradigms have not been widely adopted as there are several integral aspects that have not fully matured. The purpose of this paper is to discuss and review various aspects of shared control and the technologies leading up to the current advancements in shared control for assistive mobile robots. METHODS A comprehensive review of the literature was conducted following a dichotomy of studies from the pre-2000 and the post-2000 periods to focus on both the early developments and the current state of the art in this domain. RESULTS A systematic review of 135 research papers and 7 review papers selected from the literature was conducted. To facilitate the organization of the reviewed work, a 6-level ladder categorization was developed based on the extent of autonomy shared between the human and the robot in the use of assistive mobile robots. This taxonomy highlights the chronological improvements in this domain. CONCLUSION It was found that most prior studies have focussed on basic functionalities, thus paving the way for research to now focus on the higher levels of the ladder taxonomy. It was concluded that further research in the domain must focus on ensuring safety in mobility and adaptability to varying environments.Implications for rehabilitationShared autonomy in assistive mobile robots plays a vital role in effectively adapting to ensure safety while also considering the user comfort.User's immediate desires should be considered in decision making to ensure that the users are in control of the assistive robots.The current focus of research should be towards successful adaptation of the assistive mobile robots to varying environments to assure safety of the user.
Collapse
Affiliation(s)
- Sumukha Udupa
- Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor, MI, USA.,Robotics Institute, University of Michigan, Ann Arbor, MI, USA
| | - Vineet R Kamat
- Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor, MI, USA.,Robotics Institute, University of Michigan, Ann Arbor, MI, USA
| | - Carol C Menassa
- Department of Civil and Environmental Engineering, University of Michigan, Ann Arbor, MI, USA.,Robotics Institute, University of Michigan, Ann Arbor, MI, USA
| |
Collapse
|
16
|
EEG-Based Eye Movement Recognition Using Brain-Computer Interface and Random Forests. SENSORS 2021; 21:s21072339. [PMID: 33801663 PMCID: PMC8036672 DOI: 10.3390/s21072339] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Revised: 03/22/2021] [Accepted: 03/25/2021] [Indexed: 11/24/2022]
Abstract
Discrimination of eye movements and visual states is a flourishing field of research and there is an urgent need for non-manual EEG-based wheelchair control and navigation systems. This paper presents a novel system that utilizes a brain–computer interface (BCI) to capture electroencephalographic (EEG) signals from human subjects while eye movement and subsequently classify them into six categories by applying a random forests (RF) classification algorithm. RF is an ensemble learning method that constructs a series of decision trees where each tree gives a class prediction, and the class with the highest number of class predictions becomes the model’s prediction. The categories of the proposed random forests brain–computer interface (RF-BCI) are defined according to the position of the subject’s eyes: open, closed, left, right, up, and down. The purpose of RF-BCI is to be utilized as an EEG-based control system for driving an electromechanical wheelchair (rehabilitation device). The proposed approach has been tested using a dataset containing 219 records taken from 10 different patients. The BCI implemented the EPOC Flex head cap system, which includes 32 saline felt sensors for capturing the subjects’ EEG signals. Each sensor caught four different brain waves (delta, theta, alpha, and beta) per second. Then, these signals were split in 4-second windows resulting in 512 samples per record and the band energy was extracted for each EEG rhythm. The proposed system was compared with naïve Bayes, Bayes Network, k-nearest neighbors (K-NN), multilayer perceptron (MLP), support vector machine (SVM), J48-C4.5 decision tree, and Bagging classification algorithms. The experimental results showed that the RF algorithm outperformed compared to the other approaches and high levels of accuracy (85.39%) for a 6-class classification are obtained. This method exploits high spatial information acquired from the Emotiv EPOC Flex wearable EEG recording device and examines successfully the potential of this device to be used for BCI wheelchair technology.
Collapse
|
17
|
Kaur A. Wheelchair control for disabled patients using EMG/EOG based human machine interface: a review. J Med Eng Technol 2020; 45:61-74. [PMID: 33302770 DOI: 10.1080/03091902.2020.1853838] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
The human-machine interface (HMI) and bio-signals have been used to control rehabilitation equipment and improve the lives of people with severe disabilities. This research depicts a review of electromyogram (EMG) or electrooculogram (EOG) signal-based control system for driving the wheelchair for disabled. For a paralysed person, EOG is one of the most useful signals that help to successfully communicate with the environment by using eye movements. In the case of amputation, the selection of muscles according to the distribution of power and frequency highly contributes to the specific motion of a wheelchair. Taking into account the day-to-day activities of persons with disabilities, both technologies are being used to design EMG or EOG based wheelchairs. This review paper examines a total of 70 EMG studies and 25 EOG studies published from 2000 to 2019. In addition, this paper covers current technologies used in wheelchair systems for signal capture, filtering, characterisation, and classification, including control commands such as left and right turns, forward and reverse motion, acceleration, deceleration, and wheelchair stop.
Collapse
Affiliation(s)
- Amanpreet Kaur
- Department of Electronics and Communication Engineering, Thapar Institute of Engineering and Technology, Patiala, India
| |
Collapse
|
18
|
Tanwear A, Liang X, Liu Y, Vuckovic A, Ghannam R, Bohnert T, Paz E, Freitas PP, Ferreira R, Heidari H. Spintronic Sensors Based on Magnetic Tunnel Junctions for Wireless Eye Movement Gesture Control. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2020; 14:1299-1310. [PMID: 32991289 DOI: 10.1109/tbcas.2020.3027242] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
The tracking of eye gesture movements using wearable technologies can undoubtedly improve quality of life for people with mobility and physical impairments by using spintronic sensors based on the tunnel magnetoresistance (TMR) effect in a human-machine interface. Our design involves integrating three TMR sensors on an eyeglass frame for detecting relative movement between the sensor and tiny magnets embedded in an in-house fabricated contact lens. Using TMR sensors with the sensitivity of 11 mV/V/Oe and ten <1 mm3 embedded magnets within a lens, an eye gesture system was implemented with a sampling frequency of up to 28 Hz. Three discrete eye movements were successfully classified when a participant looked up, right or left using a threshold-based classifier. Moreover, our proof-of-concept real-time interaction system was tested on 13 participants, who played a simplified Tetris game using their eye movements. Our results show that all participants were successful in completing the game with an average accuracy of 90.8%.
Collapse
|
19
|
Martínez-Cerveró J, Ardali MK, Jaramillo-Gonzalez A, Wu S, Tonin A, Birbaumer N, Chaudhary U. Open Software/Hardware Platform for Human-Computer Interface Based on Electrooculography (EOG) Signal Classification. SENSORS 2020; 20:s20092443. [PMID: 32344820 PMCID: PMC7248971 DOI: 10.3390/s20092443] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Revised: 04/21/2020] [Accepted: 04/23/2020] [Indexed: 12/28/2022]
Abstract
Electrooculography (EOG) signals have been widely used in Human-Computer Interfaces (HCI). The HCI systems proposed in the literature make use of self-designed or closed environments, which restrict the number of potential users and applications. Here, we present a system for classifying four directions of eye movements employing EOG signals. The system is based on open source ecosystems, the Raspberry Pi single-board computer, the OpenBCI biosignal acquisition device, and an open-source python library. The designed system provides a cheap, compact, and easy to carry system that can be replicated or modified. We used Maximum, Minimum, and Median trial values as features to create a Support Vector Machine (SVM) classifier. A mean of 90% accuracy was obtained from 7 out of 10 subjects for online classification of Up, Down, Left, and Right movements. This classification system can be used as an input for an HCI, i.e., for assisted communication in paralyzed people.
Collapse
Affiliation(s)
- Jayro Martínez-Cerveró
- Institute of Medical Psychology and Behavioural Neurobiology, University of Tübingen, Silcherstraße 5, 72076 Tübingen, Germany
| | - Majid Khalili Ardali
- Institute of Medical Psychology and Behavioural Neurobiology, University of Tübingen, Silcherstraße 5, 72076 Tübingen, Germany
| | - Andres Jaramillo-Gonzalez
- Institute of Medical Psychology and Behavioural Neurobiology, University of Tübingen, Silcherstraße 5, 72076 Tübingen, Germany
| | - Shizhe Wu
- Institute of Medical Psychology and Behavioural Neurobiology, University of Tübingen, Silcherstraße 5, 72076 Tübingen, Germany
| | - Alessandro Tonin
- Wyss-Center for Bio- and Neuro-Engineering, Chemin des Mines 9, Ch 1202 Geneva, Switzerland
| | - Niels Birbaumer
- Institute of Medical Psychology and Behavioural Neurobiology, University of Tübingen, Silcherstraße 5, 72076 Tübingen, Germany
| | - Ujwal Chaudhary
- Institute of Medical Psychology and Behavioural Neurobiology, University of Tübingen, Silcherstraße 5, 72076 Tübingen, Germany
- Wyss-Center for Bio- and Neuro-Engineering, Chemin des Mines 9, Ch 1202 Geneva, Switzerland
- Correspondence:
| |
Collapse
|
20
|
Huang Q, Zhang Z, Yu T, He S, Li Y. An EEG-/EOG-Based Hybrid Brain-Computer Interface: Application on Controlling an Integrated Wheelchair Robotic Arm System. Front Neurosci 2019; 13:1243. [PMID: 31824245 PMCID: PMC6882933 DOI: 10.3389/fnins.2019.01243] [Citation(s) in RCA: 30] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2019] [Accepted: 11/04/2019] [Indexed: 11/13/2022] Open
Abstract
Most existing brain-computer Interfaces (BCIs) are designed to control a single assistive device, such as a wheelchair, a robotic arm or a prosthetic limb. However, many daily tasks require combined functions which can only be realized by integrating multiple robotic devices. Such integration raises the requirement of the control accuracy and is more challenging to achieve a reliable control compared with the single device case. In this study, we propose a novel hybrid BCI with high accuracy based on electroencephalogram (EEG) and electrooculogram (EOG) to control an integrated wheelchair robotic arm system. The user turns the wheelchair left/right by performing left/right hand motor imagery (MI), and generates other commands for the wheelchair and the robotic arm by performing eye blinks and eyebrow raising movements. Twenty-two subjects participated in a MI training session and five of them completed a mobile self-drinking experiment, which was designed purposely with high accuracy requirements. The results demonstrated that the proposed hBCI could provide satisfied control accuracy for a system that consists of multiple robotic devices, and showed the potential of BCI-controlled systems to be applied in complex daily tasks.
Collapse
Affiliation(s)
- Qiyun Huang
- Center for Brain Computer Interfaces and Brain Information Processing, South China University of Technology, Guangzhou, China
| | - Zhijun Zhang
- Center for Brain Computer Interfaces and Brain Information Processing, South China University of Technology, Guangzhou, China
| | - Tianyou Yu
- Center for Brain Computer Interfaces and Brain Information Processing, South China University of Technology, Guangzhou, China
| | - Shenghong He
- MRC Brain Network Dynamics Unit, Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom
| | - Yuanqing Li
- Center for Brain Computer Interfaces and Brain Information Processing, South China University of Technology, Guangzhou, China
| |
Collapse
|
21
|
Qassim HM, Eesee AK, Osman OT, Jarjees MS. Controlling a motorized electric wheelchair based on face tilting. BIO-ALGORITHMS AND MED-SYSTEMS 2019. [DOI: 10.1515/bams-2019-0033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
AbstractDisability, specifically impaired upper and/or lower limbs, has a direct impact on the patients’ quality of life. Nowadays, motorized wheelchairs supported by a mobility-aided technique have been devised to improve the quality of life of these patients by increasing their independence. This study aims to present a platform to control a motorized wheelchair based on face tilting. A real-time tracking system of face tilting using a webcam and a microcontroller circuit has been designed and implemented. The designed system is dedicated to control the movement directions of the motorized wheelchair. Four commands were adequate to perform the required movements for the motorized wheelchair (forward, right, and left, as well as stopping status). The platform showed an excellent performance regarding controlling the motorized wheelchair using face tilting, and the position of the eyes was shown as the most useful face feature to track face tilting.
Collapse
Affiliation(s)
- Hassan M. Qassim
- Technical Engineering College, Northern Technical University, Mosul, Iraq
| | | | - Omar T. Osman
- Technical Engineering College, Northern Technical University, Mosul, Iraq
| | | |
Collapse
|
22
|
Bio-potentials for smart control applications. HEALTH AND TECHNOLOGY 2019. [DOI: 10.1007/s12553-019-00314-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
23
|
Graybill P, Kiani M. Eyelid Drive System: An Assistive Technology Employing Inductive Sensing of Eyelid Movement. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2019; 13:203-213. [PMID: 30475729 DOI: 10.1109/tbcas.2018.2882510] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
This paper presents the design, development, and validation of the eyelid drive system (EDS), an assistive technology comprising a specialized pair of glasses and millimeter-sized passive resonators, attached to the user's eyelids, that transduce eyelid movement (blinking and winking) through inductive sensing. The theory of operation and design optimization with simulations are presented. A proof-of-concept prototype EDS was constructed using a pair of nonprescription glasses and commercial-off-the-shelf components. In benchtop tests with model eyelids, the EDS demonstrated basic functionality. Initial trials were performed involving six human subjects interacting with custom designed graphical user interfaces on a computer. A group mean accuracy of 96.3% was achieved using a set of four different commands at a response rate of 3 s. A mean information transfer rate (ITR) of 56.1 b/min over all subjects was achieved with a set of six different commands at a response rate of 1.5 s. This proof-of-concept device consumes 51.6 mW of power. The EDS compares favorably with related eye-interfacing assistive technologies and provides a unique combination of advantages, including high accuracy and ITR, wearability, insensitivity to lighting and noise conditions, obviation of facial electrodes, and the use of nonexaggerated gestures.
Collapse
|
24
|
Huang Q, Chen Y, Zhang Z, He S, Zhang R, Liu J, Zhang Y, Shao M, Li Y. An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries. J Neural Eng 2019; 16:026021. [PMID: 30620927 DOI: 10.1088/1741-2552/aafc88] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
OBJECTIVE In this study, we combine a wheelchair and an intelligent robotic arm based on an electrooculogram (EOG) signal to help patients with spinal cord injuries (SCIs) accomplish a self-drinking task. The main challenge is to accurately control the wheelchair to ensure that the randomly located object is within a limited reachable space of the robotic arm (length: 0.8 m; width: 0.4 m; height: 0.6 m), which requires decimeter-level precision, and is still undemonstrated for EOG-based systems as well as EEG-based systems. APPROACH A novel high-precision EOG-based human machine interface (HMI) is proposed which can effectively translate two kinds of eye movements (i.e. blinking and eyebrow raising) into various commands. For the wheelchair, positional precision can reach decimeter-level and the minimal steering angle is [Formula: see text]. For the intelligent robotic arm, shared control is implemented based on an EOG-based HMI, two cameras and the arm's own intelligence. MAIN RESULTS After brief training, five healthy subjects and five paralyzed patients with severe SCIs successfully completed three experiments. For the healthy subjects/patients with SCIs, the system achieved an average accuracy of 99.3%/97.3%, an average response time of 1.91 s/2.02 s per command and an average stop-response time of 1.30 s/1.36 s with a 0 false operation rate. SIGNIFICANCE The EOG-based HMI can provide sufficient precision control to integrate a wheelchair and a robotic arm into a system which can help patients with SCIs to accomplish a self-drinking task. (ChiCTR1800019764).
Collapse
Affiliation(s)
- Qiyun Huang
- Center for Brain Computer Interfaces and Brain Information Processing, South China University of Technology, Guangzhou 510640, People's Republic of China
| | | | | | | | | | | | | | | | | |
Collapse
|
25
|
Fall CL, Quevillon F, Blouin M, Latour S, Campeau-Lecours A, Gosselin C, Gosselin B. A Multimodal Adaptive Wireless Control Interface for People With Upper-Body Disabilities. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS 2018; 12:564-575. [PMID: 29877820 DOI: 10.1109/tbcas.2018.2810256] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
This paper describes a multimodal body-machine interface (BoMI) to help individuals with upper-limb disabilities using advanced assistive technologies, such as robotic arms. The proposed system uses a wearable and wireless body sensor network (WBSN) supporting up to six sensor nodes to measure the natural upper-body gesture of the users and translate it into control commands. Natural gesture of the head and upper-body parts, as well as muscular activity, are measured using inertial measurement units (IMUs) and surface electromyography (sEMG) using custom-designed multimodal wireless sensor nodes. An IMU sensing node is attached to a headset worn by the user. It has a size of 2.9 cm 2.9 cm, a maximum power consumption of 31 mW, and provides angular precision of 1. Multimodal patch sensor nodes, including both IMU and sEMG sensing modalities are placed over the user able-body parts to measure the motion and muscular activity. These nodes have a size of 2.5 cm 4.0 cm and a maximum power consumption of 11 mW. The proposed BoMI runs on a Raspberry Pi. It can adapt to several types of users through different control scenarios using the head and shoulder motion, as well as muscular activity, and provides a power autonomy of up to 24 h. JACO, a 6-DoF assistive robotic arm, is used as a testbed to evaluate the performance of the proposed BoMI. Ten able-bodied subjects performed ADLs while operating the AT device, using the Test d'Évaluation des Membres Supérieurs de Personnes Âgées to evaluate and compare the proposed BoMI with the conventional joystick controller. It is shown that the users can perform all tasks with the proposed BoMI, almost as fast as with the joystick controller, with only 30% time overhead on average, while being potentially more accessible to the upper-body disabled who cannot use the conventional joystick controller. Tests show that control performance with the proposed BoMI improved by up to 17% on average, after three trials.
Collapse
|
26
|
A Novel Feature Optimization for Wearable Human-Computer Interfaces Using Surface Electromyography Sensors. SENSORS 2018; 18:s18030869. [PMID: 29543737 PMCID: PMC5877383 DOI: 10.3390/s18030869] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/04/2018] [Revised: 03/11/2018] [Accepted: 03/13/2018] [Indexed: 12/01/2022]
Abstract
The novel human-computer interface (HCI) using bioelectrical signals as input is a valuable tool to improve the lives of people with disabilities. In this paper, surface electromyography (sEMG) signals induced by four classes of wrist movements were acquired from four sites on the lower arm with our designed system. Forty-two features were extracted from the time, frequency and time-frequency domains. Optimal channels were determined from single-channel classification performance rank. The optimal-feature selection was according to a modified entropy criteria (EC) and Fisher discrimination (FD) criteria. The feature selection results were evaluated by four different classifiers, and compared with other conventional feature subsets. In online tests, the wearable system acquired real-time sEMG signals. The selected features and trained classifier model were used to control a telecar through four different paradigms in a designed environment with simple obstacles. Performance was evaluated based on travel time (TT) and recognition rate (RR). The results of hardware evaluation verified the feasibility of our acquisition systems, and ensured signal quality. Single-channel analysis results indicated that the channel located on the extensor carpi ulnaris (ECU) performed best with mean classification accuracy of 97.45% for all movement’s pairs. Channels placed on ECU and the extensor carpi radialis (ECR) were selected according to the accuracy rank. Experimental results showed that the proposed FD method was better than other feature selection methods and single-type features. The combination of FD and random forest (RF) performed best in offline analysis, with 96.77% multi-class RR. Online results illustrated that the state-machine paradigm with a 125 ms window had the highest maneuverability and was closest to real-life control. Subjects could accomplish online sessions by three sEMG-based paradigms, with average times of 46.02, 49.06 and 48.08 s, respectively. These experiments validate the feasibility of proposed real-time wearable HCI system and algorithms, providing a potential assistive device interface for persons with disabilities.
Collapse
|