1
|
Pierella C, D'Antuono C, Marchesi G, Menotti CE, Casadio M. A Computer Interface Controlled by Upper Limb Muscles: Effects of a Two Weeks Training on Younger and Older Adults. IEEE Trans Neural Syst Rehabil Eng 2023; 31:3744-3751. [PMID: 37676798 DOI: 10.1109/tnsre.2023.3312981] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/09/2023]
Abstract
As the population worldwide ages, there is a growing need for assistive technology and effective human-machine interfaces to address the wider range of motor disabilities that older adults may experience. Motor disabilities can make it difficult for individuals to perform basic daily tasks, such as getting dressed, preparing meals, or using a computer. The goal of this study was to investigate the effect of two weeks of training with a myoelectric computer interface (MCI) on motor functions in younger and older adults. Twenty people were recruited in the study: thirteen younger (range: 22-35 years old) and seven older (range: 61-78 years old) adults. Participants completed six training sessions of about 2 hours each, during which the activity of right and left biceps and trapezius were mapped into a control signal for the cursor of a computer. Results highlighted significant improvements in cursor control, and therefore in muscle coordination, in both groups. All participants with training became faster and more accurate, although people in different age range learned with a different dynamic. Results of the questionnaire on system usability and quality highlighted a general consensus about easiness of use and intuitiveness. These findings suggest that the proposed MCI training can be a powerful tool in the framework of assistive technologies for both younger and older adults. Further research is needed to determine the optimal duration and intensity of MCI training for different age groups and to investigate long-term effects of training on physical and cognitive function.
Collapse
|
2
|
Patwardhan S, Gladhill KA, Joiner WM, Schofield JS, Lee BS, Sikdar S. Using principles of motor control to analyze performance of human machine interfaces. Sci Rep 2023; 13:13273. [PMID: 37582852 PMCID: PMC10427694 DOI: 10.1038/s41598-023-40446-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Accepted: 08/10/2023] [Indexed: 08/17/2023] Open
Abstract
There have been significant advances in biosignal extraction techniques to drive external biomechatronic devices or to use as inputs to sophisticated human machine interfaces. The control signals are typically derived from biological signals such as myoelectric measurements made either from the surface of the skin or subcutaneously. Other biosignal sensing modalities are emerging. With improvements in sensing modalities and control algorithms, it is becoming possible to robustly control the target position of an end-effector. It remains largely unknown to what extent these improvements can lead to naturalistic human-like movement. In this paper, we sought to answer this question. We utilized a sensing paradigm called sonomyography based on continuous ultrasound imaging of forearm muscles. Unlike myoelectric control strategies which measure electrical activation and use the extracted signals to determine the velocity of an end-effector; sonomyography measures muscle deformation directly with ultrasound and uses the extracted signals to proportionally control the position of an end-effector. Previously, we showed that users were able to accurately and precisely perform a virtual target acquisition task using sonomyography. In this work, we investigate the time course of the control trajectories derived from sonomyography. We show that the time course of the sonomyography-derived trajectories that users take to reach virtual targets reflect the trajectories shown to be typical for kinematic characteristics observed in biological limbs. Specifically, during a target acquisition task, the velocity profiles followed a minimum jerk trajectory shown for point-to-point arm reaching movements, with similar time to target. In addition, the trajectories based on ultrasound imaging result in a systematic delay and scaling of peak movement velocity as the movement distance increased. We believe this is the first evaluation of similarities in control policies in coordinated movements in jointed limbs, and those based on position control signals extracted at the individual muscle level. These results have strong implications for the future development of control paradigms for assistive technologies.
Collapse
Affiliation(s)
| | - Keri Anne Gladhill
- Department of Psychology, George Mason University, Fairfax, VA, 22030, USA
| | - Wilsaan M Joiner
- Department of Neurobiology, Physiology and Behavior, University of California, Davis, Davis, CA, 95616, USA
| | - Jonathon S Schofield
- Mechanical and Aerospace Engineering Department, University of California, Davis, Davis, CA, 95616, USA
| | - Ben Seiyon Lee
- Department of Statistics, George Mason University, Fairfax, VA, 22030, USA
| | - Siddhartha Sikdar
- Department of Bioengineering, George Mason University, Fairfax, VA, 22030, USA.
- Center for Adaptive Systems of Brain-Body Interactions, Fairfax, VA, 22030, USA.
| |
Collapse
|
3
|
Patwardhan S, Gladhill KA, Joiner WM, Schofield JS, Sikdar S. Using Principles of Motor Control to Analyze Performance of Human Machine Interfaces. RESEARCH SQUARE 2023:rs.3.rs-2763325. [PMID: 37292730 PMCID: PMC10246101 DOI: 10.21203/rs.3.rs-2763325/v1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
There have been significant advances in biosignal extraction techniques to drive external biomechatronic devices or to use as inputs to sophisticated human machine interfaces. The control signals are typically derived from biological signals such as myoelectric measurements made either from the surface of the skin or subcutaneously. Other biosignal sensing modalities are emerging. With improvements in sensing modalities and control algorithms, it is becoming possible to robustly control the target position of a end effector. It remains largely unknown to what extent these improvements can lead to naturalistic human-like movement. In this paper, we sought to answer this question. We utilized a sensing paradigm called sonomyography based on continuous ultrasound imaging of forearm muscles. Unlike myoelectric control strategies which measure electrical activation and use the extracted signals to determine the velocity of an end-effector; sonomyography measures muscle deformation directly with ultrasound and uses the extracted signals to proportionally control the position of an end-effector. Previously, we showed that users were able to accurately and precisely perform a virtual target acquisition task using sonomyography. In this work, we investigate the time course of the control trajectories derived from sonomyography. We show that the time course of the sonomyography-derived trajectories that users take to reach virtual targets reflect the trajectories shown to be typical for kinematic characteristics observed in biological limbs. Specifically, during a target acquisition task, the velocity profiles followed a minimum jerk trajectory shown for point-to-point arm reaching movements, with similar time to target. In addition, the trajectories based on ultrasound imaging result in a systematic delay and scaling of peak movement velocity as the movement distance increased. We believe this is the first evaluation of similarities in control policies in coordinated movements in jointed limbs, and those based on position control signals extracted at the individual muscle level. These results have strong implications for the future development of control paradigms for assistive technologies.
Collapse
Affiliation(s)
| | - Keri Anne Gladhill
- Department of Psychology, George Mason University, Fairfax, VA, 22030, USA
| | - Wilsaan M. Joiner
- Department of Neurobiology, Physiology and Behavior, University of California, Davis, Davis, CA, 95616, USA
| | - Jonathon S. Schofield
- Mechanical and Aerospace Engineering Department, University of California, Davis, Davis, CA, 95616, USA
| | - Siddhartha Sikdar
- Department of Bioengineering, George Mason University, Fairfax VA, 22030, USA
- Center for Adaptive Systems of Brain-Body Interactions, Fairfax VA, 22030, USA
| |
Collapse
|
4
|
Rojas M, Ponce P, Molina A. Development of a Sensing Platform Based on Hands-Free Interfaces for Controlling Electronic Devices. Front Hum Neurosci 2022; 16:867377. [PMID: 35754778 PMCID: PMC9231433 DOI: 10.3389/fnhum.2022.867377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Accepted: 05/04/2022] [Indexed: 11/13/2022] Open
Abstract
Hands-free interfaces are essential to people with limited mobility for interacting with biomedical or electronic devices. However, there are not enough sensing platforms that quickly tailor the interface to these users with disabilities. Thus, this article proposes to create a sensing platform that could be used by patients with mobility impairments to manipulate electronic devices, thereby their independence will be increased. Hence, a new sensing scheme is developed by using three hands-free signals as inputs: voice commands, head movements, and eye gestures. These signals are obtained by using non-invasive sensors: a microphone for the speech commands, an accelerometer to detect inertial head movements, and an infrared oculography to register eye gestures. These signals are processed and received as the user's commands by an output unit, which provides several communication ports for sending control signals to other devices. The interaction methods are intuitive and could extend boundaries for people with disabilities to manipulate local or remote digital systems. As a study case, two volunteers with severe disabilities used the sensing platform to steer a power wheelchair. Participants performed 15 common skills for wheelchair users and their capacities were evaluated according to a standard test. By using the head control they obtained 93.3 and 86.6%, respectively for volunteers A and B; meanwhile, by using the voice control they obtained 63.3 and 66.6%, respectively. These results show that the end-users achieved high performance by developing most of the skills by using the head movements interface. On the contrary, the users were not able to develop most of the skills by using voice control. These results showed valuable information for tailoring the sensing platform according to the end-user needs.
Collapse
Affiliation(s)
- Mario Rojas
- Tecnologico de Monterrey, School of Engineering and Sciences, Mexico City, Mexico
| | - Pedro Ponce
- Tecnologico de Monterrey, School of Engineering and Sciences, Mexico City, Mexico
| | - Arturo Molina
- Tecnologico de Monterrey, School of Engineering and Sciences, Mexico City, Mexico
| |
Collapse
|
5
|
Esposito D, Centracchio J, Andreozzi E, Gargiulo GD, Naik GR, Bifulco P. Biosignal-Based Human-Machine Interfaces for Assistance and Rehabilitation: A Survey. SENSORS 2021; 21:s21206863. [PMID: 34696076 PMCID: PMC8540117 DOI: 10.3390/s21206863] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Revised: 09/30/2021] [Accepted: 10/12/2021] [Indexed: 12/03/2022]
Abstract
As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal-based HMIs for assistance and rehabilitation to outline state-of-the-art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full-text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever-growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complexity, so their usefulness should be carefully evaluated for the specific application.
Collapse
Affiliation(s)
- Daniele Esposito
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Jessica Centracchio
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Emilio Andreozzi
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| | - Gaetano D. Gargiulo
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The MARCS Institute, Western Sydney University, Penrith, NSW 2751, Australia
| | - Ganesh R. Naik
- School of Engineering, Design and Built Environment, Western Sydney University, Penrith, NSW 2747, Australia;
- The Adelaide Institute for Sleep Health, Flinders University, Bedford Park, SA 5042, Australia
- Correspondence:
| | - Paolo Bifulco
- Department of Electrical Engineering and Information Technologies, Polytechnic and Basic Sciences School, University of Naples “Federico II”, 80125 Naples, Italy; (D.E.); (J.C.); (E.A.); (P.B.)
| |
Collapse
|
6
|
Zhu B, Zhang D, Chu Y, Zhao X, Zhang L, Zhao L. Face-Computer Interface (FCI): Intent Recognition Based on Facial Electromyography (fEMG) and Online Human-Computer Interface With Audiovisual Feedback. Front Neurorobot 2021; 15:692562. [PMID: 34335220 PMCID: PMC8322851 DOI: 10.3389/fnbot.2021.692562] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2021] [Accepted: 06/21/2021] [Indexed: 11/13/2022] Open
Abstract
Patients who have lost limb control ability, such as upper limb amputation and high paraplegia, are usually unable to take care of themselves. Establishing a natural, stable, and comfortable human-computer interface (HCI) for controlling rehabilitation assistance robots and other controllable equipments will solve a lot of their troubles. In this study, a complete limbs-free face-computer interface (FCI) framework based on facial electromyography (fEMG) including offline analysis and online control of mechanical equipments was proposed. Six facial movements related to eyebrows, eyes, and mouth were used in this FCI. In the offline stage, 12 models, eight types of features, and three different feature combination methods for model inputing were studied and compared in detail. In the online stage, four well-designed sessions were introduced to control a robotic arm to complete drinking water task in three ways (by touch screen, by fEMG with and without audio feedback) for verification and performance comparison of proposed FCI framework. Three features and one model with an average offline recognition accuracy of 95.3%, a maximum of 98.8%, and a minimum of 91.4% were selected for use in online scenarios. In contrast, the way with audio feedback performed better than that without audio feedback. All subjects completed the drinking task in a few minutes with FCI. The average and smallest time difference between touch screen and fEMG under audio feedback were only 1.24 and 0.37 min, respectively.
Collapse
Affiliation(s)
- Bo Zhu
- State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China.,Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, China.,University of Chinese Academy of Sciences, Beijing, China
| | - Daohui Zhang
- State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China.,Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, China
| | - Yaqi Chu
- State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China.,Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, China.,University of Chinese Academy of Sciences, Beijing, China
| | - Xingang Zhao
- State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China.,Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, China
| | - Lixin Zhang
- Rehabilitation Center, Shengjing Hospital of China Medical University, Shenyang, China
| | - Lina Zhao
- Rehabilitation Center, Shengjing Hospital of China Medical University, Shenyang, China
| |
Collapse
|
7
|
Kaur A. Wheelchair control for disabled patients using EMG/EOG based human machine interface: a review. J Med Eng Technol 2020; 45:61-74. [PMID: 33302770 DOI: 10.1080/03091902.2020.1853838] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
The human-machine interface (HMI) and bio-signals have been used to control rehabilitation equipment and improve the lives of people with severe disabilities. This research depicts a review of electromyogram (EMG) or electrooculogram (EOG) signal-based control system for driving the wheelchair for disabled. For a paralysed person, EOG is one of the most useful signals that help to successfully communicate with the environment by using eye movements. In the case of amputation, the selection of muscles according to the distribution of power and frequency highly contributes to the specific motion of a wheelchair. Taking into account the day-to-day activities of persons with disabilities, both technologies are being used to design EMG or EOG based wheelchairs. This review paper examines a total of 70 EMG studies and 25 EOG studies published from 2000 to 2019. In addition, this paper covers current technologies used in wheelchair systems for signal capture, filtering, characterisation, and classification, including control commands such as left and right turns, forward and reverse motion, acceleration, deceleration, and wheelchair stop.
Collapse
Affiliation(s)
- Amanpreet Kaur
- Department of Electronics and Communication Engineering, Thapar Institute of Engineering and Technology, Patiala, India
| |
Collapse
|