• Reference Citation Analysis
  • v
  • v
  • Find an Article
Find an Article PDF (4606053)   Today's Articles (5415)   Subscriber (49373)
For: Krausz NE, Lamotte D, Batzianoulis I, Hargrove LJ, Micera S, Billard A. Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis. IEEE Trans Neural Syst Rehabil Eng 2020;28:1471-1480. [PMID: 32386160 DOI: 10.1109/tnsre.2020.2992885] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Number Cited by Other Article(s)
1
Xia H, Zhang Y, Rajabi N, Taleb F, Yang Q, Kragic D, Li Z. Shaping high-performance wearable robots for human motor and sensory reconstruction and enhancement. Nat Commun 2024;15:1760. [PMID: 38409128 PMCID: PMC10897332 DOI: 10.1038/s41467-024-46249-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2023] [Accepted: 02/19/2024] [Indexed: 02/28/2024]  Open
2
Segas E, Mick S, Leconte V, Dubois O, Klotz R, Cattaert D, de Rugy A. Intuitive movement-based prosthesis control enables arm amputees to reach naturally in virtual reality. eLife 2023;12:RP87317. [PMID: 37847150 PMCID: PMC10581689 DOI: 10.7554/elife.87317] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2023]  Open
3
Yang B, Chen X, Xiao X, Yan P, Hasegawa Y, Huang J. Gaze and Environmental Context-Guided Deep Neural Network and Sequential Decision Fusion for Grasp Intention Recognition. IEEE Trans Neural Syst Rehabil Eng 2023;31:3687-3698. [PMID: 37703142 DOI: 10.1109/tnsre.2023.3314503] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/15/2023]
4
Yang S, Garg NP, Gao R, Yuan M, Noronha B, Ang WT, Accoto D. Learning-Based Motion-Intention Prediction for End-Point Control of Upper-Limb-Assistive Robots. SENSORS (BASEL, SWITZERLAND) 2023;23:2998. [PMID: 36991709 PMCID: PMC10056111 DOI: 10.3390/s23062998] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Revised: 03/04/2023] [Accepted: 03/08/2023] [Indexed: 06/19/2023]
5
Shi P, Fang K, Yu H. Design and control of intelligent bionic artificial hand based on image recognition. Technol Health Care 2023;31:21-35. [PMID: 35723126 DOI: 10.3233/thc-213320] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
6
Qu J, Guo H, Wang W, Dang S. Prediction of Human-Computer Interaction Intention Based on Eye Movement and Electroencephalograph Characteristics. Front Psychol 2022;13:816127. [PMID: 35496176 PMCID: PMC9039167 DOI: 10.3389/fpsyg.2022.816127] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2021] [Accepted: 01/19/2022] [Indexed: 11/13/2022]  Open
7
Bao T, Xie SQ, Yang P, Zhou P, Zhang ZQ. Towards Robust, Adaptive and Reliable Upper-limb Motion Estimation Using Machine Learning and Deep Learning--A Survey in Myoelectric Control. IEEE J Biomed Health Inform 2022;26:3822-3835. [PMID: 35294368 DOI: 10.1109/jbhi.2022.3159792] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
8
Karrenbach M, Boe D, Sie A, Bennett R, Rombokas E. Improving automatic control of upper-limb prosthesis wrists using gaze-centered eye tracking and deep learning. IEEE Trans Neural Syst Rehabil Eng 2022;30:340-349. [PMID: 35100118 DOI: 10.1109/tnsre.2022.3147772] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
9
Lotti N, Xiloyannis M, Missiroli F, Bokranz C, Chiaradia D, Frisoli A, Riener R, Masia L. Myoelectric or Force Control? A Comparative Study on a Soft Arm Exosuit. IEEE T ROBOT 2022. [DOI: 10.1109/tro.2021.3137748] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
10
Park H, Kim S, Nussbaum MA, Srinivasan D. Effects of using a whole-body powered exoskeleton during simulated occupational load-handling tasks: A pilot study. APPLIED ERGONOMICS 2022;98:103589. [PMID: 34563748 DOI: 10.1016/j.apergo.2021.103589] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/10/2020] [Revised: 09/10/2021] [Accepted: 09/13/2021] [Indexed: 06/13/2023]
11
Mouchoux J, Bravo-Cabrera MA, Dosen S, Schilling AF, Markovic M. Impact of Shared Control Modalities on Performance and Usability of Semi-autonomous Prostheses. Front Neurorobot 2021;15:768619. [PMID: 34975446 PMCID: PMC8718752 DOI: 10.3389/fnbot.2021.768619] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Accepted: 11/22/2021] [Indexed: 11/13/2022]  Open
12
Crocher V, Singh R, Newn J, Oetomo D. Towards a Gaze-Informed Movement Intention Model for Robot-Assisted Upper-Limb Rehabilitation. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021;2021:6155-6158. [PMID: 34892521 DOI: 10.1109/embc46164.2021.9629610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
13
Zhu B, Zhang D, Chu Y, Zhao X, Zhang L, Zhao L. Face-Computer Interface (FCI): Intent Recognition Based on Facial Electromyography (fEMG) and Online Human-Computer Interface With Audiovisual Feedback. Front Neurorobot 2021;15:692562. [PMID: 34335220 PMCID: PMC8322851 DOI: 10.3389/fnbot.2021.692562] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2021] [Accepted: 06/21/2021] [Indexed: 11/13/2022]  Open
14
Koochaki F, Najafizadeh L. A Data-Driven Framework for Intention Prediction via Eye Movement With Applications to Assistive Systems. IEEE Trans Neural Syst Rehabil Eng 2021;29:974-984. [PMID: 34038364 DOI: 10.1109/tnsre.2021.3083815] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
PrevPage 1 of 1 1Next
© 2004-2024 Baishideng Publishing Group Inc. All rights reserved. 7041 Koll Center Parkway, Suite 160, Pleasanton, CA 94566, USA