1
|
Tao Q, Liu J, Zheng Y, Yang Y, Lin C, Guang C. Evaluation of an Active Disturbance Rejection Controller for Ophthalmic Robots with Piezo-Driven Injector. MICROMACHINES 2024; 15:833. [PMID: 39064342 PMCID: PMC11278564 DOI: 10.3390/mi15070833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/05/2024] [Revised: 06/24/2024] [Accepted: 06/25/2024] [Indexed: 07/28/2024]
Abstract
Retinal vein cannulation involves puncturing an occluded vessel on the micron scale. Even single millinewton force can cause permanent damage. An ophthalmic robot with a piezo-driven injector is precise enough to perform this delicate procedure, but the uncertain viscoelastic characteristics of the vessel make it difficult to achieve the desired contact force without harming the retina. The paper utilizes a viscoelastic contact model to explain the mechanical characteristics of retinal blood vessels to address this issue. The uncertainty in the viscoelastic properties is considered an internal disturbance of the contact model, and an active disturbance rejection controller is then proposed to precisely control the contact force. The experimental results show that this method can precisely adjust the contact force at the millinewton level even when the viscoelastic parameters vary significantly (up to 403.8%). The root mean square (RMS) and maximum value of steady-state error are 0.32 mN and 0.41 mN. The response time is below 2.51 s with no obvious overshoot.
Collapse
Affiliation(s)
- Qiannan Tao
- School of Energy and Power Engineering, Beihang University, Beijing 100191, China;
| | - Jianjun Liu
- School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, China; (J.L.); (C.L.)
| | - Yu Zheng
- College of Automation and College of Artificial Intelligence, Nanjing University of Posts and Telecommunications, Nanjing 210023, China
| | - Yang Yang
- School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, China; (J.L.); (C.L.)
| | - Chuang Lin
- School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, China; (J.L.); (C.L.)
| | - Chenhan Guang
- School of Mechanical and Materials Engineering, North China University of Technology, Beijing 100144, China;
| |
Collapse
|
2
|
Esfandiari M, Kim JW, Zhao B, Amirkhani G, Hadi M, Gehlbach P, Taylor RH, Iordachita I. Cooperative vs. Teleoperation Control of the Steady Hand Eye Robot with Adaptive Sclera Force Control: A Comparative Study. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : ICRA : [PROCEEDINGS]. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION 2024; 2024:8209-8215. [PMID: 39421218 PMCID: PMC11486505 DOI: 10.1109/icra57147.2024.10611084] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/19/2024]
Abstract
A surgeon's physiological hand tremor can significantly impact the outcome of delicate and precise retinal surgery, such as retinal vein cannulation (RVC) and epiretinal membrane peeling. Robot-assisted eye surgery technology provides ophthalmologists with advanced capabilities such as hand tremor cancellation, hand motion scaling, and safety constraints that enable them to perform these otherwise challenging and high-risk surgeries with high precision and safety. Steady-Hand Eye Robot (SHER) with cooperative control mode can filter out surgeon's hand tremor, yet another important safety feature, that is, minimizing the contact force between the surgical instrument and sclera surface for avoiding tissue damage cannot be met in this control mode. Also, other capabilities, such as hand motion scaling and haptic feedback, require a teleoperation control framework. In this work, for the first time, we implemented a teleoperation control mode incorporated with an adaptive sclera force control algorithm using a PHANTOM Omni haptic device and a force-sensing surgical instrument equipped with Fiber Bragg Grating (FBG) sensors attached to the SHER 2.1 end-effector. This adaptive sclera force control algorithm allows the robot to dynamically minimize the tool-sclera contact force. Moreover, for the first time, we compared the performance of the proposed adaptive teleoperation mode with the cooperative mode by conducting a vessel-following experiment inside an eye phantom under a microscope.
Collapse
Affiliation(s)
- Mojtaba Esfandiari
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Ji Woong Kim
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Botao Zhao
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Golchehr Amirkhani
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Muhammad Hadi
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Peter Gehlbach
- Peter Gehlbach is with the Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD, 21287, USA
| | - Russell H Taylor
- Russell H. Taylor is with the Department of Computer Science and also the Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Iulian Iordachita
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| |
Collapse
|
3
|
Luo J, Zhou X, Zeng C, Jiang Y, Qi W, Xiang K, Pang M, Tang B. Robotics Perception and Control: Key Technologies and Applications. MICROMACHINES 2024; 15:531. [PMID: 38675342 PMCID: PMC11052398 DOI: 10.3390/mi15040531] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/05/2024] [Revised: 04/10/2024] [Accepted: 04/11/2024] [Indexed: 04/28/2024]
Abstract
The integration of advanced sensor technologies has significantly propelled the dynamic development of robotics, thus inaugurating a new era in automation and artificial intelligence. Given the rapid advancements in robotics technology, its core area-robot control technology-has attracted increasing attention. Notably, sensors and sensor fusion technologies, which are considered essential for enhancing robot control technologies, have been widely and successfully applied in the field of robotics. Therefore, the integration of sensors and sensor fusion techniques with robot control technologies, which enables adaptation to various tasks in new situations, is emerging as a promising approach. This review seeks to delineate how sensors and sensor fusion technologies are combined with robot control technologies. It presents nine types of sensors used in robot control, discusses representative control methods, and summarizes their applications across various domains. Finally, this survey discusses existing challenges and potential future directions.
Collapse
Affiliation(s)
- Jing Luo
- School of Automation, Wuhan University of Technology, Wuhan 430070, China; (J.L.); (X.Z.); (K.X.)
- Chongqing Research Institute, Wuhan University of Technology, Chongqing 401135, China
| | - Xiangyu Zhou
- School of Automation, Wuhan University of Technology, Wuhan 430070, China; (J.L.); (X.Z.); (K.X.)
| | - Chao Zeng
- Department of Informatics, University of Hamburg, 22527 Hamburg, Germany;
| | - Yiming Jiang
- School of Robotics, Hunan University, Changsha 410082, China;
| | - Wen Qi
- School of Future Technology, South China University of Technology, Guangzhou 510641, China;
| | - Kui Xiang
- School of Automation, Wuhan University of Technology, Wuhan 430070, China; (J.L.); (X.Z.); (K.X.)
| | - Muye Pang
- School of Automation, Wuhan University of Technology, Wuhan 430070, China; (J.L.); (X.Z.); (K.X.)
| | - Biwei Tang
- School of Automation, Wuhan University of Technology, Wuhan 430070, China; (J.L.); (X.Z.); (K.X.)
| |
Collapse
|
4
|
Posselli NR, Bernstein PS, Abbott JJ. Eye-mounting goggles to bridge the gap between benchtop experiments and in vivo robotic eye surgery. Sci Rep 2023; 13:15503. [PMID: 37726336 PMCID: PMC10509142 DOI: 10.1038/s41598-023-42561-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2023] [Accepted: 09/12/2023] [Indexed: 09/21/2023] Open
Abstract
A variety of robot-assisted surgical systems have been proposed to improve the precision of eye surgery. Evaluation of these systems has typically relied on benchtop experiments with artificial or enucleated eyes. However, this does not properly account for the types of head motion that are common among patients undergoing eye surgery, which a clinical robotic system will encounter. In vivo experiments are clinically realistic, but they are risky and thus require the robotic system to be at a sufficiently mature state of development. In this paper, we describe a low-cost device that enables an artificial or enucleated eye to be mounted to standard swim goggles worn by a human volunteer to enable more realistic evaluation of eye-surgery robots after benchtop studies and prior to in vivo studies. The mounted eye can rotate about its center, with a rotational stiffness matching that of an anesthetized patient's eye. We describe surgeon feedback and technical analyses to verify that various aspects of the design are sufficient for simulating a patient's eye during surgery.
Collapse
Affiliation(s)
- Nicholas R Posselli
- Robotics Center and Department of Mechanical Engineering, University of Utah, Salt Lake City, UT, USA.
| | - Paul S Bernstein
- Department of Ophthalmology and Visual Sciences, Moran Eye Center, University of Utah, Salt Lake City, UT, USA
| | - Jake J Abbott
- Robotics Center and Department of Mechanical Engineering, University of Utah, Salt Lake City, UT, USA
| |
Collapse
|
5
|
Wu Z, Chen D, Pan C, Zhang G, Chen S, Shi J, Meng C, Zhao X, Tao B, Chen D, Liu W, Ding H, Tang Z. Surgical Robotics for Intracerebral Hemorrhage Treatment: State of the Art and Future Directions. Ann Biomed Eng 2023; 51:1933-1941. [PMID: 37405558 PMCID: PMC10409846 DOI: 10.1007/s10439-023-03295-x] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Accepted: 06/17/2023] [Indexed: 07/06/2023]
Abstract
Intracerebral hemorrhage (ICH) is a stroke subtype with high mortality and disability, and there are no proven medical treatments that can improve the functional outcome of ICH patients. Robot-assisted neurosurgery is a significant advancement in the development of minimally invasive surgery for ICH. This review encompasses the latest advances and future directions of surgical robots for ICH. First, three robotic systems for neurosurgery applied to ICH are illustrated. Second, the key technologies of robot-assisted surgery for ICH are introduced in aspects of stereotactic technique and navigation, the puncture instrument, and hematoma evacuation. Finally, the limitations of current surgical robots are summarized, and the possible development direction is discussed, which is named "multisensor fusion and intelligent aspiration control of minimally invasive surgical robot for ICH". It is expected that the new generation of surgical robots for ICH will facilitate quantitative, precise, individualized, standardized treatment strategies for ICH.
Collapse
Affiliation(s)
- Zhuojin Wu
- Department of Neurology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430030, China
| | - Danyang Chen
- Department of Neurology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430030, China
| | - Chao Pan
- Department of Neurology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430030, China
| | - Ge Zhang
- Department of Neurology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430030, China
| | - Shiling Chen
- Department of Neurology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430030, China
| | - Jian Shi
- School of Mechanical Science & Engineering, Huazhong University of Science and Technology, Wuhan, 430074, China
| | - Cai Meng
- School of Mechanical Engineering & Automation-BUAA, Beihang University, Beijing, 100083, China
| | - Xingwei Zhao
- School of Mechanical Science & Engineering, Huazhong University of Science and Technology, Wuhan, 430074, China
| | - Bo Tao
- School of Mechanical Science & Engineering, Huazhong University of Science and Technology, Wuhan, 430074, China
| | - Diansheng Chen
- School of Mechanical Engineering & Automation-BUAA, Beihang University, Beijing, 100083, China
| | - Wenjie Liu
- Beijing WanTeFu Medical Instrument Co., Ltd, Beijing, 102299, China
| | - Han Ding
- School of Mechanical Science & Engineering, Huazhong University of Science and Technology, Wuhan, 430074, China.
| | - Zhouping Tang
- Department of Neurology, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430030, China.
| |
Collapse
|
6
|
Ebrahimi A, Sefati S, Gehlbach P, Taylor RH, Iordachita I. Simultaneous Online Registration-Independent Stiffness Identification and Tip Localization of Surgical Instruments in Robot-assisted Eye Surgery. IEEE T ROBOT 2023; 39:1373-1387. [PMID: 37377922 PMCID: PMC10292740 DOI: 10.1109/tro.2022.3201393] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2023]
Abstract
Notable challenges during retinal surgery lend themselves to robotic assistance which has proven beneficial in providing a safe steady-hand manipulation. Efficient assistance from the robots heavily relies on accurate sensing of surgery states (e.g. instrument tip localization and tool-to-tissue interaction forces). Many of the existing tool tip localization methods require preoperative frame registrations or instrument calibrations. In this study using an iterative approach and by combining vision and force-based methods, we develop calibration- and registration-independent (RI) algorithms to provide online estimates of instrument stiffness (least squares and adaptive). The estimations are then combined with a state-space model based on the forward kinematics (FWK) of the Steady-Hand Eye Robot (SHER) and Fiber Brag Grating (FBG) sensor measurements. This is accomplished using a Kalman Filtering (KF) approach to improve the deflected instrument tip position estimations during robot-assisted eye surgery. The conducted experiments demonstrate that when the online RI stiffness estimations are used, the instrument tip localization results surpass those obtained from pre-operative offline calibrations for stiffness.
Collapse
Affiliation(s)
- Ali Ebrahimi
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Shahriar Sefati
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Peter Gehlbach
- Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD, 21287, USA
| | - Russell H Taylor
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
- Department of Computer Science and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| | - Iulian Iordachita
- Department of Mechanical Engineering and also Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD, 21218, USA
| |
Collapse
|
7
|
Patel N, Urias M, Ebrahimi A, Taylor RH, Gehlbach P, Iordachita I. Force-based Control for Safe Robot-assisted Retinal Interventions: In Vivo Evaluation in Animal Studies. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2022; 4:578-587. [PMID: 36033345 PMCID: PMC9410268 DOI: 10.1109/tmrb.2022.3191441] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
In recent years, robotic assistance in vitreoretinal surgery has moved from a benchtop environment to the operating rooms. Emerging robotic systems improve tool manoeuvrability and provide precise tool motions in a constrained intraocular environment and reduce/remove hand tremor. However, often due to their stiff and bulky mechanical structure, they diminish the perception of tool-to-sclera (scleral) forces, on which the surgeon relies, for eyeball manipulation. In this paper we measure these scleral forces and actively control the robot to keep them under a predefined threshold. Scleral forces are measured using a Fiber Bragg Grating (FBG) based force sensing instrument in an in vivo rabbit eye model in manual, cooperative robotic assistance with no scleral force control (NC), adaptive scleral force norm control (ANC) and adaptive scleral force component control (ACC) methods. To the best of our knowledge, this is the first time that the scleral forces are measured in an in vivo eye model during robot assisted vitreoretinal procedures. An experienced retinal surgeon repeated an intraocular tool manipulation (ITM) task 10 times in four in vivo rabbit eyes and a phantom eyeball, for a total of 50 repetitions in each control mode. Statistical analysis shows that the ANC and ACC control schemes restrict the duration of the undesired scleral forces to 4.41% and 14.53% as compared to 43.30% and 35.28% in manual and NC cases, respectively during the in vivo studies. These results show that the active robot control schemes can maintain applied scleral forces below a desired threshold during robot-assisted vitreoretinal surgery. The scleral forces measurements in this study may enable a better understanding of tool-to-sclera interactions during vitreoretinal surgery and the proposed control strategies could be extended to other microsurgery and robot-assisted interventions.
Collapse
Affiliation(s)
- Niravkumar Patel
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD USA-21218
- Indian Institute of Technology Madras, Chennai, India
| | - Muller Urias
- Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD 21287 USA
| | - Ali Ebrahimi
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD USA-21218
| | - Russell H Taylor
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD USA-21218
| | - Peter Gehlbach
- Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD 21287 USA
| | - Iulian Iordachita
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD USA-21218
| |
Collapse
|
8
|
Short Circuit and Broken Rotor Faults Severity Discrimination in Induction Machines Using Non-invasive Optical Fiber Technology. ENERGIES 2022. [DOI: 10.3390/en15020577] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
Multiple techniques continue to be simultaneously utilized in the condition monitoring and fault detection of electric machines, as there is still no single technique that provides an all-round solution to fault finding in these machines. Having various machine fault-detection techniques is useful in allowing the ability to combine two or more in a manner that will provide a more comprehensive application-dependent condition-monitoring solution; especially, given the increasing role these machines are expected to play in man’s transition to a more sustainable environment, where many more electric machines will be required. This paper presents a novel non-invasive optical fiber using a stray flux technique for the condition monitoring and fault detection of induction machines. A giant magnetostrictive transducer, made of terfenol-D, was bonded onto a fiber Bragg grating, to form a composite FBG-T sensor, which utilizes the machines’ stray flux to determine the internal condition of the machine. Three machine conditions were investigated: healthy, broken rotor, and short circuit inter-turn fault. A tri-axial auto-data-logging flux meter was used to obtain stray magnetic flux measurements, and the numerical results obtained with LabView were analyzed in MATLAB. The optimal positioning and sensitivity of the FBG-T sensor were found to be transverse and 19.3810 pm/μT, respectively. The experimental results showed that the FBG-T sensor accurately distinguished each of the three machine conditions using a different order of magnitude of Bragg wavelength shifts, with the most severe fault reaching wavelength shifts of hundreds of picometres (pm) compared to the healthy and broken rotor conditions, which were in the low-to-mid-hundred and high-hundred picometre (pm) range, respectively. A fast Fourier transform (FFT) analysis, performed on the measured stray flux, revealed that the spectral content of the stray flux affected the magnetostrictive behavior of the magnetic dipoles of the terfenol-D transducer, which translated into strain on the fiber gratings.
Collapse
|
9
|
Zhou M, Wu J, Ebrahimi A, Patel N, Liu Y, Navab N, Gehlbach P, Knoll A, Nasseri MA, Iordachita I. Spotlight-based 3D Instrument Guidance for Autonomous Task in Robot-assisted Retinal Surgery. IEEE Robot Autom Lett 2021; 6:7750-7757. [PMID: 35309100 PMCID: PMC8932929 DOI: 10.1109/lra.2021.3100937] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/29/2023]
Abstract
Retinal surgery is known to be a complicated and challenging task for an ophthalmologist even for retina specialists. Image guided robot-assisted intervention is among the novel and promising solutions that may enhance human capabilities during microsurgery. In this paper, a novel method is proposed for 3D navigation of a microsurgical instrument based on the projection of a spotlight during robot-assisted retinal surgery. To test the feasibility and effectiveness of the proposed method, a vessel tracking task in a phantom with a Remote Center of Motion (RCM) constraint is performed by the Steady-Hand Eye Robot (SHER). The results are compared to manual tracking, cooperative control tracking with the SHER and spotlight-based automatic tracking with SHER. The reported results are that the spotlight-based automatic tracking with SHER can reach an average tracking error of 0.013 mm and keeping distance error of 0.1 mm from the desired range demonstrating a significant improvement compared to manual or cooperative control methods alone.
Collapse
Affiliation(s)
- Mingchuan Zhou
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
- Department of Computer Science in Technische Universität München, München 85748 Germany
| | - Jiahao Wu
- Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
- T Stone Robotics Institute, the Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, HKSAR, China
| | - Ali Ebrahimi
- Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Niravkumar Patel
- Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Yunhui Liu
- T Stone Robotics Institute, the Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, HKSAR, China
| | - Nassir Navab
- Department of Computer Science in Technische Universität München, München 85748 Germany
| | - Peter Gehlbach
- Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD 21287 USA, and the Department of Electical Engineering at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Alois Knoll
- Department of Computer Science in Technische Universität München, München 85748 Germany
| | - M. Ali Nasseri
- Augenklinik und Poliklinik, Klinikum rechts der Isar der Technische Universität München, München 81675 Germany
| | - Iulian Iordachita
- Laboratory for Computational Sensing and Robotics at the Johns Hopkins University, Baltimore, MD 21218 USA
| |
Collapse
|