1
|
Đalić V, Jovanović V, Marić P. Submillimeter-Accurate Markerless Hand-Eye Calibration Based on a Robot's Flange Features. SENSORS (BASEL, SWITZERLAND) 2024; 24:1071. [PMID: 38400232 PMCID: PMC10892941 DOI: 10.3390/s24041071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/26/2023] [Revised: 01/29/2024] [Accepted: 02/05/2024] [Indexed: 02/25/2024]
Abstract
An accurate and reliable estimation of the transformation matrix between an optical sensor and a robot is a key aspect of the hand-eye system calibration process in vision-guided robotic applications. This paper presents a novel approach to markerless hand-eye calibration that achieves streamlined, flexible, and highly accurate results, even without error compensation. The calibration procedure is mainly based on using the robot's tool center point (TCP) as the reference point. The TCP coordinate estimation is based on the robot's flange point cloud, considering its geometrical features. A mathematical model streamlining the conventional marker-based hand-eye calibration is derived. Furthermore, a novel algorithm for the automatic estimation of the flange's geometric features from its point cloud, based on a 3D circle fitting, the least square method, and a nearest neighbor (NN) approach, is proposed. The accuracy of the proposed algorithm is validated using a calibration setting ring as the ground truth. Furthermore, to establish the minimal required number and configuration of calibration points, the impact of the number and the selection of the unique robot's flange positions on the calibration accuracy is investigated and validated by real-world experiments. Our experimental findings strongly indicate that our hand-eye system, employing the proposed algorithm, enables the estimation of the transformation between the robot and the 3D scanner with submillimeter accuracy, even when using the minimum of four non-coplanar points for calibration. Our approach improves the calibration accuracy by approximately four times compared to the state of the art, while eliminating the need for error compensation. Moreover, our calibration approach reduces the required number of the robot's flange positions by approximately 40%, and even more if the calibration procedure utilizes just four properly selected flange positions. The presented findings introduce a more efficient hand-eye calibration procedure, offering a superior simplicity of implementation and increased precision in various robotic applications.
Collapse
Affiliation(s)
- Velibor Đalić
- Faculty of Electrical Engineering, University of Banja Luka, Patre 5, 78000 Banja Luka, Bosnia and Herzegovina; (V.J.); (P.M.)
| | | | | |
Collapse
|
2
|
Zhou K, Huang X, Li S, Li G. Convolutional neural network-based pose mapping estimation as an alternative to traditional hand-eye calibration. THE REVIEW OF SCIENTIFIC INSTRUMENTS 2023; 94:065002. [PMID: 37862475 DOI: 10.1063/5.0147783] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2023] [Accepted: 05/13/2023] [Indexed: 10/22/2023]
Abstract
The vision system is a crucial technology for realizing the automation and intelligence of industrial robots, and the accuracy of hand-eye calibration is crucial in determining the relationship between the camera and robot end. Parallel robots are widely used in automated assembly due to their high positioning accuracy and large carrying capacity, but traditional hand-eye calibration methods may not be applicable due to their limited motion range and resulting accuracy problems. To address this issue, we propose using a pose, nonlinear mapping estimation method to solve the hand-eye calibration problem and have constructed a 1-D pose estimation convolutional neural network (PECNN) with excellent performance, through experiments and discussions. The PECNN achieves an end-to-end mapping of the variation of the target object pose to the variation of the robot end pose. Our experiments have shown that the proposed hand-eye calibration method has high accuracy and can be applied to the automated assembly tasks of vision-guided parallel robots. Moreover, the method is also applicable to most parallel robots and tandem robots.
Collapse
Affiliation(s)
- Kuai Zhou
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
| | - Xiang Huang
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
| | - Shuanggao Li
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
| | - Gen Li
- Suzhou Research Institute, Nanjing University of Aeronautics and Astronautics, Suzhou, China
| |
Collapse
|
3
|
Liu J, Sun W, Zhao Y, Zheng G. Ultrasound Probe and Hand-Eye Calibrations for Robot-Assisted Needle Biopsy. SENSORS (BASEL, SWITZERLAND) 2022; 22:9465. [PMID: 36502167 PMCID: PMC9740029 DOI: 10.3390/s22239465] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Revised: 11/28/2022] [Accepted: 11/30/2022] [Indexed: 06/17/2023]
Abstract
In robot-assisted ultrasound-guided needle biopsy, it is essential to conduct calibration of the ultrasound probe and to perform hand-eye calibration of the robot in order to establish a link between intra-operatively acquired ultrasound images and robot-assisted needle insertion. Based on a high-precision optical tracking system, novel methods for ultrasound probe and robot hand-eye calibration are proposed. Specifically, we first fix optically trackable markers to the ultrasound probe and to the robot, respectively. We then design a five-wire phantom to calibrate the ultrasound probe. Finally, an effective method taking advantage of steady movement of the robot but without an additional calibration frame or the need to solve the AX=XB equation is proposed for hand-eye calibration. After calibrations, our system allows for in situ definition of target lesions and aiming trajectories from intra-operatively acquired ultrasound images in order to align the robot for precise needle biopsy. Comprehensive experiments were conducted to evaluate accuracy of different components of our system as well as the overall system accuracy. Experiment results demonstrated the efficacy of the proposed methods.
Collapse
|
4
|
Sun W, Liu J, Zhao Y, Zheng G. A Novel Point Set Registration-Based Hand-Eye Calibration Method for Robot-Assisted Surgery. SENSORS (BASEL, SWITZERLAND) 2022; 22:8446. [PMID: 36366144 PMCID: PMC9656731 DOI: 10.3390/s22218446] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/16/2022] [Revised: 10/29/2022] [Accepted: 10/31/2022] [Indexed: 06/16/2023]
Abstract
Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and safety when compared with manual implantation. In developing such a system, hand-eye calibration is an essential component that aims to determine the transformation between a position tracking and robot-arm systems. In this paper, we propose an effective hand-eye calibration method, namely registration-based hand-eye calibration (RHC), which estimates the calibration transformation via point set registration without the need to solve the AX=XB equation. Our hand-eye calibration method consists of tool-tip pivot calibrations in two-coordinate systems, in addition to paired-point matching, where the point pairs are generated via the steady movement of the robot arm in space. After calibration, our system allows for robot-assisted, image-guided pedicle screw insertion. Comprehensive experiments are conducted to verify the efficacy of the proposed hand-eye calibration method. A mean distance deviation of 0.70 mm and a mean angular deviation of 0.68° are achieved by our system when the proposed hand-eye calibration method is used. Further experiments on drilling trajectories are conducted on plastic vertebrae as well as pig vertebrae. A mean distance deviation of 1.01 mm and a mean angular deviation of 1.11° are observed when the drilled trajectories are compared with the planned trajectories on the pig vertebrae.
Collapse
|
5
|
Abstract
A robot can identify the position of a target and complete a grasping based on the hand–eye calibration algorithm, through which the relationship between the robot coordinate system and the camera coordinate system can be established. The accuracy of the hand–eye calibration algorithm affects the real-time performance of the visual servo system and the robot manipulation. The traditional calibration technique is based on a perfect mathematical model AX = XB, in which the X represents the relationship of (A) the camera coordinate system and (B) the robot coordinate system. The traditional solution to the transformation matrix has a certain extent of limitation and instability. To solve this problem, an optimized neural-network-based hand–eye calibration method was developed to establish a non-linear relationship between robotic coordinates and pixel coordinates that can compensate for the nonlinear distortion of the camera lens. The learning process of the hand–eye calibration model can be interpreted as B=fA, which is the coordinate transformation relationship trained by the neural network. An accurate hand–eye calibration model can finally be obtained by continuously optimizing the network structure and parameters via training. Finally, the accuracy and stability of the method were verified by experiments on a robot grasping system.
Collapse
|
6
|
Richter F, Lu J, Orosco RK, Yip MC. Robotic Tool Tracking Under Partially Visible Kinematic Chain: A Unified Approach. IEEE T ROBOT 2021. [DOI: 10.1109/tro.2021.3111441] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
7
|
Özgüner O, Shkurti T, Huang S, Hao R, Jackson RC, Newman WS, Çavuşoğlu MC. Camera-Robot Calibration for the da Vinci® Robotic Surgery System. IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING : A PUBLICATION OF THE IEEE ROBOTICS AND AUTOMATION SOCIETY 2020; 17:2154-2161. [PMID: 33746640 PMCID: PMC7978174 DOI: 10.1109/tase.2020.2986503] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
The development of autonomous or semi-autonomous surgical robots stands to improve the performance of existing teleoperated equipment, but requires fine hand-eye calibration between the free-moving endoscopic camera and patient-side manipulator arms (PSMs). A novel method of solving this problem for the da Vinci® robotic surgical system and kinematically similar systems is presented. First, a series of image-processing and optical-tracking operations are performed to compute the coordinate transformation between the endoscopic camera view frame and an optical-tracking marker permanently affixed to the camera body. Then, the kinematic properties of the PSM are exploited to compute the coordinate transformation between the kinematic base frame of the PSM and an optical marker permanently affixed thereto. Using these transformations, it is then possible to compute the spatial relationship between the PSM and the endoscopic camera using only one tracker snapshot of the two markers. The effectiveness of this calibration is demonstrated by successfully guiding the PSM end effector to points of interest identified through the camera. Additional tests on a surgical task, namely grasping a surgical needle, are also performed to validate the proposed method. The resulting visually-guided robot positioning accuracy is better than the earlier hand-eye calibration results reported in the literature for the da Vinci® system, while supporting intraoperative update of the calibration and requiring only devices that are already commonly used in the surgical environment.
Collapse
Affiliation(s)
- Orhan Özgüner
- Department of Electrical Engineering and Computer Science, Case Western Reserve University, Cleveland, OH
| | - Thomas Shkurti
- Department of Electrical Engineering and Computer Science, Case Western Reserve University, Cleveland, OH
| | - Siqi Huang
- Department of Electrical Engineering and Computer Science, Case Western Reserve University, Cleveland, OH
| | - Ran Hao
- Department of Electrical Engineering and Computer Science, Case Western Reserve University, Cleveland, OH
| | - Russell C Jackson
- Department of Electrical Engineering and Computer Science, Case Western Reserve University, Cleveland, OH
| | - Wyatt S Newman
- Department of Electrical Engineering and Computer Science, Case Western Reserve University, Cleveland, OH
| | - M Cenk Çavuşoğlu
- Department of Electrical Engineering and Computer Science, Case Western Reserve University, Cleveland, OH
| |
Collapse
|
8
|
Sun Y, Pan B, Guo Y, Fu Y, Niu G. Vision-based hand-eye calibration for robot-assisted minimally invasive surgery. Int J Comput Assist Radiol Surg 2020; 15:2061-2069. [PMID: 32808149 DOI: 10.1007/s11548-020-02245-5] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2020] [Accepted: 08/07/2020] [Indexed: 11/24/2022]
Abstract
PURPOSE The knowledge of laparoscope vision can greatly improve the surgical operation room (OR) efficiency. For the vision-based computer-assisted surgery, the hand-eye calibration establishes the coordinate relationship between laparoscope and robot slave arm. While significant advances have been made for hand-eye calibration in recent years, efficient algorithm for minimally invasive surgical robot is still a major challenge. Removing the external calibration object in abdominal environment to estimate the hand-eye transformation is still a critical problem. METHODS We propose a novel hand-eye calibration algorithm to tackle the problem which relies purely on surgical instrument already in the operating scenario for robot-assisted minimally invasive surgery (RMIS). Our model is formed by the geometry information of the surgical instrument and the remote center-of-motion (RCM) constraint. We also enhance the algorithm with stereo laparoscope model. RESULTS Promising validation of synthetic simulation and experimental surgical robot system have been conducted to evaluate the proposed method. We report results that the proposed method can exhibit the hand-eye calibration without calibration object. CONCLUSION Vision-based hand-eye calibration is developed. We demonstrate the feasibility to perform hand-eye calibration by taking advantage of the components of surgical robot system, leading to the efficiency of surgical OR.
Collapse
Affiliation(s)
- Yanwen Sun
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China
| | - Bo Pan
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China.
| | - Yongchen Guo
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China
| | - Yili Fu
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China
| | - Guojun Niu
- School of Mechanical Engineering and Automation, Zhejiang Sci-Tech University, Hangzhou, China
| |
Collapse
|
9
|
Zhong F, Wang Z, Chen W, He K, Wang Y, Liu YH. Hand-Eye Calibration of Surgical Instrument for Robotic Surgery Using Interactive Manipulation. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2967685] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
10
|
Robust and Accurate Hand-Eye Calibration Method Based on Schur Matric Decomposition. SENSORS 2019; 19:s19204490. [PMID: 31623249 PMCID: PMC6832585 DOI: 10.3390/s19204490] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/04/2019] [Revised: 10/12/2019] [Accepted: 10/14/2019] [Indexed: 12/02/2022]
Abstract
To improve the accuracy and robustness of hand–eye calibration, a hand–eye calibration method based on Schur matric decomposition is proposed in this paper. The accuracy of these methods strongly depends on the quality of observation data. Therefore, preprocessing observation data is essential. As with traditional two-step hand–eye calibration methods, we first solve the rotation parameters and then the translation vector can be immediately determined. A general solution was obtained from one observation through Schur matric decomposition and then the degrees of freedom were decreased from three to two. Observation data preprocessing is one of the basic unresolved problems with hand–eye calibration methods. A discriminant equation to delete outliers was deduced based on Schur matric decomposition. Finally, the basic problem of observation data preprocessing was solved using outlier detection, which significantly improved robustness. The proposed method was validated by both simulations and experiments. The results show that the prediction error of rotation and translation was 0.06 arcmin and 1.01 mm respectively, and the proposed method performed much better in outlier detection. A minimal configuration for the unique solution was proven from a new perspective.
Collapse
|