1
|
Zhou K, Huang X, Li S, Li G. Convolutional neural network-based pose mapping estimation as an alternative to traditional hand-eye calibration. THE REVIEW OF SCIENTIFIC INSTRUMENTS 2023; 94:065002. [PMID: 37862475 DOI: 10.1063/5.0147783] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2023] [Accepted: 05/13/2023] [Indexed: 10/22/2023]
Abstract
The vision system is a crucial technology for realizing the automation and intelligence of industrial robots, and the accuracy of hand-eye calibration is crucial in determining the relationship between the camera and robot end. Parallel robots are widely used in automated assembly due to their high positioning accuracy and large carrying capacity, but traditional hand-eye calibration methods may not be applicable due to their limited motion range and resulting accuracy problems. To address this issue, we propose using a pose, nonlinear mapping estimation method to solve the hand-eye calibration problem and have constructed a 1-D pose estimation convolutional neural network (PECNN) with excellent performance, through experiments and discussions. The PECNN achieves an end-to-end mapping of the variation of the target object pose to the variation of the robot end pose. Our experiments have shown that the proposed hand-eye calibration method has high accuracy and can be applied to the automated assembly tasks of vision-guided parallel robots. Moreover, the method is also applicable to most parallel robots and tandem robots.
Collapse
Affiliation(s)
- Kuai Zhou
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
| | - Xiang Huang
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
| | - Shuanggao Li
- College of Mechanical and Electrical Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
| | - Gen Li
- Suzhou Research Institute, Nanjing University of Aeronautics and Astronautics, Suzhou, China
| |
Collapse
|
2
|
Sever K, Golušin LM, Lončar J. Optimization of Gradient Descent Parameters in Attitude Estimation Algorithms. SENSORS (BASEL, SWITZERLAND) 2023; 23:2298. [PMID: 36850898 PMCID: PMC9962275 DOI: 10.3390/s23042298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Revised: 02/15/2023] [Accepted: 02/16/2023] [Indexed: 06/18/2023]
Abstract
Attitude estimation methods provide modern consumer, industrial, and space systems with an estimate of a body orientation based on noisy sensor measurements. The gradient descent algorithm is one of the most recent methods for optimal attitude estimation, whose iterative nature demands adequate adjustment of the algorithm parameters, which is often overlooked in the literature. Here, we present the effects of the step size, the maximum number of iterations, and the initial quaternion, as well as different propagation methods on the quality of the estimation in noiseless and noisy conditions. A novel figure of merit and termination criterion that defines the algorithm's accuracy is proposed. Furthermore, the guidelines for selecting the optimal set of parameters in order to achieve the highest accuracy of the estimate using the fewest iterations are proposed and verified in simulations and experimentally based on the measurements acquired from an in-house developed model of a satellite attitude determination and control system. The proposed attitude estimation method based on the gradient descent algorithm and complementary filter automatically adjusts the number of iterations with the average below 0.5, reducing the demand on the processing power and energy consumption and causing it to be suitable for low-power applications.
Collapse
|
3
|
Li X, Xiao Y, Wang B, Ren H, Zhang Y, Ji J. Automatic targetless LiDAR–camera calibration: a survey. Artif Intell Rev 2022. [DOI: 10.1007/s10462-022-10317-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
4
|
Sun W, Liu J, Zhao Y, Zheng G. A Novel Point Set Registration-Based Hand-Eye Calibration Method for Robot-Assisted Surgery. SENSORS (BASEL, SWITZERLAND) 2022; 22:8446. [PMID: 36366144 PMCID: PMC9656731 DOI: 10.3390/s22218446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/16/2022] [Revised: 10/29/2022] [Accepted: 10/31/2022] [Indexed: 06/16/2023]
Abstract
Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and safety when compared with manual implantation. In developing such a system, hand-eye calibration is an essential component that aims to determine the transformation between a position tracking and robot-arm systems. In this paper, we propose an effective hand-eye calibration method, namely registration-based hand-eye calibration (RHC), which estimates the calibration transformation via point set registration without the need to solve the AX=XB equation. Our hand-eye calibration method consists of tool-tip pivot calibrations in two-coordinate systems, in addition to paired-point matching, where the point pairs are generated via the steady movement of the robot arm in space. After calibration, our system allows for robot-assisted, image-guided pedicle screw insertion. Comprehensive experiments are conducted to verify the efficacy of the proposed hand-eye calibration method. A mean distance deviation of 0.70 mm and a mean angular deviation of 0.68° are achieved by our system when the proposed hand-eye calibration method is used. Further experiments on drilling trajectories are conducted on plastic vertebrae as well as pig vertebrae. A mean distance deviation of 1.01 mm and a mean angular deviation of 1.11° are observed when the drilled trajectories are compared with the planned trajectories on the pig vertebrae.
Collapse
|
5
|
Enebuse I, Ibrahim BKSMK, Foo M, Matharu RS, Ahmed H. Accuracy evaluation of hand-eye calibration techniques for vision-guided robots. PLoS One 2022; 17:e0273261. [PMID: 36260640 PMCID: PMC9581431 DOI: 10.1371/journal.pone.0273261] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2022] [Accepted: 08/04/2022] [Indexed: 11/07/2022] Open
Abstract
Hand-eye calibration is an important step in controlling a vision-guided robot in applications like part assembly, bin picking and inspection operations etc. Many methods for estimating hand-eye transformations have been proposed in literature with varying degrees of complexity and accuracy. However, the success of a vision-guided application is highly impacted by the accuracy the hand-eye calibration of the vision system with the robot. The level of this accuracy depends on several factors such as rotation and translation noise, rotation and translation motion range that must be considered during calibration. Previous studies and benchmarking of the proposed algorithms have largely been focused on the combined effect of rotation and translation noise. This study provides insight on the impact of rotation and translation noise acting in isolation on the hand-eye calibration accuracy. This deviates from the most common method of assessing hand-eye calibration accuracy based on pose noise (combined rotation and translation noise). We also evaluated the impact of the robot motion range used during the hand-eye calibration operation which is rarely considered. We provide quantitative evaluation of our study using six commonly used algorithms from an implementation perspective. We comparatively analyse the performance of these algorithms through simulation case studies and experimental validation using the Universal Robot’s UR5e physical robots. Our results show that these different algorithms perform differently when the noise conditions vary rather than following a general trend. For example, the simultaneous methods are more resistant to rotation noise, whereas the separate methods are better at dealing with translation noise. Additionally, while increasing the robot rotation motion span during calibration enhances the accuracy of the separate methods, it has a negative effect on the simultaneous methods. Conversely, increasing the translation motion range improves the accuracy of simultaneous methods but degrades the accuracy of the separate methods. These findings suggest that those conditions should be considered when benchmarking algorithms or performing a calibration process for enhanced accuracy.
Collapse
Affiliation(s)
- Ikenna Enebuse
- Centre for Future Transport and Cities, Coventry University, Coventry, United Kingdom
| | | | - Mathias Foo
- School of Engineering, University of Warwick, Coventry, United Kingdom
| | - Ranveer S. Matharu
- Centre for Future Transport and Cities, Coventry University, Coventry, United Kingdom
| | - Hafiz Ahmed
- Nuclear Futures Institute, Bangor University, Bangor, United Kingdom
- * E-mail:
| |
Collapse
|
6
|
Significance of Camera Pixel Error in the Calibration Process of a Robotic Vision System. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12136406] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/10/2022]
Abstract
Although robotic vision systems offer a promising technology solution for rapid and reconfigurable in-process 3D inspection of complex and large parts in contemporary manufacturing, measurement accuracy poses a challenge for its wide deployment. One of the key issues in adopting a robotic vision system is to understand the extent of its measurement errors which are directly correlated with the calibration process. In this paper, a possible source of practical and inherent measurement uncertainties involved in the calibration process of a robotic vision system are discussed. The system considered in this work consists of an image sensor mounted on an industrial robot manipulator with six degrees of freedom. Based on a series of experimental tests and computer simulations, the paper gives a comprehensive performance comparison of different calibration approaches and shows the impact of measurement uncertainties on the calibration process. It has been found from the error sensitivity analysis that minor uncertainties in the calibration process can significantly affect the accuracy of the robotic vision system. Further investigations suggest that inducing errors in image calibration patterns can have an adverse effect on the hand–eye calibration process compared to the angular errors in the robot joints.
Collapse
|
7
|
Wu J, Liu M, Huang Y, Jin C, Wu Y, Yu C. SE(n)++: An Efficient Solution to Multiple Pose Estimation Problems. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:3829-3840. [PMID: 32877345 DOI: 10.1109/tcyb.2020.3015039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
In robotic applications, many pose problems involve solving the homogeneous transformation based on the special Euclidean group SE(n) . However, due to the nonconvexity of SE(n) , many of these solvers treat rotation and translation separately, and the computational efficiency is still unsatisfactory. A new technique called the SE(n)++ is proposed in this article that exploits a novel mapping from SE(n) to SO(n + 1) . The mapping transforms the coupling between rotation and translation into a unified formulation on the Lie group and gives better analytical results and computational performances. Specifically, three major pose problems are considered in this article, that is, the point-cloud registration, the hand-eye calibration, and the SE(n) synchronization. Experimental validations have confirmed the effectiveness of the proposed SE(n)++ method in open datasets.
Collapse
|
8
|
Sarabandi S, Porta JM, Thomas F. Hand-Eye Calibration Made Easy Through a Closed-Form Two-Stage Method. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3146943] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
9
|
Abstract
Abstract
A classic hand-eye system involves hand-eye calibration and robot-world and hand-eye calibration. Insofar as hand-eye calibration can solve only hand-eye transformation, this study aims to determine the robot-world and hand-eye transformations simultaneously based on the robot-world and hand-eye equation. According to whether the rotation part and the translation part of the equation are decoupled, the methods can be divided into separable solutions and simultaneous solutions. The separable solutions solve the rotation part before solving the translation part, so the estimated errors of the rotation will be transferred to the translation. In this study, a method was proposed for calculation with rotation and translation coupling; a closed-form solution based on Kronecker product and an iterative solution based on the Gauss–Newton algorithm were involved. The feasibility was further tested using simulated data and real data, and the superiority was verified by comparison with the results obtained by the available method. Finally, we improved a method that can solve the singularity problem caused by the parameterization of the rotation matrix, which can be widely used in the robot-world and hand-eye calibration. The results show that the prediction errors of rotation and translation based on the proposed method be reduced to
$0.26^\circ$
and
$1.67$
mm, respectively.
Collapse
|
10
|
Zhang Y, Qiu Z, Zhang X. Calibration method for hand-eye system with rotation and translation couplings. APPLIED OPTICS 2019; 58:5375-5387. [PMID: 31504005 DOI: 10.1364/ao.58.005375] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2019] [Accepted: 06/11/2019] [Indexed: 06/10/2023]
Abstract
This paper develops a novel hand-eye calibration method for hand-eye systems with rotation and translation coupling terms. First, a nonlinear camera model with distortion terms and a model of a hand-eye system with rotation and translation coupling terms are established. Based on a non-linear optimization method and a reverse projection method, a decoupling calibration method for a lower-degree-of-freedom hand-eye system is proposed. Then the path planning for the calibration process is carried out. Based on the analysis of coupling constraints and hand-eye system motion constraints, three types of hand-eye calibration paths with high efficiency and easy operation are developed. In addition, the influence of key parameters on hand-eye calibration accuracy is analyzed. Finally, calibration experiments and parametric influence experiments are carried out. The results demonstrate that the proposed method is effective and practical for calibrating the hand-eye system.
Collapse
|
11
|
Koide K, Menegatti E. General Hand–Eye Calibration Based on Reprojection Error Minimization. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2893612] [Citation(s) in RCA: 31] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
12
|
Stereo Camera Head-Eye Calibration Based on Minimum Variance Approach Using Surface Normal Vectors. SENSORS 2018; 18:s18113706. [PMID: 30384481 PMCID: PMC6263920 DOI: 10.3390/s18113706] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/06/2018] [Revised: 10/19/2018] [Accepted: 10/29/2018] [Indexed: 11/17/2022]
Abstract
This paper presents a stereo camera-based head-eye calibration method that aims to find the globally optimal transformation between a robot’s head and its eye. This method is highly intuitive and simple, so it can be used in a vision system for humanoid robots without any complex procedures. To achieve this, we introduce an extended minimum variance approach for head-eye calibration using surface normal vectors instead of 3D point sets. The presented method considers both positional and orientational error variances between visual measurements and kinematic data in head-eye calibration. Experiments using both synthetic and real data show the accuracy and efficiency of the proposed method.
Collapse
|
13
|
Hunt CL, Sharma A, Osborn LE, Kaliki RR, Thakor NV. Predictive trajectory estimation during rehabilitative tasks in augmented reality using inertial sensors. IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE : HEALTHCARE TECHNOLOGY : [PROCEEDINGS]. IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE 2018; 2018:10.1109/biocas.2018.8584805. [PMID: 38501114 PMCID: PMC10947724 DOI: 10.1109/biocas.2018.8584805] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/20/2024]
Abstract
This paper presents a wireless kinematic tracking framework used for biomechanical analysis during rehabilitative tasks in augmented and virtual reality. The framework uses low-cost inertial measurement units and exploits the rigid connections of the human skeletal system to provide egocentric position estimates of joints to centimeter accuracy. On-board sensor fusion combines information from three-axis accelerometers, gyroscopes, and magnetometers to provide robust estimates in real-time. Sensor precision and accuracy were validated using the root mean square error of estimated joint angles against ground truth goniometer measurements. The sensor network produced a mean estimate accuracy of 2.81° with 1.06° precision, resulting in a maximum hand tracking error of 7.06 cm. As an application, the network is used to collect kinematic information from an unconstrained object manipulation task in augmented reality, from which dynamic movement primitives are extracted to characterize natural task completion in N = 3 able-bodied human subjects. These primitives are then leveraged for trajectory estimation in both a generalized and a subject-specific scheme resulting in 0.187 cm and 0.161 cm regression accuracy, respectively. Our proposed kinematic tracking network is wireless, accurate, and especially useful for predicting voluntary actuation in virtual and augmented reality applications.
Collapse
Affiliation(s)
- Christopher L Hunt
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218 USA
| | - Avinash Sharma
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218 USA
| | - Luke E Osborn
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218 USA
| | - Rahul R Kaliki
- Infinite Biomedical Technologies, LLC, Baltimore, MD 21218 USA
| | - Nitish V Thakor
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore, MD 21218 USA
- Singapore Institute for Neurotechnology, National University of Singapore, 119077 Singapore
| |
Collapse
|
14
|
Pachtrachai K, Vasconcelos F, Chadebecq F, Allan M, Hailes S, Pawar V, Stoyanov D. Adjoint Transformation Algorithm for Hand-Eye Calibration with Applications in Robotic Assisted Surgery. Ann Biomed Eng 2018; 46:1606-1620. [PMID: 30051249 PMCID: PMC6154014 DOI: 10.1007/s10439-018-2097-4] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2017] [Accepted: 07/17/2018] [Indexed: 11/30/2022]
Abstract
Hand–eye calibration aims at determining the unknown rigid transformation between the coordinate systems of a robot arm and a camera. Existing hand–eye algorithms using closed-form solutions followed by iterative non-linear refinement provide accurate calibration results within a broad range of robotic applications. However, in the context of surgical robotics hand–eye calibration is still a challenging problem due to the required accuracy within the millimetre range, coupled with a large displacement between endoscopic cameras and the robot end-effector. This paper presents a new method for hand–eye calibration based on the adjoint transformation of twist motions that solves the problem iteratively through alternating estimations of rotation and translation. We show that this approach converges to a solution with a higher accuracy than closed form initializations within a broad range of synthetic and real experiments. We also propose a stereo hand–eye formulation that can be used in the context of both our proposed method and previous state-of-the-art closed form solutions. Experiments with real data are conducted with a stereo laparoscope, the KUKA robot arm manipulator, and the da Vinci surgical robot, showing that both our new alternating solution and the explicit representation of stereo camera hand–eye relations contribute to a higher calibration accuracy.
Collapse
Affiliation(s)
- Krittin Pachtrachai
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK.
| | - Francisco Vasconcelos
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK
| | - François Chadebecq
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK
| | - Max Allan
- Intuitive Surgical, Sunnyvale, CA, USA
| | - Stephen Hailes
- Department of Computer Science, University College London, London, UK
| | - Vijay Pawar
- Department of Computer Science, University College London, London, UK
| | - Danail Stoyanov
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK
| |
Collapse
|
15
|
Pachtrachai K, Vasconcelos F, Dwyer G, Pawar V, Hailes S, Stoyanov D. CHESS—Calibrating the Hand-Eye Matrix With Screw Constraints and Synchronization. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2018.2800088] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
16
|
Wang Z, Liu Z, Ma Q, Cheng A, Liu YH, Kim S, Deguet A, Reiter A, Kazanzides P, Taylor RH. Vision-Based Calibration of Dual RCM-Based Robot Arms in Human-Robot Collaborative Minimally Invasive Surgery. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2017.2737485] [Citation(s) in RCA: 38] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
17
|
Shah M, Bostelman R, Legowik S, Hong T. Calibration of mobile manipulators using 2D positional features. MEASUREMENT : JOURNAL OF THE INTERNATIONAL MEASUREMENT CONFEDERATION 2018; 124:10.1016/j.measurement.2018.04.024. [PMID: 30996508 PMCID: PMC6463307 DOI: 10.1016/j.measurement.2018.04.024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Robotic manipulators are increasingly being attached to Automatic Ground Vehicles (AGVs) to aid in the efficiency of assembly for manufacturing systems. However, calibrating these mobile manipulators is difficult as the offset between the robotic manipulator and the AGV is often unknown. This paper provides a novel, simple, and low-cost method for calibrating and measuring the performance of mobile manipulators by using data collected from a laser retroreflector that digitally detects the horizontal two-dimensional (2D) position of reflectors on an artifact as well as a navigation system that provides the heading angle and 2D position of the AGV. The method is mathematically presented by providing a closed form solution to the positional component of the 2D robotworld/hand-eye calibration problem AX Y= B. The method is then applied to simulated data as well as data collected in a laboratory setting and compared to other methods.
Collapse
Affiliation(s)
- Mili Shah
- Department of Mathematics and Statistics, Loyola University Maryland, 4501 North Charles Street, Baltimore, MD 21210, United States
- Intelligent Systems Division, National Institute of Standards and Technology, 100 Bureau Drive, Gaithersburg, MD 20899, United States
| | - Roger Bostelman
- Intelligent Systems Division, National Institute of Standards and Technology, 100 Bureau Drive, Gaithersburg, MD 20899, United States
- Le2i, Universite de Bourgogne, BP 47870, 21078 Dijon, France
| | - Steven Legowik
- Robotic Research, LLC, 555 Quince Orchard Road, Gaithersburg, MD 20878, United States
| | - Tsai Hong
- Intelligent Systems Division, National Institute of Standards and Technology, 100 Bureau Drive, Gaithersburg, MD 20899, United States
| |
Collapse
|
18
|
A computationally efficient method for hand-eye calibration. Int J Comput Assist Radiol Surg 2017; 12:1775-1787. [PMID: 28726116 PMCID: PMC5608875 DOI: 10.1007/s11548-017-1646-x] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2017] [Accepted: 07/10/2017] [Indexed: 11/05/2022]
Abstract
Purpose Surgical robots with cooperative control and semiautonomous features have shown increasing clinical potential, particularly for repetitive tasks under imaging and vision guidance. Effective performance of an autonomous task requires accurate hand–eye calibration so that the transformation between the robot coordinate frame and the camera coordinates is well defined. In practice, due to changes in surgical instruments, online hand–eye calibration must be performed regularly. In order to ensure seamless execution of the surgical procedure without affecting the normal surgical workflow, it is important to derive fast and efficient hand–eye calibration methods. Methods We present a computationally efficient iterative method for hand–eye calibration. In this method, dual quaternion is introduced to represent the rigid transformation, and a two-step iterative method is proposed to recover the real and dual parts of the dual quaternion simultaneously, and thus the estimation of rotation and translation of the transformation. Results The proposed method was applied to determine the rigid transformation between the stereo laparoscope and the robot manipulator. Promising experimental and simulation results have shown significant convergence speed improvement to 3 iterations from larger than 30 with regard to standard optimization method, which illustrates the effectiveness and efficiency of the proposed method.
Collapse
|
19
|
Registration of a hybrid robot using the Degradation-Kronecker method and a purely nonlinear method. ROBOTICA 2016. [DOI: 10.1017/s0263574715000338] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
SUMMARYAlthough the registration of a robot is crucial in order to identify its pose with respect to a tracking system, there is no reported solution to address this issue for a hybrid robot. Different from classical registration, the registration of a hybrid robot requires the need to solve an equation with three unknowns where two of these unknowns are coupled together. This property makes it difficult to obtain a closed-form solution. This paper is a first attempt to solve the registration of a hybrid robot. The Degradation-Kronecker (D-K) method is proposed as an optimal closed-form solution for the registration of a hybrid robot in this paper. Since closed-form methods generally suffer from limited accuracy, a purely nonlinear (PN) method is proposed to complement the D-K method. With simulation and experiment results, it has been found that both methods are robust. The PN method is more accurate but slower as compared to the D-K method. The fast computation property of the D-K method makes it appropriate to be applied in real-time circumstances, while the PN method is suitable to be applied where good accuracy is preferred.
Collapse
|
20
|
Ulrich M, Steger C. Hand-eye calibration of SCARA robots using dual quaternions. PATTERN RECOGNITION AND IMAGE ANALYSIS 2016. [DOI: 10.1134/s1054661816010272] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
21
|
Abstract
Whenever a sensor is mounted on a robot hand, it is important to know the relationship between the sensor and the hand. The problem of determining this relationship is referred to as the hand-eye calibration problem. Hand-eye calibration is impor tant in at least two types of tasks: (1) map sensor centered measurements into the robot workspace frame and (2) tasks allowing the robot to precisely move the sensor. In the past some solutions were proposed, particularly in the case of the sensor being a television camera. With almost no exception, all existing solutions attempt to solve a homogeneous matrix equation of the form AX = X B. This article has the following main contributions. First we show that there are two possible formulations of the hand-eye calibration problem. One formu lation is the classic one just mentioned. A second formulation takes the form of the following homogeneous matrix equation: MY = M'YB. The advantage of the latter formulation is that the extrinsic and intrinsic parameters of the camera need not be made explicit. Indeed, this formulation directly uses the 3 x4 perspective matrices (M andM' ) associated with two positions of the camera with respect to the calibration frame. Moreover, this formulation together with the classic one covers a wider range of camera-based sensors to be calibrated with respect to the robot hand: single scan-line cameras, stereo heads, range finders, etc. Second, we develop a common mathematical framework to solve for the hand-eye calibration problem using either of the two formulations. We represent rotation by a unit quaternion and present two methods: (1) a closed-form solution for solving for rotation using unit quaternions and then solving for translation and (2) a nonlinear technique for simultane ously solving for rotation and translation. Third, we perform a stability analysis both for our two methods and for the lin ear method developed by Tsai and Lenz (1989). This analysis allows the comparison of the three methods. In light of this comparison, the nonlinear optimization method, which solves for rotation and translation simultaneously, seems to be the most robust one with respect to noise and measurement errors.
Collapse
Affiliation(s)
- Radu Horaud
- LIFIA-IMAG and Inria Rhône-Alpes 46, avenue Félix Viallet 38031 Grenoble, France
| | - Fadi Dornaika
- LIFIA-IMAG and Inria Rhône-Alpes 46, avenue Félix Viallet 38031 Grenoble, France
| |
Collapse
|
22
|
Abstract
To relate measurements made by a sensor mounted on a mechanical link to the robot’s coordinate frame, we must first estimate the transformation between these two frames. Many algorithms have been proposed for this so-called hand-eye calibration, but they do not treat the relative position and orientation in a unified way. In this paper, we introduce the use of dual quaternions, which are the algebraic counterpart of screws. Then we show how a line transformation can be written with the dual-quaternion product. We algebraically prove that if we consider the camera and motor transformations as screws, then only the line coefficients of the screw axes are relevant regarding the hand-eye calibration. The dual-quaternion parameterization facilitates a new simultaneous solution for the hand-eye rotation and translation using the singular value decomposition. Real-world performance is assessed directly in the application of hand-eye information for stereo reconstruction, as well as in the positioning of cameras. Both real and synthetic experiments show the superiority of the approach over two other proposed methods.
Collapse
|
23
|
Abstract
In this paper, we propose a new flexible method for hand-eye calibration. The vast majority of existing hand-eye calibration techniques require a calibration rig that is used in conjunction with camera pose estimation methods. Instead, we combine structure-from-motion with known robot motions, and we show that the solution can be obtained in linear form. The latter solves for both the hand-eye parameters and the unknown scale factor inherent with structure-from-motion methods. The algebraic analysis that is made possible with such a linear formulation allows investigation of not only the well-known case of general screw motions but also of such singular motions as pure translations, pure rotations, and planar motions. In essence, the robot-mounted camera looks to an unknown rigid layout, tracks points over an image sequence, and estimates the camera-to-robot relationship. Such a self-calibration process is relevant for unmanned vehicles, robots working in remote places, and so forth. We conduct a large number of experiments that validate the quality of the method by comparing it with existing ones.
Collapse
Affiliation(s)
- Nicolas Andreff
- Institut Français de Mécanique Avancée, BP 265, 63175 Aubière Cedex, France
| | - Radu Horaud
- INRIA Rhône-Alpes and GRAVIR-IMAG, 655, av. de l’Europe, 38330 Montbonnot Saint Martin, France
| | - Bernard Espiau
- INRIA Rhône-Alpes and GRAVIR-IMAG, 655, av. de l’Europe, 38330 Montbonnot Saint Martin, France
| |
Collapse
|
24
|
Thompson S, Stoyanov D, Schneider C, Gurusamy K, Ourselin S, Davidson B, Hawkes D, Clarkson MJ. Hand-eye calibration for rigid laparoscopes using an invariant point. Int J Comput Assist Radiol Surg 2016; 11:1071-80. [PMID: 26995597 PMCID: PMC4893361 DOI: 10.1007/s11548-016-1364-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2016] [Accepted: 02/24/2016] [Indexed: 01/22/2023]
Abstract
PURPOSE Laparoscopic liver resection has significant advantages over open surgery due to less patient trauma and faster recovery times, yet it can be difficult due to the restricted field of view and lack of haptic feedback. Image guidance provides a potential solution but one current challenge is in accurate "hand-eye" calibration, which determines the position and orientation of the laparoscope camera relative to the tracking markers. METHODS In this paper, we propose a simple and clinically feasible calibration method based on a single invariant point. The method requires no additional hardware, can be constructed by theatre staff during surgical setup, requires minimal image processing and can be visualised in real time. Real-time visualisation allows the surgical team to assess the calibration accuracy before use in surgery. In addition, in the laboratory, we have developed a laparoscope with an electromagnetic tracking sensor attached to the camera end and an optical tracking marker attached to the distal end. This enables a comparison of tracking performance. RESULTS We have evaluated our method in the laboratory and compared it to two widely used methods, "Tsai's method" and "direct" calibration. The new method is of comparable accuracy to existing methods, and we show RMS projected error due to calibration of 1.95 mm for optical tracking and 0.85 mm for EM tracking, versus 4.13 and 1.00 mm respectively, using existing methods. The new method has also been shown to be workable under sterile conditions in the operating room. CONCLUSION We have proposed a new method of hand-eye calibration, based on a single invariant point. Initial experience has shown that the method provides visual feedback, satisfactory accuracy and can be performed during surgery. We also show that an EM sensor placed near the camera would provide significantly improved image overlay accuracy.
Collapse
Affiliation(s)
- Stephen Thompson
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK.
| | - Danail Stoyanov
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| | - Crispin Schneider
- Division of Surgery, Hampstead Campus, UCL Medical School, Royal Free Hospital, 9th Floor, Rowland Hill Street, London, UK
| | - Kurinchi Gurusamy
- Division of Surgery, Hampstead Campus, UCL Medical School, Royal Free Hospital, 9th Floor, Rowland Hill Street, London, UK
| | - Sébastien Ourselin
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| | - Brian Davidson
- Division of Surgery, Hampstead Campus, UCL Medical School, Royal Free Hospital, 9th Floor, Rowland Hill Street, London, UK
| | - David Hawkes
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| | - Matthew J Clarkson
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| |
Collapse
|
25
|
Heller J, Havlena M, Pajdla T. Globally Optimal Hand-Eye Calibration Using Branch-and-Bound. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2016; 38:1027-1033. [PMID: 26353364 DOI: 10.1109/tpami.2015.2469299] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
This paper introduces a novel solution to the hand-eye calibration problem. It uses camera measurements directly and, at the same time, requires neither prior knowledge of the external camera calibrations nor a known calibration target. Our algorithm uses branch-and-bound approach to minimize an objective function based on the epipolar constraint. Further, it employs Linear Programming to decide the bounding step of the algorithm.Our technique is able to recover both the unknown rotation and translation simultaneously and the solution is guaranteed to be globally optimal with respect to the L∞-norm.
Collapse
|
26
|
Vemuri AS, Nicolau S, Sportes A, Marescaux J, Soler L, Ayache N. Interoperative Biopsy Site Relocalization in Endoluminal Surgery. IEEE Trans Biomed Eng 2015; 63:1862-1873. [PMID: 26625405 DOI: 10.1109/tbme.2015.2503981] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
Barrett's oesophagus, a premalignant condition of the oesophagus has been on a rise in the recent years. The standard diagnostic protocol for Barrett's involves obtaining biopsies at suspicious regions along the oesophagus. The localization and tracking of these biopsy sites "interoperatively" poses a significant challenge for providing targeted treatments and tracking disease progression. This paper proposes an approach to provide guided navigation and relocalization of the biopsy sites using an electromagnetic tracking system. The characteristic of our approach over existing ones is the integration of an electromagnetic sensor at the flexible endoscope tip, so that the endoscopic camera depth inside the oesophagus can be computed in real time, allowing to retrieve and display an image from a previous exploration at the same depth. We first describe our system setup and methodology for interoperative registration. We then propose three incremental experiments of our approach. First, on synthetic data with realistic noise model to analyze the error bounds of our system. The second on in vivo pig data using an optical tracking system to provide a pseudo ground truth. Accuracy results obtained were consistent with the synthetic experiments despite uncertainty introduced due to breathing motion, and remain inside acceptable error margin according to medical experts. Finally, a third experiment designed using data from pigs to simulate a real task of biopsy site relocalization, and evaluated by ten gastro-intestinal experts. It clearly demonstrated the benefit of our system toward assisted guidance by improving the biopsy site retrieval rate from 47.5% to 94%.
Collapse
|
27
|
Malti A, Barreto JP. Hand-eye and radial distortion calibration for rigid endoscopes. Int J Med Robot 2013; 9:441-54. [PMID: 23303645 DOI: 10.1002/rcs.1478] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/26/2012] [Indexed: 11/05/2022]
Abstract
BACKGROUND In this paper, we propose a non-linear calibration method for hand-eye system equipped with a camera undergoing radial distortion as the rigid endoscope. Whereas classic methods propose either a separated estimation of the camera intrinsics and the hand-eye transform or a mixed non-linear estimation of both hand-eye and camera intrinsics assuming a pin-hole model, the proposed approach enables a simultaneous refinement of the hand-eye and the camera parameters including the distortion factor with only three frames of the calibrated pattern. METHODS Our approach relies on three steps: (i) linear initial estimates of hand-eye and radial distortion with minimum number of frames: one single image to estimate the radial distortion and three frames to estimate the initial hand-eye transform, (ii) we propose to express the camera extrinsic with respect to hand-eye and world-grid transforms and (iii) we run bundle adjustment on the reprojection error with respect to the distortion parameters, the camera intrinsics and the hand-eye transform. RESULTS Our method is quantitatively compared with state-of-the-art linear and non-linear methods. We show that our method provides a 3D reconstruction error of approximately 5% of the size of the 3D shape. CONCLUSIONS Our experimental results show the effectiveness of simultaneously estimating hand-eye and distortion parameters for 3D reconstruction.
Collapse
Affiliation(s)
- Abed Malti
- ALCoV-ISIT, UMR 6284 CNRS/Université d'Auvergne, Clermont-Ferrand, France
| | | |
Collapse
|
28
|
Ernst F, Richter L, Matthäus L, Martens V, Bruder R, Schlaefer A, Schweikard A. Non-orthogonal tool/flange and robot/world calibration. Int J Med Robot 2012; 8:407-20. [DOI: 10.1002/rcs.1427] [Citation(s) in RCA: 44] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/30/2012] [Indexed: 11/07/2022]
Affiliation(s)
| | | | - Lars Matthäus
- Eemagine Medical Imaging Solutions GmbH; 10243; Berlin; Germany
| | - Volker Martens
- Institute for Robotics and Cognitive Systems; University of Lübeck; Ratzeburger Allee 160; 23538; Lübeck; Germany
| | - Ralf Bruder
- Institute for Robotics and Cognitive Systems; University of Lübeck; Ratzeburger Allee 160; 23538; Lübeck; Germany
| | | | | |
Collapse
|
29
|
Abstract
SUMMARYWhen computer vision technique is used in robotics, robotic hand–eye calibration is a very important research task. Many algorithms have been proposed for hand–eye calibration. Based on these algorithms, we introduce a new hand–eye calibration algorithm in this paper, which employs the screw motion theory to establish a hand–eye matrix equation by using quaternion and gets a simultaneous result for rotation and translation by solving linear equations. The algorithm proposed in this paper has high accuracy and stable computational efficiency and can be understood easily. Both simulations and real experiments show the superiority of our algorithm over the comparative algorithms.
Collapse
|
30
|
Comparing calibration approaches for 3D ultrasound probes. Int J Comput Assist Radiol Surg 2008; 4:203-13. [DOI: 10.1007/s11548-008-0258-x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2008] [Accepted: 09/14/2008] [Indexed: 10/21/2022]
|
31
|
Mirzaei F, Roumeliotis S. A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation. IEEE T ROBOT 2008. [DOI: 10.1109/tro.2008.2004486] [Citation(s) in RCA: 302] [Impact Index Per Article: 18.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
32
|
Abstract
This paper presents new vector quantization based methods for selecting well-suited data for hand-eye calibration from a given sequence of hand and eye movements. Data selection can improve the accuracy of classic hand-eye calibration, and make it possible in the first place in situations where the standard approach of manually selecting positions is inconvenient or even impossible, especially when using continuously recorded data. A variety of methods is proposed, which differ from each other in the dimensionality of the vector quantization compared to the degrees of freedom of the rotation representation, and how the rotation angle is incorporated. The performance of the proposed vector quantization based data selection methods is evaluated using data obtained from a manually moved optical tracking system (hand) and an endoscopic camera (eye).
Collapse
Affiliation(s)
- Jochen Schmidt
- Centre for Artificial Intelligence Research Auckland University of Technology Auckland, New Zealand,
| | - Heinrich Niemann
- Lehrstuhl für Mustererkennung Universität Erlangen-Nürnberg 91058 Erlangen, Germany
| |
Collapse
|
33
|
Fassi I, Legnani G. Hand to sensor calibration: A geometrical interpretation of the matrix equation. ACTA ACUST UNITED AC 2005. [DOI: 10.1002/rob.20082] [Citation(s) in RCA: 54] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
34
|
Schmidt J, Vogt F, Niemann H. Calibration–Free Hand–Eye Calibration: A Structure–from–Motion Approach. ACTA ACUST UNITED AC 2005. [DOI: 10.1007/11550518_9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/03/2023]
|
35
|
Mitschke M, Navab N. Recovering the X-ray projection geometry for three-dimensional tomographic reconstruction with additional sensors: attached camera versus external navigation system. Med Image Anal 2003; 7:65-78. [PMID: 12467722 DOI: 10.1016/s1361-8415(02)00091-9] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Three-dimensional tomographic reconstruction using intra-operative mobile C-arms could provide physicians with new and exciting tools for image-guided surgery. Recovery of the projection geometry of mobile X-ray systems is a crucial step for such reconstruction procedures. Recent work on medical imaging describes the use of optical or electro-magnetic sensor systems in order to navigate surgical instruments. These systems can also be used for the estimation of C-arm motion, and therefore for the recovery of the projection geometry of the X-ray C-arm. In this case, the mathematical problem that needs to be solved is equivalent to the hand-eye calibration well studied by both the computer vision and robotics community. We first study the recovery of the motion and projection geometry using five different hand-eye calibration methods proposed in the literature. The optical navigation system POLARIS from Northern Digital Inc. was used in our experiments. The results of the estimated motion and projection geometry using the five hand-eye calibration methods are compared with the same results obtained using an off-the-shelf CCD camera attached to the mobile C-arm. The experimental results include three-dimensional tomographic reconstruction results using our mobile C-arm. We show that even though the motion of the C-arm is more precisely recovered using the navigation system, the projection geometry is better estimated using the attached CCD camera.
Collapse
Affiliation(s)
- M Mitschke
- Siemens AG Medical Solutions, Henkestrasse 127, 91052 Erlangen, Germany.
| | | |
Collapse
|
36
|
|
37
|
Xibilia MG, Muscato G, Fortuna L, Arena P. Multilayer Perceptrons to Approximate Quaternion Valued Functions. Neural Netw 1997; 10:335-342. [PMID: 12662531 DOI: 10.1016/s0893-6080(96)00048-2] [Citation(s) in RCA: 101] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
In this paper a new type of multilayer feedforward neural network is introduced. Such a structure, called hypercomplex multilayer perceptron (HMLP), is developed in quaternion algebra and allows quaternionic input and output signals to be dealt with, requiring a lower number of neurons than the real MLP, thus providing a reduced computational complexity. The structure introduced represents a generalization of the multilayer perceptron in the complex space (CMLP) reported in the literature. The fundamental result reported in the paper is a new density theorem which makes HMLPs universal interpolators of quaternion valued continuous functions. Moreover the proof of the density theorem can be restricted in order to formulate a density theorem in the complex space. Due to the identity between the quaternion and the four-dimensional real space, such a structure is also useful to approximate multidimensional real valued functions with a lower number of real parameters, decreasing the probability of being trapped in local minima during the learning phase. A numerical example is also reported in order to show the efficiency of the proposed structure. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.
Collapse
|
38
|
Hanqi Zhuang, Kuanchih Wang, Roth Z. Simultaneous calibration of a robot and a hand-mounted camera. ACTA ACUST UNITED AC 1995. [DOI: 10.1109/70.466601] [Citation(s) in RCA: 72] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
39
|
Park F, Martin B. Robot sensor calibration: solving AX=XB on the Euclidean group. ACTA ACUST UNITED AC 1994. [DOI: 10.1109/70.326576] [Citation(s) in RCA: 300] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
40
|
|
41
|
Hanqi Zuang, Yiu Cheung Shiu. A noise-tolerant algorithm for robotic hand-eye calibration with or without sensor orientation measurement. ACTA ACUST UNITED AC 1993. [DOI: 10.1109/21.247898] [Citation(s) in RCA: 59] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
42
|
|