1
|
Dowrick T, Xiao G, Nikitichev D, Dursun E, van Berkel N, Allam M, Koo B, Ramalhinho J, Thompson S, Gurusamy K, Blandford A, Stoyanov D, Davidson BR, Clarkson MJ. Evaluation of a calibration rig for stereo laparoscopes. Med Phys 2023; 50:2695-2704. [PMID: 36779419 PMCID: PMC10614700 DOI: 10.1002/mp.16310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 02/01/2023] [Accepted: 02/01/2023] [Indexed: 02/14/2023] Open
Abstract
BACKGROUND Accurate camera and hand-eye calibration are essential to ensure high-quality results in image-guided surgery applications. The process must also be able to be undertaken by a nonexpert user in a surgical setting. PURPOSE This work seeks to identify a suitable method for tracked stereo laparoscope calibration within theater. METHODS A custom calibration rig, to enable rapid calibration in a surgical setting, was designed. The rig was compared against freehand calibration. Stereo reprojection, stereo reconstruction, tracked stereo reprojection, and tracked stereo reconstruction error metrics were used to evaluate calibration quality. RESULTS Use of the calibration rig reduced mean errors: reprojection (1.47 mm [SD 0.13] vs. 3.14 mm [SD 2.11], p-value 1e-8), reconstruction (1.37 px [SD 0.10] vs. 10.10 px [SD 4.54], p-value 6e-7), and tracked reconstruction (1.38 mm [SD 0.10] vs. 12.64 mm [SD 4.34], p-value 1e-6) compared with freehand calibration. The use of a ChArUco pattern yielded slightly lower reprojection errors, while a dot grid produced lower reconstruction errors and was more robust under strong global illumination. CONCLUSION The use of the calibration rig results in a statistically significant decrease in calibration error metrics, versus freehand calibration, and represents the preferred approach for use in the operating theater.
Collapse
Affiliation(s)
- Thomas Dowrick
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Guofang Xiao
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Daniil Nikitichev
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Eren Dursun
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Niels van Berkel
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Moustafa Allam
- Royal Free CampusUCL Medical SchoolRoyal Free HospitalLondonUK
| | - Bongjin Koo
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Joao Ramalhinho
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Stephen Thompson
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | | | - Ann Blandford
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | - Danail Stoyanov
- Wellcome EPSRC Centre for Interventional and Surgical SciencesUCLLondonUK
| | | | | |
Collapse
|
2
|
Abstract
Abstract
A classic hand-eye system involves hand-eye calibration and robot-world and hand-eye calibration. Insofar as hand-eye calibration can solve only hand-eye transformation, this study aims to determine the robot-world and hand-eye transformations simultaneously based on the robot-world and hand-eye equation. According to whether the rotation part and the translation part of the equation are decoupled, the methods can be divided into separable solutions and simultaneous solutions. The separable solutions solve the rotation part before solving the translation part, so the estimated errors of the rotation will be transferred to the translation. In this study, a method was proposed for calculation with rotation and translation coupling; a closed-form solution based on Kronecker product and an iterative solution based on the Gauss–Newton algorithm were involved. The feasibility was further tested using simulated data and real data, and the superiority was verified by comparison with the results obtained by the available method. Finally, we improved a method that can solve the singularity problem caused by the parameterization of the rotation matrix, which can be widely used in the robot-world and hand-eye calibration. The results show that the prediction errors of rotation and translation based on the proposed method be reduced to
$0.26^\circ$
and
$1.67$
mm, respectively.
Collapse
|
3
|
Pachtrachai K, Vasconcelos F, Edwards P, Stoyanov D. Learning to Calibrate - Estimating the Hand-eye Transformation Without Calibration Objects. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3098942] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
4
|
Robust and Accurate Hand-Eye Calibration Method Based on Schur Matric Decomposition. SENSORS 2019; 19:s19204490. [PMID: 31623249 PMCID: PMC6832585 DOI: 10.3390/s19204490] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/04/2019] [Revised: 10/12/2019] [Accepted: 10/14/2019] [Indexed: 12/02/2022]
Abstract
To improve the accuracy and robustness of hand–eye calibration, a hand–eye calibration method based on Schur matric decomposition is proposed in this paper. The accuracy of these methods strongly depends on the quality of observation data. Therefore, preprocessing observation data is essential. As with traditional two-step hand–eye calibration methods, we first solve the rotation parameters and then the translation vector can be immediately determined. A general solution was obtained from one observation through Schur matric decomposition and then the degrees of freedom were decreased from three to two. Observation data preprocessing is one of the basic unresolved problems with hand–eye calibration methods. A discriminant equation to delete outliers was deduced based on Schur matric decomposition. Finally, the basic problem of observation data preprocessing was solved using outlier detection, which significantly improved robustness. The proposed method was validated by both simulations and experiments. The results show that the prediction error of rotation and translation was 0.06 arcmin and 1.01 mm respectively, and the proposed method performed much better in outlier detection. A minimal configuration for the unique solution was proven from a new perspective.
Collapse
|
5
|
Pachtrachai K, Vasconcelos F, Dwyer G, Hailes S, Stoyanov D. Hand-Eye Calibration With a Remote Centre of Motion. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2924845] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
6
|
Nguyen H, Pham QC. On the Covariance of
$\boldsymbol X$
in
$\boldsymbol A\boldsymbol X = \boldsymbol X\boldsymbol B$
. IEEE T ROBOT 2018. [DOI: 10.1109/tro.2018.2861905] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
7
|
Pachtrachai K, Vasconcelos F, Chadebecq F, Allan M, Hailes S, Pawar V, Stoyanov D. Adjoint Transformation Algorithm for Hand-Eye Calibration with Applications in Robotic Assisted Surgery. Ann Biomed Eng 2018; 46:1606-1620. [PMID: 30051249 PMCID: PMC6154014 DOI: 10.1007/s10439-018-2097-4] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2017] [Accepted: 07/17/2018] [Indexed: 11/30/2022]
Abstract
Hand–eye calibration aims at determining the unknown rigid transformation between the coordinate systems of a robot arm and a camera. Existing hand–eye algorithms using closed-form solutions followed by iterative non-linear refinement provide accurate calibration results within a broad range of robotic applications. However, in the context of surgical robotics hand–eye calibration is still a challenging problem due to the required accuracy within the millimetre range, coupled with a large displacement between endoscopic cameras and the robot end-effector. This paper presents a new method for hand–eye calibration based on the adjoint transformation of twist motions that solves the problem iteratively through alternating estimations of rotation and translation. We show that this approach converges to a solution with a higher accuracy than closed form initializations within a broad range of synthetic and real experiments. We also propose a stereo hand–eye formulation that can be used in the context of both our proposed method and previous state-of-the-art closed form solutions. Experiments with real data are conducted with a stereo laparoscope, the KUKA robot arm manipulator, and the da Vinci surgical robot, showing that both our new alternating solution and the explicit representation of stereo camera hand–eye relations contribute to a higher calibration accuracy.
Collapse
Affiliation(s)
- Krittin Pachtrachai
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK.
| | - Francisco Vasconcelos
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK
| | - François Chadebecq
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK
| | - Max Allan
- Intuitive Surgical, Sunnyvale, CA, USA
| | - Stephen Hailes
- Department of Computer Science, University College London, London, UK
| | - Vijay Pawar
- Department of Computer Science, University College London, London, UK
| | - Danail Stoyanov
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences (WEISS) and the Department of Computer Science, University College London, London, UK
| |
Collapse
|
8
|
Chen ECS, Morgan I, Jayarathne U, Ma B, Peters TM. Hand-eye calibration using a target registration error model. Healthc Technol Lett 2017; 4:157-162. [PMID: 29184657 PMCID: PMC5683221 DOI: 10.1049/htl.2017.0072] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2017] [Accepted: 08/01/2017] [Indexed: 11/19/2022] Open
Abstract
Surgical cameras are prevalent in modern operating theatres and are often used as a surrogate for direct vision. Visualisation techniques (e.g. image fusion) made possible by tracking the camera require accurate hand-eye calibration between the camera and the tracking system. The authors introduce the concept of 'guided hand-eye calibration', where calibration measurements are facilitated by a target registration error (TRE) model. They formulate hand-eye calibration as a registration problem between homologous point-line pairs. For each measurement, the position of a monochromatic ball-tip stylus (a point) and its projection onto the image (a line) is recorded, and the TRE of the resulting calibration is predicted using a TRE model. The TRE model is then used to guide the placement of the calibration tool, so that the subsequent measurement minimises the predicted TRE. Assessing TRE after each measurement produces accurate calibration using a minimal number of measurements. As a proof of principle, they evaluated guided calibration using a webcam and an endoscopic camera. Their endoscopic camera results suggest that millimetre TRE is achievable when at least 15 measurements are acquired with the tracker sensor ∼80 cm away on the laparoscope handle for a target ∼20 cm away from the camera.
Collapse
Affiliation(s)
| | - Isabella Morgan
- Biomedical Engineering, University of Waterloo, Waterloo, Ontario, Canada
| | | | - Burton Ma
- Department of Electrical Engineering and Computer Science, York University, Toronto, Ontario, Canada
| | | |
Collapse
|
9
|
Morgan I, Jayarathne U, Rankin A, Peters TM, Chen ECS. Hand-eye calibration for surgical cameras: a Procrustean Perspective-n-Point solution. Int J Comput Assist Radiol Surg 2017; 12:1141-1149. [PMID: 28425030 DOI: 10.1007/s11548-017-1590-9] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2017] [Accepted: 04/10/2017] [Indexed: 11/30/2022]
Abstract
PURPOSE Surgical cameras are prevalent in modern operating theatres often used as surrogates for direct vision. A surgical navigational system is a useful adjunct, but requires an accurate "hand-eye" calibration to determine the geometrical relationship between the surgical camera and tracking markers. METHODS Using a tracked ball-tip stylus, we formulated hand-eye calibration as a Perspective-n-Point problem, which can be solved efficiently and accurately using as few as 15 measurements. RESULTS The proposed hand-eye calibration algorithm was applied to three types of camera and validated against five other widely used methods. Using projection error as the accuracy metric, our proposed algorithm compared favourably with existing methods. CONCLUSION We present a fully automated hand-eye calibration technique, based on Procrustean point-to-line registration, which provides superior results for calibrating surgical cameras when compared to existing methods.
Collapse
Affiliation(s)
| | | | - Adam Rankin
- Robarts Research Institute, Western University, London, ON, Canada
| | - Terry M Peters
- Robarts Research Institute, Western University, London, ON, Canada
| | - Elvis C S Chen
- Robarts Research Institute, Western University, London, ON, Canada.
| |
Collapse
|
10
|
Thompson S, Stoyanov D, Schneider C, Gurusamy K, Ourselin S, Davidson B, Hawkes D, Clarkson MJ. Hand-eye calibration for rigid laparoscopes using an invariant point. Int J Comput Assist Radiol Surg 2016; 11:1071-80. [PMID: 26995597 PMCID: PMC4893361 DOI: 10.1007/s11548-016-1364-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2016] [Accepted: 02/24/2016] [Indexed: 01/22/2023]
Abstract
PURPOSE Laparoscopic liver resection has significant advantages over open surgery due to less patient trauma and faster recovery times, yet it can be difficult due to the restricted field of view and lack of haptic feedback. Image guidance provides a potential solution but one current challenge is in accurate "hand-eye" calibration, which determines the position and orientation of the laparoscope camera relative to the tracking markers. METHODS In this paper, we propose a simple and clinically feasible calibration method based on a single invariant point. The method requires no additional hardware, can be constructed by theatre staff during surgical setup, requires minimal image processing and can be visualised in real time. Real-time visualisation allows the surgical team to assess the calibration accuracy before use in surgery. In addition, in the laboratory, we have developed a laparoscope with an electromagnetic tracking sensor attached to the camera end and an optical tracking marker attached to the distal end. This enables a comparison of tracking performance. RESULTS We have evaluated our method in the laboratory and compared it to two widely used methods, "Tsai's method" and "direct" calibration. The new method is of comparable accuracy to existing methods, and we show RMS projected error due to calibration of 1.95 mm for optical tracking and 0.85 mm for EM tracking, versus 4.13 and 1.00 mm respectively, using existing methods. The new method has also been shown to be workable under sterile conditions in the operating room. CONCLUSION We have proposed a new method of hand-eye calibration, based on a single invariant point. Initial experience has shown that the method provides visual feedback, satisfactory accuracy and can be performed during surgery. We also show that an EM sensor placed near the camera would provide significantly improved image overlay accuracy.
Collapse
Affiliation(s)
- Stephen Thompson
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK.
| | - Danail Stoyanov
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| | - Crispin Schneider
- Division of Surgery, Hampstead Campus, UCL Medical School, Royal Free Hospital, 9th Floor, Rowland Hill Street, London, UK
| | - Kurinchi Gurusamy
- Division of Surgery, Hampstead Campus, UCL Medical School, Royal Free Hospital, 9th Floor, Rowland Hill Street, London, UK
| | - Sébastien Ourselin
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| | - Brian Davidson
- Division of Surgery, Hampstead Campus, UCL Medical School, Royal Free Hospital, 9th Floor, Rowland Hill Street, London, UK
| | - David Hawkes
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| | - Matthew J Clarkson
- Centre for Medical Image Computing, Front Engineering Building, University College London, Malet Place, London, UK
| |
Collapse
|
11
|
Malti A, Barreto JP. Hand-eye and radial distortion calibration for rigid endoscopes. Int J Med Robot 2013; 9:441-54. [PMID: 23303645 DOI: 10.1002/rcs.1478] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/26/2012] [Indexed: 11/05/2022]
Abstract
BACKGROUND In this paper, we propose a non-linear calibration method for hand-eye system equipped with a camera undergoing radial distortion as the rigid endoscope. Whereas classic methods propose either a separated estimation of the camera intrinsics and the hand-eye transform or a mixed non-linear estimation of both hand-eye and camera intrinsics assuming a pin-hole model, the proposed approach enables a simultaneous refinement of the hand-eye and the camera parameters including the distortion factor with only three frames of the calibrated pattern. METHODS Our approach relies on three steps: (i) linear initial estimates of hand-eye and radial distortion with minimum number of frames: one single image to estimate the radial distortion and three frames to estimate the initial hand-eye transform, (ii) we propose to express the camera extrinsic with respect to hand-eye and world-grid transforms and (iii) we run bundle adjustment on the reprojection error with respect to the distortion parameters, the camera intrinsics and the hand-eye transform. RESULTS Our method is quantitatively compared with state-of-the-art linear and non-linear methods. We show that our method provides a 3D reconstruction error of approximately 5% of the size of the 3D shape. CONCLUSIONS Our experimental results show the effectiveness of simultaneously estimating hand-eye and distortion parameters for 3D reconstruction.
Collapse
Affiliation(s)
- Abed Malti
- ALCoV-ISIT, UMR 6284 CNRS/Université d'Auvergne, Clermont-Ferrand, France
| | | |
Collapse
|