1
|
Ha HG, Gu K, Jeung D, Hong J, Lee H. Simulated augmented reality-based calibration of optical see-through head mound display for surgical navigation. Int J Comput Assist Radiol Surg 2024; 19:1647-1657. [PMID: 38777946 DOI: 10.1007/s11548-024-03164-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2024] [Accepted: 04/22/2024] [Indexed: 05/25/2024]
Abstract
PURPOSE Calibration of an optical see-through head-mounted display is critical for augmented reality-based surgical navigation. While conventional methods have advanced, calibration errors remain significant. Moreover, prior research has focused primarily on calibration accuracy and procedure, neglecting the impact on the overall surgical navigation system. Consequently, these enhancements do not necessarily translate to accurate augmented reality in the optical see-through head mount due to systemic errors, including those in calibration. METHOD This study introduces a simulated augmented reality-based calibration to address these issues. By replicating the augmented reality that appeared in the optical see-through head mount, the method achieves calibration that compensates for augmented reality errors, thereby reducing them. The process involves two distinct calibration approaches, followed by adjusting the transformation matrix to minimize displacement in the simulated augmented reality. RESULTS The efficacy of this method was assessed through two accuracy evaluations: registration accuracy and augmented reality accuracy. Experimental results showed an average translational error of 2.14 mm and rotational error of 1.06° across axes in both approaches. Additionally, augmented reality accuracy, measured by the overlay regions' ratio, increased to approximately 95%. These findings confirm the enhancement in both calibration and augmented reality accuracy with the proposed method. CONCLUSION The study presents a calibration method using simulated augmented reality, which minimizes augmented reality errors. This approach, requiring minimal manual intervention, offers a more robust and precise calibration technique for augmented reality applications in surgical navigation.
Collapse
Affiliation(s)
- Ho-Gun Ha
- Division of Intelligent Robot, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Kyeongmo Gu
- Division of Intelligent Robot, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Deokgi Jeung
- Department of Robotics and Mechatronics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Jaesung Hong
- Department of Robotics and Mechatronics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea
| | - Hyunki Lee
- Division of Intelligent Robot, Daegu Gyeongbuk Institute of Science and Technology (DGIST), 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988, Republic of Korea.
| |
Collapse
|
2
|
Jeung D, Choi H, Ha HG, Oh SH, Hong J. Intraoperative zoom lens calibration for high magnification surgical microscope. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2023; 238:107618. [PMID: 37247472 DOI: 10.1016/j.cmpb.2023.107618] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Revised: 01/25/2023] [Accepted: 05/18/2023] [Indexed: 05/31/2023]
Abstract
BACKGROUND AND OBJECTIVES An augmented reality (AR)-based surgical guidance system is often used with high-magnification zoom lens systems such as a surgical microscope, particularly in neurology or otolaryngology. To superimpose the internal structures of relevant organs on the microscopy image, an accurate calibration process to obtain the camera intrinsic and hand-eye parameters of the microscope is essential. However, conventional calibration methods are unsuitable for surgical microscopes because of their narrow depth of focus at high magnifications. To realize AR-based surgical guidance with a high-magnification surgical microscope, we herein propose a new calibration method that is applicable to the highest magnification levels as well as low magnifications. METHODS The key idea of the proposed method is to find the relationship between the focal length and the hand-eye parameters, which remains constant regardless of the magnification level. Based on this, even if the magnification changes arbitrarily during surgery, the intrinsic and hand-eye parameters are recalculated quickly and accurately with one or two pictures of the pattern. We also developed a dedicated calibration tool with a prism to take focused pattern images without interfering with the surgery. RESULTS The proposed calibration method ensured an AR error of < 1 mm for all magnification levels. In addition, the variation of focal length was within 1% regardless of the magnification level, and the corresponding variation with the conventional calibration method exceeded 20% at high magnification levels. CONCLUSIONS The comparative study showed that the proposed method has outstanding accuracy and reproducibility for a high-magnification surgical microscope. The proposed calibration method is applicable to various endoscope or microscope systems with zoom lens.
Collapse
Affiliation(s)
- Deokgi Jeung
- Department of Robotics and Mechatronics Engineering, DGIST, 333 Techno Jungang-Daero, Daegu 42988, Republic of Korea
| | | | - Ho-Gun Ha
- Division of Intelligent Robot, DGIST, Daegu, Republic of Korea
| | - Seung-Ha Oh
- Department of Otorhinolaryngology-Head and Neck Surgery, Seoul National University College of Medicine, Seoul, Republic of Korea; Sensory Organ Research Institute, Seoul National University Medical Research Center, Seoul, Republic of Korea
| | - Jaesung Hong
- Department of Robotics and Mechatronics Engineering, DGIST, 333 Techno Jungang-Daero, Daegu 42988, Republic of Korea.
| |
Collapse
|
3
|
Ha HG, Han G, Lee S, Nam K, Joung S, Park I, Hong J. Robot-patient registration for optical tracker-free robotic fracture reduction surgery. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2023; 228:107239. [PMID: 36410266 DOI: 10.1016/j.cmpb.2022.107239] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 10/25/2022] [Accepted: 11/08/2022] [Indexed: 06/16/2023]
Abstract
BACKGROUND AND OBJECTIVE Image-guided robotic surgery for fracture reduction is a medical procedure in which surgeons control a surgical robot to align the fractured bones by using a navigation system that shows the rotation and distance of bone movement. In such robotic surgeries, it is necessary to estimate the relationship between the robot and patient (bone), a task known as robot-patient registration, to realize the navigation. Through the registration, a fracture state in real-world can be simulated in virtual space of the navigation system. METHODS This paper proposes an approach to realize robot-patient registration for an optical-tracker-free robotic fracture-reduction system. Instead of the optical tracker which is a three-dimensional position localizer, X-ray images are used to realize the robot-patient registration, combining the relationship of both the robot and patient with regards to C-arm. The proposed method consists of two steps of registration, where initial registration is followed by refined registration which adopts particle swarm optimization with the minimum cross-reprojection error based on bidirectional X-ray images. To address the unrecognizable features due to interference between the robot and bone, we also developed attachable robot features. The allocated robot features could be clearly extracted from the X-ray images, and precise registration could be realized through the particle swarm optimization. RESULTS The proposed method was evaluated in phantom and ex-vivo experiments involving a caprine cadaver. For the phantom experiments, the average translational and rotational errors were 1.88 mm and 2.45°, respectively, and the corresponding errors in the ex vivo experiments were 2.64 mm and 3.32° The results demonstrated the effectiveness of the proposed robot-patient registration. CONCLUSIONS The proposed method enable to estimate the three-dimensional relationship between fractured bones in real-world by using only two-dimensional images, and the relationship is accurately simulated in virtual reality for the navigation. Therefore, a reduction procedure for successful treatment of bone fractures in image-guided robotic surgery can be expected with the aid of the proposed registration method.
Collapse
Affiliation(s)
- Ho-Gun Ha
- Division of Intelligent Robot, DGIST, 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-Gun, Daegu 42988, Republic of Korea.
| | - Gukyeong Han
- Department of Robotics and Mechatronics Engineering, DGIST, 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu 42988, Republic of Korea
| | - Seongpung Lee
- R&D Center, Curexo Inc., 4-5, Yanghyeon-ro 405 Beon-gil, Jungwon-gu, Seongnam-si, Gyeonggi-do 13438, Republic of Korea
| | - Kwonsun Nam
- R&D Center, SAMICK THK Co., Ltd., Jinwi2sandan-ro, Jinwi-myeon, Pyeongtaek-si, Gyeonggi-do 17708, Republic of Korea
| | - Sanghyun Joung
- Medical Device and Robot Institute of Park, Kyungpook National University, Global plaza 1006, 80, Daehak-ro, Buk-gu, Daegu 41566, Republic of Korea
| | - Ilhyung Park
- Medical Device and Robot Institute of Park, Kyungpook National University, Global plaza 1006, 80, Daehak-ro, Buk-gu, Daegu 41566, Republic of Korea; Department of Orthopaedic Surgery, School of Medicine, Kyungpook National University Hospital, 130 Dongdeok-ro, Jung-gu, Daegu 41944, Republic of Korea
| | - Jaesung Hong
- Department of Robotics and Mechatronics Engineering, DGIST, 333 Techno Jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu 42988, Republic of Korea
| |
Collapse
|
4
|
Heterogeneous Stitching of X-ray Images According to Homographic Evaluation. J Digit Imaging 2021; 34:1249-1263. [PMID: 34505959 DOI: 10.1007/s10278-021-00503-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Revised: 08/09/2021] [Accepted: 08/12/2021] [Indexed: 10/20/2022] Open
Abstract
The C-arm X-ray system is a common intraoperative imaging modality used to observe the state of a fractured bone in orthopedic surgery. Using C-arm, the bone fragments are aligned during surgery, and their lengths and angles with respect to the entire bone are measured to verify the fracture reduction. Since the field-of-view of the C-arm is too narrow to visualize the entire bone, a panoramic X-ray image is utilized to enlarge it by stitching multiple images. To achieve X-ray image stitching with feature detection, the extraction of accurate and densely matched features within the overlap region between images is imperative. However, since the features are highly affected by the properties and sizes of the overlap regions in consecutive X-ray images, the accuracy and density of matched features cannot be guaranteed. To solve this problem, a heterogeneous stitching of X-ray images was proposed. This heterogeneous stitching was completed according to the overlap region based on homographic evaluation. To acquire sufficiently matched features within the limited overlap region, integrated feature detection was used to estimate a homography. The homography was then evaluated to confirm its accuracy. When the estimated homography was incorrect, local regions around the matched feature were derived from integrated feature detection and substituted to re-estimate the homography. Successful X-ray image stitching of the C-arm was achieved by estimating the optimal homography for each image. Based on phantom and ex-vivo experiments using the proposed method, we confirmed a panoramic X-ray image construction that was robust compared to the conventional methods.
Collapse
|
5
|
Chen L, Zhang F, Zhan W, Gan M, Sun L. Optimization of virtual and real registration technology based on augmented reality in a surgical navigation system. Biomed Eng Online 2020; 19:1. [PMID: 31915014 PMCID: PMC6950982 DOI: 10.1186/s12938-019-0745-z] [Citation(s) in RCA: 34] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Accepted: 12/30/2019] [Indexed: 12/19/2022] Open
Abstract
Background The traditional navigation interface was intended only for two-dimensional observation by doctors; thus, this interface does not display the total spatial information for the lesion area. Surgical navigation systems have become essential tools that enable for doctors to accurately and safely perform complex operations. The image navigation interface is separated from the operating area, and the doctor needs to switch the field of vision between the screen and the patient’s lesion area. In this paper, augmented reality (AR) technology was applied to spinal surgery to provide more intuitive information to surgeons. The accuracy of virtual and real registration was improved via research on AR technology. During the operation, the doctor could observe the AR image and the true shape of the internal spine through the skin. Methods To improve the accuracy of virtual and real registration, a virtual and real registration technique based on an improved identification method and robot-assisted method was proposed. The experimental method was optimized by using the improved identification method. X-ray images were used to verify the effectiveness of the puncture performed by the robot. Results The final experimental results show that the average accuracy of the virtual and real registration based on the general identification method was 9.73 ± 0.46 mm (range 8.90–10.23 mm). The average accuracy of the virtual and real registration based on the improved identification method was 3.54 ± 0.13 mm (range 3.36–3.73 mm). Compared with the virtual and real registration based on the general identification method, the accuracy was improved by approximately 65%. The highest accuracy of the virtual and real registration based on the robot-assisted method was 2.39 mm. The accuracy was improved by approximately 28.5% based on the improved identification method. Conclusion The experimental results show that the two optimized methods are highly very effective. The proposed AR navigation system has high accuracy and stability. This system may have value in future spinal surgeries.
Collapse
Affiliation(s)
- Long Chen
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China
| | - Fengfeng Zhang
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China. .,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China.
| | - Wei Zhan
- Department of Radiation Oncology, The First Affiliated Hospital of Soochow University, Suzhou, China
| | - Minfeng Gan
- Department of Radiation Oncology, The First Affiliated Hospital of Soochow University, Suzhou, China
| | - Lining Sun
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China.,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China
| |
Collapse
|
6
|
Hosseinian S, Arefi H, Navab N. Toward an End-to-End Calibration for Mobile C-Arm in Combination with a Depth Sensor for Surgical Augmented Reality Applications. SENSORS 2019; 20:s20010036. [PMID: 31861606 PMCID: PMC6982695 DOI: 10.3390/s20010036] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/31/2019] [Revised: 12/11/2019] [Accepted: 12/13/2019] [Indexed: 11/18/2022]
Abstract
C-arm X-ray imaging is commonly applied in operating rooms for guiding orthopedic surgeries. Augmented Reality (AR) with C-arm X-ray images during surgery is an efficient way to facilitate procedures for surgeons. However, the accurate calibration process for surgical AR based on C-arm is essential and still challenging due to the limitations of C-arm imaging systems, such as instability of C-arm calibration parameters and the narrow field of view. We extend existing methods using a depth camera and propose a new calibration procedure consisting of calibration of the C-arm imaging system, and 3D/2D calibration of an RGB-D camera and C-arm system with a new method to achieve reliable data and promising accuracy and, at the same time, consistent with standard surgical protocols. For the calibration procedure, we apply bundle adjustment equations with a 3D designed Lego multi-modal phantom, in contrast to the previous methods in which planar calibration phantoms were applied. By using our method, the visualization of the X-ray image upon the 3D data was done, and the achieved mean overlay error was 1.03 mm. The evaluations showed that the proposed calibration procedure provided promising accuracy for AR surgeries and it improved the flexibility and robustness of existing C-arm calibration methods for surgical augmented reality (using C-arm and RGB-D sensor). Moreover, the results showed the efficiency of our method to compensate for the effects of the C-arm movement on calibration parameters. It was shown that the obtained overlay error was improved for the non-zero rotation movement of C-arm by using a virtual detector.
Collapse
Affiliation(s)
- Sahar Hosseinian
- School of Surveying and Geospatial Engineering, College of Engineering, University of Tehran, Tehran 1439957131, Iran;
| | - Hossein Arefi
- School of Surveying and Geospatial Engineering, College of Engineering, University of Tehran, Tehran 1439957131, Iran;
- Correspondence:
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures & Augmented Reality, Faculty of Computer Science, Technical University of Munich, Boltzmannstr. 3, 85748 Garching b. Munich, Germany;
| |
Collapse
|