1
|
Dho YS, Lee BC, Moon HC, Kim KM, Kang H, Lee EJ, Kim MS, Kim JW, Kim YH, Park SJ, Park CK. Validation of real-time inside-out tracking and depth realization technologies for augmented reality-based neuronavigation. Int J Comput Assist Radiol Surg 2024; 19:15-25. [PMID: 37442869 DOI: 10.1007/s11548-023-02993-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
PURPOSE Concomitant with the significant advances in computing technology, the utilization of augmented reality-based navigation in clinical applications is being actively researched. In this light, we developed novel object tracking and depth realization technologies to apply augmented reality-based neuronavigation to brain surgery. METHODS We developed real-time inside-out tracking based on visual inertial odometry and a visual inertial simultaneous localization and mapping algorithm. The cube quick response marker and depth data obtained from light detection and ranging sensors are used for continuous tracking. For depth realization, order-independent transparency, clipping, and annotation and measurement functions were developed. In this study, the augmented reality model of a brain tumor patient was applied to its life-size three-dimensional (3D) printed model. RESULTS Using real-time inside-out tracking, we confirmed that the augmented reality model remained consistent with the 3D printed patient model without flutter, regardless of the movement of the visualization device. The coordination accuracy during real-time inside-out tracking was also validated. The average movement error of the X and Y axes was 0.34 ± 0.21 and 0.04 ± 0.08 mm, respectively. Further, the application of order-independent transparency with multilayer alpha blending and filtered alpha compositing improved the perception of overlapping internal brain structures. Clipping, and annotation and measurement functions were also developed to aid depth perception and worked perfectly during real-time coordination. We named this system METAMEDIP navigation. CONCLUSIONS The results validate the efficacy of the real-time inside-out tracking and depth realization technology. With these novel technologies developed for continuous tracking and depth perception in augmented reality environments, we are able to overcome the critical obstacles in the development of clinically applicable augmented reality neuronavigation.
Collapse
Affiliation(s)
- Yun-Sik Dho
- Neuro-Oncology Clinic, National Cancer Center, Goyang, Republic of Korea
| | - Byeong Cheol Lee
- Research and Science Division, Research and Development Center, MEDICALIP Co. Ltd., Seoul, Republic of Korea
| | - Hyeong Cheol Moon
- Department of Neurosurgery, Chungbuk National University Hospital, Cheongju, Republic of Korea
| | - Kyung Min Kim
- Department of Neurosurgery, Inha University Hospital, Inha University College of Medicine, Incheon, Korea
| | - Ho Kang
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Eun Jung Lee
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Min-Sung Kim
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Jin Wook Kim
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Yong Hwy Kim
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea
| | - Sang Joon Park
- Research and Science Division, Research and Development Center, MEDICALIP Co. Ltd., Seoul, Republic of Korea.
- Department of Radiology, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea.
| | - Chul-Kee Park
- Department of Neurosurgery, Seoul National University Hospital, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080, Republic of Korea.
| |
Collapse
|
2
|
Heiliger C, Heiliger T, Deodati A, Winkler A, Grimm M, Kalim F, Esteban J, Mihatsch L, Ehrlich V Treuenstätt VH, Mohamed KA, Andrade D, Frank A, Solyanik O, Mandal S, Werner J, Eck U, Navab N, Karcz K. AR visualizations in laparoscopy: surgeon preferences and depth assessment of vascular anatomy. MINIM INVASIV THER 2023; 32:190-198. [PMID: 37293947 DOI: 10.1080/13645706.2023.2219739] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 05/11/2023] [Indexed: 06/10/2023]
Abstract
Introduction: This study compares five augmented reality (AR) vasculature visualization techniques in a mixed-reality laparoscopy simulator with 50 medical professionals and analyzes their impact on the surgeon. Material and methods: The different visualization techniques' abilities to convey depth were measured using the participant's accuracy in an objective depth sorting task. Demographic data and subjective measures, such as the preference of each AR visualization technique and potential application areas, were collected with questionnaires. Results: Despite measuring differences in objective measurements across the visualization techniques, they were not statistically significant. In the subjective measures, however, 55% of the participants rated visualization technique II, 'Opaque with single-color Fresnel highlights', as their favorite. Participants felt that AR could be useful for various surgeries, especially complex surgeries (100%). Almost all participants agreed that AR could potentially improve surgical parameters, such as patient safety (88%), complication rate (84%), and identifying risk structures (96%). Conclusions: More studies are needed on the effect of different visualizations on task performance, as well as more sophisticated and effective visualization techniques for the operating room. With the findings of this study, we encourage the development of new study setups to advance surgical AR.
Collapse
Affiliation(s)
- Christian Heiliger
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Thomas Heiliger
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Alessandra Deodati
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Alexander Winkler
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
- Computer Aided Medical Procedures & Augmented Reality (CAMP), Technical University of Munich (TUM), Munich, Germany
| | - Matthias Grimm
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
- Computer Aided Medical Procedures & Augmented Reality (CAMP), Technical University of Munich (TUM), Munich, Germany
- Maxer Endoscopy GmbH, Wurmlingen, Germany
| | | | - Javier Esteban
- Computer Aided Medical Procedures & Augmented Reality (CAMP), Technical University of Munich (TUM), Munich, Germany
| | - Lorenz Mihatsch
- Department of Anesthesiology and Intensive Care Medicine, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Viktor H Ehrlich V Treuenstätt
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Khaled Ahmed Mohamed
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Dorian Andrade
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Alexander Frank
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Olga Solyanik
- Department of Radiology, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | | | - Jens Werner
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| | - Ulrich Eck
- Computer Aided Medical Procedures & Augmented Reality (CAMP), Technical University of Munich (TUM), Munich, Germany
| | - Nassir Navab
- Computer Aided Medical Procedures & Augmented Reality (CAMP), Technical University of Munich (TUM), Munich, Germany
- Laboratory for Computational Sensing and Robotics, Whiting School of Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Konrad Karcz
- Department of General, Visceral, and Transplantation Surgery, Hospital of the LMU Munich, Ludwig-Maximilians-Universität (LMU), Munich, Germany
| |
Collapse
|
3
|
Katayama M, Mitsuno D, Ueda K. Clinical Application to Improve the "Depth Perception Problem" by Combining Augmented Reality and a 3D Printing Model. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2023; 11:e5071. [PMID: 37361506 PMCID: PMC10289554 DOI: 10.1097/gox.0000000000005071] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2023] [Accepted: 04/28/2023] [Indexed: 06/28/2023]
Abstract
In our experience with intraoperative evaluation and educational application of augmented reality technology, an illusion of depth has been a major problem. To improve this depth perception problem, we conducted two experiments combining various three-dimensional models and holograms and the observation angles using an augmented reality device. Methods In experiment 1, when observing holograms projected on the surface layer of the model (bone model) or holograms projected on a layer deeper than the model (body surface model), the observer's first impression regarding which model made it easier to understand positional relationships was investigated. In experiment 2, to achieve a more quantitative evaluation, the observer was asked to measure the distance between two specific points on the surface and deep layers from two angles in each of the above combinations. Statistical analysis was performed on the measurement error for this distance. Results In experiment 1, the three-dimensional positional relationships were easier to understand in the bone than in the body surface model. In experiment 2, there was not much difference in the measurement error under either condition, which was not large enough to cause a misunderstanding of the depth relationship between the surface and deep layers. Conclusions Any combination can be used for preoperative examinations and anatomical study purposes. In particular, projecting holograms on a deep model or observing positional relationships from not only the operator's viewpoint, but also multiple other angles is more desirable because it reduces confusion caused by the depth perception problem and improves understanding of anatomy.
Collapse
Affiliation(s)
- Misato Katayama
- From the Department of Plastic and Reconstructive Surgery, Osaka Medical and Pharmaceutical University, Takatsuki City, Osaka, Japan
| | - Daisuke Mitsuno
- From the Department of Plastic and Reconstructive Surgery, Osaka Medical and Pharmaceutical University, Takatsuki City, Osaka, Japan
| | - Koichi Ueda
- From the Department of Plastic and Reconstructive Surgery, Osaka Medical and Pharmaceutical University, Takatsuki City, Osaka, Japan
| |
Collapse
|
4
|
Jeung D, Choi H, Ha HG, Oh SH, Hong J. Intraoperative zoom lens calibration for high magnification surgical microscope. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2023; 238:107618. [PMID: 37247472 DOI: 10.1016/j.cmpb.2023.107618] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Revised: 01/25/2023] [Accepted: 05/18/2023] [Indexed: 05/31/2023]
Abstract
BACKGROUND AND OBJECTIVES An augmented reality (AR)-based surgical guidance system is often used with high-magnification zoom lens systems such as a surgical microscope, particularly in neurology or otolaryngology. To superimpose the internal structures of relevant organs on the microscopy image, an accurate calibration process to obtain the camera intrinsic and hand-eye parameters of the microscope is essential. However, conventional calibration methods are unsuitable for surgical microscopes because of their narrow depth of focus at high magnifications. To realize AR-based surgical guidance with a high-magnification surgical microscope, we herein propose a new calibration method that is applicable to the highest magnification levels as well as low magnifications. METHODS The key idea of the proposed method is to find the relationship between the focal length and the hand-eye parameters, which remains constant regardless of the magnification level. Based on this, even if the magnification changes arbitrarily during surgery, the intrinsic and hand-eye parameters are recalculated quickly and accurately with one or two pictures of the pattern. We also developed a dedicated calibration tool with a prism to take focused pattern images without interfering with the surgery. RESULTS The proposed calibration method ensured an AR error of < 1 mm for all magnification levels. In addition, the variation of focal length was within 1% regardless of the magnification level, and the corresponding variation with the conventional calibration method exceeded 20% at high magnification levels. CONCLUSIONS The comparative study showed that the proposed method has outstanding accuracy and reproducibility for a high-magnification surgical microscope. The proposed calibration method is applicable to various endoscope or microscope systems with zoom lens.
Collapse
Affiliation(s)
- Deokgi Jeung
- Department of Robotics and Mechatronics Engineering, DGIST, 333 Techno Jungang-Daero, Daegu 42988, Republic of Korea
| | | | - Ho-Gun Ha
- Division of Intelligent Robot, DGIST, Daegu, Republic of Korea
| | - Seung-Ha Oh
- Department of Otorhinolaryngology-Head and Neck Surgery, Seoul National University College of Medicine, Seoul, Republic of Korea; Sensory Organ Research Institute, Seoul National University Medical Research Center, Seoul, Republic of Korea
| | - Jaesung Hong
- Department of Robotics and Mechatronics Engineering, DGIST, 333 Techno Jungang-Daero, Daegu 42988, Republic of Korea.
| |
Collapse
|
5
|
Jeung D, Jung K, Lee HJ, Hong J. Augmented reality-based surgical guidance for wrist arthroscopy with bone-shift compensation. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2023; 230:107323. [PMID: 36608430 DOI: 10.1016/j.cmpb.2022.107323] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/03/2021] [Revised: 08/17/2022] [Accepted: 12/22/2022] [Indexed: 06/17/2023]
Abstract
BACKGROUND AND OBJECTIVES Intraoperative joint condition is different from preoperative CT/MR due to the motion applied during surgery, inducing an inaccurate approach to surgical targets. This study aims to provide real-time augmented reality (AR)-based surgical guidance for wrist arthroscopy based on a bone-shift model through an in vivo computed tomography (CT) study. METHODS To accurately visualize concealed wrist bones on the intra-articular arthroscopic image, we propose a surgical guidance system with a novel bone-shift compensation method using noninvasive fiducial markers. First, to measure the effect of traction during surgery, two noninvasive fiducial markers were attached before surgery. In addition, two virtual link models connecting the wrist bones were implemented. When wrist traction occurs during the operation, the movement of the fiducial marker is measured, and bone-shift compensation is applied to move the virtual links in the direction of the traction. The proposed bone-shift compensation method was verified with the in vivo CT data of 10 participants. Finally, to introduce AR, camera calibration for the arthroscope parameters was performed, and a patient-specific template was used for registration between the patient and the wrist bone model. As a result, a virtual bone model with three-dimensional information could be accurately projected on a two-dimensional arthroscopic image plane. RESULTS The proposed method was possible to estimate the position of wrist bone in the traction state with an accuracy of 1.4 mm margin. After bone-shift compensation was applied, the target point error was reduced by 33.6% in lunate, 63.3% in capitate, 55.0% in scaphoid, and 74.8% in trapezoid than those in preoperative wrist CT. In addition, a phantom experiment was introduced simulating the real surgical environment. AR display allowed to expand the field of view (FOV) of the arthroscope and helped in visualizing the anatomical structures around the bones. CONCLUSIONS This study demonstrated the successful handling of AR error caused by wrist traction using the proposed method. In addition, the method allowed accurate AR visualization of the concealed bones and expansion of the limited FOV of the arthroscope. The proposed bone-shift compensation can also be applied to other joints, such as the knees or shoulders, by representing their bone movements using corresponding virtual links. In addition, the movement of the joint skin during surgery can be measured using noninvasive fiducial markers in the same manner as that used for the wrist joint.
Collapse
Affiliation(s)
- Deokgi Jeung
- Department of Robotics and Mechatronics Engineering, DGIST, Daegu, South Korea
| | - Kyunghwa Jung
- Department of Robotics and Mechatronics Engineering, DGIST, Daegu, South Korea; Korea Research Institute of Standards and Science, Daejeon, South Korea
| | - Hyun-Joo Lee
- Department of Orthopaedic Surgery, School of Medicine, Kyungpook National University, Kyungpook National University Hospital, Daegu, South Korea.
| | - Jaesung Hong
- Department of Robotics and Mechatronics Engineering, DGIST, Daegu, South Korea.
| |
Collapse
|
6
|
Durrani S, Onyedimma C, Jarrah R, Bhatti A, Nathani KR, Bhandarkar AR, Mualem W, Ghaith AK, Zamanian C, Michalopoulos GD, Alexander AY, Jean W, Bydon M. The Virtual Vision of Neurosurgery: How Augmented Reality and Virtual Reality are Transforming the Neurosurgical Operating Room. World Neurosurg 2022; 168:190-201. [DOI: 10.1016/j.wneu.2022.10.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2022] [Revised: 09/30/2022] [Accepted: 10/01/2022] [Indexed: 11/22/2022]
|
7
|
Lin L, Gao Y, Aung ZM, Xu H, Wang B, Yang X, Chai G, Xie L. Preliminary reports of augmented-reality assisted craniofacial bone fracture reduction. J Plast Reconstr Aesthet Surg 2022; 75:e1-e8. [DOI: 10.1016/j.bjps.2022.06.105] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2021] [Revised: 05/01/2022] [Accepted: 06/05/2022] [Indexed: 10/31/2022]
|
8
|
Rahimov C, Aliyev D, Rahimov N, Farzaliyev I. Mixed reality in the reconstruction of orbital floor: An experimental and clinical evaluative study. Ann Maxillofac Surg 2022; 12:46-53. [PMID: 36199454 PMCID: PMC9527844 DOI: 10.4103/ams.ams_141_21] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Revised: 06/22/2022] [Accepted: 07/21/2022] [Indexed: 11/04/2022] Open
Abstract
Introduction: Materials and Methods: Results: Discussion:
Collapse
|
9
|
Jung K, Kim H, Kholinne E, Park D, Choi H, Lee S, Shin MJ, Kim DM, Hong J, Koh KH, Jeon IH. Navigation-assisted anchor insertion in shoulder arthroscopy: a validity study. BMC Musculoskelet Disord 2020; 21:812. [PMID: 33278892 PMCID: PMC7719245 DOI: 10.1186/s12891-020-03808-y] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/23/2020] [Accepted: 11/19/2020] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND This study aimed to compare conventional and navigation-assisted arthroscopic rotator cuff repair in terms of anchor screw insertion. METHODS The surgical performance of five operators while using the conventional and proposed navigation-assisted systems in a phantom surgical model and cadaveric shoulders were compared. The participating operators were divided into two groups, the expert group (n = 3) and the novice group (n = 2). In the phantom model, the experimental tasks included anchor insertion in the rotator cuff footprint and sutures retrieval. A motion analysis camera system was used to track the surgeons' hand movements. The surgical performance metric included the total path length, number of movements, and surgical duration. In cadaveric experiments, the repeatability and reproducibility of the anchor insertion angle were compared among the three experts, and the feasibility of the navigation-assisted anchor insertion was validated. RESULTS No significant differences in the total path length, number of movements, and time taken were found between the conventional and proposed systems in the phantom model. In cadaveric experiments, however, the clustering of the anchor insertion angle indicated that the proposed system enabled both novice and expert operators to reproducibly insert the anchor with an angle close to the predetermined target angle, resulting in an angle error of < 2° (P = 0.0002). CONCLUSION The proposed navigation-assisted system improved the surgical performance from a novice level to an expert level. All the experts achieved high repeatability and reproducibility for anchor insertion. The navigation-assisted system may help surgeons, including those who are inexperienced, easily familiarize themselves to of suture anchors insertion in the right direction by providing better guidance for anchor orientation. LEVEL OF EVIDENCE A retrospective study (level 2).
Collapse
Affiliation(s)
- Kyunghwa Jung
- Department of Robotics Engineering, DGIST, Daegu, Republic of Korea
| | - Hyojune Kim
- Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea
| | - Erica Kholinne
- Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea.,Department of Orthopedic Surgery, St. Carolus Hospital, Jakarta, Indonesia
| | - Dongjun Park
- Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea
| | - Hyunseok Choi
- Department of Robotics Engineering, DGIST, Daegu, Republic of Korea
| | - Seongpung Lee
- Department of Robotics Engineering, DGIST, Daegu, Republic of Korea
| | - Myung-Jin Shin
- Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea
| | - Dong-Min Kim
- Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea
| | - Jaesung Hong
- Department of Robotics Engineering, DGIST, Daegu, Republic of Korea
| | - Kyoung Hwan Koh
- Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea
| | - In-Ho Jeon
- Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea.
| |
Collapse
|
10
|
Intraoperative 3-dimensional Projection of Blood Vessels on Body Surface Using an Augmented Reality System. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2020; 8:e3028. [PMID: 32983783 PMCID: PMC7489712 DOI: 10.1097/gox.0000000000003028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2020] [Accepted: 06/11/2020] [Indexed: 11/26/2022]
Abstract
Preoperative understanding of the running pattern of blood vessels is an important factor to approach surgical fields safely. In 2 cases where the vascular abnormalities were estimated, we projected the blood vessels onto the surgical field using an augmented reality device HoloLens. A splint was made to allow the patient to be fixed while undergoing computed tomographic angiography. Three-dimensional (3D) data on the blood vessels, skin surfaces, bones, and the 3 chosen points for alignment were segmented and then projected onto the body surfaces as holograms using the HoloLens. Two types of projection for holograms were used: projection type 1-where the body contours were projected as a line, and projection type 2-where the body surface was projected as meshed skin type. By projecting projection type 2 rather than projection type 1, we gained a better understanding of the 3D anatomic findings and deformation characteristics, including the anatomic blood vessel variation and positional relationships between the organs and body surfaces. To some extent, we could make sure that the depth perception can be obtained by recognizing the bone, vessels, or tumor inside the meshed skin surface. Our new method allows the 3D visualization of blood vessels from the body surface, and helps understand the 3D anatomic variation of the blood vessels to be applied as long as the blood vessels can be visualized.
Collapse
|
11
|
Goo HW, Park SJ, Yoo SJ. Advanced Medical Use of Three-Dimensional Imaging in Congenital Heart Disease: Augmented Reality, Mixed Reality, Virtual Reality, and Three-Dimensional Printing. Korean J Radiol 2020; 21:133-145. [PMID: 31997589 PMCID: PMC6992436 DOI: 10.3348/kjr.2019.0625] [Citation(s) in RCA: 59] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2019] [Accepted: 10/24/2019] [Indexed: 12/15/2022] Open
Abstract
Three-dimensional (3D) imaging and image reconstruction play a prominent role in the diagnosis, treatment planning, and post-therapeutic monitoring of patients with congenital heart disease. More interactive and realistic medical experiences take advantage of advanced visualization techniques like augmented, mixed, and virtual reality. Further, 3D printing is now used in medicine. All these technologies improve the understanding of the complex morphologies of congenital heart disease. In this review article, we describe the technical advantages and disadvantages of various advanced visualization techniques and their medical applications in the field of congenital heart disease. In addition, unresolved issues and future perspectives of these evolving techniques are described.
Collapse
Affiliation(s)
- Hyun Woo Goo
- Department of Radiology and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul, Korea.
| | - Sang Joon Park
- Department of Radiology, Biomedical Research Institute, Seoul National University Hospital, Seoul, Korea
| | - Shi Joon Yoo
- Department of Diagnostic Imaging, The Hospital for Sick Children, University of Toronto, Toronto, Canada
| |
Collapse
|
12
|
Li G, Dong J, Wang J, Cao D, Zhang X, Cao Z, Lu G. The clinical application value of mixed-reality-assisted surgical navigation for laparoscopic nephrectomy. Cancer Med 2020; 9:5480-5489. [PMID: 32543025 PMCID: PMC7402835 DOI: 10.1002/cam4.3189] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2020] [Revised: 05/06/2020] [Accepted: 05/07/2020] [Indexed: 12/22/2022] Open
Abstract
Purpose Laparoscopic nephrectomy (LN) has become the preferred method for renal cell carcinoma (RCC). Adequate preoperative assessment or intraoperative navigation is key to the successful implementation of LN. The aim of this study was to evaluate the clinical application value of mixed‐reality–assisted surgical navigation (MRASN) in LN. Patients and Methods A total of 100 patients with stage T1N0M0 renal tumors who underwent laparoscopic partial nephrectomy (LPN) or laparoscopic radical nephrectomy (LRN) were prospectively enrolled and divided into a mixed‐reality‐assisted laparoscopic nephrectomy (MRALN) group (n = 50) and a non–mixed‐reality‐assisted laparoscopic nephrectomy (non‐MRALN) group (n = 50). All patients underwent renal contrast‐enhanced CT scans. The CT DICOM data of all patients in the MRALN group were imported into the mixed‐reality (MR) postprocessing workstation and underwent holographic three‐dimensional visualization (V3D) modeling and MR displayed, respectively. We adopted the Likert scale to evaluate the clinical application value of MRASN. The consistency of evaluators was assessed using the Cohen kappa coefficient (k). Results No significant differences in patient demographic indicators between the MRALN group and the non‐MRALN group (P > .05). The subjective score of MRASN clinical application value in operative plan formulation, intraoperative navigation, remote consultation, teaching guidance, and doctor‐patient communication were higher in the MRASN group than in the non‐MRASN group (all P < .001). There were significantly more patients for whom LPN was successfully implemented in the MRALN group than in the non‐MRALN group (82% vs 46%, P < .001). The MRALN group had a shorter operative time (OT) and warm ischemia time (WIT) and less estimated blood loss (EBL) than the non‐MRALN group (all P < .001). Conclusion MRASN is helpful for operative plan formulation, intraoperative navigation, remote consultation, teaching guidance, and doctor‐patient communication. MRALN may effectively improve the successful implementation rate of LPN and reduce the OT, WIT, and EBL.
Collapse
Affiliation(s)
- Guan Li
- Department of Radiology, Jinling Hospital, Nanjing Medical University, Nanjing, China
| | - Jie Dong
- Department of Urology, Jinling Hospital, Nanjing Medical University, Nanjing, China
| | - Jinbao Wang
- Department of Radiology, General Hospital of Northern Theater Command, Shenyang, China
| | - Dongbing Cao
- Department of Urology, Cancer Hospital of China Medical University, Shenyang, China
| | - Xin Zhang
- Department of Radiology, The First Affiliated Hospital of China Medical University, Shenyang, China
| | - Zhiqiang Cao
- Department of Urology, General Hospital of Northern Theater Command, Shenyang, China
| | - Guangming Lu
- Department of Radiology, Jinling Hospital, Nanjing Medical University, Nanjing, China
| |
Collapse
|
13
|
Zhang M, Wang L. Re: Francesco Porpiglia, Enrico Checcucci, Daniele Amparore, et al. Three-dimensional Augmented Reality Robot-assisted Partial Nephrectomy in Case of Complex Tumours (PADUA ≥ 10): A New Intraoperative Tool Overcoming the Ultrasound Guidance. Eur Urol. In press. https://doi.org/10.1016/j.eururo.2019.11.024. Eur Urol 2020; 77:e161-e162. [PMID: 32303382 DOI: 10.1016/j.eururo.2020.03.037] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2020] [Accepted: 03/23/2020] [Indexed: 11/24/2022]
Affiliation(s)
- Mengda Zhang
- Department of Urology, The Third Xiangya Hospital, Central South University, Changsha, Hunan, P.R. China; Department of Urology, Xiangya Hospital, Central South University, Changsha, Hunan, P.R. China
| | - Long Wang
- Department of Urology, The Third Xiangya Hospital, Central South University, Changsha, Hunan, P.R. China; Department of Urology, Xiangya Hospital, Central South University, Changsha, Hunan, P.R. China.
| |
Collapse
|
14
|
Budhathoki S, Alsadoon A, Prasad P, Haddad S, Maag A. Augmented reality for narrow area navigation in jaw surgery: Modified tracking by detection volume subtraction algorithm. Int J Med Robot 2020; 16:e2097. [DOI: 10.1002/rcs.2097] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2019] [Revised: 02/19/2020] [Accepted: 02/20/2020] [Indexed: 12/27/2022]
Affiliation(s)
- Srijana Budhathoki
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| | - Abeer Alsadoon
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| | - P.W.C. Prasad
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| | - Sami Haddad
- Department of Oral and Maxillofacial ServicesGreater Western Sydney Area Health Services Sydney Australia
- Department of Oral and Maxillofacial ServicesCentral Coast Area Health Gosford Australia
| | - Angelika Maag
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| |
Collapse
|
15
|
A Skin-Conformal, Stretchable, and Breathable Fiducial Marker Patch for Surgical Navigation Systems. MICROMACHINES 2020; 11:mi11020194. [PMID: 32070015 PMCID: PMC7074652 DOI: 10.3390/mi11020194] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/06/2020] [Revised: 01/31/2020] [Accepted: 02/11/2020] [Indexed: 11/24/2022]
Abstract
Augmented reality (AR) surgical navigation systems have attracted considerable attention as they assist medical professionals in visualizing the location of ailments within the human body that are not readily seen with the naked eye. Taking medical imaging with a parallel C-shaped arm (C-arm) as an example, surgical sites are typically targeted using an optical tracking device and a fiducial marker in real-time. These markers then guide operators who are using a multifunctional endoscope apparatus by signaling the direction or distance needed to reach the affected parts of the body. In this way, fiducial markers are used to accurately protect the vessels and nerves exposed during the surgical process. Although these systems have already shown potential for precision implantation, delamination of the fiducial marker, which is a critical component of the system, from human skin remains a challenge due to a mechanical mismatch between the marker and skin, causing registration problems that lead to poor position alignments and surgical degradation. To overcome this challenge, the mechanical modulus and stiffness of the marker patch should be lowered to approximately 150 kPa, which is comparable to that of the epidermis, while improving functionality. Herein, we present a skin-conformal, stretchable yet breathable fiducial marker for the application in AR-based surgical navigation systems. By adopting pore patterns, we were able to create a fiducial marker with a skin-like low modulus and breathability. When attached to the skin, the fiducial marker was easily identified using optical recognition equipment and showed skin-conformal adhesion when stretched and shrunk repeatedly. As such, we believe the marker would be a good fiducial marker candidate for patients under surgical navigation systems.
Collapse
|
16
|
Lee S, Shim S, Ha HG, Lee H, Hong J. Simultaneous Optimization of Patient-Image Registration and Hand-Eye Calibration for Accurate Augmented Reality in Surgery. IEEE Trans Biomed Eng 2020; 67:2669-2682. [PMID: 31976878 DOI: 10.1109/tbme.2020.2967802] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
OBJECTIVE Augmented reality (AR) navigation using a position sensor in endoscopic surgeries relies on the quality of patient-image registration and hand-eye calibration. Conventional methods collect the necessary data to compute two output transformation matrices separately. However, the AR display setting during surgery generally differs from that during preoperative processes. Although conventional methods can identify optimal solutions under initial conditions, AR display errors are unavoidable during surgery owing to the inherent computational complexity of AR processes, such as error accumulation over successive matrix multiplications, and tracking errors of position sensor. METHODS We propose the simultaneous optimization of patient-image registration and hand-eye calibration in an AR environment before surgery. The relationship between the endoscope and a virtual object to overlay is first calculated using an endoscopic image, which also functions as a reference during optimization. After including the tracking information from the position sensor, patient-image registration and hand-eye calibration are optimized in terms of least-squares. RESULTS Experiments with synthetic data verify that the proposed method is less sensitive to computation and tracking errors. A phantom experiment with a position sensor is also conducted. The accuracy of the proposed method is significantly higher than that of the conventional method. CONCLUSION The AR accuracy of the proposed method is compared with those of the conventional ones, and the superiority of the proposed method is verified. SIGNIFICANCE This study demonstrates that the proposed method exhibits substantial potential for improving AR navigation accuracy.
Collapse
|
17
|
Gao Y, Lin L, Chai G, Xie L. A feasibility study of a new method to enhance the augmented reality navigation effect in mandibular angle split osteotomy. J Craniomaxillofac Surg 2019; 47:1242-1248. [DOI: 10.1016/j.jcms.2019.04.005] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2018] [Revised: 04/16/2019] [Accepted: 04/16/2019] [Indexed: 01/31/2023] Open
|
18
|
Supporting mandibular resection with intraoperative navigation utilizing augmented reality technology – A proof of concept study. J Craniomaxillofac Surg 2019; 47:854-859. [DOI: 10.1016/j.jcms.2019.03.004] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2018] [Revised: 01/30/2019] [Accepted: 03/04/2019] [Indexed: 11/22/2022] Open
|
19
|
Pietruski P, Majak M, Świątek-Najwer E, Żuk M, Popek M, Jaworowski J, Mazurek M. Supporting fibula free flap harvest with augmented reality: A proof-of-concept study. Laryngoscope 2019; 130:1173-1179. [PMID: 31132152 DOI: 10.1002/lary.28090] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2019] [Revised: 04/23/2019] [Accepted: 05/13/2019] [Indexed: 12/30/2022]
Abstract
OBJECTIVE To analyze a novel navigation system utilizing augmented reality (AR) as a supporting method for fibula free flap (FFF) harvest and fabrication. METHODS A total of 126 simulated osteotomies supported with a cutting guide or one of two AR-based intraoperative navigation modules-simple AR (sAR) or navigated AR (nAR)-were carried out on 18 identical models of the fibula (42 osteotomies per method). After fusing postoperative computed tomography scans of the operated fibulas with the virtual surgical plan based on preoperative images, the objective outcomes-angular deviations from the planned osteotomy trajectory (o ) and deviations of control points marked on the trajectory (mm)-were determined. RESULTS All analyzed methods provided similar accuracy of assisted osteotomies. The only significant difference referred to angular deviation in the sagittal plane, which was smaller after the cutting guide-assisted procedures than after the application of sAR and nAR (4.1 ± 2.29 vs. 5.08 ± 3.64 degrees, P = 0.031 and 4.1 ± 2.29 vs. 4.97 ± 2.91, P = 0.002, respectively). Mean deviation of control points after the cutting guide-assisted procedures was 2.76 ± 1.06 mm, as compared with 2.67 ± 1.09 mm for sAR and 2.95 ± 1.11 mm for nAR. CONCLUSION Our study demonstrated that both novel AR-based methods provided similar accuracy of assisted harvesting and contouring of the FFF as the cutting guides. This fact, as well as the acceptability of the concept by clinicians, justify their further development and evaluation in preclinical settings. LEVEL OF EVIDENCE NA Laryngoscope, 130:1173-1179, 2020.
Collapse
Affiliation(s)
- Piotr Pietruski
- Department of Applied Pharmacy and Bioengineering, Medical University of Warsaw, Warsaw, Poland
| | - Marcin Majak
- Department of Biomedical Engineering, Mechatronics and Theory of Mechanisms, Wroclaw University of Technology, Wroclaw, Poland.,Department of Radiology, Medical Centre of Postgraduate Education, Gruca Orthopaedic and Trauma Teaching Hospital, Otwock, Poland
| | - Ewelina Świątek-Najwer
- Department of Biomedical Engineering, Mechatronics and Theory of Mechanisms, Wroclaw University of Technology, Wroclaw, Poland
| | - Magdalena Żuk
- Department of Biomedical Engineering, Mechatronics and Theory of Mechanisms, Wroclaw University of Technology, Wroclaw, Poland
| | - Michał Popek
- Department of Biomedical Engineering, Mechatronics and Theory of Mechanisms, Wroclaw University of Technology, Wroclaw, Poland
| | - Janusz Jaworowski
- Department of Applied Pharmacy and Bioengineering, Medical University of Warsaw, Warsaw, Poland.,Timeless Plastic Surgery Clinic, Warsaw, Poland
| | - Maciej Mazurek
- Department of Applied Pharmacy and Bioengineering, Medical University of Warsaw, Warsaw, Poland
| |
Collapse
|
20
|
Ahn J, Choi H, Hong J, Hong J. Tracking Accuracy of a Stereo Camera-Based Augmented Reality Navigation System for Orthognathic Surgery. J Oral Maxillofac Surg 2019; 77:1070.e1-1070.e11. [DOI: 10.1016/j.joms.2018.12.032] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2018] [Revised: 12/27/2018] [Accepted: 12/27/2018] [Indexed: 10/27/2022]
|
21
|
Song C, Jeon S, Lee S, Ha HG, Kim J, Hong J. Augmented reality-based electrode guidance system for reliable electroencephalography. Biomed Eng Online 2018; 17:64. [PMID: 29793498 PMCID: PMC5968572 DOI: 10.1186/s12938-018-0500-x] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2017] [Accepted: 05/17/2018] [Indexed: 11/10/2022] Open
Abstract
Background In longitudinal electroencephalography (EEG) studies, repeatable electrode positioning is essential for reliable EEG assessment. Conventional methods use anatomical landmarks as fiducial locations for the electrode placement. Since the landmarks are manually identified, the EEG assessment is inevitably unreliable because of individual variations among the subjects and the examiners. To overcome this unreliability, an augmented reality (AR) visualization-based electrode guidance system was proposed. Methods The proposed electrode guidance system is based on AR visualization to replace the manual electrode positioning. After scanning and registration of the facial surface of a subject by an RGB-D camera, the AR of the initial electrode positions as reference positions is overlapped with the current electrode positions in real time. Thus, it can guide the position of the subsequently placed electrodes with high repeatability. Results The experimental results with the phantom show that the repeatability of the electrode positioning was improved compared to that of the conventional 10–20 positioning system. Conclusion The proposed AR guidance system improves the electrode positioning performance with a cost-effective system, which uses only RGB-D camera. This system can be used as an alternative to the international 10–20 system.
Collapse
Affiliation(s)
- Chanho Song
- Department of Robotics Engineering, DGIST, Techno jungang-daero, Daegu, Republic of Korea
| | - Sangseo Jeon
- Department of Robotics Engineering, DGIST, Techno jungang-daero, Daegu, Republic of Korea
| | - Seongpung Lee
- Department of Robotics Engineering, DGIST, Techno jungang-daero, Daegu, Republic of Korea
| | - Ho-Gun Ha
- Department of Robotics Engineering, DGIST, Techno jungang-daero, Daegu, Republic of Korea
| | - Jonghyun Kim
- Department of Robotics Engineering, DGIST, Techno jungang-daero, Daegu, Republic of Korea
| | - Jaesung Hong
- Department of Robotics Engineering, DGIST, Techno jungang-daero, Daegu, Republic of Korea.
| |
Collapse
|
22
|
Intraoperative Evaluation of Body Surface Improvement by an Augmented Reality System That a Clinician Can Modify. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2017; 5:e1432. [PMID: 28894655 PMCID: PMC5585428 DOI: 10.1097/gox.0000000000001432] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2017] [Accepted: 06/09/2017] [Indexed: 01/01/2023]
Abstract
BACKGROUND Augmented reality (AR) technology that can combine computer-generated images with a real scene has been reported in the medical field recently. We devised the AR system for evaluation of improvements of the body surface, which is important for plastic surgery. METHODS We constructed an AR system that is easy to modify by combining existing devices and free software. We superimposed the 3-dimensional images of the body surface and the bone (obtained from VECTRA H1 and CT) onto the actual surgical field by Moverio BT-200 smart glasses and evaluated improvements of the body surface in 8 cases. RESULTS In all cases, the 3D image was successfully projected on the surgical field. Improvement of the display method of the 3D image made it easier to distinguish the different shapes in the 3D image and surgical field, making comparison easier. In a patient with fibrous dysplasia, the symmetrized body surface image was useful for confirming improvement of the real body surface. In a patient with complex facial fracture, the simulated bone image was useful as a reference for reduction. In a patient with an osteoma of the forehead, simultaneously displayed images of the body surface and the bone made it easier to understand these positional relationships. CONCLUSIONS This study confirmed that AR technology is helpful for evaluation of the body surface in several clinical applications. Our findings are not only useful for body surface evaluation but also for effective utilization of AR technology in the field of plastic surgery.
Collapse
|
23
|
Virtual Reality and Augmented Reality in Plastic Surgery: A Review. Arch Plast Surg 2017; 44:179-187. [PMID: 28573091 PMCID: PMC5447526 DOI: 10.5999/aps.2017.44.3.179] [Citation(s) in RCA: 109] [Impact Index Per Article: 15.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2017] [Revised: 04/18/2017] [Accepted: 04/21/2017] [Indexed: 11/08/2022] Open
Abstract
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed.
Collapse
|
24
|
The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 2017; 37:66-90. [DOI: 10.1016/j.media.2017.01.007] [Citation(s) in RCA: 183] [Impact Index Per Article: 26.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2016] [Revised: 01/16/2017] [Accepted: 01/23/2017] [Indexed: 12/27/2022]
|