1
|
Bamps K, Bertels J, Minten L, Puvrez A, Coudyzer W, De Buck S, Ector J. Phantom study of augmented reality framework to assist epicardial punctures. J Med Imaging (Bellingham) 2024; 11:035002. [PMID: 38817712 PMCID: PMC11135927 DOI: 10.1117/1.jmi.11.3.035002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Revised: 04/24/2024] [Accepted: 05/15/2024] [Indexed: 06/01/2024] Open
Abstract
Purpose The objective of this study is to evaluate the accuracy of an augmented reality (AR) system in improving guidance, accuracy, and visualization during the subxiphoidal approach for epicardial ablation. Approach An AR application was developed to project real-time needle trajectories and patient-specific 3D organs using the Hololens 2. Additionally, needle tracking was implemented to offer real-time feedback to the operator, facilitating needle navigation. The AR application was evaluated through three different experiments: examining overlay accuracy, assessing puncture accuracy, and performing pre-clinical evaluations on a phantom. Results The results of the overlay accuracy assessment for the AR system yielded 2.36 ± 2.04 mm . Additionally, the puncture accuracy utilizing the AR system yielded 1.02 ± 2.41 mm . During the pre-clinical evaluation on the phantom, needle puncture with AR guidance showed 7.43 ± 2.73 mm , whereas needle puncture without AR guidance showed 22.62 ± 9.37 mm . Conclusions Overall, the AR platform has the potential to enhance the accuracy of percutaneous epicardial access for mapping and ablation of cardiac arrhythmias, thereby reducing complications and improving patient outcomes. The significance of this study lies in the potential of AR guidance to enhance the accuracy and safety of percutaneous epicardial access.
Collapse
Affiliation(s)
- Kobe Bamps
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
- KU Leuven, ESAT-PSI, Leuven, Belgium
| | | | - Lennert Minten
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
| | - Alexis Puvrez
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
| | | | - Stijn De Buck
- KU Leuven, ESAT-PSI, Leuven, Belgium
- KU Leuven, Department of Imaging and Pathology, Leuven, Belgium
| | - Joris Ector
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
| |
Collapse
|
2
|
Nawawithan N, Young J, Bettati P, Rathgeb AP, Pruitt KT, Frimpter J, Kim H, Yu J, Driver D, Shiferaw A, Chaudhari A, Johnson BA, Gahan J, Yu J, Fei B. An augmented reality and high-speed optical tracking system for laparoscopic surgery. PROCEEDINGS OF SPIE--THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING 2024; 12928:129280E. [PMID: 38708143 PMCID: PMC11069178 DOI: 10.1117/12.3008448] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/07/2024]
Abstract
While minimally invasive laparoscopic surgery can help reduce blood loss, reduce hospital time, and shorten recovery time compared to open surgery, it has the disadvantages of limited field of view and difficulty in locating subsurface targets. Our proposed solution applies an augmented reality (AR) system to overlay pre-operative images, such as those from magnetic resonance imaging (MRI), onto the target organ in the user's real-world environment. Our system can provide critical information regarding the location of subsurface lesions to guide surgical procedures in real time. An infrared motion tracking camera system was employed to obtain real-time position data of the patient and surgical instruments. To perform hologram registration, fiducial markers were used to track and map virtual coordinates to the real world. In this study, phantom models of each organ were constructed to test the reliability and accuracy of the AR-guided laparoscopic system. Root mean square error (RMSE) was used to evaluate the targeting accuracy of the laparoscopic interventional procedure. Our results demonstrated a registration error of 2.42 ± 0.79 mm and a procedural targeting error of 4.17 ± 1.63 mm using our AR-guided laparoscopic system that will be further refined for potential clinical procedures.
Collapse
Affiliation(s)
- Nati Nawawithan
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
- Department of Bioengineering, University of Texas at Dallas, Richardson, TX
| | - Jeff Young
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
- Department of Bioengineering, University of Texas at Dallas, Richardson, TX
| | - Patric Bettati
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
- Department of Bioengineering, University of Texas at Dallas, Richardson, TX
| | - Armand P. Rathgeb
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
- Department of Bioengineering, University of Texas at Dallas, Richardson, TX
| | - Kelden T. Pruitt
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
- Department of Bioengineering, University of Texas at Dallas, Richardson, TX
| | - Jordan Frimpter
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
| | - Henry Kim
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
| | - Jonathan Yu
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
| | - Davis Driver
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
| | - Amanuel Shiferaw
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
| | - Aditi Chaudhari
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
| | - Brett A. Johnson
- Department of Urology, University of Texas Southwestern Medical Center, Dallas, TX
| | - Jeffrey Gahan
- Department of Urology, University of Texas Southwestern Medical Center, Dallas, TX
| | - James Yu
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
- Department of Radiology, University of Texas Southwestern Medical Center, Dallas, TX
| | - Baowei Fei
- Center for Imaging and Surgical Innovation, University of Texas at Dallas, Richardson, TX
- Department of Bioengineering, University of Texas at Dallas, Richardson, TX
- Department of Radiology, University of Texas Southwestern Medical Center, Dallas, TX
| |
Collapse
|
3
|
Elsakka A, Park BJ, Marinelli B, Swinburne NC, Schefflein J. Virtual and Augmented Reality in Interventional Radiology: Current Applications, Challenges, and Future Directions. Tech Vasc Interv Radiol 2023; 26:100919. [PMID: 38071031 PMCID: PMC11152052 DOI: 10.1016/j.tvir.2023.100919] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Virtual reality (VR) and augmented Reality (AR) are emerging technologies with the potential to revolutionize Interventional radiology (IR). These innovations offer advantages in patient care, interventional planning, and educational training by improving the visualization and navigation of medical images. Despite progress, several challenges hinder their widespread adoption, including limitations in navigation systems, cost, clinical acceptance, and technical constraints of AR/VR equipment. However, ongoing research holds promise with recent advancements such as shape-sensing needles and improved organ deformation modeling. The development of deep learning techniques, particularly for medical imaging segmentation, presents a promising avenue to address existing accuracy and precision issues. Future applications of AR/VR in IR include simulation-based training, preprocedural planning, intraprocedural guidance, and increased patient engagement. As these technologies advance, they are expected to facilitate telemedicine, enhance operational efficiency, and improve patient outcomes, marking a new frontier in interventional radiology.
Collapse
Affiliation(s)
- Ahmed Elsakka
- Neuroradiology Service, Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY
| | - Brian J Park
- Oregon Health & Science University, Portland, OR
| | - Brett Marinelli
- Interventional Radiology Service, Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY
| | - Nathaniel C Swinburne
- Neuroradiology Service, Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY
| | - Javin Schefflein
- Neuroradiology Service, Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY.
| |
Collapse
|
4
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
5
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
6
|
HoloUS: Augmented reality visualization of live ultrasound images using HoloLens for ultrasound-guided procedures. Int J Comput Assist Radiol Surg 2021; 17:385-391. [PMID: 34817764 DOI: 10.1007/s11548-021-02526-7] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2021] [Accepted: 10/20/2021] [Indexed: 10/19/2022]
Abstract
PURPOSE Microsoft HoloLens is a pair of augmented reality (AR) smart glasses that could improve the intraprocedural visualization of ultrasound-guided procedures. With the wearable HoloLens headset, an ultrasound image can be virtually rendered and registered with the ultrasound transducer and placed directly in the practitioner's field of view. METHODS A custom application, called HoloUS, was developed using the HoloLens and a portable ultrasound machine connected through a wireless network. A custom 3D-printed case with an AR-pattern for the ultrasound transducer permitted ultrasound image tracking and registration. Voice controls on the HoloLens supported the scaling and movement of the ultrasound image as desired. The ultrasound images were streamed and displayed in real-time. A user study was performed to assess the effectiveness of the HoloLens as an alternative display platform. Novices and experts were timed on tasks involving targeting simulated vessels using a needle in a custom phantom. RESULTS Technical characterization of the HoloUS app was conducted using frame rate, tracking accuracy, and latency as performance metrics. The app ran at 25 frames/s, had an 80-ms latency, and could track the transducer with an average reprojection error of 0.0435 pixels. With AR visualization, the novices' times improved by 17% but the experts' times decreased slightly by 5%, which may reflect the experts' training and experience bias. CONCLUSION The HoloUS application was found to enhance user experience and simplify hand-eye coordination. By eliminating the need to alternately observe the patient and the ultrasound images presented on a separate monitor, the proposed AR application has the potential to improve efficiency and effectiveness of ultrasound-guided procedures.
Collapse
|
7
|
Augmented Reality Based Surgical Navigation of Complex Pelvic Osteotomies—A Feasibility Study on Cadavers. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11031228] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
Augmented reality (AR)-based surgical navigation may offer new possibilities for safe and accurate surgical execution of complex osteotomies. In this study we investigated the feasibility of navigating the periacetabular osteotomy of Ganz (PAO), known as one of the most complex orthopedic interventions, on two cadaveric pelves under realistic operating room conditions. Preoperative planning was conducted on computed tomography (CT)-reconstructed 3D models using an in-house developed software, which allowed creating cutting plane objects for planning of the osteotomies and reorientation of the acetabular fragment. An AR application was developed comprising point-based registration, motion compensation and guidance for osteotomies as well as fragment reorientation. Navigation accuracy was evaluated on CT-reconstructed 3D models, resulting in an error of 10.8 mm for osteotomy starting points and 5.4° for osteotomy directions. The reorientation errors were 6.7°, 7.0° and 0.9° for the x-, y- and z-axis, respectively. Average postoperative error of LCE angle was 4.5°. Our study demonstrated that the AR-based execution of complex osteotomies is feasible. Fragment realignment navigation needs further improvement, although it is more accurate than the state of the art in PAO surgery.
Collapse
|
8
|
Long DJ, Li M, De Ruiter QMB, Hecht R, Li X, Varble N, Blain M, Kassin MT, Sharma KV, Sarin S, Krishnasamy VP, Pritchard WF, Karanian JW, Wood BJ, Xu S. Comparison of Smartphone Augmented Reality, Smartglasses Augmented Reality, and 3D CBCT-guided Fluoroscopy Navigation for Percutaneous Needle Insertion: A Phantom Study. Cardiovasc Intervent Radiol 2021; 44:774-781. [PMID: 33409547 DOI: 10.1007/s00270-020-02760-7] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Accepted: 12/23/2020] [Indexed: 11/30/2022]
Abstract
PURPOSE To compare needle placement performance using an augmented reality (AR) navigation platform implemented on smartphone or smartglasses devices to that of CBCT-guided fluoroscopy in a phantom. MATERIALS AND METHODS An AR application was developed to display a planned percutaneous needle trajectory on the smartphone (iPhone7) and smartglasses (HoloLens1) devices in real time. Two AR-guided needle placement systems and CBCT-guided fluoroscopy with navigation software (XperGuide, Philips) were compared using an anthropomorphic phantom (CIRS, Norfolk, VA). Six interventional radiologists each performed 18 independent needle placements using smartphone (n = 6), smartglasses (n = 6), and XperGuide (n = 6) guidance. Placement error was defined as the distance from the needle tip to the target center. Placement time was recorded. For XperGuide, dose-area product (DAP, mGy*cm2) and fluoroscopy time (sec) were recorded. Statistical comparisons were made using a two-way repeated measures ANOVA. RESULTS The placement error using the smartphone, smartglasses, or XperGuide was similar (3.98 ± 1.68 mm, 5.18 ± 3.84 mm, 4.13 ± 2.38 mm, respectively, p = 0.11). Compared to CBCT-guided fluoroscopy, the smartphone and smartglasses reduced placement time by 38% (p = 0.02) and 55% (p = 0.001), respectively. The DAP for insertion using XperGuide was 3086 ± 2920 mGy*cm2, and no intra-procedural radiation was required for augmented reality. CONCLUSIONS Smartphone- and smartglasses-based augmented reality reduced needle placement time and radiation exposure while maintaining placement accuracy compared to a clinically validated needle navigation platform.
Collapse
Affiliation(s)
- Dilara J Long
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.
| | - Quirina M B De Ruiter
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Rachel Hecht
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Nicole Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.,Philips Research of North America, Cambridge, MA, 02141, USA
| | - Maxime Blain
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Michael T Kassin
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Karun V Sharma
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, Washington, DC, USA
| | - Shawn Sarin
- Department of Interventional Radiology, George Washington University Hospital, Washington, DC, USA
| | - Venkatesh P Krishnasamy
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - William F Pritchard
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - John W Karanian
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Bradford J Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| |
Collapse
|
9
|
Andrews CM, Henry AB, Soriano IM, Southworth MK, Silva JR. Registration Techniques for Clinical Applications of Three-Dimensional Augmented Reality Devices. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2020; 9:4900214. [PMID: 33489483 PMCID: PMC7819530 DOI: 10.1109/jtehm.2020.3045642] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 11/13/2020] [Accepted: 12/03/2020] [Indexed: 12/15/2022]
Abstract
Many clinical procedures would benefit from direct and intuitive real-time visualization of anatomy, surgical plans, or other information crucial to the procedure. Three-dimensional augmented reality (3D-AR) is an emerging technology that has the potential to assist physicians with spatial reasoning during clinical interventions. The most intriguing applications of 3D-AR involve visualizations of anatomy or surgical plans that appear directly on the patient. However, commercially available 3D-AR devices have spatial localization errors that are too large for many clinical procedures. For this reason, a variety of approaches for improving 3D-AR registration accuracy have been explored. The focus of this review is on the methods, accuracy, and clinical applications of registering 3D-AR devices with the clinical environment. The works cited represent a variety of approaches for registering holograms to patients, including manual registration, computer vision-based registration, and registrations that incorporate external tracking systems. Evaluations of user accuracy when performing clinically relevant tasks suggest that accuracies of approximately 2 mm are feasible. 3D-AR device limitations due to the vergence-accommodation conflict or other factors attributable to the headset hardware add on the order of 1.5 mm of error compared to conventional guidance. Continued improvements to 3D-AR hardware will decrease these sources of error.
Collapse
Affiliation(s)
- Christopher M. Andrews
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
- SentiAR, Inc.St. LouisMO63108USA
| | | | | | | | - Jonathan R. Silva
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
| |
Collapse
|
10
|
Heinrich F, Schwenderling L, Joeres F, Lawonn K, Hansen C. Comparison of Augmented Reality Display Techniques to Support Medical Needle Insertion. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3568-3575. [PMID: 33006930 DOI: 10.1109/tvcg.2020.3023637] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Augmented reality (AR) may be a useful technique to overcome issues of conventionally used navigation systems supporting medical needle insertions, like increased mental workload and complicated hand-eye coordination. Previous research primarily focused on the development of AR navigation systems designed for specific displaying devices, but differences between employed methods have not been investigated before. To this end, a user study involving a needle insertion task was conducted comparing different AR display techniques with a monitor-based approach as baseline condition for the visualization of navigation information. A video see-through stationary display, an optical see-through head-mounted display and a spatial AR projector-camera-system were investigated in this comparison. Results suggest advantages of using projected navigation information in terms of lower task completion time, lower angular deviation and affirmative subjective participant feedback. Techniques requiring the intermediate view on screens, i.e. the stationary display and the baseline condition, showed less favorable results. Thus, benefits of providing AR navigation information compared to a conventionally used method could be identified. Significant objective measures results, as well as an identification of advantages and disadvantages of individual display techniques contribute to the development and design of improved needle navigation systems.
Collapse
|
11
|
Park BJ, Hunt SJ, Nadolski GJ, Gade TP. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: a phantom study using HoloLens 2. Sci Rep 2020; 10:18620. [PMID: 33122766 PMCID: PMC7596500 DOI: 10.1038/s41598-020-75676-4] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Accepted: 10/19/2020] [Indexed: 12/14/2022] Open
Abstract
Out-of-plane lesions pose challenges for CT-guided interventions. Augmented reality (AR) headsets are capable to provide holographic 3D guidance to assist CT-guided targeting. A prospective trial was performed assessing CT-guided lesion targeting on an abdominal phantom with and without AR guidance using HoloLens 2. Eight operators performed a cumulative total of 86 needle passes. Total needle redirections, radiation dose, procedure time, and puncture rates of nontargeted lesions were compared with and without AR. Mean number of needle passes to reach the target reduced from 7.4 passes without AR to 3.4 passes with AR (p = 0.011). Mean CT dose index decreased from 28.7 mGy without AR to 16.9 mGy with AR (p = 0.009). Mean procedure time reduced from 8.93 min without AR to 4.42 min with AR (p = 0.027). Puncture rate of a nontargeted lesion decreased from 11.9% without AR (7/59 passes) to 0% with AR (0/27 passes). First needle passes were closer to the ideal target trajectory with AR versus without AR (4.6° vs 8.0° offset, respectively, p = 0.018). AR reduced variability and elevated the performance of all operators to the same level irrespective of prior clinical experience. AR guidance can provide significant improvements in procedural efficiency and radiation dose savings for targeting out-of-plane lesions.
Collapse
Affiliation(s)
- Brian J Park
- Oregon Health and Science, University School of Medicine, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA.
| | - Stephen J Hunt
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| | - Gregory J Nadolski
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| | - Terence P Gade
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| |
Collapse
|
12
|
Park BJ, Hunt SJ, Martin C, Nadolski GJ, Wood BJ, Gade TP. Augmented and Mixed Reality: Technologies for Enhancing the Future of IR. J Vasc Interv Radiol 2020; 31:1074-1082. [PMID: 32061520 DOI: 10.1016/j.jvir.2019.09.020] [Citation(s) in RCA: 49] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2019] [Revised: 08/01/2019] [Accepted: 09/20/2019] [Indexed: 10/25/2022] Open
Abstract
Augmented and mixed reality are emerging interactive and display technologies. These technologies are able to merge virtual objects, in either 2 or 3 dimensions, with the real world. Image guidance is the cornerstone of interventional radiology. With augmented or mixed reality, medical imaging can be more readily accessible or displayed in actual 3-dimensional space during procedures to enhance guidance, at times when this information is most needed. In this review, the current state of these technologies is addressed followed by a fundamental overview of their inner workings and challenges with 3-dimensional visualization. Finally, current and potential future applications in interventional radiology are highlighted.
Collapse
Affiliation(s)
- Brian J Park
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104.
| | - Stephen J Hunt
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104
| | - Charles Martin
- Department of Interventional Radiology, Cleveland Clinic, Cleveland, Ohio
| | - Gregory J Nadolski
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104
| | - Bradford J Wood
- Interventional Radiology, National Institutes of Health, Bethesda, Maryland
| | - Terence P Gade
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104
| |
Collapse
|
13
|
Heinrich F, Schwenderling L, Becker M, Skalej M, Hansen C. HoloInjection: augmented reality support for CT-guided spinal needle injections. Healthc Technol Lett 2019; 6:165-171. [PMID: 32038851 PMCID: PMC6942927 DOI: 10.1049/htl.2019.0062] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2019] [Accepted: 10/02/2019] [Indexed: 12/02/2022] Open
Abstract
The correct placement of needles is decisive for the success of many minimally-invasive interventions and therapies. These needle insertions are usually only guided by radiological imaging and can benefit from additional navigation support. Augmented reality (AR) is a promising tool to conveniently provide needed information and may thus overcome the limitations of existing approaches. To this end, a prototypical AR application was developed to guide the insertion of needles to spinal targets using the mixed reality glasses Microsoft HoloLens. The system's registration accuracy was attempted to measure and three guidance visualisation concepts were evaluated concerning achievable in-plane and out-of-plane needle orientation errors in a comparison study. Results suggested high registration accuracy and showed that the AR prototype is suitable for reducing out-of-plane orientation errors. Limitations, like comparatively high in-plane orientation errors, effects of the viewing position and missing image slices indicate potential for improvement that needs to be addressed before transferring the application to clinical trials.
Collapse
Affiliation(s)
- Florian Heinrich
- Faculty of Computer Science, University of Magdeburg, Universitätsplatz 2, 39106 Magdeburg, Germany.,Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany
| | - Luisa Schwenderling
- Faculty of Computer Science, University of Magdeburg, Universitätsplatz 2, 39106 Magdeburg, Germany.,Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany
| | - Mathias Becker
- Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany.,Department of Neuroradiology, University Hospital Magdeburg, Leipziger Strasse 44, 39120 Magdeburg, Germany
| | - Martin Skalej
- Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany.,Department of Neuroradiology, University Hospital Magdeburg, Leipziger Strasse 44, 39120 Magdeburg, Germany
| | - Christian Hansen
- Faculty of Computer Science, University of Magdeburg, Universitätsplatz 2, 39106 Magdeburg, Germany.,Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany
| |
Collapse
|
14
|
Beams R, Kim AS, Badano A. Transverse chromatic aberration in virtual reality head-mounted displays. OPTICS EXPRESS 2019; 27:24877-24884. [PMID: 31510369 DOI: 10.1364/oe.27.024877] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/01/2019] [Accepted: 07/08/2019] [Indexed: 06/10/2023]
Abstract
We demonstrate a method for measuring the transverse chromatic aberration (TCA) in a virtual reality head-mounted display. The method relies on acquiring images of a digital bar pattern and measuring the displacement of different color bars. This procedure was used to characterize the TCAs in the Oculus Go, Oculus Rift, Samsung Gear, and HTC Vive. The results show noticeable TCAs for the Oculus devices for angles larger than 5° from the center of the field of view. TCA is less noticeable in the Vive in part due to off-axis monochromatic aberrations. Finally, user measurements were conducted, which were in excellent agreement with the laboratory results.
Collapse
|
15
|
Heinrich F, Joeres F, Lawonn K, Hansen C. Comparison of Projective Augmented Reality Concepts to Support Medical Needle Insertion. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2019; 25:2157-2167. [PMID: 30892210 DOI: 10.1109/tvcg.2019.2903942] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Augmented reality (AR) is a promising tool to improve instrument navigation in needle-based interventions. Limited research has been conducted regarding suitable navigation visualizations. In this work, three navigation concepts based on existing approaches were compared in a user study using a projective AR setup. Each concept was implemented with three different scales for accuracy-to-color mapping and two methods of navigation indicator scaling. Participants were asked to perform simulated needle insertion tasks with each of the resulting 18 prototypes. Insertion angle and insertion depth accuracies were measured and analyzed, as well as task completion time and participants' subjectively perceived task difficulty. Results show a clear ranking of visualization concepts across variables. Less consistent results were obtained for the color and indicator scaling factors. Results suggest that logarithmic indicator scaling achieved better accuracy, but participants perceived it to be more difficult than linear scaling. With specific results for angle and depth accuracy, our study contributes to the future composition of improved navigation support and systems for precise needle insertion or similar applications.
Collapse
|