1
|
Necker FN, Cholok DJ, Shaheen MS, Fischer MJ, Gifford K, Le Castillo C, Scholz M, Leuze CW, Daniel BL, Momeni A. Suture Packaging as a Marker for Intraoperative Image Alignment in Augmented Reality on Mobile Devices. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2024; 12:e5933. [PMID: 38919516 PMCID: PMC11199004 DOI: 10.1097/gox.0000000000005933] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2024] [Accepted: 05/14/2024] [Indexed: 06/27/2024]
Abstract
Preoperative vascular imaging has become standard practice in the planning of microsurgical breast reconstruction. Currently, translating perforator locations from radiological findings to a patient's abdomen is often not easy or intuitive. Techniques using three-dimensional printing or patient-specific guides have been introduced to superimpose anatomy onto the abdomen for reference. Augmented and mixed reality is currently actively investigated for perforator mapping by superimposing virtual models directly onto the patient. Most techniques have found only limited adoption due to complexity and price. Additionally, a critical step is aligning virtual models to patients. We propose repurposing suture packaging as an image tracking marker. Tracking markers allow quick and easy alignment of virtual models to the individual patient's anatomy. Current techniques are often complicated or expensive and limit intraoperative use of augmented reality models. Suture packs are sterile, readily available, and can be used to align abdominal models on the patients. Using an iPad, the augmented reality models automatically align in the correct position by using a suture pack as a tracking marker. Given the ubiquity of iPads, the combination of these devices with readily available suture packs will predictably lower the barrier to entry and utilization of this technology. Here, our workflow is presented along with its intraoperative utilization. Additionally, we investigated the accuracy of this technology.
Collapse
Affiliation(s)
- Fabian N. Necker
- From the Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, Calif
- Institute of Functional and Clinical Anatomy, Digital Anatomy Lab, Faculty of Medicine, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
- Department of Surgery, Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, Calif
| | - David J. Cholok
- Department of Surgery, Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, Calif
| | - Mohammed S. Shaheen
- Department of Surgery, Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, Calif
| | - Marc J. Fischer
- From the Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, Calif
| | - Kyle Gifford
- Department of Radiology, 3D and Quantitative Imaging, Stanford University School of Medicine, Stanford, Calif
| | - Chris Le Castillo
- Department of Radiology, 3D and Quantitative Imaging, Stanford University School of Medicine, Stanford, Calif
| | - Michael Scholz
- Institute of Functional and Clinical Anatomy, Digital Anatomy Lab, Faculty of Medicine, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Christoph W. Leuze
- From the Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, Calif
| | - Bruce L. Daniel
- From the Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, Calif
| | - Arash Momeni
- Department of Surgery, Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, Calif
| |
Collapse
|
2
|
Borde T, Saccenti L, Li M, Varble NA, Hazen LA, Kassin MT, Ukeh IN, Horton KM, Delgado JF, Martin C, Xu S, Pritchard WF, Karanian JW, Wood BJ. Smart goggles augmented reality CT-US fusion compared to conventional fusion navigation for percutaneous needle insertion. Int J Comput Assist Radiol Surg 2024:10.1007/s11548-024-03148-5. [PMID: 38814530 DOI: 10.1007/s11548-024-03148-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2024] [Accepted: 04/10/2024] [Indexed: 05/31/2024]
Abstract
PURPOSE Targeting accuracy determines outcomes for percutaneous needle interventions. Augmented reality (AR) in IR may improve procedural guidance and facilitate access to complex locations. This study aimed to evaluate percutaneous needle placement accuracy using a goggle-based AR system compared to an ultrasound (US)-based fusion navigation system. METHODS Six interventional radiologists performed 24 independent needle placements in an anthropomorphic phantom (CIRS 057A) in four needle guidance cohorts (n = 6 each): (1) US-based fusion, (2) goggle-based AR with stereoscopically projected anatomy (AR-overlay), (3) goggle AR without the projection (AR-plain), and (4) CT-guided freehand. US-based fusion included US/CT registration with electromagnetic (EM) needle, transducer, and patient tracking. For AR-overlay, US, EM-tracked needle, stereoscopic anatomical structures and targets were superimposed over the phantom. Needle placement accuracy (distance from needle tip to target center), placement time (from skin puncture to final position), and procedure time (time to completion) were measured. RESULTS Mean needle placement accuracy using US-based fusion, AR-overlay, AR-plain, and freehand was 4.5 ± 1.7 mm, 7.0 ± 4.7 mm, 4.7 ± 1.7 mm, and 9.2 ± 5.8 mm, respectively. AR-plain demonstrated comparable accuracy to US-based fusion (p = 0.7) and AR-overlay (p = 0.06). Excluding two outliers, AR-overlay accuracy became 5.9 ± 2.6 mm. US-based fusion had the highest mean placement time (44.3 ± 27.7 s) compared to all navigation cohorts (p < 0.001). Longest procedure times were recorded with AR-overlay (34 ± 10.2 min) compared to AR-plain (22.7 ± 8.6 min, p = 0.09), US-based fusion (19.5 ± 5.6 min, p = 0.02), and freehand (14.8 ± 1.6 min, p = 0.002). CONCLUSION Goggle-based AR showed no difference in needle placement accuracy compared to the commercially available US-based fusion navigation platform. Differences in accuracy and procedure times were apparent with different display modes (with/without stereoscopic projections). The AR-based projection of the US and needle trajectory over the body may be a helpful tool to enhance visuospatial orientation. Thus, this study refines the potential role of AR for needle placements, which may serve as a catalyst for informed implementation of AR techniques in IR.
Collapse
Affiliation(s)
- Tabea Borde
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA.
| | - Laetitia Saccenti
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
- Henri Mondor Biomedical Research Institute, Inserm U955, Team N°18, Créteil, France
| | - Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Nicole A Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
- Philips Healthcare, Cambridge, MA, 02141, USA
| | - Lindsey A Hazen
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Michael T Kassin
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Ifechi N Ukeh
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Keith M Horton
- Department of Radiology, Georgetown Medical School, Medstar Washington Hospital Center, Washington, DC, 20007, USA
| | - Jose F Delgado
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
- Fischell Department of Bioengineering, University of Maryland, College Park, MD, 20742, USA
| | - Charles Martin
- Department of Interventional Radiology, Cleveland Clinic, Cleveland, OH, 44195, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - William F Pritchard
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - John W Karanian
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA
| | - Bradford J Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, 10 Center Drive, Room 3N320, MSC 1182, Bethesda, MD, 20892, USA.
- Fischell Department of Bioengineering, University of Maryland, College Park, MD, 20742, USA.
| |
Collapse
|
3
|
Bamps K, Bertels J, Minten L, Puvrez A, Coudyzer W, De Buck S, Ector J. Phantom study of augmented reality framework to assist epicardial punctures. J Med Imaging (Bellingham) 2024; 11:035002. [PMID: 38817712 PMCID: PMC11135927 DOI: 10.1117/1.jmi.11.3.035002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Revised: 04/24/2024] [Accepted: 05/15/2024] [Indexed: 06/01/2024] Open
Abstract
Purpose The objective of this study is to evaluate the accuracy of an augmented reality (AR) system in improving guidance, accuracy, and visualization during the subxiphoidal approach for epicardial ablation. Approach An AR application was developed to project real-time needle trajectories and patient-specific 3D organs using the Hololens 2. Additionally, needle tracking was implemented to offer real-time feedback to the operator, facilitating needle navigation. The AR application was evaluated through three different experiments: examining overlay accuracy, assessing puncture accuracy, and performing pre-clinical evaluations on a phantom. Results The results of the overlay accuracy assessment for the AR system yielded 2.36 ± 2.04 mm . Additionally, the puncture accuracy utilizing the AR system yielded 1.02 ± 2.41 mm . During the pre-clinical evaluation on the phantom, needle puncture with AR guidance showed 7.43 ± 2.73 mm , whereas needle puncture without AR guidance showed 22.62 ± 9.37 mm . Conclusions Overall, the AR platform has the potential to enhance the accuracy of percutaneous epicardial access for mapping and ablation of cardiac arrhythmias, thereby reducing complications and improving patient outcomes. The significance of this study lies in the potential of AR guidance to enhance the accuracy and safety of percutaneous epicardial access.
Collapse
Affiliation(s)
- Kobe Bamps
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
- KU Leuven, ESAT-PSI, Leuven, Belgium
| | | | - Lennert Minten
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
| | - Alexis Puvrez
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
| | | | - Stijn De Buck
- KU Leuven, ESAT-PSI, Leuven, Belgium
- KU Leuven, Department of Imaging and Pathology, Leuven, Belgium
| | - Joris Ector
- KU Leuven, Department of Cardiovascular Sciences, Leuven, Belgium
| |
Collapse
|
4
|
Sharma N, Mallela AN, Khan T, Canton SP, Kass NM, Steuer F, Jardini J, Biehl J, Andrews EG. Evolution of the meta-neurosurgeon: A systematic review of the current technical capabilities, limitations, and applications of augmented reality in neurosurgery. Surg Neurol Int 2024; 15:146. [PMID: 38742013 PMCID: PMC11090549 DOI: 10.25259/sni_167_2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Accepted: 04/05/2024] [Indexed: 05/16/2024] Open
Abstract
Background Augmented reality (AR) applications in neurosurgery have expanded over the past decade with the introduction of headset-based platforms. Many studies have focused on either preoperative planning to tailor the approach to the patient's anatomy and pathology or intraoperative surgical navigation, primarily realized as AR navigation through microscope oculars. Additional efforts have been made to validate AR in trainee and patient education and to investigate novel surgical approaches. Our objective was to provide a systematic overview of AR in neurosurgery, provide current limitations of this technology, as well as highlight several applications of AR in neurosurgery. Methods We performed a literature search in PubMed/Medline to identify papers that addressed the use of AR in neurosurgery. The authors screened three hundred and seventy-five papers, and 57 papers were selected, analyzed, and included in this systematic review. Results AR has made significant inroads in neurosurgery, particularly in neuronavigation. In spinal neurosurgery, this primarily has been used for pedicle screw placement. AR-based neuronavigation also has significant applications in cranial neurosurgery, including neurovascular, neurosurgical oncology, and skull base neurosurgery. Other potential applications include operating room streamlining, trainee and patient education, and telecommunications. Conclusion AR has already made a significant impact in neurosurgery in the above domains and has the potential to be a paradigm-altering technology. Future development in AR should focus on both validating these applications and extending the role of AR.
Collapse
Affiliation(s)
- Nikhil Sharma
- School of Medicine, University of Pittsburgh, Pittsburgh, United States
| | - Arka N. Mallela
- Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh, United States
| | - Talha Khan
- Department of Computing and Information, University of Pittsburgh, Pittsburgh, United States
| | - Stephen Paul Canton
- Department of Orthopaedic Surgery, University of Pittsburgh Medical Center, Pittsburgh, United States
| | | | - Fritz Steuer
- School of Medicine, University of Pittsburgh, Pittsburgh, United States
| | - Jacquelyn Jardini
- Department of Biology, Haverford College, Haverford, Pennsylvania, United States
| | - Jacob Biehl
- Department of Computing and Information, University of Pittsburgh, Pittsburgh, United States
| | - Edward G. Andrews
- Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh, United States
| |
Collapse
|
5
|
Lee KH, Li M, Varble N, Negussie AH, Kassin MT, Arrichiello A, Carrafiello G, Hazen LA, Wakim PG, Li X, Xu S, Wood BJ. Smartphone Augmented Reality Outperforms Conventional CT Guidance for Composite Ablation Margins in Phantom Models. J Vasc Interv Radiol 2024; 35:452-461.e3. [PMID: 37852601 DOI: 10.1016/j.jvir.2023.10.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Revised: 09/23/2023] [Accepted: 10/08/2023] [Indexed: 10/20/2023] Open
Abstract
PURPOSE To develop and evaluate a smartphone augmented reality (AR) system for a large 50-mm liver tumor ablation with treatment planning for composite overlapping ablation zones. MATERIALS AND METHODS A smartphone AR application was developed to display tumor, probe, projected probe paths, ablated zones, and real-time percentage of the ablated target tumor volume. Fiducial markers were attached to phantoms and an ablation probe hub for tracking. The system was evaluated with tissue-mimicking thermochromic phantoms and gel phantoms. Four interventional radiologists performed 2 trials each of 3 probe insertions per trial using AR guidance versus computed tomography (CT) guidance approaches in 2 gel phantoms. Insertion points and optimal probe paths were predetermined. On Gel Phantom 2, serial ablated zones were saved and continuously displayed after each probe placement/adjustment, enabling feedback and iterative planning. The percentages of tumor ablated for AR guidance versus CT guidance, and with versus without display of recorded ablated zones, were compared among interventional radiologists with pairwise t-tests. RESULTS The means of percentages of tumor ablated for CT freehand and AR guidance were 36% ± 7 and 47% ± 4 (P = .004), respectively. The mean composite percentages of tumor ablated for AR guidance were 43% ± 1 (without) and 50% ± 2 (with display of ablation zone) (P = .033). There was no strong correlation between AR-guided percentage of ablation and years of experience (r < 0.5), whereas there was a strong correlation between CT-guided percentage of ablation and years of experience (r > 0.9). CONCLUSIONS A smartphone AR guidance system for dynamic iterative large liver tumor ablation was accurate, performed better than conventional CT guidance, especially for less experienced interventional radiologists, and enhanced more standardized performance across experience levels for ablation of a 50-mm tumor.
Collapse
Affiliation(s)
- Katerina H Lee
- McGovern Medical School at UTHealth, Houston, Texas; Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Ming Li
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Nicole Varble
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland; Philips Research North America, Cambridge, Massachusetts
| | - Ayele H Negussie
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Michael T Kassin
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Antonio Arrichiello
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Gianpaolo Carrafiello
- Department of Radiology, Foundation IRCCS Ca' Granda Ospedale Maggiore Policlinico, University of Milan, Milan, Italy
| | - Lindsey A Hazen
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Paul G Wakim
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Sheng Xu
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Bradford J Wood
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland.
| |
Collapse
|
6
|
Lui C, Polster R, Bullen J, Baqui Z, Ilaslan H, Neill M, Simpfendorfer C, Altahawi F, Polster J. Smartphone application with 3D-printed needle guide for faster and more accurate CT-guided interventions in a phantom. Skeletal Radiol 2024; 53:567-573. [PMID: 37725165 DOI: 10.1007/s00256-023-04453-x] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/19/2023] [Revised: 09/06/2023] [Accepted: 09/06/2023] [Indexed: 09/21/2023]
Abstract
OBJECTIVE This study is to determine whether a needle guidance device combining a 3D-printed component with a smartphone would decrease the number of passes and time required to perform a standard CT-guided needle procedure in a phantom study. MATERIALS AND METHODS A 3D-printed mechanical guide with built-in apertures for various needle sizes was designed and printed. It was mounted on a smartphone and used to direct commercially available spring-loaded biopsy devices. A smartphone software application was developed to use the phone's sensors to provide the real-time location of a lesion in space, based on parameters derived from preprocedural CT images. The physical linkage of the guide, smartphone, and needle allowed the operator to manipulate the assembly as a single unit, with real-time graphical representation of the lesion shown on the smartphone display. Two radiology trainees and 3 staff radiologists targeted 5 lesions with and without the device (50 total procedures). The number of passes and time taken to reach each lesion were determined. RESULTS Use of the smartphone needle guide decreased the mean number of passes (with guide, 1.8; without guide, 3.4; P < 0.001) and mean time taken (with guide, 1.6 min; without guide, 2.7 min; P = 0.005) to perform a standard CT-guided procedure. On average, the decreases in number of passes and procedure time were more pronounced among trainees (P < 0.001). CONCLUSION The combination of a mechanical guide and smartphone can reduce the number of needle passes and the amount of time needed to reach a lesion in a phantom for both trainees and experienced radiologists.
Collapse
Affiliation(s)
- Christopher Lui
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA.
| | - Rylan Polster
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| | - Jennifer Bullen
- Quantitative Health Sciences, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| | - Zeeshan Baqui
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| | - Hakan Ilaslan
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| | - Matthew Neill
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| | - Claus Simpfendorfer
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| | - Faysal Altahawi
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| | - Joshua Polster
- Imaging Institute, Cleveland Clinic, 9500 Euclid Ave, Cleveland, OH, 44195, USA
| |
Collapse
|
7
|
Charalampopoulos G, Bale R, Filippiadis D, Odisio BC, Wood B, Solbiati L. Navigation and Robotics in Interventional Oncology: Current Status and Future Roadmap. Diagnostics (Basel) 2023; 14:98. [PMID: 38201407 PMCID: PMC10795729 DOI: 10.3390/diagnostics14010098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2023] [Revised: 12/26/2023] [Accepted: 12/30/2023] [Indexed: 01/12/2024] Open
Abstract
Interventional oncology (IO) is the field of Interventional Radiology that provides minimally invasive procedures under imaging guidance for the diagnosis and treatment of malignant tumors. Sophisticated devices can be utilized to increase standardization, accuracy, outcomes, and "repeatability" in performing percutaneous Interventional Oncology techniques. These technologies can reduce variability, reduce human error, and outperform human hand-to-eye coordination and spatial relations, thus potentially normalizing an otherwise broad diversity of IO techniques, impacting simulation, training, navigation, outcomes, and performance, as well as verification of desired minimum ablation margin or other measures of successful procedures. Stereotactic navigation and robotic systems may yield specific advantages, such as the potential to reduce procedure duration and ionizing radiation exposure during the procedure and, at the same time, increase accuracy. Enhanced accuracy, in turn, is linked to improved outcomes in many clinical scenarios. The present review focuses on the current role of percutaneous navigation systems and robotics in diagnostic and therapeutic Interventional Oncology procedures. The currently available alternatives are presented, including their potential impact on clinical practice as reflected in the peer-reviewed medical literature. A review of such data may inform wiser investment of time and resources toward the most impactful IR/IO applications of robotics and navigation to both standardize and address unmet clinical needs.
Collapse
Affiliation(s)
- Georgios Charalampopoulos
- 2nd Department of Radiology, University General Hospital “ATTIKON”, Medical School, National and Kapodistrian University of Athens, 1 Rimini Str, 12462 Athens, Greece;
| | - Reto Bale
- Interventional Oncology/Stereotaxy and Robotics, Department of Radiology, Medical University of Innsbruck, 6020 Innsbruck, Austria;
| | - Dimitrios Filippiadis
- 2nd Department of Radiology, University General Hospital “ATTIKON”, Medical School, National and Kapodistrian University of Athens, 1 Rimini Str, 12462 Athens, Greece;
| | - Bruno C. Odisio
- Department of Interventional Radiology, The University of Texas MD Anderson Cancer Center, Houston, TX 77030, USA;
| | - Bradford Wood
- Interventional Radiology and Center for Interventional Oncology, NIH Clinical Center and National Cancer Institute, National Institutes of Health, Bethesda, MD 20892, USA;
| | - Luigi Solbiati
- Department of Radiology, IRCCS Humanitas Research Hospital, Rozzano (Milano), Italy and Department of Biomedical Sciences, Humanitas University, Pieve Emanuele (Milano), 20072 Milano, Italy;
| |
Collapse
|
8
|
Duan X, He R, Jiang Y, Cui F, Wen H, Chen X, Hao Z, Zeng Y, Liu H, Shi J, Cheong H, Dong M, U K, Jiang S, Wang W, Liang H, Liu J, He J. Robot-assisted navigation for percutaneous localization of peripheral pulmonary nodule: an in vivo swine study. Quant Imaging Med Surg 2023; 13:8020-8030. [PMID: 38106331 PMCID: PMC10721995 DOI: 10.21037/qims-23-716] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2023] [Accepted: 09/12/2023] [Indexed: 12/19/2023]
Abstract
Background Robot-assisted surgery (RAS) systems have been developed but rarely applied to lung nodule localization. This study aimed to assess the feasibility and safety of using a robot-assisted navigation system in percutaneous lung nodule localization. Methods A computed tomography (CT)-guided robot-assisted navigation system was used to localize the simulated peripheral nodule in the swine lung through fluorescent agent injection. After the localization, fluorescent thoracoscopic wedge resection was performed. The deviation between the target point and the needle tip was measured using a professional 3-dimensional (3D) distance measurement software. The primary outcome was the localization accuracy (deviation) of the localization. The secondary outcomes were the localization-related complication rate, the localization duration, and the success rate. Results A total of 4 pigs were enrolled, and 20 peripheral lung nodules were created and localized successfully. All nodules underwent subsequent wedge resection for verification. The mean deviation by measuring the 3D distance was 3.81 mm [standard deviation (SD): 1.29 mm, 95% confidence interval (CI): 2.936-4.536 mm]. The technical success rate for localization was 100%, and the mean localization time was 14.69 minutes (SD: 4.67 minutes). The complication rate was 5% (1/20), with 1 pneumothorax after localization, and no mortality occurred. Conclusions This pilot animal study demonstrated the promising potential of the robot-assisted navigation technique in peripheral lung nodule localization, with high accuracy and feasibility. Further clinical trials are needed to validate its safety compared to traditional manual localization.
Collapse
Affiliation(s)
- Xingguang Duan
- School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
- School of Medical Technology, Beijing Institute of Technology, Beijing, China
| | - Rui He
- School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | - Yu Jiang
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Fei Cui
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Hao Wen
- School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China
| | | | - Zhexue Hao
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Yuan Zeng
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Hui Liu
- Department of Anesthesia, the First Affiliated Hospital of Guangzhou Medical University, Guangzhou, China
| | - Jipeng Shi
- True Health Medical Technology Co. Ltd., Hengqin, China
| | - Houiam Cheong
- True Health Medical Technology Co. Ltd., Hengqin, China
| | - Mengxing Dong
- True Health Medical Technology Co. Ltd., Hengqin, China
| | | | - Shunjun Jiang
- Departments of Pharmacology, the First Affiliated Hospital of Guangzhou Medical University, Guangzhou, China
| | - Wei Wang
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Hengrui Liang
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Jun Liu
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
| | - Jianxing He
- Department of Thoracic Surgery and Oncology, the First Affiliated Hospital of Guangzhou Medical University, State Key Laboratory of Respiratory Disease, National Clinical Research Center for Respiratory Disease, Guangzhou Institute of Respiratory Health, Guangzhou, China
- Southern Medical University, Guangzhou, China
| |
Collapse
|
9
|
Albano D, Messina C, Gitto S, Chianca V, Sconfienza LM. Bone biopsies guided by augmented reality: a pilot study. Eur Radiol Exp 2023; 7:40. [PMID: 37468652 PMCID: PMC10356701 DOI: 10.1186/s41747-023-00353-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Accepted: 05/09/2023] [Indexed: 07/21/2023] Open
Abstract
PURPOSE To test the technical feasibility of an augmented reality (AR) navigation system to guide bone biopsies. METHODS We enrolled patients subjected to percutaneous computed tomography (CT)-guided bone biopsy using a novel AR navigation system. Data from prospectively enrolled patients (AR group) were compared with data obtained retrospectively from previous standard CT-guided bone biopsies (control group). We evaluated the following: procedure duration, number of CT passes, patient's radiation dose (dose-length product), complications, and specimen adequacy. Technical success was defined as the ability to complete the procedure as planned, reaching the target center. Technical efficacy was assessed evaluating specimen adequacy. RESULTS Eight patients (4 males) aged 58 ± 24 years (mean ± standard deviation) were enrolled in the AR group and compared with 8 controls (4 males) aged 60 ± 15 years. No complications were observed. Procedure duration, number of CT passes, and radiation dose were 22 ± 5 min, 4 (median) [4, 6 interquartile range] and 1,034 ± 672 mGy*cm for the AR group and 23 ± 5 min, 9 [7.75, 11.25], and 1,954 ± 993 mGy*cm for controls, respectively. No significant differences were observed for procedure duration (p = 0.878). Conversely, number of CT passes and radiation doses were significantly lower for the AR group (p < 0.001 and p = 0.021, respectively). Technical success and technical efficacy were 100% for both groups. CONCLUSIONS This AR navigation system is safe, feasible, and effective; it can decrease radiation exposure and number of CT passes during bone biopsies without increasing duration time. RELEVANCE STATEMENT This augmented reality (AR) navigation system is a safe and feasible guidance for bone biopsies; it may ensure a decrease in the number of CT passes and patient's radiation dose. KEY POINTS • This AR navigation system is a safe guidance for bone biopsies. • It ensures decrease of number of CT passes and patient's radiation exposure. • Procedure duration was similar to that of standard CT-guided biopsy. • Technical success was 100% as in all patients the target was reached. • Technical efficacy was 100% as the specimen was adequate in all patients.
Collapse
Affiliation(s)
| | - Carmelo Messina
- IRCCS Istituto Ortopedico Galeazzi, Milan, 20161, Italy
- Dipartimento di Scienze Biomediche per la Salute, Università degli Studi di Milano, Milan, 20122, Italy
| | - Salvatore Gitto
- IRCCS Istituto Ortopedico Galeazzi, Milan, 20161, Italy
- Dipartimento di Scienze Biomediche per la Salute, Università degli Studi di Milano, Milan, 20122, Italy
| | - Vito Chianca
- Clinica Di Radiologia EOC IIMSI, Lugano, Switzerland
- Ospedale Evangelico Betania, Via Argine 604, Naples, 80147, Italy
| | - Luca Maria Sconfienza
- IRCCS Istituto Ortopedico Galeazzi, Milan, 20161, Italy
- Dipartimento di Scienze Biomediche per la Salute, Università degli Studi di Milano, Milan, 20122, Italy
| |
Collapse
|
10
|
Bruners P. [CT-guided local ablative interventions]. RADIOLOGIE (HEIDELBERG, GERMANY) 2023:10.1007/s00117-023-01164-1. [PMID: 37306751 DOI: 10.1007/s00117-023-01164-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Accepted: 05/09/2023] [Indexed: 06/13/2023]
Abstract
BACKGROUND Applicator-based local ablations under computed tomography (CT) guidance for the treatment of malignant tumors have found their way into clinical routine. OBJECTIVES The basic principles of the different ablation technologies and their specific clinical field of application are described. MATERIALS AND METHODS A comprehensive literature review regarding applicator-based ablation techniques was carried out. RESULTS Radiofrequency (RFA) and microwave ablation (MWA) represent two image-guided hyperthermal treatment modalities that have been established for the treatment of primary and secondary liver malignancies. In addition, both techniques are also applied for local ablative therapy of lung- and kidney tumors. Cryoablation is mainly used for the local ablation of T1 kidney cancer and due to its intrinsic analgetic characteristics for application in the musculoskeletal system. Nonresectable pancreatic tumors and centrally located liver malignancies can be treated with irreversible electroporation. This nonthermal ablation modality preserves the structure of the extracellular matrix including blood vessels and ducts. Technical advancements in the field of CT-guided interventions include the use of robotics, different tracking and navigation technologies and the use of augmented reality with the goal to achieve higher precision, shorter intervention time and thereby reduce radiation exposure. CONCLUSION Percutaneous ablation techniques under CT guidance are an essential part of interventional radiology and they are suited for local treatment of malignancies in most organ systems.
Collapse
Affiliation(s)
- Philipp Bruners
- Klinik für Diagnostische und Interventionelle Radiologie, Universitätsklinik RWTH Aachen, Pauwelsstraße 30, 52074, Aachen, Deutschland.
| |
Collapse
|
11
|
Suresh D, Aydin A, James S, Ahmed K, Dasgupta P. The Role of Augmented Reality in Surgical Training: A Systematic Review. Surg Innov 2023; 30:366-382. [PMID: 36412148 PMCID: PMC10331622 DOI: 10.1177/15533506221140506] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
This review aims to provide an update on the role of augmented reality (AR) in surgical training and investigate whether the use of AR improves performance measures compared to traditional approaches in surgical trainees. PUBMED, EMBASE, Google Scholar, Cochrane Library, British Library and Science Direct were searched following PRIMSA guidelines. All English language original studies pertaining to AR in surgical training were eligible for inclusion. Qualitative analysis was performed and results were categorised according to simulator models, subsequently being evaluated using Messick's framework for validity and McGaghie's translational outcomes for simulation-based learning. Of the 1132 results retrieved, 45 were included in the study. 29 platforms were identified, with the highest 'level of effectiveness' recorded as 3. In terms of validity parameters, 10 AR models received a strong 'content validity' score of 2.15 models had a 'response processes' score ≥ 1. 'Internal structure' and 'consequences' were largely not discussed. 'Relations to other variables' was the best assessed criterion, with 9 platforms achieving a high score of 2. Overall, the Microsoft HoloLens received the highest level of recommendation for both validity and level of effectiveness. Augmented reality in surgical education is feasible and effective as an adjunct to traditional training. The Microsoft HoloLens has shown the most promising results across all parameters and produced improved performance measures in surgical trainees. In terms of the other simulator models, further research is required with stronger study designs, in order to validate the use of AR in surgical training.
Collapse
Affiliation(s)
- Dhivya Suresh
- Guy’s, King’s and St Thomas’ School of Medical Education, King’s College London, London, UK
| | - Abdullatif Aydin
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| | - Stuart James
- Department of General Surgery, Princess Royal University Hospital, London, UK
| | - Kamran Ahmed
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| | - Prokar Dasgupta
- MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, UK
| |
Collapse
|
12
|
Hayasaka T, Kawano K, Onodera Y, Suzuki H, Nakane M, Kanoto M, Kawamae K. Comparison of accuracy between augmented reality/mixed reality techniques and conventional techniques for epidural anesthesia using a practice phantom model kit. BMC Anesthesiol 2023; 23:171. [PMID: 37210521 DOI: 10.1186/s12871-023-02133-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Accepted: 05/10/2023] [Indexed: 05/22/2023] Open
Abstract
BACKGROUND This study used an epidural anesthesia practice kit (model) to evaluate the accuracy of epidural anesthesia using standard techniques (blind) and augmented/mixed reality technology and whether visualization using augmented/mixed reality technology would facilitate epidural anesthesia. METHODS This study was conducted at the Yamagata University Hospital (Yamagata, Japan) between February and June 2022. Thirty medical students with no experience in epidural anesthesia were randomly divided into augmented reality (-), augmented reality (+), and semi-augmented reality groups, with 10 students in each group. Epidural anesthesia was performed using the paramedian approach with an epidural anesthesia practice kit. The augmented reality (-) group performed epidural anesthesia without HoloLens2Ⓡ and the augmented reality (+) group with HoloLens2Ⓡ. The semi-augmented reality group performed epidural anesthesia without HoloLens2Ⓡ after 30 s of image construction of the spine using HoloLens2Ⓡ. The epidural space puncture point distance between the ideal insertion needle and participant's insertion needle was compared. RESULTS Four medical students in the augmented reality (-), zero in the augmented reality (+), and one in the semi-augmented reality groups failed to insert the needle into the epidural space. The epidural space puncture point distance for the augmented reality (-), augmented reality (+), and semi-augmented reality groups were 8.7 (5.7-14.3) mm, 3.5 (1.8-8.0) mm (P = 0.017), and 4.9 (3.2-5.9) mm (P = 0.027), respectively; a significant difference was observed between the two groups. CONCLUSIONS Augmented/mixed reality technology has the potential to contribute significantly to the improvement of epidural anesthesia techniques.
Collapse
Affiliation(s)
- Tatsuya Hayasaka
- Department of Anesthesiology, Yamagata University Hospital, 2-2-2 Iidanishi, Yamagata City, Yamagata, 990-9585, Japan.
| | - Kazuharu Kawano
- Department of Medicine, Yamagata University School of Medicine, Yamagata, Japan
| | - Yu Onodera
- Critical Care Center, Yamagata University Hospital, Yamagata, Japan
| | - Hiroto Suzuki
- Critical Care Center, Yamagata University Hospital, Yamagata, Japan
| | - Masaki Nakane
- Department of Emergency and Critical Care Medicine, Yamagata University Hospital, Yamagata, Japan
| | - Masafumi Kanoto
- Department of Radiology, Division of Diagnostic Radiology, Yamagata University Hospital, Yamagata, Japan
| | - Kaneyuki Kawamae
- Department of Anesthesiology, Yamagata University Hospital, 2-2-2 Iidanishi, Yamagata City, Yamagata, 990-9585, Japan
| |
Collapse
|
13
|
Chiou SY, Liu LS, Lee CW, Kim DH, Al-masni MA, Liu HL, Wei KC, Yan JL, Chen PY. Augmented Reality Surgical Navigation System Integrated with Deep Learning. Bioengineering (Basel) 2023; 10:617. [PMID: 37237687 PMCID: PMC10215407 DOI: 10.3390/bioengineering10050617] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Revised: 05/11/2023] [Accepted: 05/17/2023] [Indexed: 05/28/2023] Open
Abstract
Most current surgical navigation methods rely on optical navigators with images displayed on an external screen. However, minimizing distractions during surgery is critical and the spatial information displayed in this arrangement is non-intuitive. Previous studies have proposed combining optical navigation systems with augmented reality (AR) to provide surgeons with intuitive imaging during surgery, through the use of planar and three-dimensional imagery. However, these studies have mainly focused on visual aids and have paid relatively little attention to real surgical guidance aids. Moreover, the use of augmented reality reduces system stability and accuracy, and optical navigation systems are costly. Therefore, this paper proposed an augmented reality surgical navigation system based on image positioning that achieves the desired system advantages with low cost, high stability, and high accuracy. This system also provides intuitive guidance for the surgical target point, entry point, and trajectory. Once the surgeon uses the navigation stick to indicate the position of the surgical entry point, the connection between the surgical target and the surgical entry point is immediately displayed on the AR device (tablet or HoloLens glasses), and a dynamic auxiliary line is shown to assist with incision angle and depth. Clinical trials were conducted for EVD (extra-ventricular drainage) surgery, and surgeons confirmed the system's overall benefit. A "virtual object automatic scanning" method is proposed to achieve a high accuracy of 1 ± 0.1 mm for the AR-based system. Furthermore, a deep learning-based U-Net segmentation network is incorporated to enable automatic identification of the hydrocephalus location by the system. The system achieves improved recognition accuracy, sensitivity, and specificity of 99.93%, 93.85%, and 95.73%, respectively, representing a significant improvement from previous studies.
Collapse
Affiliation(s)
- Shin-Yan Chiou
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Nuclear Medicine, Linkou Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Li-Sheng Liu
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Chia-Wei Lee
- Department of Electrical Engineering, College of Engineering, Chang Gung University, Kwei-Shan, Taoyuan 333, Taiwan
| | - Dong-Hyun Kim
- Department of Electrical and Electronic Engineering, College of Engineering, Yonsei University, Seodaemun-gu, Seoul 03722, Republic of Korea
| | - Mohammed A. Al-masni
- Department of Artificial Intelligence, College of Software & Convergence Technology, Daeyang AI Center, Sejong University, Seoul 05006, Republic of Korea
| | - Hao-Li Liu
- Department of Electrical Engineering, National Taiwan University, Taipei 106, Taiwan
| | - Kuo-Chen Wei
- New Taipei City Tucheng Hospital, Tao-Yuan, Tucheng, New Taipei City 236, Taiwan
| | - Jiun-Lin Yan
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| | - Pin-Yuan Chen
- Department of Neurosurgery, Keelung Chang Gung Memorial Hospital, Keelung 204, Taiwan
| |
Collapse
|
14
|
Li M, Mehralivand S, Xu S, Varble N, Bakhutashvili I, Gurram S, Pinto PA, Choyke PL, Wood BJ, Turkbey B. HoloLens augmented reality system for transperineal free-hand prostate procedures. J Med Imaging (Bellingham) 2023; 10:025001. [PMID: 36875636 PMCID: PMC9976411 DOI: 10.1117/1.jmi.10.2.025001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2022] [Accepted: 02/09/2023] [Indexed: 03/05/2023] Open
Abstract
Purpose An augmented reality (AR) system was developed to facilitate free-hand real-time needle guidance for transperineal prostate (TP) procedures and to overcome the limitations of a traditional guidance grid. Approach The HoloLens AR system enables the superimposition of annotated anatomy derived from preprocedural volumetric images onto a patient and addresses the most challenging part of free-hand TP procedures by providing real-time needle tip localization and needle depth visualization during insertion. The AR system accuracy, or the image overlay accuracy ( n = 56 ), and needle targeting accuracy ( n = 24 ) were evaluated within a 3D-printed phantom. Three operators each used a planned-path guidance method ( n = 4 ) and free-hand guidance ( n = 4 ) to guide needles into targets in a gel phantom. Placement error was recorded. The feasibility of the system was further evaluated by delivering soft tissue markers into tumors of an anthropomorphic pelvic phantom via the perineum. Results The image overlay error was 1.29 ± 0.57 mm , and needle targeting error was 2.13 ± 0.52 mm . The planned-path guidance placements showed similar error compared to the free-hand guidance ( 4.14 ± 1.08 mm versus 4.20 ± 1.08 mm , p = 0.90 ). The markers were successfully implanted either into or in close proximity to the target lesion. Conclusions The HoloLens AR system can provide accurate needle guidance for TP interventions. AR support for free-hand lesion targeting is feasible and may provide more flexibility than grid-based methods, due to the real-time 3D and immersive experience during free-hand TP procedures.
Collapse
Affiliation(s)
- Ming Li
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Sherif Mehralivand
- National Institutes of Health, Molecular Imaging Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Sheng Xu
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Nicole Varble
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
- Philips Research of North America, Cambridge, Massachusetts, United States
| | - Ivane Bakhutashvili
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Sandeep Gurram
- National Institutes of Health, Urologic Oncology Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Peter A. Pinto
- National Institutes of Health, Urologic Oncology Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Peter L. Choyke
- National Institutes of Health, Molecular Imaging Branch, National Cancer Institute, Bethesda, Maryland, United States
| | - Bradford J. Wood
- National Institutes of Health, Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, Bethesda, Maryland, United States
| | - Baris Turkbey
- National Institutes of Health, Molecular Imaging Branch, National Cancer Institute, Bethesda, Maryland, United States
| |
Collapse
|
15
|
Out-of-Plane Needle Placements Using 3D Augmented Reality Protractor on Smartphone: An Experimental Phantom Study. Cardiovasc Intervent Radiol 2023; 46:675-679. [PMID: 36658373 DOI: 10.1007/s00270-023-03357-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/29/2022] [Accepted: 12/31/2022] [Indexed: 01/20/2023]
Abstract
PURPOSE To evaluate the accuracy of needle placement using a three-dimensional (3D) augmented reality (AR) protractor on smartphones (AR Puncture). MATERIALS AND METHODS An AR protractor that can be rotated in three directions against the CT plane with angle guidance lines for smartphones was developed. The protractor center can be adjusted to an entry point by manually moving the smartphone with the protractor center fixed at the center of the screen (Fix-On-Screen) or by image tracking with a printed QR code placed at an entry point (QR-Tracking). Needle placement was performed by viewing a target line in the tangent direction with the Bull's eye method. The needle placement errors placed by four operators in six out-of-plane directions in a phantom using a smartphone (iPhone XR, Apple, Cupertino, CA, USA) were compared with two registration methods. RESULTS No significant difference in the average needle placement error was observed between the Fix-On-Screen and QR-Tracking methods (5.6 ± 1.7 mm vs. 6.1 ± 2.9 mm, p = 0.475). The average procedural time of the Fix-On-Screen method was shorter than that of the QR-Tracking method (71.0 ± 23.9 s vs. 98.4 ± 59.5 s, p = 0.042). CONCLUSION The accuracies of out-of-plane needle placements using the 3D AR protractor with the two registration methods were equally high, with short procedure times. In clinical use, the Fix-On-Screen registration method would be more convenient because no additional markers are required.
Collapse
|
16
|
Khan T, Biehl JT, Andrews EG, Babichenko D. A systematic comparison of the accuracy of monocular RGB tracking and LiDAR for neuronavigation. Healthc Technol Lett 2022; 9:91-101. [PMID: 36514478 PMCID: PMC9731545 DOI: 10.1049/htl2.12036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 09/02/2022] [Accepted: 09/05/2022] [Indexed: 12/16/2022] Open
Abstract
With the advent of augmented reality (AR), the use of AR-guided systems in the field of medicine has gained traction. However, the wide-scale adaptation of these systems requires highly accurate and reliable tracking. In this work, the tracking accuracy of two technology platforms, LiDAR and Vuforia, are developed and rigorously tested for a catheter placement neurological procedure. Several experiments (900) are performed for each technology across various combinations of catheter lengths and insertion trajectories. This analysis shows that the LiDAR platform outperformed Vuforia; which is the state-of-the-art in monocular RGB tracking solutions. LiDAR had 75% less radial distance error and 26% less angle deviation error. Results provide key insights into the value and utility of LiDAR-based tracking in AR guidance systems.
Collapse
Affiliation(s)
- Talha Khan
- School of Computing and InformationUniversity of PittsburghPittsburghPAUSA
| | - Jacob T. Biehl
- School of Computing and InformationUniversity of PittsburghPittsburghPAUSA
| | - Edward G. Andrews
- Department of Neurological SurgerySchool of MedicineUniversity of PittsburghPittsburghPAUSA
| | - Dmitriy Babichenko
- School of Computing and InformationUniversity of PittsburghPittsburghPAUSA
| |
Collapse
|
17
|
Moreta-Martínez R, Rubio-Pérez I, García-Sevilla M, García-Elcano L, Pascau J. Evaluation of optical tracking and augmented reality for needle navigation in sacral nerve stimulation. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2022; 224:106991. [PMID: 35810510 DOI: 10.1016/j.cmpb.2022.106991] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Revised: 06/10/2022] [Accepted: 06/28/2022] [Indexed: 06/15/2023]
Abstract
BACKGROUND AND OBJECTIVE Sacral nerve stimulation (SNS) is a minimally invasive procedure where an electrode lead is implanted through the sacral foramina to stimulate the nerve modulating colonic and urinary functions. One of the most crucial steps in SNS procedures is the placement of the tined lead close to the sacral nerve. However, needle insertion is very challenging for surgeons. Several x-ray projections are required to interpret the needle position correctly. In many cases, multiple punctures are needed, causing an increase in surgical time and patient's discomfort and pain. In this work we propose and evaluate two different navigation systems to guide electrode placement in SNS surgeries designed to reduce surgical time, minimize patient discomfort and improve surgical outcomes. METHODS We developed, for the first alternative, an open-source navigation software to guide electrode placement by real-time needle tracking with an optical tracking system (OTS). In the second method, we present a smartphone-based AR application that displays virtual guidance elements directly on the affected area, using a 3D printed reference marker placed on the patient. This guidance facilitates needle insertion with a predefined trajectory. Both techniques were evaluated to determine which one obtained better results than the current surgical procedure. To compare the proposals with the clinical method, we developed an x-ray software tool that calculates a digitally reconstructed radiograph, simulating the fluoroscopy acquisitions during the procedure. Twelve physicians (inexperienced and experienced users) performed needle insertions through several specific targets to evaluate the alternative SNS guidance methods on a realistic patient-based phantom. RESULTS With each navigation solution, we observed that users took less average time to complete each insertion (36.83 s and 44.43 s for the OTS and AR methods, respectively) and needed fewer average punctures to reach the target (1.23 and 1.96 for the OTS and AR methods respectively) than following the standard clinical method (189.28 s and 3.65 punctures). CONCLUSIONS To conclude, we have shown two navigation alternatives that could improve surgical outcome by significantly reducing needle insertions, surgical time and patient's pain in SNS procedures. We believe that these solutions are feasible to train surgeons and even replace current SNS clinical procedures.
Collapse
Affiliation(s)
- Rafael Moreta-Martínez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Instituto de Investigación Sanitaria Gregorio Marañón, Madrid 28007, Spain
| | - Inés Rubio-Pérez
- Servicio de Cirugía General, Hospital Universitario La Paz, Madrid 28046, Spain
| | - Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Instituto de Investigación Sanitaria Gregorio Marañón, Madrid 28007, Spain
| | - Laura García-Elcano
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Centro de Investigación Médica Aplicada, Clínica Universidad de Navarra, Madrid 28027, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Leganés 28911, Spain; Instituto de Investigación Sanitaria Gregorio Marañón, Madrid 28007, Spain.
| |
Collapse
|
18
|
Multicenter assessment of augmented reality registration methods for image-guided interventions. Radiol Med 2022; 127:857-865. [DOI: 10.1007/s11547-022-01515-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 06/13/2022] [Indexed: 10/17/2022]
|
19
|
Thermal Ablation of Liver Tumors Guided by Augmented Reality: An Initial Clinical Experience. Cancers (Basel) 2022; 14:cancers14051312. [PMID: 35267620 PMCID: PMC8909771 DOI: 10.3390/cancers14051312] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 02/25/2022] [Accepted: 02/27/2022] [Indexed: 02/06/2023] Open
Abstract
Background: Over the last two decades, augmented reality (AR) has been used as a visualization tool in many medical fields in order to increase precision, limit the radiation dose, and decrease the variability among operators. Here, we report the first in vivo study of a novel AR system for the guidance of percutaneous interventional oncology procedures. Methods: Eight patients with 15 liver tumors (0.7−3.0 cm, mean 1.56 + 0.55) underwent percutaneous thermal ablations using AR guidance (i.e., the Endosight system). Prior to the intervention, the patients were evaluated with US and CT. The targeted nodules were segmented and three-dimensionally (3D) reconstructed from CT images, and the probe trajectory to the target was defined. The procedures were guided solely by AR, with the position of the probe tip was subsequently confirmed by conventional imaging. The primary endpoints were the targeting accuracy, the system setup time, and targeting time (i.e., from the target visualization to the correct needle insertion). The technical success was also evaluated and validated by co-registration software. Upon completion, the operators were assessed for cybersickness or other symptoms related to the use of AR. Results: Rapid system setup and procedural targeting times were noted (mean 14.3 min; 12.0−17.2 min; 4.3 min, 3.2−5.7 min, mean, respectively). The high targeting accuracy (3.4 mm; 2.6−4.2 mm, mean) was accompanied by technical success in all 15 lesions (i.e., the complete ablation of the tumor and 13/15 lesions with a >90% 5-mm periablational margin). No intra/periprocedural complications or operator cybersickness were observed. Conclusions: AR guidance is highly accurate, and allows for the confident performance of percutaneous thermal ablations.
Collapse
|
20
|
Suzuki K, Morita S, Endo K, Yamamoto T, Sakai S. Noncontact measurement of puncture needle angle using augmented reality technology in computed tomography-guided biopsy: stereotactic coordinate design and accuracy evaluation. Int J Comput Assist Radiol Surg 2022; 17:745-750. [PMID: 35190975 DOI: 10.1007/s11548-022-02572-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Accepted: 01/26/2022] [Indexed: 11/26/2022]
Abstract
PURPOSE This study aims to introduce a new handheld device application for noncontact and real-time measurements of the angle of a biopsy needle using augmented reality (AR) image tracking technology. Furthermore, this study discusses the methods used to optimize the related coordinate design for computed tomography (CT)-guided biopsy procedures. METHODS An in-house noncontact angle measurement application was developed using AR platform software. This application tracks the position and direction of a printed texture located on the handle of a biopsy needle. The needle direction was factorized into two directions: tilting or rolling. Tilting was defined following the tilting of the CT gantry so that rolling would match the angle measured in CT images. In this study, CT-guided tumor biopsies were performed using a conventional guiding method with a protractor. The true value of needle rolling was measured by CT imaging and was then compared to the rolling measurement provided by the application developed in the current study using a mobile phone. RESULTS This study enrolled 18 cases of tumor biopsy (five renal tumors, five lung tumors, four retroperitoneal tumors, one soft tissue tumor, one thyroid tumor, one mesentery tumor, and one bone tumor). The measurement accuracy was - 0.2°, which was the average difference between AR and CT, and the measurement precision was 2.0°, which was the standard deviation of the difference between AR and CT measurements. The coefficient of determination (R2) was 0.996. CONCLUSION The noncontact needle measurement software using AR technology is sufficiently reliable for use in clinical settings. A real-time display of the needle angle that also shows the direction of the CT gantry is expected to enable a simple biopsy needle navigation.
Collapse
Affiliation(s)
- Kazufumi Suzuki
- Department of Diagnostic Imaging and Nuclear Medicine, Tokyo Women's Medical University, 8-1, Kawada-cho, Shinjuku-ku, Tokyo, 162-8666, Japan.
| | - Satoru Morita
- Department of Diagnostic Imaging and Nuclear Medicine, Tokyo Women's Medical University, 8-1, Kawada-cho, Shinjuku-ku, Tokyo, 162-8666, Japan
| | - Kenji Endo
- Department of Diagnostic Imaging and Nuclear Medicine, Tokyo Women's Medical University, 8-1, Kawada-cho, Shinjuku-ku, Tokyo, 162-8666, Japan
| | - Takahiro Yamamoto
- Department of Diagnostic Imaging and Nuclear Medicine, Tokyo Women's Medical University, 8-1, Kawada-cho, Shinjuku-ku, Tokyo, 162-8666, Japan
| | - Shuji Sakai
- Department of Diagnostic Imaging and Nuclear Medicine, Tokyo Women's Medical University, 8-1, Kawada-cho, Shinjuku-ku, Tokyo, 162-8666, Japan
| |
Collapse
|
21
|
Mixed Reality Needle Guidance Application on Smartglasses Without Pre-procedural CT Image Import with Manually Matching Coordinate Systems. Cardiovasc Intervent Radiol 2022; 45:349-356. [PMID: 35022858 DOI: 10.1007/s00270-021-03029-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/20/2021] [Accepted: 10/28/2021] [Indexed: 11/02/2022]
Abstract
PURPOSE To develop and assess the accuracy of a mixed reality (MR) needle guidance application on smartglasses. MATERIALS AND METHODS An MR needle guidance application on HoloLens2, without pre-procedural CT image reconstruction or import by manually matching the spatial and MR coordinate systems, was developed. First, the accuracy of the target locations in the image overlay at 63 points arranged on a 45 × 35 × 21 cm box and needle angles from 0° to 80°, placed using the MR application, was verified. The needle placement errors from 12 different entry points in a phantom by seven operators (four physicians and three non-physicians) were compared using a linear mixed model between the MR guidance and conventional methods using protractors. RESULTS The average errors of the target locations and needle angles placed using the MR application were 5.9 ± 2.6 mm and 2.3 ± 1.7°, respectively. The average needle insertion error using the MR guidance was slightly smaller compared to that using the conventional method (8.4 ± 4.0 mm vs. 9.6 ± 5.1 mm, p = 0.091), particularly in the out-of-plane approach (9.6 ± 3.5 mm vs. 12.3 ± 4.6 mm, p = 0.003). The procedural time was longer with MR guidance than with the conventional method (412 ± 134 s vs. 219 ± 66 s, p < 0.001). CONCLUSION MR needle guidance without pre-procedural CT image import is feasible when matching coordinate systems, and the accuracy of needle insertion is slightly better than that of the conventional method.
Collapse
|
22
|
Christou AS, Amalou A, Lee H, Rivera J, Li R, Kassin MT, Varble N, Tsz Ho Tse Z, Xu S, Wood BJ. Image-Guided Robotics for Standardized and Automated Biopsy and Ablation. Semin Intervent Radiol 2021; 38:565-575. [PMID: 34853503 DOI: 10.1055/s-0041-1739164] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
Image-guided robotics for biopsy and ablation aims to minimize procedure times, reduce needle manipulations, radiation, and complications, and enable treatment of larger and more complex tumors, while facilitating standardization for more uniform and improved outcomes. Robotic navigation of needles enables standardized and uniform procedures which enhance reproducibility via real-time precision feedback, while avoiding radiation exposure to the operator. Robots can be integrated with computed tomography (CT), cone beam CT, magnetic resonance imaging, and ultrasound and through various techniques, including stereotaxy, table-mounted, floor-mounted, and patient-mounted robots. The history, challenges, solutions, and questions facing the field of interventional radiology (IR) and interventional oncology are reviewed, to enable responsible clinical adoption and value definition via ergonomics, workflows, business models, and outcome data. IR-integrated robotics is ready for broader adoption. The robots are coming!
Collapse
Affiliation(s)
- Anna S Christou
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Amel Amalou
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - HooWon Lee
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Jocelyne Rivera
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Rui Li
- Tandon School of Engineering, New York University, Brooklyn, New York
| | - Michael T Kassin
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Nicole Varble
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland.,Philips Research North America, Cambridge, Massachusetts
| | - Zion Tsz Ho Tse
- Department of Electrical Engineering, University of York, Heslington, York, United Kingdom
| | - Sheng Xu
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Bradford J Wood
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland.,Department of Radiology and Imaging Sciences, National Institutes of Health, Bethesda, Maryland.,National Cancer Institute, National Institutes of Health, Bethesda, Maryland.,Interventional Radiology, Radiology and Imaging Sciences, National Institutes of Health, Bethesda, Maryland
| |
Collapse
|
23
|
Zhao Z, Poyhonen J, Chen Cai X, Sophie Woodley Hooper F, Ma Y, Hu Y, Ren H, Song W, Tsz Ho Tse Z. Augmented reality technology in image-guided therapy: State-of-the-art review. Proc Inst Mech Eng H 2021; 235:1386-1398. [PMID: 34304631 PMCID: PMC8573682 DOI: 10.1177/09544119211034357] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/17/2023]
Abstract
Image-guided therapies have been on the rise in recent years as they can achieve higher accuracy and are less invasive than traditional methods. By combining augmented reality technology with image-guided therapy, more organs, and tissues can be observed by surgeons to improve surgical accuracy. In this review, 233 publications (dated from 2015 to 2020) on the design and application of augmented reality-based systems for image-guided therapy, including both research prototypes and commercial products, were considered for review. Based on their functions and applications. Sixteen studies were selected. The engineering specifications and applications were analyzed and summarized for each study. Finally, future directions and existing challenges in the field were summarized and discussed.
Collapse
Affiliation(s)
- Zhuo Zhao
- School of Electrical and Computer Engineering, University of Georgia, Athens, GA, USA
| | - Jasmin Poyhonen
- Department of Electronic Engineering, University of York, York, UK
| | - Xin Chen Cai
- Department of Electronic Engineering, University of York, York, UK
| | | | - Yangmyung Ma
- Hull York Medical School, University of York, York, UK
| | - Yihua Hu
- Department of Electronic Engineering, University of York, York, UK
| | - Hongliang Ren
- Department of Electronic Engineering The Chinese University of Hong Kong (CUHK), Hong Kong, China
| | - Wenzhan Song
- School of Electrical and Computer Engineering, University of Georgia, Athens, GA, USA
| | - Zion Tsz Ho Tse
- Department of Electronic Engineering, University of York, York, UK
| |
Collapse
|
24
|
Long DJ, Li M, De Ruiter QMB, Hecht R, Li X, Varble N, Blain M, Kassin MT, Sharma KV, Sarin S, Krishnasamy VP, Pritchard WF, Karanian JW, Wood BJ, Xu S. Comparison of Smartphone Augmented Reality, Smartglasses Augmented Reality, and 3D CBCT-guided Fluoroscopy Navigation for Percutaneous Needle Insertion: A Phantom Study. Cardiovasc Intervent Radiol 2021; 44:774-781. [PMID: 33409547 DOI: 10.1007/s00270-020-02760-7] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Accepted: 12/23/2020] [Indexed: 11/30/2022]
Abstract
PURPOSE To compare needle placement performance using an augmented reality (AR) navigation platform implemented on smartphone or smartglasses devices to that of CBCT-guided fluoroscopy in a phantom. MATERIALS AND METHODS An AR application was developed to display a planned percutaneous needle trajectory on the smartphone (iPhone7) and smartglasses (HoloLens1) devices in real time. Two AR-guided needle placement systems and CBCT-guided fluoroscopy with navigation software (XperGuide, Philips) were compared using an anthropomorphic phantom (CIRS, Norfolk, VA). Six interventional radiologists each performed 18 independent needle placements using smartphone (n = 6), smartglasses (n = 6), and XperGuide (n = 6) guidance. Placement error was defined as the distance from the needle tip to the target center. Placement time was recorded. For XperGuide, dose-area product (DAP, mGy*cm2) and fluoroscopy time (sec) were recorded. Statistical comparisons were made using a two-way repeated measures ANOVA. RESULTS The placement error using the smartphone, smartglasses, or XperGuide was similar (3.98 ± 1.68 mm, 5.18 ± 3.84 mm, 4.13 ± 2.38 mm, respectively, p = 0.11). Compared to CBCT-guided fluoroscopy, the smartphone and smartglasses reduced placement time by 38% (p = 0.02) and 55% (p = 0.001), respectively. The DAP for insertion using XperGuide was 3086 ± 2920 mGy*cm2, and no intra-procedural radiation was required for augmented reality. CONCLUSIONS Smartphone- and smartglasses-based augmented reality reduced needle placement time and radiation exposure while maintaining placement accuracy compared to a clinically validated needle navigation platform.
Collapse
Affiliation(s)
- Dilara J Long
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.
| | - Quirina M B De Ruiter
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Rachel Hecht
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Nicole Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.,Philips Research of North America, Cambridge, MA, 02141, USA
| | - Maxime Blain
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Michael T Kassin
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Karun V Sharma
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, Washington, DC, USA
| | - Shawn Sarin
- Department of Interventional Radiology, George Washington University Hospital, Washington, DC, USA
| | - Venkatesh P Krishnasamy
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - William F Pritchard
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - John W Karanian
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Bradford J Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| |
Collapse
|
25
|
Heinrich F, Schwenderling L, Joeres F, Lawonn K, Hansen C. Comparison of Augmented Reality Display Techniques to Support Medical Needle Insertion. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3568-3575. [PMID: 33006930 DOI: 10.1109/tvcg.2020.3023637] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Augmented reality (AR) may be a useful technique to overcome issues of conventionally used navigation systems supporting medical needle insertions, like increased mental workload and complicated hand-eye coordination. Previous research primarily focused on the development of AR navigation systems designed for specific displaying devices, but differences between employed methods have not been investigated before. To this end, a user study involving a needle insertion task was conducted comparing different AR display techniques with a monitor-based approach as baseline condition for the visualization of navigation information. A video see-through stationary display, an optical see-through head-mounted display and a spatial AR projector-camera-system were investigated in this comparison. Results suggest advantages of using projected navigation information in terms of lower task completion time, lower angular deviation and affirmative subjective participant feedback. Techniques requiring the intermediate view on screens, i.e. the stationary display and the baseline condition, showed less favorable results. Thus, benefits of providing AR navigation information compared to a conventionally used method could be identified. Significant objective measures results, as well as an identification of advantages and disadvantages of individual display techniques contribute to the development and design of improved needle navigation systems.
Collapse
|
26
|
Park BJ, Hunt SJ, Nadolski GJ, Gade TP. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: a phantom study using HoloLens 2. Sci Rep 2020; 10:18620. [PMID: 33122766 PMCID: PMC7596500 DOI: 10.1038/s41598-020-75676-4] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Accepted: 10/19/2020] [Indexed: 12/14/2022] Open
Abstract
Out-of-plane lesions pose challenges for CT-guided interventions. Augmented reality (AR) headsets are capable to provide holographic 3D guidance to assist CT-guided targeting. A prospective trial was performed assessing CT-guided lesion targeting on an abdominal phantom with and without AR guidance using HoloLens 2. Eight operators performed a cumulative total of 86 needle passes. Total needle redirections, radiation dose, procedure time, and puncture rates of nontargeted lesions were compared with and without AR. Mean number of needle passes to reach the target reduced from 7.4 passes without AR to 3.4 passes with AR (p = 0.011). Mean CT dose index decreased from 28.7 mGy without AR to 16.9 mGy with AR (p = 0.009). Mean procedure time reduced from 8.93 min without AR to 4.42 min with AR (p = 0.027). Puncture rate of a nontargeted lesion decreased from 11.9% without AR (7/59 passes) to 0% with AR (0/27 passes). First needle passes were closer to the ideal target trajectory with AR versus without AR (4.6° vs 8.0° offset, respectively, p = 0.018). AR reduced variability and elevated the performance of all operators to the same level irrespective of prior clinical experience. AR guidance can provide significant improvements in procedural efficiency and radiation dose savings for targeting out-of-plane lesions.
Collapse
Affiliation(s)
- Brian J Park
- Oregon Health and Science, University School of Medicine, 3181 SW Sam Jackson Park Rd, Portland, OR, 97239, USA.
| | - Stephen J Hunt
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| | - Gregory J Nadolski
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| | - Terence P Gade
- Perelman School of Medicine, University of Pennsylvania, 3400 Civic Center Blvd, Philadelphia, PA, 19104, USA
| |
Collapse
|
27
|
Li M, Seifabadi R, Long D, De Ruiter Q, Varble N, Hecht R, Negussie AH, Krishnasamy V, Xu S, Wood BJ. Smartphone- versus smartglasses-based augmented reality (AR) for percutaneous needle interventions: system accuracy and feasibility study. Int J Comput Assist Radiol Surg 2020; 15:1921-1930. [PMID: 32734314 DOI: 10.1007/s11548-020-02235-7] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2020] [Accepted: 07/14/2020] [Indexed: 11/26/2022]
Abstract
PURPOSE To compare the system accuracy and needle placement performance of smartphone- and smartglasses-based augmented reality (AR) for percutaneous needle interventions. METHODS An AR platform was developed to enable the superimposition of annotated anatomy and a planned needle trajectory onto a patient in real time. The system accuracy of the AR display on smartphone (iPhone7) and smartglasses (HoloLens1) devices was evaluated on a 3D-printed phantom. The target overlay error was measured as the distance between actual and virtual targets (n = 336) on the AR display, derived from preprocedural CT. The needle overlay angle was measured as the angular difference between actual and virtual needles (n = 12) on the AR display. Three operators each used the iPhone (n = 8), HoloLens (n = 8) and CT-guided freehand (n = 8) to guide needles into targets in a phantom. Needle placement error was measured with post-placement CT. Needle placement time was recorded from needle puncture to navigation completion. RESULTS The target overlay error of the iPhone was comparable to the HoloLens (1.75 ± 0.59 mm, 1.74 ± 0.86 mm, respectively, p = 0.9). The needle overlay angle of the iPhone and HoloLens was similar (0.28 ± 0.32°, 0.41 ± 0.23°, respectively, p = 0.26). The iPhone-guided needle placements showed reduced error compared to the HoloLens (2.58 ± 1.04 mm, 3.61 ± 2.25 mm, respectively, p = 0.05) and increased time (87 ± 17 s, 71 ± 27 s, respectively, p = 0.02). Both AR devices reduced placement error compared to CT-guided freehand (15.92 ± 8.06 mm, both p < 0.001). CONCLUSION An augmented reality platform employed on smartphone and smartglasses devices may provide accurate display and navigation guidance for percutaneous needle-based interventions.
Collapse
Affiliation(s)
- Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.
| | - Reza Seifabadi
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Dilara Long
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Quirina De Ruiter
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Nicole Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
- Philips Research of North America, Cambridge, MA, USA
| | - Rachel Hecht
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Ayele H Negussie
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Venkatesh Krishnasamy
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Bradford J Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| |
Collapse
|
28
|
Solbiati L, Gennaro N, Muglia R. Augmented Reality: From Video Games to Medical Clinical Practice. Cardiovasc Intervent Radiol 2020; 43:1427-1429. [PMID: 32632853 DOI: 10.1007/s00270-020-02575-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Accepted: 06/22/2020] [Indexed: 12/31/2022]
Affiliation(s)
- Luigi Solbiati
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Milan, Italy. .,Department of Radiology, Humanitas Clinical and Research Hospital, Rozzano, Milan, Italy.
| | - Nicolo' Gennaro
- Training School in Radiology, Humanitas University, Pieve Emanuele, Milan, Italy
| | - Riccardo Muglia
- Training School in Radiology, Humanitas University, Pieve Emanuele, Milan, Italy
| |
Collapse
|