1
|
Galyfos G, Pikula M, Skalski A, Vagena S, Filis K, Sigala F. Using a novel three-dimensional holographic technology to perform open vascular surgery procedures. J Vasc Surg Cases Innov Tech 2024; 10:101440. [PMID: 38464890 PMCID: PMC10924202 DOI: 10.1016/j.jvscit.2024.101440] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2023] [Accepted: 01/11/2024] [Indexed: 03/12/2024] Open
Abstract
Augmented reality technology has been introduced during recent years into everyday clinical practice. Several surgical specialties have begun using such technology for preoperative planning as well as intraoperatively. Regarding vascular surgery, a limited number of reports have described the benefits, mainly for endovascular procedures. We aim to present a novel three-dimensional holographic system we used to perform an open vascular procedure.
Collapse
Affiliation(s)
- George Galyfos
- Vascular Unit, First Propedeutic Department of Surgery, National and Kapodistrian University of Athens, Hippocration Hospital, Athens, Greece
| | - Marcel Pikula
- Vascular Unit, First Propedeutic Department of Surgery, National and Kapodistrian University of Athens, Hippocration Hospital, Athens, Greece
| | - Andrzej Skalski
- Vascular Unit, First Propedeutic Department of Surgery, National and Kapodistrian University of Athens, Hippocration Hospital, Athens, Greece
| | - Sylvia Vagena
- Vascular Unit, First Propedeutic Department of Surgery, National and Kapodistrian University of Athens, Hippocration Hospital, Athens, Greece
| | - Konstantinos Filis
- Vascular Unit, First Propedeutic Department of Surgery, National and Kapodistrian University of Athens, Hippocration Hospital, Athens, Greece
| | - Frangiska Sigala
- Vascular Unit, First Propedeutic Department of Surgery, National and Kapodistrian University of Athens, Hippocration Hospital, Athens, Greece
| |
Collapse
|
2
|
Boogaard LL, Notten K, Kluivers K, Van der Wal S, Maal TJJ, Verhamme L. Accuracy of augmented reality-guided needle placement for pulsed radiofrequency treatment of pudendal neuralgia: a pilot study on a phantom model. PeerJ 2024; 12:e17127. [PMID: 38560457 PMCID: PMC10981882 DOI: 10.7717/peerj.17127] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2023] [Accepted: 02/27/2024] [Indexed: 04/04/2024] Open
Abstract
Background Pudendal neuralgia (PN) is a chronic neuropathy that causes pain, numbness, and dysfunction in the pelvic region. The current state-of-the-art treatment is pulsed radiofrequency (PRF) in which a needle is supposed to be placed close to the pudendal nerve for neuromodulation. Given the effective range of PRF of 5 mm, the accuracy of needle placement is important. This study aimed to investigate the potential of augmented reality guidance for improving the accuracy of needle placement in pulsed radiofrequency treatment for pudendal neuralgia. Methods In this pilot study, eight subjects performed needle placements onto an in-house developed phantom model of the pelvis using AR guidance. AR guidance is provided using an in-house developed application on the HoloLens 2. The accuracy of needle placement was calculated based on the virtual 3D models of the needle and targeted phantom nerve, derived from CBCT scans. Results The median Euclidean distance between the tip of the needle and the target is found to be 4.37 (IQR 5.16) mm, the median lateral distance is 3.25 (IQR 4.62) mm and the median depth distance is 1.94 (IQR 7.07) mm. Conclusion In this study, the first method is described in which the accuracy of patient-specific needle placement using AR guidance is determined. This method could potentially improve the accuracy of PRF needle placement for pudendal neuralgia, resulting in improved treatment outcomes.
Collapse
Affiliation(s)
- Lars L. Boogaard
- Radboudumc 3D Lab, Radboud University Medical Centre, Nijmegen, The Netherlands
- Department of Obstetrics and Gynaecology, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - Kim Notten
- Department of Obstetrics and Gynaecology, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - Kirsten Kluivers
- Department of Obstetrics and Gynaecology, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - Selina Van der Wal
- Department of Anesthesiology, Pain and Palliative Care, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - Thomas J. J. Maal
- Radboudumc 3D Lab, Radboud University Medical Centre, Nijmegen, The Netherlands
| | - Luc Verhamme
- Radboudumc 3D Lab, Radboud University Medical Centre, Nijmegen, The Netherlands
| |
Collapse
|
3
|
Hatzl J, Henning D, Böckler D, Hartmann N, Meisenbacher K, Uhl C. Comparing Different Registration and Visualization Methods for Navigated Common Femoral Arterial Access-A Phantom Model Study Using Mixed Reality. J Imaging 2024; 10:76. [PMID: 38667974 PMCID: PMC11051344 DOI: 10.3390/jimaging10040076] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2024] [Revised: 03/17/2024] [Accepted: 03/20/2024] [Indexed: 04/28/2024] Open
Abstract
Mixed reality (MxR) enables the projection of virtual three-dimensional objects into the user's field of view via a head-mounted display (HMD). This phantom model study investigated three different workflows for navigated common femoral arterial (CFA) access and compared it to a conventional sonography-guided technique as a control. A total of 160 punctures were performed by 10 operators (5 experts and 5 non-experts). A successful CFA puncture was defined as puncture at the mid-level of the femoral head with the needle tip at the central lumen line in a 0° coronary insertion angle and a 45° sagittal insertion angle. Positional errors were quantified using cone-beam computed tomography following each attempt. Mixed effect modeling revealed that the distance from the needle entry site to the mid-level of the femoral head is significantly shorter for navigated techniques than for the control group. This highlights that three-dimensional visualization could increase the safety of CFA access. However, the navigated workflows are infrastructurally complex with limited usability and are associated with relevant cost. While navigated techniques appear as a potentially beneficial adjunct for safe CFA access, future developments should aim to reduce workflow complexity, avoid optical tracking systems, and offer more pragmatic methods of registration and instrument tracking.
Collapse
Affiliation(s)
- Johannes Hatzl
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Daniel Henning
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Dittmar Böckler
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Niklas Hartmann
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Katrin Meisenbacher
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
| | - Christian Uhl
- Department of Vascular and Endovascular Surgery, University Hospital Heidelberg, 69120 Heidelberg, Germany
- Department of Vascular Surgery, University Hospital RWTH Aachen, 52074 Aachen, Germany
| |
Collapse
|
4
|
van der Woude R, Fitski M, van der Zee JM, van de Ven CP, Bökkerink GMJ, Wijnen MHWA, Meulstee JW, van Doormaal TPC, Siepel FJ, van der Steeg AFW. Clinical Application and Further Development of Augmented Reality Guidance for the Surgical Localization of Pediatric Chest Wall Tumors. J Pediatr Surg 2024:S0022-3468(24)00105-2. [PMID: 38472040 DOI: 10.1016/j.jpedsurg.2024.02.023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/13/2024] [Accepted: 02/16/2024] [Indexed: 03/14/2024]
Abstract
BACKGROUND Surgical treatment of pediatric chest wall tumors requires accurate surgical planning and tumor localization to achieve radical resections while sparing as much healthy tissue as possible. Augmented Reality (AR) could facilitate surgical decision making by improving anatomical understanding and intraoperative tumor localization. We present our clinical experience with the use of an AR system for intraoperative tumor localization during chest wall resections. Furthermore, we present the pre-clinical results of a new registration method to improve our conventional AR system. METHODS From January 2021, we used the HoloLens 2 for pre-incisional tumor localization during all chest wall resections inside our center. A patient-specific 3D model was projected onto the patient by use of a five-point registration method based on anatomical landmarks. Furthermore, we developed and pre-clinically tested a surface matching method to allow post-incisional AR guidance by performing registration on the exposed surface of the ribs. RESULTS Successful registration and holographic overlay were achieved in eight patients. The projection seemed most accurate when landmarks were positioned in a non-symmetric configuration in proximity to the tumor. Disagreements between the overlay and expected tumor location were mainly due to user-dependent registration errors. The pre-clinical tests of the surface matching method proved the feasibility of registration on the exposed ribs. CONCLUSIONS Our results prove the applicability of AR guidance for the pre- and post-incisional localization of pediatric chest wall tumors during surgery. The system has the potential to enable intraoperative 3D visualization, hereby facilitating surgical planning and management of chest wall resections. LEVEL OF EVIDENCE IV TYPE OF STUDY: Treatment Study.
Collapse
Affiliation(s)
- Rémi van der Woude
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands; Technical Medicine, TechMed Centre, University of Twente, Drienderlolaan 5, 7522 NB, Enschede, the Netherlands
| | - Matthijs Fitski
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | - Jasper M van der Zee
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands; Technical Medicine, TechMed Centre, University of Twente, Drienderlolaan 5, 7522 NB, Enschede, the Netherlands
| | - Cornelis P van de Ven
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | - Guus M J Bökkerink
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | - Marc H W A Wijnen
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands
| | | | - Tristan P C van Doormaal
- Augmedit B.V., Naarden, the Netherlands; Department of Neurosurgery, Brain Division, University Medical Center, Utrecht, the Netherlands
| | - Françoise J Siepel
- Robotics and Mechatronics, TechMed Centre, University of Twente, Drienerlolaan 5, 7522 NB, Enschede, the Netherlands
| | - Alida F W van der Steeg
- Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584 CS, Utrecht, the Netherlands.
| |
Collapse
|
5
|
Morley CT, Arreola DM, Qian L, Lynn AL, Veigulis ZP, Osborne TF. Mixed Reality Surgical Navigation System; Positional Accuracy Based on Food and Drug Administration Standard. Surg Innov 2024; 31:48-57. [PMID: 38019844 PMCID: PMC10773158 DOI: 10.1177/15533506231217620] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2023]
Abstract
BACKGROUND Computer assisted surgical navigation systems are designed to improve outcomes by providing clinicians with procedural guidance information. The use of new technologies, such as mixed reality, offers the potential for more intuitive, efficient, and accurate procedural guidance. The goal of this study is to assess the positional accuracy and consistency of a clinical mixed reality system that utilizes commercially available wireless head-mounted displays (HMDs), custom software, and localization instruments. METHODS Independent teams using the second-generation Microsoft HoloLens© hardware, Medivis SurgicalAR© software, and localization instruments, tested the accuracy of the combined system at different institutions, times, and locations. The ASTM F2554-18 consensus standard for computer-assisted surgical systems, as recognized by the U.S. FDA, was utilized to measure the performance. 288 tests were performed. RESULTS The system demonstrated consistent results, with an average accuracy performance that was better than one millimeter (.75 ± SD .37 mm). CONCLUSION Independently acquired positional tracking accuracies exceed conventional in-market surgical navigation tracking systems and FDA standards. Importantly, the performance was achieved at two different institutions, using an international testing standard, and with a system that included a commercially available off-the-shelf wireless head mounted display and software.
Collapse
Affiliation(s)
| | - David M. Arreola
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
| | | | | | - Zachary P. Veigulis
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Business Analytics, Tippie College of Business, University of Iowa, Iowa, IA, USA
| | - Thomas F. Osborne
- US Department of Veterans Affairs, Palo Alto Healthcare System, Palo Alto, CA, USA
- Department of Radiology, Stanford University School of Medicine, Stanford, CA, USA
| |
Collapse
|
6
|
Takács A, Hardi E, Cavalcante BGN, Szabó B, Kispélyi B, Joób-Fancsaly Á, Mikulás K, Varga G, Hegyi P, Kivovics M. Advancing accuracy in guided implant placement: A comprehensive meta-analysis: Meta-Analysis evaluation of the accuracy of available implant placement Methods. J Dent 2023; 139:104748. [PMID: 37863173 DOI: 10.1016/j.jdent.2023.104748] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2023] [Revised: 10/16/2023] [Accepted: 10/18/2023] [Indexed: 10/22/2023] Open
Abstract
OBJECTIVES This meta-analysis aimed to determine the accuracy of currently available computer-assisted implant surgery (CAIS) modalities under in vitro conditions and investigate whether these novel techniques can achieve clinically acceptable accuracy. DATA In vitro studies comparing the postoperative implant position with the preoperative plan were included. Risk of bias was assessed using the Quality Assessment Tool For In Vitro Studies (QUIN Tool) and a sensitivity analysis was conducted using funnel plots. SOURCES A systematic search was performed on April 18, 2023, using the following three databases: MEDLINE (via PubMed), EMBASE, and Cochrane Central Register of Controlled Trials. No filters or restrictions were applied during the search. RESULTS A total of 5,894 studies were included following study selection. Robotic- and static CAIS (sCAIS) had the most accurate and clinically acceptable outcomes. sCAIS was further divided according to the guidance level. Among the sCAIS groups, fully guided implant placement had the greatest accuracy. Augmented reality-based CAIS (AR-based CAIS) had clinically acceptable results for all the outcomes except for apical global deviation. Dynamic CAIS (dCAIS) demonstrated clinically safe results, except for horizontal apical deviation. Freehand implant placement was associated with the greatest number of errors. CONCLUSIONS Fully guided sCAIS demonstrated the most predictable outcomes, whereas freehand sCAIS demonstrated the lowest accuracy. AR-based and robotic CAIS may be promising alternatives. CLINICAL SIGNIFICANCE To our knowledge, this is the first meta-analysis to evaluate the accuracy of robotic CAIS and investigate the accuracy of various CAIS modalities.
Collapse
Affiliation(s)
- Anna Takács
- Department of Community Dentistry, Semmelweis University, Szentkirályi utca 40. 1088 Budapest, Hungary; Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary
| | - Eszter Hardi
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary; Department of Oro-Maxillofacial Surgery and Stomatology, Semmelweis University, Mária utca 52. 1085 Budapest, Hungary
| | - Bianca Golzio Navarro Cavalcante
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary; Department of Oral Biology, Semmelweis University, Nagyvárad tér 4. 1089 Budapest, Hungary
| | - Bence Szabó
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary
| | - Barbara Kispélyi
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary; Department of Prosthodontics, Semmelweis University, Szentkirályi utca 47. 1088 Budapest, Hungary
| | - Árpád Joób-Fancsaly
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary; Department of Oro-Maxillofacial Surgery and Stomatology, Semmelweis University, Mária utca 52. 1085 Budapest, Hungary
| | - Krisztina Mikulás
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary; Department of Prosthodontics, Semmelweis University, Szentkirályi utca 47. 1088 Budapest, Hungary
| | - Gábor Varga
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary; Department of Oral Biology, Semmelweis University, Nagyvárad tér 4. 1089 Budapest, Hungary
| | - Péter Hegyi
- Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary; Institute for Translational Medicine, Szentágothai Research Centre, Medical School, University of Pécs, Szigeti út 12. 7624 Pécs, Hungary; Division of Pancreatic Diseases, Heart and Vascular Center, Semmelweis University, Városmajor utca 68. 1122 Budapest, Hungary
| | - Márton Kivovics
- Department of Community Dentistry, Semmelweis University, Szentkirályi utca 40. 1088 Budapest, Hungary; Centre for Translational Medicine, Semmelweis University, Üllői út 26. 1085 Budapest, Hungary.
| |
Collapse
|
7
|
Qi Z, Bopp MHA, Nimsky C, Chen X, Xu X, Wang Q, Gan Z, Zhang S, Wang J, Jin H, Zhang J. A Novel Registration Method for a Mixed Reality Navigation System Based on a Laser Crosshair Simulator: A Technical Note. Bioengineering (Basel) 2023; 10:1290. [PMID: 38002414 PMCID: PMC10669875 DOI: 10.3390/bioengineering10111290] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2023] [Accepted: 11/01/2023] [Indexed: 11/26/2023] Open
Abstract
Mixed Reality Navigation (MRN) is pivotal in augmented reality-assisted intelligent neurosurgical interventions. However, existing MRN registration methods face challenges in concurrently achieving low user dependency, high accuracy, and clinical applicability. This study proposes and evaluates a novel registration method based on a laser crosshair simulator, evaluating its feasibility and accuracy. A novel registration method employing a laser crosshair simulator was introduced, designed to replicate the scanner frame's position on the patient. The system autonomously calculates the transformation, mapping coordinates from the tracking space to the reference image space. A mathematical model and workflow for registration were designed, and a Universal Windows Platform (UWP) application was developed on HoloLens-2. Finally, a head phantom was used to measure the system's target registration error (TRE). The proposed method was successfully implemented, obviating the need for user interactions with virtual objects during the registration process. Regarding accuracy, the average deviation was 3.7 ± 1.7 mm. This method shows encouraging results in efficiency and intuitiveness and marks a valuable advancement in low-cost, easy-to-use MRN systems. The potential for enhancing accuracy and adaptability in intervention procedures positions this approach as promising for improving surgical outcomes.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| |
Collapse
|
8
|
Cao Z, Xiu Y, Yu D, Li X, Yang C, Li Z. Clinical Value of Mixed Reality-Assisted Puncture Navigation for Percutaneous Nephrolithotripsy. Urology 2023; 176:219-225. [PMID: 36921844 DOI: 10.1016/j.urology.2022.12.067] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2022] [Revised: 11/28/2022] [Accepted: 12/04/2022] [Indexed: 03/14/2023]
Abstract
OBJECTIVE To evaluate the clinical value of mixed reality-assisted puncture navigation (MRAPN) in percutaneous nephrolithotripsy (PCNL). METHODS Two hundred patients undergoing PCN were enrolled, all of whom had kidney stones to be subjected to lithotripsy by PCNL and grouped according to surgical procedure into the MRAPN (n = 100) and non-mixed reality-assisted puncture (non-MRAPN) (n = 100) groups. CT data in DICOM format for all patients in the MRAPN group were imported into 3D reconstruction and mixed reality (MR) post-processing workstations, and holographic 3D visualization modelling. Comparing parameters such as the operative time (OT), puncture time (PT), number of attempts, and estimated blood loss (EBL), a Likert scale was used to assess the clinical value of MRAPN. The Cohen κ coefficient (k) was employed to evaluate consistency among assessors; safety was assessed. RESULTS There were no significant differences in patient demographic indicators or preoperative general information between the MRAPN and non-MRAPN groups (P > .05). The clinical value of MRAPN was higher for subjective scores regarding surgical planning, intraoperative navigation, didactic guidance and physician-patient communication (all P < .001). The PT was significantly shorter in the MRAPN group (P < .001), with a shorter overall OT and lower EBL (P < .001). There were no significant differences in the overall comparison, length of hospital stay, or preoperative or postoperative creatinine (all P > .05). CONCLUSION MRAPN can safely and effectively improve the success of PCN, reduce complications, and decrease the PT, OT, and EBL.
Collapse
Affiliation(s)
- Zhiqiang Cao
- Department of Urology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China; Department of Burn and Plastic Surgery, General Hospital of Northern Theater Command, Shenyang, Liaoning, China
| | - Yiping Xiu
- Department of Burn and Plastic Surgery, General Hospital of Northern Theater Command, Shenyang, Liaoning, China
| | - Dongyang Yu
- Department of Urology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
| | - Xinyang Li
- Department of Urology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China
| | - Caleb Yang
- Department of Nutritional Sciences and Toxicology, University of California, Berkeley, CA
| | - Zhenhua Li
- Department of Urology, Shengjing Hospital of China Medical University, Shenyang, Liaoning, China.
| |
Collapse
|
9
|
Tzelnick S, Rampinelli V, Sahovaler A, Franz L, Chan HHL, Daly MJ, Irish JC. Skull-Base Surgery—A Narrative Review on Current Approaches and Future Developments in Surgical Navigation. J Clin Med 2023; 12:jcm12072706. [PMID: 37048788 PMCID: PMC10095207 DOI: 10.3390/jcm12072706] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2023] [Revised: 03/10/2023] [Accepted: 03/29/2023] [Indexed: 04/07/2023] Open
Abstract
Surgical navigation technology combines patient imaging studies with intraoperative real-time data to improve surgical precision and patient outcomes. The navigation workflow can also include preoperative planning, which can reliably simulate the intended resection and reconstruction. The advantage of this approach in skull-base surgery is that it guides access into a complex three-dimensional area and orients tumors intraoperatively with regard to critical structures, such as the orbit, carotid artery and brain. This enhances a surgeon’s capabilities to preserve normal anatomy while resecting tumors with adequate margins. The aim of this narrative review is to outline the state of the art and the future directions of surgical navigation in the skull base, focusing on the advantages and pitfalls of this technique. We will also present our group experience in this field, within the frame of the current research trends.
Collapse
Affiliation(s)
- Sharon Tzelnick
- Division of Head and Neck Surgery, Princess Margaret Cancer Center, University of Toronto, Toronto, ON M5G 2M9, Canada
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Vittorio Rampinelli
- Unit of Otorhinolaryngology—Head and Neck Surgery, Department of Medical and Surgical Specialties, Radiologic Sciences and Public Health, University of Brescia, 25121 Brescia, Italy
- Technology for Health (PhD Program), Department of Information Engineering, University of Brescia, 25121 Brescia, Italy
| | - Axel Sahovaler
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
- Head & Neck Surgery Unit, University College London Hospitals, London NW1 2PG, UK
| | - Leonardo Franz
- Department of Neuroscience DNS, Otolaryngology Section, University of Padova, 35122 Padua, Italy
| | - Harley H. L. Chan
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Michael J. Daly
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Jonathan C. Irish
- Division of Head and Neck Surgery, Princess Margaret Cancer Center, University of Toronto, Toronto, ON M5G 2M9, Canada
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| |
Collapse
|
10
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
11
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
12
|
Chegini S, Edwards E, McGurk M, Clarkson M, Schilling C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br J Oral Maxillofac Surg 2023; 61:19-27. [PMID: 36513525 DOI: 10.1016/j.bjoms.2022.08.007] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 07/31/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Augmented-reality (AR) head-mounted devices (HMD) allow the wearer to have digital images superposed on to their field of vision. They are being used to superpose annotations on to the surgical field akin to a navigation system. This review examines published validation studies on HMD-AR systems, their reported protocols, and outcomes. The aim was to establish commonalities and an acceptable registration outcome. Multiple databases were systematically searched for relevant articles between January 2015 and January 2021. Studies that examined the registration of AR content using a HMD to guide surgery were eligible for inclusion. The country of origin, year of publication, medical specialty, HMD device, software, and method of registration, were recorded. A meta-analysis of the mean registration error was conducted. A total of 4784 papers were identified, of which 23 met the inclusion criteria. They included studies using HoloLens (Microsoft) (n = 22) and nVisor ST60 (NVIS Inc) (n = 1). Sixty-six per cent of studies were in hard tissue specialties. Eleven studies reported registration errors using pattern markers (mean (SD) 2.6 (1.8) mm), and four reported registration errors using surface markers (mean (SD) 3.8 (3.7) mm). Three studies reported registration errors using manual alignment (mean (SD) 2.2 (1.3) mm). The majority of studies in this review used in-house software with a variety of registration methods and reported errors. The mean registration error calculated in this study can be considered as a minimum acceptable standard. It should be taken into consideration when procedural applications are selected.
Collapse
Affiliation(s)
- Soudeh Chegini
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom.
| | - Eddie Edwards
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Mark McGurk
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Matthew Clarkson
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Clare Schilling
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| |
Collapse
|
13
|
Bagher Zadeh Ansari N, Léger É, Kersten-Oertel M. VentroAR: an augmented reality platform for ventriculostomy using the Microsoft HoloLens. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2022. [DOI: 10.1080/21681163.2022.2156394] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Affiliation(s)
| | - Étienne Léger
- Department of Computer Science and Software Engineering, Concordia University, Montreal, QC, Canada
| | - Marta Kersten-Oertel
- Department of Computer Science and Software Engineering, Concordia University, Montreal, QC, Canada
- PERFORM Centre, Concordia University, Montreal, QC, Canada
| |
Collapse
|
14
|
Meulstee J, Bussink T, Delye H, Xi T, Borstlap W, Maal T. Surgical guides versus augmented reality to transfer a virtual surgical plan for open cranial vault reconstruction: A pilot study. ADVANCES IN ORAL AND MAXILLOFACIAL SURGERY 2022. [DOI: 10.1016/j.adoms.2022.100334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
|
15
|
Gupta A, Ambade R. From Diagnosis to Therapy: The Role of Virtual and Augmented Reality in Orthopaedic Trauma Surgery. Cureus 2022; 14:e29099. [PMID: 36249662 PMCID: PMC9557249 DOI: 10.7759/cureus.29099] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/09/2022] [Accepted: 09/12/2022] [Indexed: 11/28/2022] Open
Abstract
By reducing procedure-related problems, advancements in computer-assisted surgery (CAS) and surgical training aim to boost operative precision and enhance patient safety. Orthopaedic training and practice have started to change as a result of the incorporation of reality technologies like virtual reality (VR), augmented reality (AR), and mixed reality (MR) into CAS. Today's trainees can engage in realistic and highly involved operational simulations without supervision. With the coronavirus disease 2019 (COVID-19) pandemic, there is a greater need for breakthrough technology adoption. VR is an interactive technology that enables personalised care and could support successful patient-centered rehabilitation. It is a valid and trustworthy evaluation method for determining joint range of motion, function, and balance in physical rehabilitation. It may make it possible to customise care, encourage patients, boost compliance, and track their advancement. AR supplementation in orthopaedic surgery has shown promising results in pre-clinical settings, with improvements in surgical accuracy and reproducibility, decreased operating times, and less radiation exposure. As little patient observation is needed, this may lessen the workload clinicians must bear. The ability to use it for home-based therapy is often available commercially as well. The objectives of this review are to evaluate the technology available, comprehend the available evidence regarding the benefit, and take into account implementation problems in clinical practice. The use of this technology, its practical and moral ramifications, and how it will affect orthopaedic doctors and their patients are also covered. This review offers a current and thorough analysis of the reality technologies and their uses in orthopaedic surgery.
Collapse
|
16
|
Multicenter assessment of augmented reality registration methods for image-guided interventions. Radiol Med 2022; 127:857-865. [DOI: 10.1007/s11547-022-01515-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 06/13/2022] [Indexed: 10/17/2022]
|
17
|
Grunbeck IA, Teatini A, Kumar RP, Elle OJ, Wiig O. Evaluation and Comparison of Target Registration Error in Active and Passive Optical Tracking Systems. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:3476-3480. [PMID: 36085841 DOI: 10.1109/embc48229.2022.9871591] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Optical tracking systems combined with imaging modalities such as computed tomography and magnetic reso-nance imaging are important parts of image guided surgery systems. By determining the location and orientation of sur-gical tools relative to a patient's reference system, tracking systems assist surgeons during the planning and execution of image guided procedures. Therefore, knowledge of the tracking system-induced error is of great importance. To this end, this study compared one passive and two active optical tracking systems in terms of their Target Registration Error. Two experiments were performed to measure the systems' accuracy, testing the impact of factors such as the size of the measuring volume, length of surgical instruments and environmental conditions with orthopedic procedures in mind. According to the performed experiments, the active systems achieved significantly higher accuracy than the tested passive system, reporting an overall accuracy of 0.063 mm (SD = 0.025) and 0.259 mm (SD = 0.152), respectively.
Collapse
|
18
|
von Haxthausen F, Moreta-Martinez R, Pose Díez de la Lastra A, Pascau J, Ernst F. UltrARsound: in situ visualization of live ultrasound images using HoloLens 2. Int J Comput Assist Radiol Surg 2022; 17:2081-2091. [PMID: 35776399 PMCID: PMC9515035 DOI: 10.1007/s11548-022-02695-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Accepted: 05/31/2022] [Indexed: 11/24/2022]
Abstract
Purpose Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. Methods The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses—thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. Results Tracking is performed with a median accuracy of 1.98 mm/1.81\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘ for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$^\circ $$\end{document}∘. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. Conclusions In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.
Collapse
Affiliation(s)
- Felix von Haxthausen
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain. .,Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany.
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Alicia Pose Díez de la Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain
| | - Floris Ernst
- Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany
| |
Collapse
|
19
|
Pose-Díez-de-la-Lastra A, Moreta-Martinez R, García-Sevilla M, García-Mato D, Calvo-Haro JA, Mediavilla-Santos L, Pérez-Mañanes R, von Haxthausen F, Pascau J. HoloLens 1 vs. HoloLens 2: Improvements in the New Model for Orthopedic Oncological Interventions. SENSORS 2022; 22:s22134915. [PMID: 35808407 PMCID: PMC9269857 DOI: 10.3390/s22134915] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/27/2022] [Revised: 06/20/2022] [Accepted: 06/27/2022] [Indexed: 11/16/2022]
Abstract
This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons’ feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.
Collapse
Affiliation(s)
- Alicia Pose-Díez-de-la-Lastra
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
| | - José Antonio Calvo-Haro
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Lydia Mediavilla-Santos
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Rubén Pérez-Mañanes
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Servicio de Cirugía Ortopédica y Traumatología, Hospital General Universitario Gregorio Marañón, 28007 Madrid, Spain
- Departamento de Cirugía, Facultad de Medicina, Universidad Complutense de Madrid, 28040 Madrid, Spain
| | - Felix von Haxthausen
- Institute for Robotics and Cognitive Systems, University of Lübeck, 23562 Lübeck, Germany;
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, 28911 Leganés, Spain; (A.P.-D.-d.-l.-L.); (R.M.-M.); (M.G.-S.); (D.G.-M.)
- Instituto de Investigación Sanitaria Gregorio Marañón, 28007 Madrid, Spain; (J.A.C.-H.); (L.M.-S.); (R.P.-M.)
- Correspondence: ; Tel.: +34-91-624-8196
| |
Collapse
|
20
|
Role and Utility of Mixed Reality Technology in Laparoscopic Partial Nephrectomy: Outcomes of a Prospective RCT Using an Indigenously Developed Software. Adv Urol 2022; 2022:8992051. [PMID: 35615077 PMCID: PMC9126718 DOI: 10.1155/2022/8992051] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2022] [Revised: 04/07/2022] [Accepted: 04/23/2022] [Indexed: 11/17/2022] Open
Abstract
Objective To develop a software for mixed reality (MR) anatomical model creation and study its intraoperative clinical utility to facilitate laparoscopic partial nephrectomy. Materials and Methods After institutional review board approval, 47 patients were prospectively randomized for LPN into two groups: the control group (24 patients) underwent operation with an intraoperative ultrasound (US) control and the experimental group (23 patients) with smart glasses HoloLens 2 (Microsoft, Seattle, WA, USA). Our team has developed an open-source software package called “HLOIA,” utilization of which allowed to create and use during surgery the MR anatomical model of the kidney with its vascular pedicle and tumor. The study period extended from June 2020 to February 2021 where demographic, perioperative, and pathological data were collected for all qualifying patients. The objective was to assess the utility of a MR model during LPN and through a 5-point Likert scale questionnaire, completed by the surgeon, immediately after LPN. Patient characteristics were tested using the chi-square test for categorical variables and Student's t-test or Mann–Whitney test for continuous variables. Results Comparison of the variables between the groups revealed statistically significant differences only in the following parameters: the time for renal pedicle exposure and the time from the renal pedicle to the detection of tumor localization (p < 0.001), which were in favor of the experimental group. The surgeon's impression of the utility of the MR model by the proposed questionnaire demonstrated high scores in all statements. Conclusions Developed open-source software “HLOIA” allowed to create the mixed reality anatomical model by operating urologist which is when used with smart glasses has shown improvement in terms of time for renal pedicle exposure and time for renal tumor identification without compromising safety.
Collapse
|
21
|
Real-time augmented reality application in presurgical planning and lesion scalp localization by a smartphone. Acta Neurochir (Wien) 2022; 164:1069-1078. [PMID: 34448914 DOI: 10.1007/s00701-021-04968-z] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 08/08/2021] [Indexed: 10/20/2022]
Abstract
OBJECTIVE A smartphone augmented reality (AR) application (app) was explored for clinical use in presurgical planning and lesion scalp localization. METHODS We programmed an AR App on a smartphone. The accuracy of the AR app was tested on a 3D-printed head model, using the Euclidean distance of displacement of virtual objects. For clinical validation, 14 patients with brain tumors were included in the study. Preoperative MRI images were used to generate 3D models for AR contents. The 3D models were then transferred to the smartphone AR app. Tumor scalp localization was marked, and a surgical corridor was planned on the patient's head by viewing AR images on the smartphone screen. Standard neuronavigation was applied to evaluate the accuracy of the smartphone. Max-margin distance (MMD) and area overlap ratio (AOR) were measured to quantitatively validate the clinical accuracy of the smartphone AR technique. RESULTS In model validation, the total mean Euclidean distance of virtual object displacement using the smartphone AR app was 4.7 ± 2.3 mm. In clinical validation, the mean duration of AR app usage was 168.5 ± 73.9 s. The total mean MMD was 6.7 ± 3.7 mm, and total mean AOR was 79%. CONCLUSIONS The smartphone AR app provides a new way of experience to observe intracranial anatomy in situ, and it makes surgical planning more intuitive and efficient. Localization accuracy is satisfactory with lesions larger than 15 mm.
Collapse
|
22
|
Projected cutting guides using an augmented reality system to improve surgical margins in maxillectomies: A preclinical study. Oral Oncol 2022; 127:105775. [DOI: 10.1016/j.oraloncology.2022.105775] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 02/03/2022] [Accepted: 02/13/2022] [Indexed: 11/21/2022]
|
23
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
24
|
Uhl C, Hatzl J, Meisenbacher K, Zimmer L, Hartmann N, Böckler D. Mixed-Reality-Assisted Puncture of the Common Femoral Artery in a Phantom Model. J Imaging 2022; 8:jimaging8020047. [PMID: 35200749 PMCID: PMC8874567 DOI: 10.3390/jimaging8020047] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Revised: 02/12/2022] [Accepted: 02/14/2022] [Indexed: 12/15/2022] Open
Abstract
Percutaneous femoral arterial access is daily practice in a variety of medical specialties and enables physicians worldwide to perform endovascular interventions. The reported incidence of percutaneous femoral arterial access complications is 3–18% and often results from suboptimal puncture location due to insufficient visualization of the target vessel. The purpose of this proof-of-concept study was to evaluate the feasibility and the positional error of a mixed-reality (MR)-assisted puncture of the common femoral artery in a phantom model using a commercially available navigation system. In total, 15 MR-assisted punctures were performed. Cone-beam computed tomography angiography (CTA) was used following each puncture to allow quantification of positional error of needle placements in the axial and sagittal planes. Technical success was achieved in 14/15 cases (93.3%) with a median axial positional error of 1.0 mm (IQR 1.3) and a median sagittal positional error of 1.1 mm (IQR 1.6). The median duration of the registration process and needle insertion was 2 min (IQR 1.0). MR-assisted puncture of the common femoral artery is feasible with acceptable positional errors in a phantom model. Future studies should aim to measure and reduce the positional error resulting from MR registration.
Collapse
|
25
|
Architecture of a Hybrid Video/Optical See-through Head-Mounted Display-Based Augmented Reality Surgical Navigation Platform. INFORMATION 2022. [DOI: 10.3390/info13020081] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
In the context of image-guided surgery, augmented reality (AR) represents a ground-breaking enticing improvement, mostly when paired with wearability in the case of open surgery. Commercially available AR head-mounted displays (HMDs), designed for general purposes, are increasingly used outside their indications to develop surgical guidance applications with the ambition to demonstrate the potential of AR in surgery. The applications proposed in the literature underline the hunger for AR-guidance in the surgical room together with the limitations that hinder commercial HMDs from being the answer to such a need. The medical domain demands specifically developed devices that address, together with ergonomics, the achievement of surgical accuracy objectives and compliance with medical device regulations. In the framework of an EU Horizon2020 project, a hybrid video and optical see-through augmented reality headset paired with a software architecture, both specifically designed to be seamlessly integrated into the surgical workflow, has been developed. In this paper, the overall architecture of the system is described. The developed AR HMD surgical navigation platform was positively tested on seven patients to aid the surgeon while performing Le Fort 1 osteotomy in cranio-maxillofacial surgery, demonstrating the value of the hybrid approach and the safety and usability of the navigation platform.
Collapse
|
26
|
Bussink T, Maal T, Meulstee J, Xi T. Augmented reality guided condylectomy. Br J Oral Maxillofac Surg 2022; 60:991-993. [DOI: 10.1016/j.bjoms.2022.01.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2022] [Revised: 01/21/2022] [Accepted: 01/26/2022] [Indexed: 10/19/2022]
|
27
|
Cercenelli L, Babini F, Badiali G, Battaglia S, Tarsitano A, Marchetti C, Marcelli E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front Oncol 2022; 11:804748. [PMID: 35071009 PMCID: PMC8770836 DOI: 10.3389/fonc.2021.804748] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 12/10/2021] [Indexed: 11/13/2022] Open
Abstract
Background Augmented Reality (AR) represents an evolution of navigation-assisted surgery, providing surgeons with a virtual aid contextually merged with the real surgical field. We recently reported a case series of AR-assisted fibular flap harvesting for mandibular reconstruction. However, the registration accuracy between the real and the virtual content needs to be systematically evaluated before widely promoting this tool in clinical practice. In this paper, after description of the AR based protocol implemented for both tablet and HoloLens 2 smart glasses, we evaluated in a first test session the achievable registration accuracy with the two display solutions, and in a second test session the success rate in executing the AR-guided skin paddle incision task on a 3D printed leg phantom. Methods From a real computed tomography dataset, 3D virtual models of a human leg, including fibula, arteries and skin with planned paddle profile for harvesting, were obtained. All virtual models were imported into Unity software to develop a marker-less AR application suitable to be used both via tablet and via HoloLens 2 headset. The registration accuracy for both solutions was verified on a 3D printed leg phantom obtained from the virtual models, by repeatedly applying the tracking function and computing pose deviations between the AR-projected virtual skin paddle profile and the real one transferred to the phantom via a CAD/CAM cutting guide. The success rate in completing the AR-guided task of skin paddle harvesting was evaluated using CAD/CAM templates positioned on the phantom model surface. Results On average, the marker-less AR protocol showed comparable registration errors (ranging within 1-5 mm) for tablet-based and HoloLens-based solution. Registration accuracy seems to be quite sensitive to ambient light conditions. We found a good success rate in completing the AR-guided task within an error margin of 4 mm (97% and 100% for tablet and HoloLens, respectively). All subjects reported greater usability and ergonomics for HoloLens 2 solution. Conclusions Results revealed that the proposed marker-less AR based protocol may guarantee a registration error within 1-5 mm for assisting skin paddle harvesting in the clinical setting. Optimal lightening conditions and further improvement of marker-less tracking technologies have the potential to increase the efficiency and precision of this AR-assisted reconstructive surgery.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Federico Babini
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Salvatore Battaglia
- Maxillofacial Surgery Unit, Policlinico San Marco University Hospital, University of Catania, Catania, Italy
| | - Achille Tarsitano
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Claudio Marchetti
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| |
Collapse
|
28
|
García-Sevilla M, Moreta-Martinez R, García-Mato D, Arenas de Frutos G, Ochandiano S, Navarro-Cuéllar C, Sanjuán de Moreta G, Pascau J. Surgical Navigation, Augmented Reality, and 3D Printing for Hard Palate Adenoid Cystic Carcinoma En-Bloc Resection: Case Report and Literature Review. Front Oncol 2022; 11:741191. [PMID: 35059309 PMCID: PMC8763795 DOI: 10.3389/fonc.2021.741191] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Accepted: 11/26/2021] [Indexed: 12/18/2022] Open
Abstract
Adenoid Cystic Carcinoma is a rare and aggressive tumor representing less than 1% of head and neck cancers. This malignancy often arises from the minor salivary glands, being the palate its most common location. Surgical en-bloc resection with clear margins is the primary treatment. However, this location presents a limited line of sight and a high risk of injuries, making the surgical procedure challenging. In this context, technologies such as intraoperative navigation can become an effective tool, reducing morbidity and improving the safety and accuracy of the procedure. Although their use is extended in fields such as neurosurgery, their application in maxillofacial surgery has not been widely evidenced. One reason is the need to rigidly fixate a navigation reference to the patient, which often entails an invasive setup. In this work, we studied three alternative and less invasive setups using optical tracking, 3D printing and augmented reality. We evaluated their precision in a patient-specific phantom, obtaining errors below 1 mm. The optimum setup was finally applied in a clinical case, where the navigation software was used to guide the tumor resection. Points were collected along the surgical margins after resection and compared with the real ones identified in the postoperative CT. Distances of less than 2 mm were obtained in 90% of the samples. Moreover, the navigation provided confidence to the surgeons, who could then undertake a less invasive and more conservative approach. The postoperative CT scans showed adequate resection margins and confirmed that the patient is free of disease after two years of follow-up.
Collapse
Affiliation(s)
- Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Gema Arenas de Frutos
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Santiago Ochandiano
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Carlos Navarro-Cuéllar
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Guillermo Sanjuán de Moreta
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Otorrinolaringología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| |
Collapse
|
29
|
Bori E, Pancani S, Vigliotta S, Innocenti B. Validation and accuracy evaluation of automatic segmentation for knee joint pre-planning. Knee 2021; 33:275-281. [PMID: 34739958 DOI: 10.1016/j.knee.2021.10.016] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/07/2021] [Revised: 09/28/2021] [Accepted: 10/12/2021] [Indexed: 02/02/2023]
Abstract
BACKGROUND Proper use of three-dimensional (3D) models generated from medical imaging data in clinical preoperative planning, training and consultation is based on the preliminary proved accuracy of the replication of the patient anatomy. Therefore, this study investigated the dimensional accuracy of 3D reconstructions of the knee joint generated from computed tomography scans via automatic segmentation by comparing them with 3D models generated through manual segmentation. METHODS Three unpaired, fresh-frozen right legs were investigated. Three-dimensional models of the femur and the tibia of each leg were manually segmented using a commercial software and compared in terms of geometrical accuracy with the 3D models automatically segmented using proprietary software. Bony landmarks were identified and used to calculate clinically relevant distances: femoral epicondylar distance; posterior femoral epicondylar distance; femoral trochlear groove length; tibial knee center tubercle distance (TKCTD). Pearson's correlation coefficient and Bland and Altman plots were used to evaluate the level of agreement between measured distances. RESULTS Differences between parameters measured on 3D models manually and automatically segmented were below 1 mm (range: -0.06 to 0.72 mm), except for TKCTD (between 1.00 and 1.40 mm in two specimens). In addition, there was a significant strong correlation between measurements. CONCLUSIONS The results obtained are comparable to those reported in previous studies where accuracy of bone 3D reconstruction was investigated. Automatic segmentation techniques can be used to quickly reconstruct reliable 3D models of bone anatomy and these results may contribute to enhance the spread of this technology in preoperative and operative settings, where it has shown considerable potential.
Collapse
Affiliation(s)
- Edoardo Bori
- BEAMS Department, Université Libre de Bruxelles, Bruxelles, Belgium.
| | | | | | | |
Collapse
|
30
|
Sahovaler A, Chan HHL, Gualtieri T, Daly M, Ferrari M, Vannelli C, Eu D, Manojlovic-Kolarski M, Orzell S, Taboni S, de Almeida JR, Goldstein DP, Deganello A, Nicolai P, Gilbert RW, Irish JC. Augmented Reality and Intraoperative Navigation in Sinonasal Malignancies: A Preclinical Study. Front Oncol 2021; 11:723509. [PMID: 34790568 PMCID: PMC8591179 DOI: 10.3389/fonc.2021.723509] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Accepted: 10/12/2021] [Indexed: 11/13/2022] Open
Abstract
Objective To report the first use of a novel projected augmented reality (AR) system in open sinonasal tumor resections in preclinical models and to compare the AR approach with an advanced intraoperative navigation (IN) system. Methods Four tumor models were created. Five head and neck surgeons participated in the study performing virtual osteotomies. Unguided, AR, IN, and AR + IN simulations were performed. Statistical comparisons between approaches were obtained. Intratumoral cut rate was the main outcome. The groups were also compared in terms of percentage of intratumoral, close, adequate, and excessive distances from the tumor. Information on a wearable gaze tracker headset and NASA Task Load Index questionnaire results were analyzed as well. Results A total of 335 cuts were simulated. Intratumoral cuts were observed in 20.7%, 9.4%, 1.2,% and 0% of the unguided, AR, IN, and AR + IN simulations, respectively (p < 0.0001). The AR was superior than the unguided approach in univariate and multivariate models. The percentage of time looking at the screen during the procedures was 55.5% for the unguided approaches and 0%, 78.5%, and 61.8% in AR, IN, and AR + IN, respectively (p < 0.001). The combined approach significantly reduced the screen time compared with the IN procedure alone. Conclusion We reported the use of a novel AR system for oncological resections in open sinonasal approaches, with improved margin delineation compared with unguided techniques. AR improved the gaze-toggling drawback of IN. Further refinements of the AR system are needed before translating our experience to clinical practice.
Collapse
Affiliation(s)
- Axel Sahovaler
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Harley H L Chan
- Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Tommaso Gualtieri
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada.,Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy
| | - Michael Daly
- Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Marco Ferrari
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada.,Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy.,Section of Otorhinolaryngology-Head and Neck Surgery, University of Padua-Azienda Ospedaliera di Padova, Padua, Italy
| | - Claire Vannelli
- Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Donovan Eu
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Mirko Manojlovic-Kolarski
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Susannah Orzell
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Stefano Taboni
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada.,Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy.,Section of Otorhinolaryngology-Head and Neck Surgery, University of Padua-Azienda Ospedaliera di Padova, Padua, Italy
| | - John R de Almeida
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - David P Goldstein
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Alberto Deganello
- Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy
| | - Piero Nicolai
- Section of Otorhinolaryngology-Head and Neck Surgery, University of Padua-Azienda Ospedaliera di Padova, Padua, Italy
| | - Ralph W Gilbert
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Jonathan C Irish
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| |
Collapse
|
31
|
VeLight: A 3D virtual reality tool for CT-based anatomy teaching and training. J Vis (Tokyo) 2021. [DOI: 10.1007/s12650-021-00790-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
32
|
Glas HH, Kraeima J, van Ooijen PMA, Spijkervet FKL, Yu L, Witjes MJH. Augmented Reality Visualization for Image-Guided Surgery: A Validation Study Using a Three-Dimensional Printed Phantom. J Oral Maxillofac Surg 2021; 79:1943.e1-1943.e10. [PMID: 34033801 DOI: 10.1016/j.joms.2021.04.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 04/01/2021] [Accepted: 04/01/2021] [Indexed: 01/21/2023]
Abstract
BACKGROUND Oral and maxillofacial surgery currently relies on virtual surgery planning based on image data (CT, MRI). Three-dimensional (3D) visualizations are typically used to plan and predict the outcome of complex surgical procedures. To translate the virtual surgical plan to the operating room, it is either converted into physical 3D-printed guides or directly translated using real-time navigation systems. PURPOSE This study aims to improve the translation of the virtual surgery plan to a surgical procedure, such as oncologic or trauma surgery, in terms of accuracy and speed. Here we report an augmented reality visualization technique for image-guided surgery. It describes how surgeons can visualize and interact with the virtual surgery plan and navigation data while in the operating room. The user friendliness and usability is objectified by a formal user study that compared our augmented reality assisted technique to the gold standard setup of a perioperative navigation system (Brainlab). Moreover, accuracy of typical navigation tasks as reaching landmarks and following trajectories is compared. RESULTS Overall completion time of navigation tasks was 1.71 times faster using augmented reality (P = .034). Accuracy improved significantly using augmented reality (P < .001), for reaching physical landmarks a less strong correlation was found (P = .087). Although the participants were relatively unfamiliar with VR/AR (rated 2.25/5) and gesture-based interaction (rated 2/5), they reported that navigation tasks become easier to perform using augmented reality (difficulty Brainlab rated 3.25/5, HoloLens 2.4/5). CONCLUSION The proposed workflow can be used in a wide range of image-guided surgery procedures as an addition to existing verified image guidance systems. Results of this user study imply that our technique enables typical navigation tasks to be performed faster and more accurately compared to the current gold standard. In addition, qualitative feedback on our augmented reality assisted technique was more positive compared to the standard setup.?>.
Collapse
Affiliation(s)
- H H Glas
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands.
| | - J Kraeima
- Technical Physician, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - P M A van Ooijen
- Associate Professor Faculty of Medical Sciences, Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - F K L Spijkervet
- Professor, Oral and Maxillofacial Surgeon, Head of the Department, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - L Yu
- Lecturer in the Department of Computer Science and Software Engineering (CSSE), Department of Radiology, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - M J H Witjes
- Oral and Maxillofacial Surgeon, Principal Investigator, Department of Oral & Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
33
|
Lareyre F, Chaudhuri A, Adam C, Carrier M, Mialhe C, Raffort J. Applications of Head-Mounted Displays and Smart Glasses in Vascular Surgery. Ann Vasc Surg 2021; 75:497-512. [PMID: 33823254 DOI: 10.1016/j.avsg.2021.02.033] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2021] [Revised: 02/22/2021] [Accepted: 02/25/2021] [Indexed: 12/11/2022]
Abstract
OBJECTIVES Advances in virtual, augmented and mixed reality have led to the development of wearable technologies including head mounted displays (HMD) and smart glasses. While there is a growing interest on their potential applications in health, only a few studies have addressed so far their use in vascular surgery. The aim of this review was to summarize the fundamental notions associated with these technologies and to discuss potential applications and current limits for their use in vascular surgery. METHODS A comprehensive literature review was performed to introduce the fundamental concepts and provide an overview of applications of HMD and smart glasses in surgery. RESULTS HMD and smart glasses demonstrated a potential interest for the education of surgeons including anatomical teaching, surgical training, teaching and telementoring. Applications for pre-surgical planning have been developed in general and cardiac surgery and could be transposed for a use in vascular surgery. The use of wearable technologies in the operating room has also been investigated in both general and cardiovascular surgery and demonstrated its potential interest for image-guided surgery and data collection. CONCLUSION Studies performed so far represent a proof of concept of the interest of HMD and smart glasses in vascular surgery for education of surgeons and for surgical practice. Although these technologies exhibited encouraging results for applications in vascular surgery, technical improvements and further clinical research in large series are required before hoping using them in daily clinical practice.
Collapse
Affiliation(s)
- Fabien Lareyre
- Department of Vascular Surgery, Hospital of Antibes-Juan-les-Pins, France; Université Côte d'Azur, CHU, Inserm U1065, C3M, Nice, France.
| | - Arindam Chaudhuri
- Bedfordshire-Milton Keynes Vascular Centre, Bedfordshire Hospitals NHS Foundation Trust, Bedford, UK
| | - Cédric Adam
- Laboratory of Applied Mathematics and Computer Science (MICS), CentraleSupélec, Université Paris-Saclay, France
| | - Marion Carrier
- Laboratory of Applied Mathematics and Computer Science (MICS), CentraleSupélec, Université Paris-Saclay, France
| | - Claude Mialhe
- Cardiovascular Surgery Unit, Cardio Thoracic Centre of Monaco, Monaco
| | - Juliette Raffort
- Université Côte d'Azur, CHU, Inserm U1065, C3M, Nice, France; Clinical Chemistry Laboratory, University Hospital of Nice, France
| |
Collapse
|
34
|
Fick T, van Doormaal JAM, Hoving EW, Regli L, van Doormaal TPC. Holographic patient tracking after bed movement for augmented reality neuronavigation using a head-mounted display. Acta Neurochir (Wien) 2021; 163:879-884. [PMID: 33515122 PMCID: PMC7966201 DOI: 10.1007/s00701-021-04707-4] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2020] [Accepted: 01/04/2021] [Indexed: 11/27/2022]
Abstract
BACKGROUND Holographic neuronavigation has several potential advantages compared to conventional neuronavigation systems. We present the first report of a holographic neuronavigation system with patient-to-image registration and patient tracking with a reference array using an augmented reality head-mounted display (AR-HMD). METHODS Three patients undergoing an intracranial neurosurgical procedure were included in this pilot study. The relevant anatomy was first segmented in 3D and then uploaded as holographic scene in our custom neuronavigation software. Registration was performed using point-based matching using anatomical landmarks. We measured the fiducial registration error (FRE) as the outcome measure for registration accuracy. A custom-made reference array with QR codes was integrated in the neurosurgical setup and used for patient tracking after bed movement. RESULTS Six registrations were performed with a mean FRE of 8.5 mm. Patient tracking was achieved with no visual difference between the registration before and after movement. CONCLUSIONS This first report shows a proof of principle of intraoperative patient tracking using a standalone holographic neuronavigation system. The navigation accuracy should be further optimized to be clinically applicable. However, it is likely that this technology will be incorporated in future neurosurgical workflows because the system improves spatial anatomical understanding for the surgeon.
Collapse
Affiliation(s)
- T Fick
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584, CS, Utrecht, The Netherlands.
| | - J A M van Doormaal
- Department of Oral and Maxillofacial surgery, University Medical Centre Utrecht, Heidelberglaan 100, 3584, CX, Utrecht, The Netherlands
| | - E W Hoving
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Heidelberglaan 25, 3584, CS, Utrecht, The Netherlands
- Department of Neurosurgery, University Medical Centre Utrecht, Heidelberglaan 100, 3584, CX, Utrecht, The Netherlands
| | - L Regli
- Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zurich, University of Zurich, Rämistrasse 100, 8091, Zürich, Switzerland
| | - T P C van Doormaal
- Department of Neurosurgery, University Medical Centre Utrecht, Heidelberglaan 100, 3584, CX, Utrecht, The Netherlands
- Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zurich, University of Zurich, Rämistrasse 100, 8091, Zürich, Switzerland
| |
Collapse
|
35
|
Cofano F, Di Perna G, Bozzaro M, Longo A, Marengo N, Zenga F, Zullo N, Cavalieri M, Damiani L, Boges DJ, Agus M, Garbossa D, Calì C. Augmented Reality in Medical Practice: From Spine Surgery to Remote Assistance. Front Surg 2021; 8:657901. [PMID: 33859995 PMCID: PMC8042331 DOI: 10.3389/fsurg.2021.657901] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2021] [Accepted: 03/08/2021] [Indexed: 11/19/2022] Open
Abstract
Background: While performing surgeries in the OR, surgeons and assistants often need to access several information regarding surgical planning and/or procedures related to the surgery itself, or the accessory equipment to perform certain operations. The accessibility of this information often relies on the physical presence of technical and medical specialists in the OR, which is increasingly difficult due to the number of limitations imposed by the COVID emergency to avoid overcrowded environments or external personnel. Here, we analyze several scenarios where we equipped OR personnel with augmented reality (AR) glasses, allowing a remote specialist to guide OR operations through voice and ad-hoc visuals, superimposed to the field of view of the operator wearing them. Methods: This study is a preliminary case series of prospective collected data about the use of AR-assistance in spine surgery from January to July 2020. The technology has been used on a cohort of 12 patients affected by degenerative lumbar spine disease with lumbar sciatica co-morbidities. Surgeons and OR specialists were equipped with AR devices, customized with P2P videoconference commercial apps, or customized holographic apps. The devices were tested during surgeries for lumbar arthrodesis in a multicenter experience involving author's Institutions. Findings: A total number of 12 lumbar arthrodesis have been performed while using the described AR technology, with application spanning from telementoring (3), teaching (2), surgical planning superimposition and interaction with the hologram using a custom application for Microsoft hololens (1). Surgeons wearing the AR goggles reported a positive feedback as for the ergonomy, wearability and comfort during the procedure; being able to visualize a 3D reconstruction during surgery was perceived as a straightforward benefit, allowing to speed-up procedures, thus limiting post-operational complications. The possibility of remotely interacting with a specialist on the glasses was a potent added value during COVID emergency, due to limited access of non-resident personnel in the OR. Interpretation: By allowing surgeons to overlay digital medical content on actual surroundings, augmented reality surgery can be exploited easily in multiple scenarios by adapting commercially available or custom-made apps to several use cases. The possibility to observe directly the operatory theater through the eyes of the surgeon might be a game-changer, giving the chance to unexperienced surgeons to be virtually at the site of the operation, or allowing a remote experienced operator to guide wisely the unexperienced surgeon during a procedure.
Collapse
Affiliation(s)
- Fabio Cofano
- Neurosurgery Unit, Department of Neuroscience "Rita Levi Montalcini," University of Torino, Turin, Italy.,Spine Surgery Unit, Humanitas Gradenigo, Turin, Italy
| | - Giuseppe Di Perna
- Neurosurgery Unit, Department of Neuroscience "Rita Levi Montalcini," University of Torino, Turin, Italy
| | - Marco Bozzaro
- Spine Surgery Unit, Humanitas Gradenigo, Turin, Italy
| | | | - Nicola Marengo
- Neurosurgery Unit, Department of Neuroscience "Rita Levi Montalcini," University of Torino, Turin, Italy
| | - Francesco Zenga
- Neurosurgery Unit, Department of Neuroscience "Rita Levi Montalcini," University of Torino, Turin, Italy
| | - Nicola Zullo
- Spine Surgery Unit, Casa di Cura Città di Bra, Bra, Italy
| | | | - Luca Damiani
- Intravides SRL, Palazzo degli Istituti Anatomici, Turin, Italy.,LD Consulting, Chiavari, Italy
| | - Daniya J Boges
- Intravides SRL, Palazzo degli Istituti Anatomici, Turin, Italy.,BESE Division, King Abdullah University of Science and Technology, Thuwal, Saudi Arabia
| | - Marco Agus
- College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| | - Diego Garbossa
- Neurosurgery Unit, Department of Neuroscience "Rita Levi Montalcini," University of Torino, Turin, Italy
| | - Corrado Calì
- Neuroscience Institute Cavalieri Ottolenghi, Orbassano, Italy.,Department of Neuroscience "Rita Levi Montalcini," University of Torino, Turin, Italy
| |
Collapse
|
36
|
Gsaxner C, Pepe A, Li J, Ibrahimpasic U, Wallner J, Schmalstieg D, Egger J. Augmented Reality for Head and Neck Carcinoma Imaging: Description and Feasibility of an Instant Calibration, Markerless Approach. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 200:105854. [PMID: 33261944 DOI: 10.1016/j.cmpb.2020.105854] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Accepted: 11/16/2020] [Indexed: 06/12/2023]
Abstract
BACKGROUND AND OBJECTIVE Augmented reality (AR) can help to overcome current limitations in computer assisted head and neck surgery by granting "X-ray vision" to physicians. Still, the acceptance of AR in clinical applications is limited by technical and clinical challenges. We aim to demonstrate the benefit of a marker-free, instant calibration AR system for head and neck cancer imaging, which we hypothesize to be acceptable and practical for clinical use. METHODS We implemented a novel AR system for visualization of medical image data registered with the head or face of the patient prior to intervention. Our system allows the localization of head and neck carcinoma in relation to the outer anatomy. Our system does not require markers or stationary infrastructure, provides instant calibration and allows 2D and 3D multi-modal visualization for head and neck surgery planning via an AR head-mounted display. We evaluated our system in a pre-clinical user study with eleven medical experts. RESULTS Medical experts rated our application with a system usability scale score of 74.8 ± 15.9, which signifies above average, good usability and clinical acceptance. An average of 12.7 ± 6.6 minutes of training time was needed by physicians, before they were able to navigate the application without assistance. CONCLUSIONS Our AR system is characterized by a slim and easy setup, short training time and high usability and acceptance. Therefore, it presents a promising, novel tool for visualizing head and neck cancer imaging and pre-surgical localization of target structures.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jianning Li
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Una Ibrahimpasic
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria
| | - Jürgen Wallner
- Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria; Department of Cranio-Maxillofacial Surgery, AZ Monica Hospital Antwerp and Antwerp University Hospital, Antwerp, Belgium.
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, Inffeldgasse 16, 8010 Graz, Austria; Department of Oral and Maxillofacial Surgery, Medical University of Graz, Auenbruggerplatz 5, 8036 Graz, Austria; Computer Algorithms for Medicine Laboratory, Graz, Austria; BioTechMed-Graz, Mozartgasse 12/II, 8010 Graz, Austria.
| |
Collapse
|
37
|
Teatini A, Kumar RP, Elle OJ, Wiig O. Mixed reality as a novel tool for diagnostic and surgical navigation in orthopaedics. Int J Comput Assist Radiol Surg 2021; 16:407-414. [PMID: 33555563 PMCID: PMC7946663 DOI: 10.1007/s11548-020-02302-z] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Accepted: 12/14/2020] [Indexed: 12/15/2022]
Abstract
Purpose This study presents a novel surgical navigation tool developed in mixed reality environment for orthopaedic surgery. Joint and skeletal deformities affect all age groups and greatly reduce the range of motion of the joints. These deformities are notoriously difficult to diagnose and to correct through surgery. Method We have developed a surgical tool which integrates surgical instrument tracking and augmented reality through a head mounted display. This allows the surgeon to visualise bones with the illusion of possessing “X-ray” vision. The studies presented below aim to assess the accuracy of the surgical navigation tool in tracking a location at the tip of the surgical instrument in holographic space. Results Results show that the average accuracy provided by the navigation tool is around 8 mm, and qualitative assessment by the orthopaedic surgeons provided positive feedback in terms of the capabilities for diagnostic use. Conclusions More improvements are necessary for the navigation tool to be accurate enough for surgical applications, however, this new tool has the potential to improve diagnostic accuracy and allow for safer and more precise surgeries, as well as provide for better learning conditions for orthopaedic surgeons in training.
Collapse
Affiliation(s)
- Andrea Teatini
- The Intervention Centre, Oslo University Hospital, Oslo, Norway.
- Department of Informatics, University of Oslo, Oslo, Norway.
| | - Rahul P Kumar
- The Intervention Centre, Oslo University Hospital, Oslo, Norway
| | - Ole Jakob Elle
- The Intervention Centre, Oslo University Hospital, Oslo, Norway
- Department of Informatics, University of Oslo, Oslo, Norway
| | - Ola Wiig
- Department of Orthopaedic Surgery, Oslo University Hospital, Oslo, Norway
| |
Collapse
|
38
|
Fotouhi J, Mehrfard A, Song T, Johnson A, Osgood G, Unberath M, Armand M, Navab N. Development and Pre-Clinical Analysis of Spatiotemporal-Aware Augmented Reality in Orthopedic Interventions. IEEE TRANSACTIONS ON MEDICAL IMAGING 2021; 40:765-778. [PMID: 33166252 PMCID: PMC8317976 DOI: 10.1109/tmi.2020.3037013] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. As a consequence, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. The main contribution of this paper is to reveal how exemplary workflows are redefined by taking full advantage of head-mounted displays when entirely co-registered with the imaging system at all times. The awareness of the system from the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. Our system achieved an error of 4.76 ± 2.91mm for placing K-wire in a fracture management procedure, and yielded errors of 1.57 ± 1.16° and 1.46 ± 1.00° in the abduction and anteversion angles, respectively, for total hip arthroplasty (THA). We compared the results with the outcomes from baseline standard operative and non-immersive AR procedures, which had yielded errors of [4.61mm, 4.76°, 4.77°] and [5.13mm, 1.78°, 1.43°], respectively, for wire placement, and abduction and anteversion during THA. We hope that our holistic approach towards improving the interface of surgery not only augments the surgeon's capabilities but also augments the surgical team's experience in carrying out an effective intervention with reduced complications and provide novel approaches of documenting procedures for training purposes.
Collapse
|
39
|
Velazco-Garcia JD, Shah DJ, Leiss EL, Tsekos NV. A modular and scalable computational framework for interactive immersion into imaging data with a holographic augmented reality interface. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 198:105779. [PMID: 33045556 DOI: 10.1016/j.cmpb.2020.105779] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/29/2020] [Accepted: 09/26/2020] [Indexed: 06/11/2023]
Abstract
BACKGROUND AND OBJECTIVE Modern imaging scanners produce an ever-growing body of 3D/4D multimodal data requiring image analytics and visualization of fused images, segmentations, and information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has shown potential. This work describes a framework (FI3D) for interactive immersion with data, integration of image processing and analytics, and rendering and fusion with an AR interface. METHODS The FI3D was designed and endowed with modules to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of FI3D is deployed to a dedicated computational unit that performs the computationally demanding processes in real-time, and the HMD is used as a display output peripheral and an input peripheral through gestures and voice commands. FI3D offers user-made processing and analysis dedicated modules. Users can customize and optimize these for a particular workflow while incorporating current or future libraries. RESULTS The FI3D framework was used to develop a workflow for processing, rendering, and visualization of CINE MRI cardiac sets. In this version, the data were loaded from a remote database, and the endocardium and epicardium of the left ventricle (LV) were segmented using a machine learning model and transmitted to a HoloLens HMD to be visualized in 4D. Performance results show that the system is capable of maintaining an image stream of one image per second with a resolution of 512 × 512. Also, it can modify visual properties of the holograms at 1 update per 16 milliseconds (62.5 Hz) while providing enough resources for the segmentation and surface reconstruction tasks without hindering the HMD. CONCLUSIONS We provide a system design and framework to be used as a foundation for medical applications that benefit from AR visualization, removing several technical challenges from the developmental pipeline.
Collapse
Affiliation(s)
- Jose D Velazco-Garcia
- MRI Lab, Dept. of CS, University of Houston, 4800 Calhoun Road PGH 501, Houston, TX, USA.
| | - Dipan J Shah
- Cardiovascular MRI Lab, Houston Methodist DeBakey Heart and Vascular Center, 6550 Fannin St., Smith Tower - Suite 1801, Houston, USA.
| | - Ernst L Leiss
- MRI Lab, Dept. of CS, University of Houston, 4800 Calhoun Road PGH 501, Houston, TX, USA.
| | - Nikolaos V Tsekos
- MRI Lab, Dept. of CS, University of Houston, 4800 Calhoun Road PGH 501, Houston, TX, USA.
| |
Collapse
|
40
|
Intraoperative Feedback and Quality Control in Orbital Reconstruction: The Past, the Present, and the Future. Atlas Oral Maxillofac Surg Clin North Am 2020; 29:97-108. [PMID: 33516542 DOI: 10.1016/j.cxom.2020.11.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
41
|
Andrews CM, Henry AB, Soriano IM, Southworth MK, Silva JR. Registration Techniques for Clinical Applications of Three-Dimensional Augmented Reality Devices. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2020; 9:4900214. [PMID: 33489483 PMCID: PMC7819530 DOI: 10.1109/jtehm.2020.3045642] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 11/13/2020] [Accepted: 12/03/2020] [Indexed: 12/15/2022]
Abstract
Many clinical procedures would benefit from direct and intuitive real-time visualization of anatomy, surgical plans, or other information crucial to the procedure. Three-dimensional augmented reality (3D-AR) is an emerging technology that has the potential to assist physicians with spatial reasoning during clinical interventions. The most intriguing applications of 3D-AR involve visualizations of anatomy or surgical plans that appear directly on the patient. However, commercially available 3D-AR devices have spatial localization errors that are too large for many clinical procedures. For this reason, a variety of approaches for improving 3D-AR registration accuracy have been explored. The focus of this review is on the methods, accuracy, and clinical applications of registering 3D-AR devices with the clinical environment. The works cited represent a variety of approaches for registering holograms to patients, including manual registration, computer vision-based registration, and registrations that incorporate external tracking systems. Evaluations of user accuracy when performing clinically relevant tasks suggest that accuracies of approximately 2 mm are feasible. 3D-AR device limitations due to the vergence-accommodation conflict or other factors attributable to the headset hardware add on the order of 1.5 mm of error compared to conventional guidance. Continued improvements to 3D-AR hardware will decrease these sources of error.
Collapse
Affiliation(s)
- Christopher M. Andrews
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
- SentiAR, Inc.St. LouisMO63108USA
| | | | | | | | - Jonathan R. Silva
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
| |
Collapse
|
42
|
Fick T, van Doormaal JAM, Hoving EW, Willems PWA, van Doormaal TPC. Current Accuracy of Augmented Reality Neuronavigation Systems: Systematic Review and Meta-Analysis. World Neurosurg 2020; 146:179-188. [PMID: 33197631 DOI: 10.1016/j.wneu.2020.11.029] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2020] [Revised: 11/04/2020] [Accepted: 11/05/2020] [Indexed: 12/17/2022]
Abstract
BACKGROUND Augmented reality neuronavigation (ARN) systems can overlay three-dimensional anatomy and disease without the need for a two-dimensional external monitor. Accuracy is crucial for their clinical applicability. We performed a systematic review regarding the reported accuracy of ARN systems and compared them with the accuracy of conventional infrared neuronavigation (CIN). METHODS PubMed and Embase were searched for ARN and CIN systems. For ARN, type of system, method of patient-to-image registration, accuracy method, and accuracy of the system were noted. For CIN, navigation accuracy, expressed as target registration error (TRE), was noted. A meta-analysis was performed comparing the TRE of ARN and CIN systems. RESULTS Thirty-five studies were included, 12 for ARN and 23 for CIN. ARN systems could be divided into head-mounted display and heads-up display. In ARN, 4 methods were encountered for patient-to-image registration, of which point-pair matching was the one most frequently used. Five methods for assessing accuracy were described. Ninety-four TRE measurements of ARN systems were compared with 9058 TRE measurements of CIN systems. Mean TRE was 2.5 mm (95% confidence interval, 0.7-4.4) for ARN systems and 2.6 mm (95% confidence interval, 2.1-3.1) for CIN systems. CONCLUSIONS In ARN, there seems to be lack of agreement regarding the best method to assess accuracy. Nevertheless, ARN systems seem able to achieve an accuracy comparable to CIN systems. Future studies should be prospective and compare TREs, which should be measured in a standardized fashion.
Collapse
Affiliation(s)
- Tim Fick
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Utrecht, The Netherlands.
| | - Jesse A M van Doormaal
- Department of Oral and Maxillofacial Surgery, University Medical Centre Utrecht, Utrecht, The Netherlands
| | - Eelco W Hoving
- Department of Neuro-oncology, Princess Máxima Center for Pediatric Oncology, Utrecht, The Netherlands
| | - Peter W A Willems
- Department of Neurosurgery, University Medical Centre Utrecht, Utrecht, The Netherlands
| | - Tristan P C van Doormaal
- Department of Neurosurgery, University Medical Centre Utrecht, Utrecht, The Netherlands; Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zurich, University of Zurich, Switzerland
| |
Collapse
|
43
|
van Doormaal TPC, van Doormaal JAM, Mensink T. Clinical Accuracy of Holographic Navigation Using Point-Based Registration on Augmented-Reality Glasses. Oper Neurosurg (Hagerstown) 2020; 17:588-593. [PMID: 31081883 PMCID: PMC6995446 DOI: 10.1093/ons/opz094] [Citation(s) in RCA: 42] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2018] [Accepted: 12/25/2018] [Indexed: 01/05/2023] Open
Abstract
BACKGROUND As current augmented-reality (AR) smart glasses are self-contained, powerful computers that project 3-dimensional holograms that can maintain their position in physical space, they could theoretically be used as a low-cost, stand-alone neuronavigation system. OBJECTIVE To determine feasibility and accuracy of holographic neuronavigation (HN) using AR smart glasses. METHODS We programmed a fully functioning neuronavigation system on commercially available smart glasses (HoloLens®, Microsoft, Redmond, Washington) and tested its accuracy and feasibility in the operating room. The fiducial registration error (FRE) was measured for both HN and conventional neuronavigation (CN) (Brainlab, Munich, Germany) by using point-based registration on a plastic head model. Subsequently, we measured HN and CN FRE on 3 patients. RESULTS A stereoscopic view of the holograms was successfully achieved in all experiments. In plastic head measurements, the mean HN FRE was 7.2 ± 1.8 mm compared to the mean CN FRE of 1.9 ± 0.45 (mean difference: –5.3 mm; 95% confidence interval [CI]: –6.7 to –3.9). In the 3 patients, the mean HN FRE was 4.4 ± 2.5 mm compared to the mean CN FRE of 3.6 ± 0.5 (mean difference: –0.8 mm; 95% CI: –3.0 to 4.6). CONCLUSION Owing to the potential benefits and promising results, we believe that HN could eventually find application in operating rooms. However, several improvements will have to be made before the device can be used in clinical practice.
Collapse
Affiliation(s)
- Tristan P C van Doormaal
- Rudolf Magnus Institute of Neuroscience, Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands.,Brain Technology Institute, Utrecht, The Netherlands
| | - Jesse A M van Doormaal
- Rudolf Magnus Institute of Neuroscience, Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands.,Brain Technology Institute, Utrecht, The Netherlands
| | - Tom Mensink
- Brain Technology Institute, Utrecht, The Netherlands
| |
Collapse
|
44
|
|
45
|
Kraeima J, Glas HH, Merema BBJ, Vissink A, Spijkervet FKL, Witjes MJH. Three-dimensional virtual surgical planning in the oncologic treatment of the mandible. Oral Dis 2020; 27:14-20. [PMID: 32881177 DOI: 10.1111/odi.13631] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2019] [Revised: 07/30/2020] [Accepted: 08/22/2020] [Indexed: 02/04/2023]
Abstract
OBJECTIVES In case of surgical removal of oral squamous cell carcinomas, a resection of mandibular bone is frequently part of the treatment. Nowadays, such resections frequently include the application of 3D virtual surgical planning (VSP) and guided surgery techniques. In this paper, current methods for 3D VSP leads for optimisation of the workflow, and patient-specific application of guides and implants are reviewed. RECENT FINDINGS Current methods for 3D VSP enable multi-modality fusion of images. This fusion of images is not restricted to a specific software package or workflow. New strategies for 3D VSP in Oral and Maxillofacial Surgery include finite element analysis, deep learning and advanced augmented reality techniques. These strategies aim to improve the treatment in terms of accuracy, predictability and safety. CONCLUSIONS Application of the discussed novel technologies and strategies will improve the accuracy and safety of mandibular resection and reconstruction planning. Accurate, easy-to-use, safe and efficient three-dimensional VSP can be applied for every patient with malignancies needing resection of the mandible.
Collapse
Affiliation(s)
- Joep Kraeima
- Department of Oral and Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Haye H Glas
- Department of Oral and Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Bram Barteld Jan Merema
- Department of Oral and Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Arjan Vissink
- Department of Oral and Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Fred K L Spijkervet
- Department of Oral and Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Max J H Witjes
- Department of Oral and Maxillofacial Surgery, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
46
|
Value of the surgeon's sightline on hologram registration and targeting in mixed reality. Int J Comput Assist Radiol Surg 2020; 15:2027-2039. [PMID: 32984934 PMCID: PMC7671978 DOI: 10.1007/s11548-020-02263-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2020] [Accepted: 09/14/2020] [Indexed: 12/12/2022]
Abstract
Purpose Mixed reality (MR) is being evaluated as a visual tool for surgical navigation. Current literature presents unclear results on intraoperative accuracy using the Microsoft HoloLens 1®. This study aims to assess the impact of the surgeon’s sightline in an inside-out marker-based MR navigation system for open surgery. Methods Surgeons at Akershus University Hospital tested this system. A custom-made phantom was used, containing 18 wire target crosses within its inner walls. A CT scan was obtained in order to segment all wire targets into a single 3D-model (hologram). An in-house software application (CTrue), developed for the Microsoft HoloLens 1, uploaded 3D-models and automatically registered the 3D-model with the phantom. Based on the surgeon’s sightline while registering and targeting (free sightline /F/or a strictly perpendicular sightline /P/), 4 scenarios were developed (FF-PF-FP-PP). Target error distance (TED) was obtained in three different working axes-(XYZ).
Results Six surgeons (5 males, age 29–62) were enrolled. A total of 864 measurements were collected in 4 scenarios, twice. Scenario PP showed the smallest TED in XYZ-axes mean = 2.98 mm ± SD 1.33; 2.28 mm ± SD 1.45; 2.78 mm ± SD 1.91, respectively. Scenario FF showed the largest TED in XYZ-axes with mean = 10.03 mm ± SD 3.19; 6.36 mm ± SD 3.36; 16.11 mm ± SD 8.91, respectively. Multiple comparison tests, grouped in scenarios and axes, showed that the majority of scenario comparisons had significantly different TED values (p < 0.05). Y-axis always presented the smallest TED regardless of scenario tested. Conclusion A strictly perpendicular working sightline in relation to the 3D-model achieves the best accuracy results. Shortcomings in this technology, as an intraoperative visual cue, can be overcome by sightline correction. Incidentally, this is the preferred working angle for open surgery.
Collapse
|
47
|
Gibby J, Cvetko S, Javan R, Parr R, Gibby W. Use of augmented reality for image-guided spine procedures. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2020; 29:1823-1832. [DOI: 10.1007/s00586-020-06495-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/22/2019] [Revised: 04/07/2020] [Accepted: 05/31/2020] [Indexed: 12/14/2022]
|
48
|
Early Feasibility Studies of Augmented Reality Navigation for Lateral Skull Base Surgery. Otol Neurotol 2020; 41:883-888. [DOI: 10.1097/mao.0000000000002724] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
49
|
Li G, Dong J, Wang J, Cao D, Zhang X, Cao Z, Lu G. The clinical application value of mixed-reality-assisted surgical navigation for laparoscopic nephrectomy. Cancer Med 2020; 9:5480-5489. [PMID: 32543025 PMCID: PMC7402835 DOI: 10.1002/cam4.3189] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2020] [Revised: 05/06/2020] [Accepted: 05/07/2020] [Indexed: 12/22/2022] Open
Abstract
Purpose Laparoscopic nephrectomy (LN) has become the preferred method for renal cell carcinoma (RCC). Adequate preoperative assessment or intraoperative navigation is key to the successful implementation of LN. The aim of this study was to evaluate the clinical application value of mixed‐reality–assisted surgical navigation (MRASN) in LN. Patients and Methods A total of 100 patients with stage T1N0M0 renal tumors who underwent laparoscopic partial nephrectomy (LPN) or laparoscopic radical nephrectomy (LRN) were prospectively enrolled and divided into a mixed‐reality‐assisted laparoscopic nephrectomy (MRALN) group (n = 50) and a non–mixed‐reality‐assisted laparoscopic nephrectomy (non‐MRALN) group (n = 50). All patients underwent renal contrast‐enhanced CT scans. The CT DICOM data of all patients in the MRALN group were imported into the mixed‐reality (MR) postprocessing workstation and underwent holographic three‐dimensional visualization (V3D) modeling and MR displayed, respectively. We adopted the Likert scale to evaluate the clinical application value of MRASN. The consistency of evaluators was assessed using the Cohen kappa coefficient (k). Results No significant differences in patient demographic indicators between the MRALN group and the non‐MRALN group (P > .05). The subjective score of MRASN clinical application value in operative plan formulation, intraoperative navigation, remote consultation, teaching guidance, and doctor‐patient communication were higher in the MRASN group than in the non‐MRASN group (all P < .001). There were significantly more patients for whom LPN was successfully implemented in the MRALN group than in the non‐MRALN group (82% vs 46%, P < .001). The MRALN group had a shorter operative time (OT) and warm ischemia time (WIT) and less estimated blood loss (EBL) than the non‐MRALN group (all P < .001). Conclusion MRASN is helpful for operative plan formulation, intraoperative navigation, remote consultation, teaching guidance, and doctor‐patient communication. MRALN may effectively improve the successful implementation rate of LPN and reduce the OT, WIT, and EBL.
Collapse
Affiliation(s)
- Guan Li
- Department of Radiology, Jinling Hospital, Nanjing Medical University, Nanjing, China
| | - Jie Dong
- Department of Urology, Jinling Hospital, Nanjing Medical University, Nanjing, China
| | - Jinbao Wang
- Department of Radiology, General Hospital of Northern Theater Command, Shenyang, China
| | - Dongbing Cao
- Department of Urology, Cancer Hospital of China Medical University, Shenyang, China
| | - Xin Zhang
- Department of Radiology, The First Affiliated Hospital of China Medical University, Shenyang, China
| | - Zhiqiang Cao
- Department of Urology, General Hospital of Northern Theater Command, Shenyang, China
| | - Guangming Lu
- Department of Radiology, Jinling Hospital, Nanjing Medical University, Nanjing, China
| |
Collapse
|
50
|
Salmas M, Chronopoulos E, Chytas D. Comment on: "A Novel Evaluation Model for a Mixed-Reality Surgical Navigation System: Where Microsoft HoloLens Meets the Operating Room". Surg Innov 2020; 27:702-703. [PMID: 32490724 DOI: 10.1177/1553350620927607] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Affiliation(s)
- Marios Salmas
- Department of Anatomy, School of Medicine, 68993National and Kapodistrian University of Athens, Greece
| | - Efstathios Chronopoulos
- 2nd Orthopaedic Department, School of Medicine, 68993National and Kapodistrian University of Athens, "Konstantopoulio-Patission" Hospital, Greece
| | - Dimitrios Chytas
- 2nd Orthopaedic Department, School of Medicine, 68993National and Kapodistrian University of Athens, "Konstantopoulio-Patission" Hospital, Greece.,Department of Anatomy, School of Medicine, 112436European University of Cyprus, Cyprus
| |
Collapse
|