1
|
Kihara T, Keller A, Ogawa T, Armand M, Martin-Gomez A. Evaluating the feasibility of using augmented reality for tooth preparation. J Dent 2024; 148:105217. [PMID: 38944264 DOI: 10.1016/j.jdent.2024.105217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2024] [Revised: 06/12/2024] [Accepted: 06/27/2024] [Indexed: 07/01/2024] Open
Abstract
OBJECTIVES Tooth preparation is complicated because it requires the preparation of an abutment while simultaneously predicting the ideal shape of the tooth. This study aimed to develop and evaluate a system using augmented reality (AR) head-mounted displays (HMDs) that provide dynamic navigation capabilities for tooth preparation. METHODS The proposed system utilizes optical see-through HMDs to overlay digital information onto the real world and enrich the user's environment. By integrating tracking algorithms and three-dimensional modeling, the system provides real-time visualization and navigation capabilities during tooth preparation by using two different visualization techniques. The experimental setup involved a comprehensive analysis of the distance to the surface and cross-sectional angles between the ideal and prepared teeth using three scenarios: traditional (without AR), overlay (AR-assisted visualization of the ideal prepared tooth), and cross-sectional (AR-assisted visualization with cross-sectional views and angular displays). RESULTS A user study (N = 24) revealed that the cross-sectional approach was more effective for angle adjustment and reduced the occurrence of over-reduction. Additional questionnaires revealed that the AR-assisted approaches were perceived as less difficult, with the cross-sectional approach excelling in terms of performance. CONCLUSIONS Visualization and navigation using cross-sectional approaches have the potential to support safer tooth preparation with less overreduction than traditional and overlay approaches do. The angular displays provided by the cross-sectional approach are considered helpful for tooth preparation. CLINICAL SIGNIFICANCE The AR navigation system can assist dentists during tooth preparation and has the potential to enhance the accuracy and safety of prosthodontic treatment.
Collapse
Affiliation(s)
- Takuya Kihara
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Fixed Prosthodontics, School of Dental Medicine, Tsurumi University, 2-1-3 Tsurumi, Tsurumi-ku, Yokohama, Kanagawa, 734-8501, Japan.
| | - Andreas Keller
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Computer Science, Technical University of Munich, Munich, Germany
| | - Takumi Ogawa
- Department of Fixed Prosthodontics, School of Dental Medicine, Tsurumi University, 2-1-3 Tsurumi, Tsurumi-ku, Yokohama, Kanagawa, 734-8501, Japan
| | - Mehran Armand
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA; Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA; Department of Orthopaedic Surgery, Johns Hopkins University, Baltimore, MD, USA
| | - Alejandro Martin-Gomez
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA; The Malone Center for Engineering in Healthcare, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
2
|
Rieder M, Remschmidt B, Gsaxner C, Gaessler J, Payer M, Zemann W, Wallner J. Augmented Reality-Guided Extraction of Fully Impacted Lower Third Molars Based on Maxillofacial CBCT Scans. Bioengineering (Basel) 2024; 11:625. [PMID: 38927861 PMCID: PMC11200966 DOI: 10.3390/bioengineering11060625] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2024] [Revised: 06/07/2024] [Accepted: 06/16/2024] [Indexed: 06/28/2024] Open
Abstract
(1) Background: This study aimed to integrate an augmented reality (AR) image-guided surgery (IGS) system, based on preoperative cone beam computed tomography (CBCT) scans, into clinical practice. (2) Methods: In preclinical and clinical surgical setups, an AR-guided visualization system based on Microsoft's HoloLens 2 was assessed for complex lower third molar (LTM) extractions. In this study, the system's potential intraoperative feasibility and usability is described first. Preparation and operating times for each procedure were measured, as well as the system's usability, using the System Usability Scale (SUS). (3) Results: A total of six LTMs (n = 6) were analyzed, two extracted from human cadaver head specimens (n = 2) and four from clinical patients (n = 4). The average preparation time was 166 ± 44 s, while the operation time averaged 21 ± 5.9 min. The overall mean SUS score was 79.1 ± 9.3. When analyzed separately, the usability score categorized the AR-guidance system as "good" in clinical patients and "best imaginable" in human cadaver head procedures. (4) Conclusions: This translational study analyzed the first successful and functionally stable application of the HoloLens technology for complex LTM extraction in clinical patients. Further research is needed to refine the technology's integration into clinical practice to improve patient outcomes.
Collapse
Affiliation(s)
- Marcus Rieder
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Bernhard Remschmidt
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria
| | - Jan Gaessler
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Michael Payer
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Wolfgang Zemann
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Juergen Wallner
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| |
Collapse
|
3
|
Wang X, Yang C, Liu Z, Zhang J, Xue C, Xing L, Zheng Y, Geng C, Yin X. R-MFE-TCN: A correlation prediction model between body surface and tumor during respiratory movement. Med Phys 2024. [PMID: 38801342 DOI: 10.1002/mp.17183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2024] [Revised: 04/30/2024] [Accepted: 05/11/2024] [Indexed: 05/29/2024] Open
Abstract
BACKGROUND 2D CT image-guided radiofrequency ablation (RFA) is an exciting minimally invasive treatment that can destroy liver tumors without removing them. However, CT images can only provide limited static information, and the tumor will move with the patient's respiratory movement. Therefore, how to accurately locate tumors under free conditions is an urgent problem to be solved at present. PURPOSE The purpose of this study is to propose a respiratory correlation prediction model for mixed reality surgical assistance system, Riemannian and Multivariate Feature Enhanced Temporal Convolutional Network (R-MFE-TCN), and to achieve accurate respiratory correlation prediction. METHODS The model adopts a respiration-oriented Riemannian information enhancement strategy to expand the diversity of the dataset. A new Multivariate Feature Enhancement module (MFE) is proposed to retain respiratory data information, so that the network can fully explore the correlation of internal and external data information, the dual-channel is used to retain multivariate respiratory feature, and the Multi-headed Self-attention obtains respiratory peak-to-valley value periodic information. This information significantly improves the prediction performance of the network. At the same time, the PSO algorithm is used for hyperparameter optimization. In the experiment, a total of seven patients' internal and external respiratory motion trajectories were obtained from the dataset, and the first six patients were selected as the training set. The respiratory signal collection frequency was 21 Hz. RESULTS A large number of experiments on the dataset prove the good performance of this method, which improves the prediction accuracy while also having strong robustness. This method can reduce the delay deviation under long window prediction and achieve good performance. In the case of 400 ms, the average RMSE and MAE are 0.0453 and 0.0361 mm, respectively, which is better than other research methods. CONCLUSION The R-MFE-TCN can be extended to respiratory correlation prediction in different clinical situations, meeting the accuracy requirements for respiratory delay prediction in surgical assistance.
Collapse
Affiliation(s)
- Xuehu Wang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Chang Yang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Ziqi Liu
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Jushuo Zhang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Chao Xue
- Senior Department of Orthopedics, the Fourth Medical Center of PLA General Hospital, Beijing, China
| | - Lihong Xing
- Affiliated Hospital of Hebei University, Baoding, China
| | - Yongchang Zheng
- Department of Liver Surgery, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College (CAMS & PUMC), Beijing, China
| | - Chen Geng
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
| | - Xiaoping Yin
- Affiliated Hospital of Hebei University, Baoding, China
| |
Collapse
|
4
|
Bochet Q, Raoul G, Lauwers L, Nicot R. Augmented reality in implantology: Virtual surgical checklist and augmented implant placement. JOURNAL OF STOMATOLOGY, ORAL AND MAXILLOFACIAL SURGERY 2024:101813. [PMID: 38452901 DOI: 10.1016/j.jormas.2024.101813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2023] [Revised: 02/17/2024] [Accepted: 03/05/2024] [Indexed: 03/09/2024]
Abstract
OBJECTIVES Aim of the present study was to create a pedagogical checklist for implant surgical protocol with an augmented reality (AR) guided freehand surgery to inexperienced surgeons using a head mounted display (HMD) with tracking. METHODS The anatomical model of a patient with two missing mandibular teeth requiring conventional single-tooth implants was selected. The computed tomography (CT) scans were extracted and imported into segmentation and implant planning software. A Patient-specific dental splint through an intermediate strut, supported 3D-printed QR code. A checklist was generated to guide surgical procedure. After tracking, the AR-HMD projects the virtual pre-surgical plan (inferior alveolar nerve (IAN), implant axis, implant location) onto the real 3D-printed anatomical models. The entire drilling sequence was based on the manufacturer's recommendations, on 3D-printed anatomical models. After the implant surgical procedure, CT of the 3D-printed models was performed to compare the actual and simulated implant placements. All procedures in the study were performed in accordance with the Declaration of Helsinki. RESULTS In total, two implants were placed in a 3D-printed anatomical model of a female patient who required implant rehabilitation for dental agenesis at the second mandibular premolar positions (#35 and #45). Superimposition of the actual and simulated implants showed high concordance between them. CONCLUSION AR in education offers crucial surgical information for novice surgeons in real time. However, the benefits provided by AR in clinical and educational implantology must be demonstrated in other studies involving a larger number of patients, surgeons and apprentices.
Collapse
Affiliation(s)
- Quentin Bochet
- Univ. Lille, CHU Lille, Department of Oral and Maxillofacial Surgery, Lille F-59000, France
| | - Gwénaël Raoul
- Univ. Lille, CHU Lille, INSERM, Department of Oral and Maxillo-Facial Surgery, U1008 - Advanced Drug Delivery Systems, Lille F-59000, France
| | - Ludovic Lauwers
- Univ. Lille, CHU Lille, Department of Oral and Maxillofacial Surgery, URL 2694 - METRICS, Lille F-59000, France
| | - Romain Nicot
- Univ. Lille, CHU Lille, INSERM, Department of Oral and Maxillo-Facial Surgery, U1008 - Advanced Drug Delivery Systems, Lille F-59000, France; CNRS, Centrale Lille, Univ. Lille, UMR 9013 - LaMcube - Laboratoire de Mécanique, Multiphysique, Multiéchelle, Lille F-59000, France.
| |
Collapse
|
5
|
Hunt R, Scarpace L, Rock J. Integration of Augmented Reality Into Glioma Resection Surgery: A Case Report. Cureus 2024; 16:e53573. [PMID: 38445166 PMCID: PMC10914376 DOI: 10.7759/cureus.53573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/03/2024] [Indexed: 03/07/2024] Open
Abstract
Augmented reality (AR) is an exciting technology that has garnered considerable attention in the field of neurosurgery. Despite this, clinical use of this technology is still in its infancy. An area of great potential for this technology is the ability to display 3D anatomy overlaid with the patient to assist with presurgical and intraoperative decision-making. A 39-year-old woman presented with headaches and was experiencing what was described as a whooshing sound. MRI revealed the presence of a large left frontal mass involving the genu of the corpus callosum, with heterogeneous enhancement and central hemorrhagic necrosis, confirmed to be a glioma. She underwent a craniotomy with intraoperative MRI for resection. An augmented reality system was used to superimpose 3D holographic anatomy onto the patient's head for surgical planning. This report highlights a new AR technology and its immediate application to cranial neurosurgery. It is critical to document new uses of this technology as the field continues to integrate AR as well as other next-generation technologies into practice.
Collapse
Affiliation(s)
- Rachel Hunt
- Neurosurgery, Henry Ford Health System, Detroit, USA
| | - Lisa Scarpace
- Neurosurgery, Henry Ford Health System, Detroit, USA
| | - Jack Rock
- Neurosurgery, Henry Ford Health System, Detroit, USA
| |
Collapse
|
6
|
Kantak PA, Bartlett S, Chaker A, Harmon S, Mansour T, Pawloski J, Telemi E, Yeo H, Winslow S, Cohen J, Scarpace L, Robin A, Rock JP. Augmented Reality Registration System for Visualization of Skull Landmarks. World Neurosurg 2024; 182:e369-e376. [PMID: 38013107 DOI: 10.1016/j.wneu.2023.11.110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 11/21/2023] [Accepted: 11/22/2023] [Indexed: 11/29/2023]
Abstract
BACKGROUND Augmented reality (AR) is an emerging technology in neurosurgery with the potential to become a strategic tool in the delivery of care and education for trainees. Advances in technology have demonstrated promising use for improving visualization and spatial awareness of critical neuroanatomic structures. In this report, we employ a novel AR registration system for the visualization and targeting of skull landmarks. METHODS A markerless AR system was used to register 3-dimensional reconstructions of suture lines onto the head via a head-mounted display. Participants were required to identify craniometric points with and without AR assistance. Targeting error was measured as the Euclidian distance between the user-defined location and the true craniometric point on the subjects' heads. RESULTS All participants successfully registered 3-dimensional reconstructions onto the subjects' heads. Targeting accuracy was significantly improved with AR (3.59 ± 1.29 mm). Across all target points, AR increased accuracy by an average of 19.96 ± 3.80 mm. Posttest surveys revealed that participants felt the technology increased their confidence in identifying landmarks (4.6/5) and that the technology will be useful for clinical care (4.2/5). CONCLUSIONS While several areas of improvement and innovation can further enhance the use of AR in neurosurgery, this report demonstrates the feasibility of a markerless headset-based AR system for visualizing craniometric points on the skull. As the technology continues to advance, AR is expected to play an increasingly significant role in neurosurgery, transforming how surgeries are performed and improving patient care.
Collapse
Affiliation(s)
- Pranish A Kantak
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Seamus Bartlett
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Anisse Chaker
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Samuel Harmon
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Tarek Mansour
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Jacob Pawloski
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Edvin Telemi
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Heegook Yeo
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Samantha Winslow
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | | | - Lisa Scarpace
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Adam Robin
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Jack P Rock
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA.
| |
Collapse
|
7
|
Qi Z, Jin H, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. The Feasibility and Accuracy of Holographic Navigation with Laser Crosshair Simulator Registration on a Mixed-Reality Display. SENSORS (BASEL, SWITZERLAND) 2024; 24:896. [PMID: 38339612 PMCID: PMC10857152 DOI: 10.3390/s24030896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2024] [Revised: 01/21/2024] [Accepted: 01/23/2024] [Indexed: 02/12/2024]
Abstract
Addressing conventional neurosurgical navigation systems' high costs and complexity, this study explores the feasibility and accuracy of a simplified, cost-effective mixed reality navigation (MRN) system based on a laser crosshair simulator (LCS). A new automatic registration method was developed, featuring coplanar laser emitters and a recognizable target pattern. The workflow was integrated into Microsoft's HoloLens-2 for practical application. The study assessed the system's precision by utilizing life-sized 3D-printed head phantoms based on computed tomography (CT) or magnetic resonance imaging (MRI) data from 19 patients (female/male: 7/12, average age: 54.4 ± 18.5 years) with intracranial lesions. Six to seven CT/MRI-visible scalp markers were used as reference points per case. The LCS-MRN's accuracy was evaluated through landmark-based and lesion-based analyses, using metrics such as target registration error (TRE) and Dice similarity coefficient (DSC). The system demonstrated immersive capabilities for observing intracranial structures across all cases. Analysis of 124 landmarks showed a TRE of 3.0 ± 0.5 mm, consistent across various surgical positions. The DSC of 0.83 ± 0.12 correlated significantly with lesion volume (Spearman rho = 0.813, p < 0.001). Therefore, the LCS-MRN system is a viable tool for neurosurgical planning, highlighting its low user dependency, cost-efficiency, and accuracy, with prospects for future clinical application enhancements.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Ruochu Xiong
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, Kanazawa 920-8641, Japan;
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| |
Collapse
|
8
|
Tao B, Fan X, Wang F, Chen X, Shen Y, Wu Y. Comparison of the accuracy of dental implant placement using dynamic and augmented reality-based dynamic navigation: An in vitro study. J Dent Sci 2024; 19:196-202. [PMID: 38303816 PMCID: PMC10829549 DOI: 10.1016/j.jds.2023.05.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2023] [Revised: 05/05/2023] [Indexed: 02/03/2024] Open
Abstract
Background/purpose Augmented reality has been gradually applied in dental implant surgery. However, whether the dynamic navigation system integrated with augmented reality technology will further improve the accuracy is still unknown. The purpose of this study is to investigate the accuracy of dental implant placement using dynamic navigation and augmented reality-based dynamic navigation systems. Materials and methods Thirty-two cone-beam CT (CBCT) scans from clinical patients were collected and used to generate 64 phantoms that were allocated to the augmented reality-based dynamic navigation (ARDN) group or the conventional dynamic navigation (DN) group. The primary outcomes were global coronal, apical and angular deviations, and they were measured after image fusion. A linear mixed model with a random intercept was used. A P value < 0.05 was considered to indicate statistical significance. Results A total of 242 dental implants were placed in two groups. The global coronal, apical and angular deviations of the ARDN and DN groups were 1.31 ± 0.67 mm vs. 1.18 ± 0.59 mm, 1.36 ± 0.67 mm vs. 1.39 ± 0.55 mm, and 3.72 ± 2.13° vs. 3.1 ± 1.56°, respectively. No significant differences were found with regard to coronal and apical deviations (P = 0.16 and 0.6, respectively), but the DN group had a significantly lower angular deviation than the ARDN group (P = 0.02). Conclusion The augmented reality-based dynamic navigation system yielded a similar accuracy to the conventional dynamic navigation system for dental implant placement in coronal and apical points, but the augmented reality-based dynamic navigation system yielded a higher angular deviation.
Collapse
Affiliation(s)
- Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Feng Wang
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| |
Collapse
|
9
|
Vidal-Sicart S, Goñi E, Cebrecos I, Rioja ME, Perissinotti A, Sampol C, Vidal O, Saavedra-Pérez D, Ferrer A, Martí C, Ferrer Rebolleda J, García Velloso MJ, Orozco-Cortés J, Díaz-Feijóo B, Niñerola-Baizán A, Valdés Olmos RA. Continuous innovation in precision radio-guided surgery. Rev Esp Med Nucl Imagen Mol 2024; 43:39-54. [PMID: 37963516 DOI: 10.1016/j.remnie.2023.11.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2023] [Accepted: 10/26/2023] [Indexed: 11/16/2023]
Abstract
Since its origins, nuclear medicine has faced technological changes that led to modifying operating modes and adapting protocols. In the field of radioguided surgery, the incorporation of preoperative scintigraphic imaging and intraoperative detection with the gamma probe provided a definitive boost to sentinel lymph node biopsy to become a standard procedure for melanoma and breast cancer. The various technological innovations and consequent adaptation of protocols come together in the coexistence of the disruptive and the gradual. As obvious examples we have the introduction of SPECT/CT in the preoperative field and Drop-in probes in the intraoperative field. Other innovative aspects with possible application in radio-guided surgery are based on the application of artificial intelligence, navigation and telecare.
Collapse
Affiliation(s)
- Sergi Vidal-Sicart
- Servicio de Medicina Nuclear, Hospital Clínic Barcelona, Barcelona, Spain; Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain.
| | - Elena Goñi
- Servicio de Medicina Nuclear, Hospital Universitario de Navarra, Pamplona, Spain
| | - Isaac Cebrecos
- Instituto Clínic de Ginecología, Obstetricia y Neonatología (ICGON), Hospital Clínic Barcelona, Barcelona, Spain
| | | | - Andrés Perissinotti
- Servicio de Medicina Nuclear, Hospital Clínic Barcelona, Barcelona, Spain; Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Centro de Investigación Biomédica en Red de Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), ISCIII, Madrid, Spain
| | - Catalina Sampol
- Servicio de Medicina Nuclear, Hospital Universitario Son Espases, Palma de Mallorca, Spain
| | - Oscar Vidal
- Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Cirugía General y Digestiva, ICMDiM, Hospital Clínic de Barcelona, Barcelona, Spain; Departamento de Cirugía, Universitat de Barcelona, Barcelona, Spain
| | - David Saavedra-Pérez
- Cirugía General y Digestiva, ICMDiM, Hospital Clínic de Barcelona, Barcelona, Spain
| | - Ada Ferrer
- Servicio de Cirugía Maxilofacial, Hospital Clínic Barcelona, Barcelona, Spain
| | - Carles Martí
- Servicio de Cirugía Maxilofacial, Hospital Clínic Barcelona, Barcelona, Spain
| | - José Ferrer Rebolleda
- Servicio Medicina Nuclear Ascires, Hospital General Universitario de Valencia, Valencia, Spain
| | | | - Jhon Orozco-Cortés
- Servicio de Medicina Nuclear, Hospital Clínico Universitario de Valencia, Barcelona, Spain
| | - Berta Díaz-Feijóo
- Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Instituto Clínic de Ginecología, Obstetricia y Neonatología (ICGON), Hospital Clínic Barcelona, Barcelona, Spain; Departamento de Cirugía, Universitat de Barcelona, Barcelona, Spain
| | - Aida Niñerola-Baizán
- Servicio de Medicina Nuclear, Hospital Clínic Barcelona, Barcelona, Spain; Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Centro de Investigación Biomédica en Red de Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), ISCIII, Madrid, Spain; Departamento de Biomedicina, Facultad de Medicina, Universitat de Barcelona, Barcelona, Spain
| | - Renato Alfredo Valdés Olmos
- Department of Radiology, Section of Nuclear Medicine & Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands
| |
Collapse
|
10
|
Olexa J, Trang A, Kim K, Rakovec M, Saadon J, Parker W. Augmented Reality-Assisted Placement of Ommaya Reservoir for Cyst Aspiration: A Case Report. Cureus 2024; 16:e52383. [PMID: 38371146 PMCID: PMC10870692 DOI: 10.7759/cureus.52383] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/16/2024] [Indexed: 02/20/2024] Open
Abstract
Image guidance technologies can significantly improve the accuracy and safety of intracranial catheter insertions. Augmented reality (AR) allows surgeons to visualize 3D information overlaid onto a patient's head. As such, AR has emerged as a novel image guidance technology that offers unique advantages when navigating intracranial targets. A 71-year-old woman with a history of brain metastasis from breast cancer and prior resection surgery and chemotherapy presented with altered mental status and generalized weakness worse on her left side. Magnetic resonance imaging (MRI) demonstrated right frontotemporoparietal edema with a contrast-enhancing mass. MR perfusion confirmed an active tumor with an enlarging right temporal pole cyst. A cyst aspiration was performed via Ommaya reservoir placement. Neuro-navigation (BrainLab, Munich, Germany) and AR navigation were used to plan the trajectory from the temporal gyrus to the cyst. Post-operative computed tomography (CT) demonstrated good placement of the reservoir, reconstitution of the temporal horn of the lateral ventricle with decreased external mass effect, and no areas of hemorrhage. AR has tremendous potential in the field of neurosurgery for improving the accuracy and safety of procedures. This case demonstrates an encouraging application of AR and can serve as an example to drive expanded clinical use of this technology.
Collapse
Affiliation(s)
- Joshua Olexa
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Annie Trang
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Kevin Kim
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Maureen Rakovec
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Jordan Saadon
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Whitney Parker
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| |
Collapse
|
11
|
Kos TM, Colombo E, Bartels LW, Robe PA, van Doormaal TPC. Evaluation Metrics for Augmented Reality in Neurosurgical Preoperative Planning, Surgical Navigation, and Surgical Treatment Guidance: A Systematic Review. Oper Neurosurg (Hagerstown) 2023; 26:01787389-990000000-01007. [PMID: 38146941 PMCID: PMC11008635 DOI: 10.1227/ons.0000000000001009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 10/10/2023] [Indexed: 12/27/2023] Open
Abstract
BACKGROUND AND OBJECTIVE Recent years have shown an advancement in the development of augmented reality (AR) technologies for preoperative visualization, surgical navigation, and intraoperative guidance for neurosurgery. However, proving added value for AR in clinical practice is challenging, partly because of a lack of standardized evaluation metrics. We performed a systematic review to provide an overview of the reported evaluation metrics for AR technologies in neurosurgical practice and to establish a foundation for assessment and comparison of such technologies. METHODS PubMed, Embase, and Cochrane were searched systematically for publications on assessment of AR for cranial neurosurgery on September 22, 2022. The findings were reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. RESULTS The systematic search yielded 830 publications; 114 were screened full text, and 80 were included for analysis. Among the included studies, 5% dealt with preoperative visualization using AR, with user perception as the most frequently reported metric. The majority (75%) researched AR technology for surgical navigation, with registration accuracy, clinical outcome, and time measurements as the most frequently reported metrics. In addition, 20% studied the use of AR for intraoperative guidance, with registration accuracy, task outcome, and user perception as the most frequently reported metrics. CONCLUSION For quality benchmarking of AR technologies in neurosurgery, evaluation metrics should be specific to the risk profile and clinical objectives of the technology. A key focus should be on using validated questionnaires to assess user perception; ensuring clear and unambiguous reporting of registration accuracy, precision, robustness, and system stability; and accurately measuring task performance in clinical studies. We provided an overview suggesting which evaluation metrics to use per AR application and innovation phase, aiming to improve the assessment of added value of AR for neurosurgical practice and to facilitate the integration in the clinical workflow.
Collapse
Affiliation(s)
- Tessa M. Kos
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Elisa Colombo
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
| | - L. Wilbert Bartels
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Pierre A. Robe
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Tristan P. C. van Doormaal
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
12
|
Qi Z, Bopp MHA, Nimsky C, Chen X, Xu X, Wang Q, Gan Z, Zhang S, Wang J, Jin H, Zhang J. A Novel Registration Method for a Mixed Reality Navigation System Based on a Laser Crosshair Simulator: A Technical Note. Bioengineering (Basel) 2023; 10:1290. [PMID: 38002414 PMCID: PMC10669875 DOI: 10.3390/bioengineering10111290] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2023] [Accepted: 11/01/2023] [Indexed: 11/26/2023] Open
Abstract
Mixed Reality Navigation (MRN) is pivotal in augmented reality-assisted intelligent neurosurgical interventions. However, existing MRN registration methods face challenges in concurrently achieving low user dependency, high accuracy, and clinical applicability. This study proposes and evaluates a novel registration method based on a laser crosshair simulator, evaluating its feasibility and accuracy. A novel registration method employing a laser crosshair simulator was introduced, designed to replicate the scanner frame's position on the patient. The system autonomously calculates the transformation, mapping coordinates from the tracking space to the reference image space. A mathematical model and workflow for registration were designed, and a Universal Windows Platform (UWP) application was developed on HoloLens-2. Finally, a head phantom was used to measure the system's target registration error (TRE). The proposed method was successfully implemented, obviating the need for user interactions with virtual objects during the registration process. Regarding accuracy, the average deviation was 3.7 ± 1.7 mm. This method shows encouraging results in efficiency and intuitiveness and marks a valuable advancement in low-cost, easy-to-use MRN systems. The potential for enhancing accuracy and adaptability in intervention procedures positions this approach as promising for improving surgical outcomes.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| |
Collapse
|
13
|
Fan X, Tao B, Tu P, Shen Y, Wu Y, Chen X. A novel mixed reality-guided dental implant placement navigation system based on virtual-actual registration. Comput Biol Med 2023; 166:107560. [PMID: 37847946 DOI: 10.1016/j.compbiomed.2023.107560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 09/14/2023] [Accepted: 10/10/2023] [Indexed: 10/19/2023]
Abstract
BACKGROUNDS The key to successful dental implant surgery is to place the implants accurately along the pre-operative planned paths. The application of surgical navigation systems can significantly improve the safety and accuracy of implantation. However, the frequent shift of the views of the surgeon between the surgical site and the computer screen causes troubles, which is expected to be solved by the introduction of mixed-reality technology through the wearing of HoloLens devices by enabling the alignment of the virtual three-dimensional (3D) image with the actual surgical site in the same field of view. METHODS This study utilized mixed reality technology to enhance dental implant surgery navigation. Our first step was reconstructing a virtual 3D model from pre-operative cone-beam CT (CBCT) images. We then obtained the relative position between objects using the navigation device and HoloLens camera. Via the algorithms of virtual-actual registration, the transformation matrixes between the HoloLens devices and the navigation tracker were acquired through the HoloLens-tracker registration, and the transformation matrixes between the virtual model and the patient phantom through the image-phantom registration. In addition, the algorithm of surgical drill calibration assisted in acquiring transformation matrixes between the surgical drill and the patient phantom. These algorithms allow real-time tracking of the surgical drill's location and orientation relative to the patient phantom under the navigation device. With the aid of the HoloLens 2, virtual 3D images and actual patient phantoms can be aligned accurately, providing surgeons with a clear visualization of the implant path. RESULTS Phantom experiments were conducted using 30 patient phantoms, with a total of 102 dental implants inserted. Comparisons between the actual implant paths and the pre-operatively planned implant paths showed that our system achieved a coronal deviation of 1.507 ± 0.155 mm, an apical deviation of 1.542 ± 0.143 mm, and an angular deviation of 3.468 ± 0.339°. The deviation was not significantly different from that of the navigation-guided dental implant placement but better than the freehand dental implant placement. CONCLUSION Our proposed system realizes the integration of the pre-operative planned dental implant paths and the patient phantom, which helps surgeons achieve adequate accuracy in traditional dental implant surgery. Furthermore, this system is expected to be applicable to animal and cadaveric experiments in further studies.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China; Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
14
|
Martinho FC, Griffin IL, Price JB, Tordik PA. Augmented Reality and 3-Dimensional Dynamic Navigation System Integration for Osteotomy and Root-end Resection. J Endod 2023; 49:1362-1368. [PMID: 37453501 DOI: 10.1016/j.joen.2023.07.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Revised: 07/03/2023] [Accepted: 07/05/2023] [Indexed: 07/18/2023]
Abstract
INTRODUCTION Augmented reality (AR) superimposes high-definition computer-generated virtual content onto the existing environment, providing users with an enhanced perception of reality. This study investigates the feasibility of integrating an AR head-mounted device into a 3-dimensional dynamic navigation system (3D-DNS) for osteotomy and root-end resection (RER). It compares the accuracy and efficiency of AR + 3D-DNS to 3D-DNS for osteotomy and RER. METHODS Seventy-two tooth roots of 3D-printed surgical jaw models were divided into two groups: AR + 3D-DNS (n = 36) and 3D-DNS (n = 36). Cone-beam computed tomography scans were taken pre and postoperatively. The osteotomy and RER were virtually planned on X-guide software and delivered under 3D-DNS guidance. For the AR + 3D-DNS group, an AR head-mounted device (Microsoft HoloLens 2) was integrated into the 3D-DNS. The 2D- and 3D-deviations were calculated. The osteotomy and RER time and the number of procedural mishaps were recorded. RESULTS Osteotomy and RER were completed in all samples (72/72). AR + 3D-DNS was more accurate than 3D-DNS, showing lower 2D- and 3D-deviation values (P < .05). The AR + 3D-DNS was more efficient in time than 3D-DNS (P < .05). There was no significant difference in the number of mishaps (P > .05). CONCLUSIONS Within the limitations of this in vitro study, the integration of an AR head-mounted device to 3D-DNS is feasible for osteotomy and RER. AR improved the accuracy and time efficiency of 3D-DNS in osteotomy and RER. Head-mounted AR has the potential to be safely and reliably integrated into 3D-DNS for endodontic microsurgery.
Collapse
Affiliation(s)
- Frederico C Martinho
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland.
| | - Ina L Griffin
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Jeffery B Price
- Division of Oral Radiology, Department of Oncology and Diagnostic Sciences, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Patricia A Tordik
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| |
Collapse
|
15
|
Remschmidt B, Rieder M, Gsaxner C, Gaessler J, Payer M, Wallner J. Augmented Reality-Guided Apicoectomy Based on Maxillofacial CBCT Scans. Diagnostics (Basel) 2023; 13:3037. [PMID: 37835780 PMCID: PMC10572956 DOI: 10.3390/diagnostics13193037] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Revised: 09/13/2023] [Accepted: 09/21/2023] [Indexed: 10/15/2023] Open
Abstract
Implementation of augmented reality (AR) image guidance systems using preoperative cone beam computed tomography (CBCT) scans in apicoectomies promises to help surgeons overcome iatrogenic complications associated with this procedure. This study aims to evaluate the intraoperative feasibility and usability of HoloLens 2, an established AR image guidance device, in the context of apicoectomies. Three experienced surgeons carried out four AR-guided apicoectomies each on human cadaver head specimens. Preparation and operating times of each procedure, as well as the subjective usability of HoloLens for AR image guidance in apicoectomies using the System Usability Scale (SUS), were measured. In total, twelve AR-guided apicoectomies on six human cadaver head specimens were performed (n = 12). The average preparation time amounted to 162 (±34) s. The surgical procedure itself took on average 9 (±2) min. There was no statistically significant difference between the three surgeons. Quantification of the usability of HoloLens revealed a mean SUS score of 80.4 (±6.8), indicating an "excellent" usability level. In conclusion, this study implies the suitability, practicality, and simplicity of AR image guidance systems such as the HoloLens in apicoectomies and advocates their routine implementation.
Collapse
Affiliation(s)
- Bernhard Remschmidt
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Marcus Rieder
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria
| | - Jan Gaessler
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Michael Payer
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Juergen Wallner
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| |
Collapse
|
16
|
Tu M, Jung H, Moghadam A, Raythatha J, Hsu J, Kim J. Exploring the Performance of Geometry-Based Markerless Registration in a Simulated Surgical Environment: A Comparative Study of Registration Algorithms in Medical Augmented Reality. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38083251 DOI: 10.1109/embc40787.2023.10341197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Augmented Reality (AR) has been utilized in multiple applications in the medical field, such as augmenting Computed Tomography (CT) images onto the patient's body during surgery. However, one of the challenges in its utilization is to register the pre-operative CT images to the patient's body accurately. The current registration process requires prior attachment of tracking markers, and their localization within the body and CT images. This process can be cumbersome, error-prone, and dependent on the surgeon's experience. Moreover, there are cases where medical instruments, drapes, or the body may occlude the markers. In light of these limitations, markerless registration algorithms have the potential to aid the registration process in the clinical setting. While those algorithms have been successfully used in other sectors, such as multimedia, they have not yet been thoroughly investigated in a clinical setting, especially in surgery, where there are more challenging cases with different positions of the patients in the image and the surgical environment. In this paper, we benchmarked and evaluated the performance of 6 state-of-the-art markerless registration algorithms from the multimedia sector by registering a CT image onto the whole-body phantom dataset acquired from a simulated surgical environment. We also analyzed the suitability of these algorithms for use in the surgical setting and discussed their potential for the advancement of AR-assisted surgery.Clinical Relevance-Our study provides insight into the potential of AR-assisted surgery and helps practitioners in choosing the most suitable registration algorithm for their needs to improve patient outcomes, reduce the risk of surgical errors and shorten the time of preoperative planning.
Collapse
|
17
|
Ochi Y, Yanai S, Yoshino Y, Sawada M, Sakate S, Kanno K, Andou M. Clinical use of mixed reality for laparoscopic myomectomy. Int J Gynaecol Obstet 2023. [PMID: 36965106 DOI: 10.1002/ijgo.14765] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2022] [Revised: 01/17/2023] [Accepted: 03/13/2023] [Indexed: 03/27/2023]
|
18
|
Zari G, Condino S, Cutolo F, Ferrari V. Magic Leap 1 versus Microsoft HoloLens 2 for the Visualization of 3D Content Obtained from Radiological Images. SENSORS (BASEL, SWITZERLAND) 2023; 23:3040. [PMID: 36991751 PMCID: PMC10054537 DOI: 10.3390/s23063040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/24/2023] [Revised: 03/01/2023] [Accepted: 03/08/2023] [Indexed: 06/19/2023]
Abstract
The adoption of extended reality solutions is growing rapidly in the healthcare world. Augmented reality (AR) and virtual reality (VR) interfaces can bring advantages in various medical-health sectors; it is thus not surprising that the medical MR market is among the fastest-growing ones. The present study reports on a comparison between two of the most popular MR head-mounted displays, Magic Leap 1 and Microsoft HoloLens 2, for the visualization of 3D medical imaging data. We evaluate the functionalities and performance of both devices through a user-study in which surgeons and residents assessed the visualization of 3D computer-generated anatomical models. The digital content is obtained through a dedicated medical imaging suite (Verima imaging suite) developed by the Italian start-up company (Witapp s.r.l.). According to our performance analysis in terms of frame rate, there are no significant differences between the two devices. The surgical staff expressed a clear preference for Magic Leap 1, particularly for the better visualization quality and the ease of interaction with the 3D virtual content. Nonetheless, even though the results of the questionnaire were slightly more positive for Magic Leap 1, the spatial understanding of the 3D anatomical model in terms of depth relations and spatial arrangement was positively evaluated for both devices.
Collapse
Affiliation(s)
- Giulia Zari
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
| | - Sara Condino
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy
| | - Fabrizio Cutolo
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy
| | - Vincenzo Ferrari
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy
| |
Collapse
|