1
|
Al Hamad KQ, Said KN, Engelschalk M, Matoug-Elwerfelli M, Gupta N, Eric J, Ali SA, Ali K, Daas H, Abu Alhaija ES. Taxonomic discordance of immersive realities in dentistry: A systematic scoping review. J Dent 2024; 146:105058. [PMID: 38729286 DOI: 10.1016/j.jdent.2024.105058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2024] [Revised: 05/04/2024] [Accepted: 05/07/2024] [Indexed: 05/12/2024] Open
Abstract
OBJECTIVES This review aimed to map taxonomy frameworks, descriptions, and applications of immersive technologies in the dental literature. DATA The Preferred reporting items for systematic reviews and meta-analyses extension for scoping reviews (PRISMA-ScR) guidelines was followed, and the protocol was registered at open science framework platform (https://doi.org/10.17605/OSF.IO/H6N8M). SOURCES Systematic search was conducted in MEDLINE (via PubMed), Scopus, and Cochrane Library databases, and complemented by manual search. STUDY SELECTION A total of 84 articles were included, with 81 % between 2019 and 2023. Most studies were experimental (62 %), including education (25 %), protocol feasibility (20 %), in vitro (11 %), and cadaver (6 %). Other study types included clinical report/technique article (24 %), clinical study (9 %), technical note/tip to reader (4 %), and randomized controlled trial (1 %). Three-quarters of the included studies were published in oral and maxillofacial surgery (38 %), dental education (26 %), and implant (12 %) disciplines. Methods of display included head mounted display device (HMD) (55 %), see through screen (32 %), 2D screen display (11 %), and projector display (2 %). Descriptions of immersive realities were fragmented and inconsistent with lack of clear taxonomy framework for the umbrella and the subset terms including virtual reality (VR), augmented reality (AR), mixed reality (MR), augmented virtuality (AV), extended reality, and X reality. CONCLUSIONS Immersive reality applications in dentistry are gaining popularity with a notable surge in the number of publications in the last 5 years. Ambiguities are apparent in the descriptions of immersive realities. A taxonomy framework based on method of display (full or partial) and reality class (VR, AR, or MR) is proposed. CLINICAL SIGNIFICANCE Understanding different reality classes can be perplexing due to their blurred boundaries and conceptual overlapping. Immersive technologies offer novel educational and clinical applications. This domain is fast developing. With the current fragmented and inconsistent terminologies, a comprehensive taxonomy framework is necessary.
Collapse
Affiliation(s)
- Khaled Q Al Hamad
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar.
| | - Khalid N Said
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Marcus Engelschalk
- Department of Oral and Maxillofacial Surgery, University Medical Center Hamburg-Eppendorf, Germany
| | | | - Nidhi Gupta
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Jelena Eric
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Shaymaa A Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Kamran Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Hanin Daas
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | | |
Collapse
|
2
|
Li F, Gao Q, Wang N, Greene N, Song T, Dianat O, Azimi E. Mixed reality guided root canal therapy. Healthc Technol Lett 2024; 11:167-178. [PMID: 38638496 PMCID: PMC11022218 DOI: 10.1049/htl2.12077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Accepted: 01/11/2024] [Indexed: 04/20/2024] Open
Abstract
Root canal therapy (RCT) is a widely performed procedure in dentistry, with over 25 million individuals undergoing it annually. This procedure is carried out to address inflammation or infection within the root canal system of affected teeth. However, accurately aligning CT scan information with the patient's tooth has posed challenges, leading to errors in tool positioning and potential negative outcomes. To overcome these challenges, a mixed reality application is developed using an optical see-through head-mounted display (OST-HMD). The application incorporates visual cues, an augmented mirror, and dynamically updated multi-view CT slices to address depth perception issues and achieve accurate tooth localization, comprehensive canal exploration, and prevention of perforation during RCT. Through the preliminary experimental assessment, significant improvements in the accuracy of the procedure are observed. Specifically, with the system the accuracy in position was improved from 1.4 to 0.4 mm (more than a 70% gain) using an Optical Tracker (NDI) and from 2.8 to 2.4 mm using an HMD, thereby achieving submillimeter accuracy with NDI. 6 participants were enrolled in the user study. The result of the study suggests that the average displacement on the crown plane of 1.27 ± 0.83 cm, an average depth error of 0.90 ± 0.72 cm and an average angular deviation of 1.83 ± 0.83°. Our error analysis further highlights the impact of HMD spatial localization and head motion on the registration and calibration process. Through seamless integration of CT image information with the patient's tooth, our mixed reality application assists dentists in achieving precise tool placement. This advancement in technology has the potential to elevate the quality of root canal procedures, ensuring better accuracy and enhancing overall treatment outcomes.
Collapse
Affiliation(s)
- Fangjie Li
- Department of Biomedical EngineeringJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Qingying Gao
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Nengyu Wang
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Nicholas Greene
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Tianyu Song
- Chair for Computer Aided Medical ProceduresTechnical University of MunichMunichGermany
| | - Omid Dianat
- School of DentistryUniversity of MarylandBaltimoreMarylandUSA
| | - Ehsan Azimi
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| |
Collapse
|
3
|
Fan X, Tao B, Tu P, Shen Y, Wu Y, Chen X. A novel mixed reality-guided dental implant placement navigation system based on virtual-actual registration. Comput Biol Med 2023; 166:107560. [PMID: 37847946 DOI: 10.1016/j.compbiomed.2023.107560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 09/14/2023] [Accepted: 10/10/2023] [Indexed: 10/19/2023]
Abstract
BACKGROUNDS The key to successful dental implant surgery is to place the implants accurately along the pre-operative planned paths. The application of surgical navigation systems can significantly improve the safety and accuracy of implantation. However, the frequent shift of the views of the surgeon between the surgical site and the computer screen causes troubles, which is expected to be solved by the introduction of mixed-reality technology through the wearing of HoloLens devices by enabling the alignment of the virtual three-dimensional (3D) image with the actual surgical site in the same field of view. METHODS This study utilized mixed reality technology to enhance dental implant surgery navigation. Our first step was reconstructing a virtual 3D model from pre-operative cone-beam CT (CBCT) images. We then obtained the relative position between objects using the navigation device and HoloLens camera. Via the algorithms of virtual-actual registration, the transformation matrixes between the HoloLens devices and the navigation tracker were acquired through the HoloLens-tracker registration, and the transformation matrixes between the virtual model and the patient phantom through the image-phantom registration. In addition, the algorithm of surgical drill calibration assisted in acquiring transformation matrixes between the surgical drill and the patient phantom. These algorithms allow real-time tracking of the surgical drill's location and orientation relative to the patient phantom under the navigation device. With the aid of the HoloLens 2, virtual 3D images and actual patient phantoms can be aligned accurately, providing surgeons with a clear visualization of the implant path. RESULTS Phantom experiments were conducted using 30 patient phantoms, with a total of 102 dental implants inserted. Comparisons between the actual implant paths and the pre-operatively planned implant paths showed that our system achieved a coronal deviation of 1.507 ± 0.155 mm, an apical deviation of 1.542 ± 0.143 mm, and an angular deviation of 3.468 ± 0.339°. The deviation was not significantly different from that of the navigation-guided dental implant placement but better than the freehand dental implant placement. CONCLUSION Our proposed system realizes the integration of the pre-operative planned dental implant paths and the patient phantom, which helps surgeons achieve adequate accuracy in traditional dental implant surgery. Furthermore, this system is expected to be applicable to animal and cadaveric experiments in further studies.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China; Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
4
|
Riad Deglow E, Zubizarreta-Macho Á, González Menéndez H, Lorrio Castro J, Galparsoro Catalán A, Tzironi G, Lobo Galindo AB, Alonso Ezpeleta LÓ, Hernández Montero S. Comparative analysis of two navigation techniques based on augmented reality technology for the orthodontic mini-implants placement. BMC Oral Health 2023; 23:542. [PMID: 37543581 PMCID: PMC10403882 DOI: 10.1186/s12903-023-03261-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 07/28/2023] [Indexed: 08/07/2023] Open
Abstract
To analyze and compare the accuracy and root contact prevalence, comparing a conventional freehand technique and two navigation techniques based on augmented reality technology for the orthodontic self-drilling mini-implants placement. Methods Two hundred and seven orthodontic self-drilling mini-implants were placed using either a conventional freehand technique (FHT) and two navigation techniques based on augmented reality technology (AR TOOTH and AR SCREWS). Accuracy across different dental sectors was also analyzed. CBCT and intraoral scans were taken both prior to and following orthodontic self-drilling mini-implants placement. The deviation angle and horizontal were then analyzed; these measurements were taken at the coronal entry point and apical endpoint between the planned and performed orthodontic self-drilling mini-implants. In addition, any complications resulting from mini-implant placement, such as spot perforations, were also analyzed across all dental sectors.Results The statistical analysis showed significant differences between study groups with regard to the coronal entry-point (p < 0.001), apical end-point(p < 0.001) and angular deviations (p < 0.001). Furthermore, statistically significant differences were shown between the orthodontic self-drilling mini-implants placement site at the coronal entry-point (p < 0.0001) and apical end-point (p < 0.001). Additionally, eight root perforations were observed in the FHT group, while there were no root perforations in the two navigation techniques based on augmented reality technology.Conclusions The navigation techniques based on augmented reality technology has an effect on the accuracy of orthodontic self-drilling mini-implants placement and results in fewer intraoperative complications, comparing to the conventional free-hand technique. The AR TOOTH augmented reality technique showed more accurate results between planned and placed orthodontic self-drilling mini-implants, comparing to the AR SCREWS and conventional free-hand techniques. The navigation techniques based on augmented reality technology showed fewer intraoperative complications, comparing to the conventional free-hand technique.
Collapse
Affiliation(s)
- Elena Riad Deglow
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Álvaro Zubizarreta-Macho
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
- Department of Surgery, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain
| | - Héctor González Menéndez
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Juan Lorrio Castro
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Agustín Galparsoro Catalán
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Georgia Tzironi
- Department of Surgery, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain
| | - Ana Belén Lobo Galindo
- Department of Surgery, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain
| | | | - Sofía Hernández Montero
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| |
Collapse
|
5
|
Woodward J, Ruiz J. Analytic Review of Using Augmented Reality for Situational Awareness. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:2166-2183. [PMID: 35007195 DOI: 10.1109/tvcg.2022.3141585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Situational awareness is the perception and understanding of the surrounding environment. Maintaining situational awareness is vital for performance and error prevention in safety critical domains. Prior work has examined applying augmented reality (AR) to the context of improving situational awareness, but has mainly focused on the applicability of using AR rather than on information design. Hence, there is a need to investigate how to design the presentation of information, especially in AR headsets, to increase users' situational awareness. We conducted a Systematic Literature Review to research how information is currently presented in AR, especially in systems that are being utilized for situational awareness. Comparing current presentations of information to existing design recommendations aided in identifying future areas of design. In addition, this survey further discusses opportunities and challenges in applying AR to increasing users' situational awareness.
Collapse
|
6
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
7
|
Fan X, Zhu Q, Tu P, Joskowicz L, Chen X. A review of advances in image-guided orthopedic surgery. Phys Med Biol 2023; 68. [PMID: 36595258 DOI: 10.1088/1361-6560/acaae9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2022] [Accepted: 12/12/2022] [Indexed: 12/15/2022]
Abstract
Orthopedic surgery remains technically demanding due to the complex anatomical structures and cumbersome surgical procedures. The introduction of image-guided orthopedic surgery (IGOS) has significantly decreased the surgical risk and improved the operation results. This review focuses on the application of recent advances in artificial intelligence (AI), deep learning (DL), augmented reality (AR) and robotics in image-guided spine surgery, joint arthroplasty, fracture reduction and bone tumor resection. For the pre-operative stage, key technologies of AI and DL based medical image segmentation, 3D visualization and surgical planning procedures are systematically reviewed. For the intra-operative stage, the development of novel image registration, surgical tool calibration and real-time navigation are reviewed. Furthermore, the combination of the surgical navigation system with AR and robotic technology is also discussed. Finally, the current issues and prospects of the IGOS system are discussed, with the goal of establishing a reference and providing guidance for surgeons, engineers, and researchers involved in the research and development of this area.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Qiyang Zhu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Leo Joskowicz
- School of Computer Science and Engineering, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China.,Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| |
Collapse
|
8
|
Leung T, Dam VV, Lee DH. Accuracy of Augmented Reality-Assisted Navigation in Dental Implant Surgery: Systematic Review and Meta-analysis. J Med Internet Res 2023; 25:e42040. [PMID: 36598798 PMCID: PMC9856431 DOI: 10.2196/42040] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2022] [Revised: 11/09/2022] [Accepted: 11/25/2022] [Indexed: 11/27/2022] Open
Abstract
BACKGROUND The novel concept of immersive 3D augmented reality (AR) surgical navigation has recently been introduced in the medical field. This method allows surgeons to directly focus on the surgical objective without having to look at a separate monitor. In the dental field, the recently developed AR-assisted dental implant navigation system (AR navigation), which uses innovative image technology to directly visualize and track a presurgical plan over an actual surgical site, has attracted great interest. OBJECTIVE This study is the first systematic review and meta-analysis study that aimed to assess the accuracy of dental implants placed by AR navigation and compare it with that of the widely used implant placement methods, including the freehand method (FH), template-based static guidance (TG), and conventional navigation (CN). METHODS Individual search strategies were used in PubMed (MEDLINE), Scopus, ScienceDirect, Cochrane Library, and Google Scholar to search for articles published until March 21, 2022. This study was performed in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines and registered in the International Prospective Register of Systematic Reviews (PROSPERO) database. Peer-reviewed journal articles evaluating the positional deviations of dental implants placed using AR-assisted implant navigation systems were included. Cohen d statistical power analysis was used to investigate the effect size estimate and CIs of standardized mean differences (SMDs) between data sets. RESULTS Among the 425 articles retrieved, 15 articles were considered eligible for narrative review, 8 articles were considered for single-arm meta-analysis, and 4 were included in a 2-arm meta-analysis. The mean lateral, global, depth, and angular deviations of the dental implant placed using AR navigation were 0.90 (95% CI 0.78-1.02) mm, 1.18 (95% CI 0.95-1.41) mm, 0.78 (95% CI 0.48-1.08) mm, and 3.96° (95% CI 3.45°-4.48°), respectively. The accuracy of AR navigation was significantly higher than that of the FH method (SMD=-1.01; 95% CI -1.47 to -0.55; P<.001) and CN method (SMD=-0.46; 95% CI -0.64 to -0.29; P<.001). However, the accuracies of the AR navigation and TG methods were similar (SMD=0.06; 95% CI -0.62 to 0.74; P=.73). CONCLUSIONS The positional deviations of AR-navigated implant placements were within the safety zone, suggesting clinically acceptable accuracy of the AR navigation method. Moreover, the accuracy of AR implant navigation was comparable with that of the highly recommended dental implant-guided surgery method, TG, and superior to that of the conventional FH and CN methods. This review highlights the possibility of using AR navigation as an effective and accurate immersive surgical guide for dental implant placement.
Collapse
Affiliation(s)
| | - Van Viet Dam
- Department of Implantology, Hanoi National Hospital of Odonto-stomatology, Hanoi, Vietnam.,VNU School of Medicine and Pharmacy, Vietnam National University, Hanoi, Vietnam
| | - Du-Hyeong Lee
- Institute for Translational Research in Dentistry, Kyungpook National University, Daegu, Republic of Korea.,Department of Prosthodontics, School of Dentistry, Kyungpook National University, Daegu, Republic of Korea
| |
Collapse
|
9
|
Usevitch DE, Bronheim RS, Reyes MC, Babilonia C, Margalit A, Jain A, Armand M. Review of Enhanced Handheld Surgical Drills. Crit Rev Biomed Eng 2023; 51:29-50. [PMID: 37824333 PMCID: PMC10874117 DOI: 10.1615/critrevbiomedeng.2023049106] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2023]
Abstract
The handheld drill has been used as a conventional surgical tool for centuries. Alongside the recent successes of surgical robots, the development of new and enhanced medical drills has improved surgeon ability without requiring the high cost and consuming setup times that plague medical robot systems. This work provides an overview of enhanced handheld surgical drill research focusing on systems that include some form of image guidance and do not require additional hardware that physically supports or guides drilling. Drilling is reviewed by main contribution divided into audio-, visual-, or hardware-enhanced drills. A vision for future work to enhance handheld drilling systems is also discussed.
Collapse
Affiliation(s)
- David E. Usevitch
- Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD, United States
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Rachel S. Bronheim
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Miguel C. Reyes
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Carlos Babilonia
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Adam Margalit
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Amit Jain
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| | - Mehran Armand
- Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD, United States
- Department of Orthopedic Surgery, Johns Hopkins University, Baltimore, MD, United States
| |
Collapse
|
10
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
11
|
Kalaiarasan K, Prathap L, Ayyadurai M, Subhashini P, Tamilselvi T, Avudaiappan T, Infant Raj I, Alemayehu Mamo S, Mezni A. Clinical Application of Augmented Reality in Computerized Skull Base Surgery. EVIDENCE-BASED COMPLEMENTARY AND ALTERNATIVE MEDICINE : ECAM 2022; 2022:1335820. [PMID: 35600956 PMCID: PMC9117015 DOI: 10.1155/2022/1335820] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 04/19/2022] [Indexed: 12/02/2022]
Abstract
Cranial base tactics comprise the regulation of tiny and complicated structures in the domains of otology, rhinology, neurosurgery, and maxillofacial medical procedure. Basic nerves and veins are in the nearness of these buildings. Increased the truth is a coming innovation that may reform the cerebral basis approach by supplying vital physical and navigational facts brought together in a solitary presentation. In any case, the awareness and acknowledgment of prospective results of expanding reality frameworks in the cerebral base region are really poor. This article targets examining the handiness of expanded reality frameworks in cranial foundation medical procedures and emphasizes the obstacles that present innovation encounters and their prospective adjustments. A specialized perspective on distinct strategies used being produced of an improved realty framework is furthermore offered. The newest item offers an expansion in interest in expanded reality frameworks that may motivate more secure and practical procedures. In any case, a couple of concerns have to be cared to before that can be for the vast part fused into normal practice.
Collapse
Affiliation(s)
- K. Kalaiarasan
- Department of Information Technology, M. Kumarasamy College of Engineering, Karur, India
| | - Lavanya Prathap
- Department of Anatomy, Saveetha Dental College and Hospital, Saveetha Institute of Medical and Technical Sciences, Chennai, Tamil Nadu 600077, India
| | - M. Ayyadurai
- SG, Institute of ECE, Saveetha School of Engineering, SIMATS, Chennai, Tamil Nadu 600077, India
| | - P. Subhashini
- Department of Computer Science and Engineering, J.N.N Institute of Engineering, Kannigaipair, Tamil Nadu 601102, India
| | - T. Tamilselvi
- Department of Computer Science and Engineering, Panimalar Institute of Technology, Varadarajapuram, Tamil Nadu 600123, India
| | - T. Avudaiappan
- Computer Science and Engineering, K. Ramakrishnan College of Technology, Trichy 621112, India
| | - I. Infant Raj
- Department of Computer Science and Engineering, K. Ramakrishnan College of Engineering, Trichy, India
| | - Samson Alemayehu Mamo
- Department of Electrical and Computer Engineering, Faculty of Electrical and Biomedical Engineering, Institute of Technology, Hawassa University, Awasa, Ethiopia
| | - Amine Mezni
- Department of Chemistry, College of Science, Taif University, P.O. Box 11099, Taif 21944, Saudi Arabia
| |
Collapse
|
12
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
13
|
Accuracy of dental implant placement using augmented reality-based navigation, static computer assisted implant surgery, and the free-hand method: An in vitro study In vitro evaluation of accuracy of dental implant placement guided by three distinct navigational methods. J Dent 2022; 119:104070. [DOI: 10.1016/j.jdent.2022.104070] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2021] [Revised: 02/03/2022] [Accepted: 02/16/2022] [Indexed: 12/17/2022] Open
|
14
|
Augmented Reality Based Transmodiolar Cochlear Implantation. Otol Neurotol 2021; 43:190-198. [PMID: 34855687 DOI: 10.1097/mao.0000000000003437] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
HYPOTHESIS Transmodiolar auditory implantation via the middle ear cavity could be possible using an augmented reality system (ARS). BACKGROUND There is no clear landmark to indicate the cochlear apex or the modiolar axis. The ARS seems to be a promising tool for transmodiolar implantation by combining information from the preprocedure computed tomography scan (CT-scan) images to the real-time video of the surgical field. METHODS Eight human temporal bone resin models were included (five adults and three children). The procedure started by the identification of the modiolar axis on the preprocedure CT-scan followed by a 3D reconstruction of the images. Information on modiolar location and navigational guidance was supplemented to the reconstructed model, which was then registered with the surgical video using a point-based approach. Relative movements between the phantom and the microscope were tracked using image feature-based motion tracking. Based on the information provided via the ARS, the surgeon implanted the electrode-array inside the modiolus after drilling the helicothrema. Postprocedure CT-scan images were acquired to evaluate the registration error and the implantation accuracy. RESULTS The implantation could be conducted in all cases with a 2D registration error of 0.4 ± 0.24 mm. The mean entry point error was 0.6 ± 1.00 mm and the implant angular error 13.5 ± 8.93 degrees (n = 8), compatible with the procedure requirements. CONCLUSION We developed an image-based ARS to identify the extremities and the axis of the cochlear modiolus on intraprocedure videos. The system yielded submillimetric accuracy for implantation and remained stable throughout the experimental study.
Collapse
|
15
|
Zhang F, Zhang S, Sun L, Zhan W, Sun L. Research on registration and navigation technology of augmented reality for ex-vivo hepatectomy. Int J Comput Assist Radiol Surg 2021; 17:147-155. [PMID: 34800225 DOI: 10.1007/s11548-021-02531-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2021] [Accepted: 10/27/2021] [Indexed: 11/29/2022]
Abstract
PURPOSE The application of augmented reality technology to the partial hepatectomy procedure has high practical significance, but the existing augmented reality navigation system has major drawbacks in the display and registration methods, which result in low precision. The augmented reality surgical navigation system proposed in this study has been improved in the above two aspects, which can significantly improve the surgical accuracy. METHODS The use of optical see-through head-mounted displays for imaging displays can prevent doctors from reconstructing the patient's two-dimensional image information in their minds and reduce the psychological burden of doctors. In the registration process, the biomechanical properties of the liver are introduced, and a non-rigid registration method based on biomechanics is proposed and realized by a meshless algorithm. In addition, this study uses the moving grid algorithm to carry out clinical experiments on ex-vivo pig liver for experimental verification. RESULTS The mark-based interactive registration error is 4.21 ± 1.6 mm, and the registration error is reduced after taking the biomechanical properties of the liver into account, which is - 0.153 ± 0.398 mm. The cutting error of the liver model is 0.159 ± 0.292 mm. In addition, with the aid of the navigation system proposed in this paper, the experiment of ex-vivo pig liver cutting was completed with an error of - 1.164 ± 0.576 mm. CONCLUSIONS As a proof-of-concept study, the augmented reality navigation system proposed in this study improves the traditional image-guided surgery in terms of display and registration methods, and the feasibility of the system is verified by ex-vivo pig liver experiments. Therefore, the navigation system has a certain guiding significance for clinical surgery.
Collapse
Affiliation(s)
- Fengfeng Zhang
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China. .,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China.
| | - Shi Zhang
- College of Mechanical and Engineering, Harbin Engineering University, Harbin, 150001, China
| | - Long Sun
- College of Mechanical and Engineering, Harbin Engineering University, Harbin, 150001, China
| | - Wei Zhan
- The First Affiliated Hospital of Soochow University, Suzhou, 215006, China
| | - Lining Sun
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China.,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China
| |
Collapse
|
16
|
Donovan SK, Herstein JJ, Prober CG, Kolars JC, Gordon JA, Boyers P, Gold J, Davies HD. Expansion of simulation and extended reality for undergraduate health professions education: A call to action. JOURNAL OF INTERPROFESSIONAL EDUCATION & PRACTICE 2021; 24:100436. [PMID: 36567809 PMCID: PMC9765302 DOI: 10.1016/j.xjep.2021.100436] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 04/27/2021] [Indexed: 12/27/2022]
Abstract
In the spring of 2020, the COVID-19 pandemic limited access for many health professions students to clinical settings amid concerns about availability of appropriate personal protective equipment as well as the desire to limit exposure in these high-risk settings. Furthermore, the pandemic led to a need to cancel clinics and inpatient rotations, with a major impact on training for health professions and interprofessional health delivery, the long-term effects of which are currently unknown. While problematic, this also presents an opportunity to reflect on challenges facing the traditional clinical training paradigm in a rapidly changing and complex health care system and develop sustainable, high-quality competency-based educational models that incorporate rapidly progressing technologies. We call for pilot studies to explore specific simulation-based inpatient and outpatient clinical rotations for professional and interprofessional training.
Collapse
|
17
|
Penczek J, Boynton PA, Beams R, Sriram RD. Measurement Challenges for Medical Image Display Devices. J Digit Imaging 2021; 34:458-472. [PMID: 33846889 DOI: 10.1007/s10278-021-00438-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2020] [Revised: 01/27/2021] [Accepted: 02/24/2021] [Indexed: 12/25/2022] Open
Abstract
Visual information is a critical component in the evaluation and communication of patient medical information. As display technologies have evolved, the medical community has sought to take advantage of advances in wider color gamuts, greater display portability, and more immersive imagery. These image quality enhancements have shown improvements in the quality of healthcare through greater efficiency, higher diagnostic accuracy, added functionality, enhanced training, and better health records. However, the display technology advances typically introduce greater complexity in the image workflow and display evaluation. This paper highlights some of the optical measurement challenges created by these new display technologies and offers possible pathways to address them.
Collapse
Affiliation(s)
- J Penczek
- National Institute of Standards and Technology, Boulder and University of Colorado, CO, 80305, Boulder, USA.
| | - P A Boynton
- National Institute of Standards and Technology, Gaithersburg, MD, 20899, USA
| | - R Beams
- Food and Drug Administration, Silver Springs, MD, 20993, USA
| | - R D Sriram
- National Institute of Standards and Technology, Gaithersburg, MD, 20899, USA
| |
Collapse
|
18
|
Bari H, Wadhwani S, Dasari BVM. Role of artificial intelligence in hepatobiliary and pancreatic surgery. World J Gastrointest Surg 2021; 13:7-18. [PMID: 33552391 PMCID: PMC7830072 DOI: 10.4240/wjgs.v13.i1.7] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/28/2020] [Revised: 12/08/2020] [Accepted: 12/17/2020] [Indexed: 02/06/2023] Open
Abstract
Over the past decade, enhanced preoperative imaging and visualization, improved delineation of the complex anatomical structures of the liver and pancreas, and intra-operative technological advances have helped deliver the liver and pancreatic surgery with increased safety and better postoperative outcomes. Artificial intelligence (AI) has a major role to play in 3D visualization, virtual simulation, augmented reality that helps in the training of surgeons and the future delivery of conventional, laparoscopic, and robotic hepatobiliary and pancreatic (HPB) surgery; artificial neural networks and machine learning has the potential to revolutionize individualized patient care during the preoperative imaging, and postoperative surveillance. In this paper, we reviewed the existing evidence and outlined the potential for applying AI in the perioperative care of patients undergoing HPB surgery.
Collapse
Affiliation(s)
- Hassaan Bari
- Department of HPB and Liver Transplantation Surgery, Queen Elizabeth Hospital, Birmingham B15 2TH, United Kingdom
| | - Sharan Wadhwani
- Department of Radiology, Queen Elizabeth Hospital, Birmingham B15 2TH, United Kingdom
| | - Bobby V M Dasari
- Department of HPB and Liver Transplantation Surgery, Queen Elizabeth Hospital, Birmingham B15 2TH, United Kingdom
| |
Collapse
|
19
|
Mondal SB, Achilefu S. Virtual and Augmented Reality Technologies in Molecular and Anatomical Imaging. Mol Imaging 2021. [DOI: 10.1016/b978-0-12-816386-3.00066-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
|
20
|
Lungu AJ, Swinkels W, Claesen L, Tu P, Egger J, Chen X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: an extension to different kinds of surgery. Expert Rev Med Devices 2020; 18:47-62. [PMID: 33283563 DOI: 10.1080/17434440.2021.1860750] [Citation(s) in RCA: 51] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Background: Research proves that the apprenticeship model, which is the gold standard for training surgical residents, is obsolete. For that reason, there is a continuing effort toward the development of high-fidelity surgical simulators to replace the apprenticeship model. Applying Virtual Reality Augmented Reality (AR) and Mixed Reality (MR) in surgical simulators increases the fidelity, level of immersion and overall experience of these simulators.Areas covered: The objective of this review is to provide a comprehensive overview of the application of VR, AR and MR for distinct surgical disciplines, including maxillofacial surgery and neurosurgery. The current developments in these areas, as well as potential future directions, are discussed.Expert opinion: The key components for incorporating VR into surgical simulators are visual and haptic rendering. These components ensure that the user is completely immersed in the virtual environment and can interact in the same way as in the physical world. The key components for the application of AR and MR into surgical simulators include the tracking system as well as the visual rendering. The advantages of these surgical simulators are the ability to perform user evaluations and increase the training frequency of surgical residents.
Collapse
Affiliation(s)
- Abel J Lungu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Wout Swinkels
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Luc Claesen
- Computational Sensing Systems, Department of Engineering Technology, Hasselt University, Diepenbeek, Belgium
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Jan Egger
- Graz University of Technology, Institute of Computer Graphics and Vision, Graz, Austria.,Graz Department of Oral &maxillofacial Surgery, Medical University of Graz, Graz, Austria.,The Laboratory of Computer Algorithms for Medicine, Medical University of Graz, Graz, Austria
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
21
|
Virtual Reality Simulation and Augmented Reality-Guided Surgery for Total Maxillectomy: A Case Report. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10186288] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
Abstract
With the improvement in computer graphics and sensors, technologies like virtual reality (VR) and augmented reality (AR) have created new possibilities for developing diagnostic and surgical techniques in the field of surgery. VR and AR are the latest technological modalities that have been integrated into clinical practice and medical education, and are rapidly emerging as powerful tools in the field of maxillofacial surgery. In this report, we describe a case of total maxillectomy and orbital floor reconstruction in a patient with malignant fibrous histiocytoma of the maxilla, with preoperative planning via VR simulation and AR-guided surgery. Future developments in VR and AR technologies will increase their utility and effectiveness in the field of surgery.
Collapse
|
22
|
Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery. Sci Rep 2020; 10:6767. [PMID: 32317726 PMCID: PMC7174368 DOI: 10.1038/s41598-020-63839-2] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Accepted: 03/26/2020] [Indexed: 11/27/2022] Open
Abstract
The aim of the study was to develop and assess the performance of a video-based augmented reality system, combining preoperative computed tomography (CT) and real-time microscopic video, as the first crucial step to keyhole middle ear procedures through a tympanic membrane puncture. Six different artificial human temporal bones were included in this prospective study. Six stainless steel fiducial markers were glued on the periphery of the eardrum, and a high-resolution CT-scan of the temporal bone was obtained. Virtual endoscopy of the middle ear based on this CT-scan was conducted on Osirix software. Virtual endoscopy image was registered to the microscope-based video of the intact tympanic membrane based on fiducial markers and a homography transformation was applied during microscope movements. These movements were tracked using Speeded-Up Robust Features (SURF) method. Simultaneously, a micro-surgical instrument was identified and tracked using a Kalman filter. The 3D position of the instrument was extracted by solving a three-point perspective framework. For evaluation, the instrument was introduced through the tympanic membrane and ink droplets were injected on three middle ear structures. An average initial registration accuracy of 0.21 ± 0.10 mm (n = 3) was achieved with a slow propagation error during tracking (0.04 ± 0.07 mm). The estimated surgical instrument tip position error was 0.33 ± 0.22 mm. The target structures’ localization accuracy was 0.52 ± 0.15 mm. The submillimetric accuracy of our system without tracker is compatible with ear surgery.
Collapse
|
23
|
Yao J, Zeng W, Zhou S, Cheng J, Huang C, Tang W. Augmented Reality Technology Could Be an Alternative Method to Treat Craniomaxillofacial Foreign Bodies: A Comparative Study Between Augmented Reality Technology and Navigation Technology. J Oral Maxillofac Surg 2020; 78:578-587. [DOI: 10.1016/j.joms.2019.11.019] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2019] [Revised: 11/06/2019] [Accepted: 11/10/2019] [Indexed: 12/11/2022]
|
24
|
Chang F, Laguna B, Uribe J, Vu L, Zapala MA, Devincent C, Courtier J. Evaluating the Performance of Augmented Reality in Displaying Magnetic Resonance Imaging-Derived Three-Dimensional Holographic Models. J Med Imaging Radiat Sci 2019; 51:95-102. [PMID: 31862176 DOI: 10.1016/j.jmir.2019.10.006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Revised: 08/29/2019] [Accepted: 10/23/2019] [Indexed: 10/25/2022]
Abstract
INTRODUCTION/BACKGROUND Establishing accuracy and precision of magnetic resonance (MR)-derived augmented reality (AR) models is critical before clinical utilization, particularly in preoperative planning. We investigate the performance of an AR application in representing and displaying MR-derived three-dimensional holographic models. METHODS Thirty gold standard (GS) measurements were obtained on a magnetic resonance imaging (MRI) phantom (six interfiducial distances and five configurations). Four MRI pulse sequences were obtained for each of the five configurations, and distances measured in Picture Archiving and Communication System (PACS). Digital imaging and communications in medicine files were translated into three-dimensional models and then loaded onto a novel AR platform. Measurements were also obtained with the software's AR caliper tool. Significant differences among the three groups (GS, PACS, and AR) were assessed with the Kruskal-Wallis test and nonsample median test. Accuracy analysis of GS vs. AR was performed. Precision (percent deviation) of the AR-based caliper tool was also assessed. RESULTS No statistically significant difference existed between AR and GS measurements (P = .6208). PACS demonstrated mean squared error (MSE) of 0.29%. AR digital caliper demonstrated an MSE of 0.3%. Three-dimensional T2 CUBE AR measurements using the platform's AR caliper tool demonstrated an MSE of 8.6%. Percent deviation of AR software caliper tool ranged between 1.9% and 3.9%. DISCUSSION AR demonstrated a high degree of accuracy in comparison to GS, comparable to PACS-based measurements. AR caliper tool demonstrated overall lower accuracy than with physical calipers, although with MSE <10% and greatest measured difference from GS measuring <5 mm. AR-based caliper demonstrated a high degree of precision. CONCLUSION There was no statistically significant difference between GS measurements and three-dimensional AR measurements in MRI phantom models.
Collapse
Affiliation(s)
- Frank Chang
- UCSF Department of Radiology and Biomedical Imaging, Masters of Science in Biomedical Imaging Program, San Francisco, California, USA
| | - Ben Laguna
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA
| | - Jesus Uribe
- UCSF School of Medicine, San Francisco, California, USA
| | - Lan Vu
- Division of Pediatric Surgery, UCSF Department of Surgery, San Francisco, California, USA
| | - Matthew A Zapala
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA
| | - Craig Devincent
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA
| | - Jesse Courtier
- UCSF Department of Radiology and Biomedical Imaging, San Francisco, California, USA.
| |
Collapse
|
25
|
Zhou Y, Yoo P, Feng Y, Sankar A, Sadr A, Seibel EJ. Towards AR-assisted visualisation and guidance for imaging of dental decay. Healthc Technol Lett 2019; 6:243-248. [PMID: 32038865 PMCID: PMC6952244 DOI: 10.1049/htl.2019.0082] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2019] [Accepted: 10/02/2019] [Indexed: 12/27/2022] Open
Abstract
Untreated dental decay is the most prevalent dental problem in the world, affecting up to 2.4 billion people and leading to a significant economic and social burden. Early detection can greatly mitigate irreversible effects of dental decay, avoiding the need for expensive restorative treatment that forever disrupts the enamel protective layer of teeth. However, two key challenges exist that make early decay management difficult: unreliable detection and lack of quantitative monitoring during treatment. New optically based imaging through the enamel provides the dentist a safe means to detect, locate, and monitor the healing process. This work explores the use of an augmented reality (AR) headset to improve the workflow of early decay therapy and monitoring. The proposed workflow includes two novel AR-enabled features: (i) in situ visualisation of pre-operative optically based dental images and (ii) augmented guidance for repetitive imaging during therapy monitoring. The workflow is designed to minimise distraction, mitigate hand-eye coordination problems, and help guide monitoring of early decay during therapy in both clinical and mobile environments. The results from quantitative evaluations as well as a formative qualitative user study uncover the potentials of the proposed system and indicate that AR can serve as a promising tool in tooth decay management.
Collapse
Affiliation(s)
- Yaxuan Zhou
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA 98195, USA
- Human Photonics Lab, Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| | - Paul Yoo
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA 98195, USA
| | - Yingru Feng
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA 98195, USA
| | - Aditya Sankar
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA 98195, USA
| | - Alireza Sadr
- School of Dentistry, University of Washington, Seattle, WA 98195, USA
| | - Eric J. Seibel
- Human Photonics Lab, Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| |
Collapse
|
26
|
Rahman R, Wood ME, Qian L, Price CL, Johnson AA, Osgood GM. Head-Mounted Display Use in Surgery: A Systematic Review. Surg Innov 2019; 27:88-100. [DOI: 10.1177/1553350619871787] [Citation(s) in RCA: 48] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Purpose. We analyzed the literature to determine (1) the surgically relevant applications for which head-mounted display (HMD) use is reported; (2) the types of HMD most commonly reported; and (3) the surgical specialties in which HMD use is reported. Methods. The PubMed, Embase, Cochrane Library, and Web of Science databases were searched through August 27, 2017, for publications describing HMD use during surgically relevant applications. We identified 120 relevant English-language, non-opinion publications for inclusion. HMD types were categorized as “heads-up” (nontransparent HMD display and direct visualization of the real environment), “see-through” (visualization of the HMD display overlaid on the real environment), or “non–see-through” (visualization of only the nontransparent HMD display). Results. HMDs were used for image guidance and augmented reality (70 publications), data display (63 publications), communication (34 publications), and education/training (18 publications). See-through HMDs were described in 55 publications, heads-up HMDs in 41 publications, and non–see-through HMDs in 27 publications. Google Glass, a see-through HMD, was the most frequently used model, reported in 32 publications. The specialties with the highest frequency of published HMD use were urology (20 publications), neurosurgery (17 publications), and unspecified surgical specialty (20 publications). Conclusion. Image guidance and augmented reality were the most commonly reported applications for which HMDs were used. See-through HMDs were the most commonly reported type used in surgically relevant applications. Urology and neurosurgery were the specialties with greatest published HMD use.
Collapse
Affiliation(s)
- Rafa Rahman
- The Johns Hopkins University, Baltimore, MD, USA
| | | | - Long Qian
- The Johns Hopkins University, Baltimore, MD, USA
| | | | | | | |
Collapse
|
27
|
Beams R, Kim AS, Badano A. Transverse chromatic aberration in virtual reality head-mounted displays. OPTICS EXPRESS 2019; 27:24877-24884. [PMID: 31510369 DOI: 10.1364/oe.27.024877] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/01/2019] [Accepted: 07/08/2019] [Indexed: 06/10/2023]
Abstract
We demonstrate a method for measuring the transverse chromatic aberration (TCA) in a virtual reality head-mounted display. The method relies on acquiring images of a digital bar pattern and measuring the displacement of different color bars. This procedure was used to characterize the TCAs in the Oculus Go, Oculus Rift, Samsung Gear, and HTC Vive. The results show noticeable TCAs for the Oculus devices for angles larger than 5° from the center of the field of view. TCA is less noticeable in the Vive in part due to off-axis monochromatic aberrations. Finally, user measurements were conducted, which were in excellent agreement with the laboratory results.
Collapse
|
28
|
Pellegrino G, Mangano C, Mangano R, Ferri A, Taraschi V, Marchetti C. Augmented reality for dental implantology: a pilot clinical report of two cases. BMC Oral Health 2019; 19:158. [PMID: 31324246 PMCID: PMC6642526 DOI: 10.1186/s12903-019-0853-y] [Citation(s) in RCA: 49] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2019] [Accepted: 07/11/2019] [Indexed: 12/28/2022] Open
Abstract
BACKGROUND Despite the limited number of articles dedicated to its use, augmented reality (AR) is an emerging technology that has shown to have increasing applications in multiple different medical sectors. These include, but are not limited to, the Maxillo-facial and Dentistry disciplines of medicine. In these medical specialties, the focus of AR technology is to achieve a more visible surgical field during an operation. Currently, this goal is brought about by an accurate display of either static or dynamic diagnostic images via the use of a visor or specific glasses. The objective of this study is to evaluate the feasibility of using a virtual display for dynamic navigation via AR. The secondary outcome is to evaluate if the use of this technology could affect the accuracy of dynamic navigation. CASE PRESENTATION Two patients, both needing implant rehabilitation in the upper premolar area, were treated with flapless surgery. Prior to the procedure itself, the position of the implant was virtually planned and placed for each of the patients using their previous scans. This placement preparation contributed to a dynamic navigation system that was displayed on AR glasses. This, in turn, allowed for the use of a computer-aided/image-guided procedure to occur. Dedicated software for surface superimposition was then used to match the planned position of the implant and the real one obtained from the postoperative scan. Accuracies, using this procedure were evaluated by way of measuring the deviation between real and planned positions of the implants. For both surgeries it was possible to proceed using the AR technology as planned. The deviations for the first implant were 0.53 mm at the entry point and 0.50 mm at the apical point and for the second implant were 0.46 mm at the entry point and 0.48 mm at the apical point. The angular deviations were respectively 3.05° and 2.19°. CONCLUSIONS From the results of this pilot study, it seems that AR can be useful in dental implantology for displaying dynamic navigation systems. While this technology did not seem to noticeably affect the accuracy of the procedure, specific software applications should further optimize the results.
Collapse
Affiliation(s)
- Gerardo Pellegrino
- Oral and Maxillofacial Surgery Unit, DIBINEM, University of Bologna, 125, Via San Vitale 59, 40125, Bologna, Italy.
| | - Carlo Mangano
- Digital Dentistry Section, University San Raffaele, Milan, Italy
| | | | - Agnese Ferri
- Oral and Maxillofacial Surgery Unit, DIBINEM, University of Bologna, 125, Via San Vitale 59, 40125, Bologna, Italy
| | - Valerio Taraschi
- University of Technology - Sydney, School of Life Sciences, Sydney, Australia
| | - Claudio Marchetti
- Chief of Oral and Maxillofacial Surgery Unit, DIBINEM, University of Bologna, Bologna, Italy
| |
Collapse
|
29
|
Current state of the art in the use of augmented reality in dentistry: a systematic review of the literature. BMC Oral Health 2019; 19:135. [PMID: 31286904 PMCID: PMC6613250 DOI: 10.1186/s12903-019-0808-3] [Citation(s) in RCA: 40] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2019] [Accepted: 05/31/2019] [Indexed: 12/29/2022] Open
Abstract
Background The aim of the present systematic review was to screen the literature and to describe current applications of augmented reality. Materials and methods The protocol design was structured according to PRISMA-P guidelines and registered in PROSPERO. A review of the following databases was carried out: Medline, Ovid, Embase, Cochrane Library, Google Scholar and the Gray literature. Data was extracted, summarized and collected for qualitative analysis and evaluated for individual risk of bias (R.O.B.) assessment, by two independent examiners. Collected data included: year of publishing, journal with reviewing system and impact factor, study design, sample size, target of the study, hardware(s) and software(s) used or custom developed, primary outcomes, field of interest and quantification of the displacement error and timing measurements, when available. Qualitative evidence synthesis refers to SPIDER. Results From a primary research of 17,652 articles, 33 were considered in the review for qualitative synthesis. 16 among selected articles were eligible for quantitative synthesis of heterogenous data, 12 out of 13 judged the precision at least as acceptable, while 3 out of 6 described an increase in operation timing of about 1 h. 60% (n = 20) of selected studies refers to a camera-display augmented reality system while 21% (n = 7) refers to a head-mounted system. The software proposed in the articles were self-developed by 7 authors while the majority proposed commercially available ones. The applications proposed for augmented reality are: Oral and maxillo-facial surgery (OMS) in 21 studies, restorative dentistry in 5 studies, educational purposes in 4 studies and orthodontics in 1 study. The majority of the studies were carried on phantoms (51%) and those on patients were 11 (33%). Conclusions On the base of literature the current development is still insufficient for full validation process, however independent sources of customized software for augmented reality seems promising to help routinely procedures, complicate or specific interventions, education and learning. Oral and maxillofacial area is predominant, the results in precision are promising, while timing is still very controversial since some authors describe longer preparation time when using augmented reality up to 60 min while others describe a reduced operating time of 50/100%. Trial registration The following systematic review was registered in PROSPERO with RN: CRD42019120058.
Collapse
|
30
|
Talaat S, Ghoneima A, Kaboudan A, Talaat W, Ragy N, Bourauel C. Three‐dimensional evaluation of the holographic projection in digital dental model superimposition using HoloLens device. Orthod Craniofac Res 2019; 22 Suppl 1:62-68. [DOI: 10.1111/ocr.12286] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2018] [Accepted: 12/14/2018] [Indexed: 12/01/2022]
Affiliation(s)
- Sameh Talaat
- Department of OrthodonticsCollege of DentistryFuture University in Egypt Cairo Egypt
- Department of Oral TechnologySchool of DentistryUniversity of Bonn Bonn Germany
| | - Ahmed Ghoneima
- Department of Orthodontics and Oral Facial GeneticsIndiana University School Dentistry Indianapolis Indiana
- Department of OrthodonticsFaculty of Dental MedicineAl‐Azhar University Cairo Egypt
- Department of OrthodonticsHamdan Bin Mohammed College of Dental MedicineMohammed Bin Rashid University of Medicine and Health Sciences Dubai United Arab Emirates
| | - Ahmed Kaboudan
- Department of Computer ScienceElShorouk Academy New Cairo Egypt
- Department of Research and DevelopmentDigiBrain4 Chicago Illinois
| | - Wael Talaat
- Department of Oral and Craniofacial Health SciencesCollege of Dental MedicineUniversity of Sharjah Sharjah United Arab Emirates
- Department of Oral and Maxillofacial SurgeryFaculty of DentistrySuez Canal University Ismailia Egypt
| | - Nivin Ragy
- Department of Oral Medicine and RadiologyCollege of DentistryFuture University in Egypt Cairo Egypt
| | - Christoph Bourauel
- Department of Oral TechnologySchool of DentistryUniversity of Bonn Bonn Germany
| |
Collapse
|
31
|
Hussain R, Lalande A, Guigou C, Bozorg Grayeli A. Contribution of Augmented Reality to Minimally Invasive Computer-Assisted Cranial Base Surgery. IEEE J Biomed Health Inform 2019; 24:2093-2106. [DOI: 10.1109/jbhi.2019.2954003] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
32
|
Bosc R, Fitoussi A, Hersant B, Dao TH, Meningaud JP. Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systematic review of the literature and a classification of relevant technologies. Int J Oral Maxillofac Surg 2019; 48:132-139. [DOI: 10.1016/j.ijom.2018.09.010] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2018] [Revised: 09/16/2018] [Accepted: 09/24/2018] [Indexed: 12/30/2022]
|
33
|
Abstract
Augmentation reality technology offers virtual information in addition to that of the real environment and thus opens new possibilities in various fields. The medical applications of augmentation reality are generally concentrated on surgery types, including neurosurgery, laparoscopic surgery and plastic surgery. Augmentation reality technology is also widely used in medical education and training. In dentistry, oral and maxillofacial surgery is the primary area of use, where dental implant placement and orthognathic surgery are the most frequent applications. Recent technological advancements are enabling new applications of restorative dentistry, orthodontics and endodontics. This review briefly summarizes the history, definitions, features, and components of augmented reality technology and discusses its applications and future perspectives in dentistry.
Collapse
Affiliation(s)
- Ho-Beom Kwon
- Department of Prosthodontics, School of Dentistry, Seoul National University and Dental Research Institute, Seoul, Korea
| | - Young-Seok Park
- Department of Oral Medicine and Oral Diagnosis, School of Dentistry, Seoul National University and Dental Research Institute, Seoul, Korea
| | - Jung-Suk Han
- Department of Prosthodontics, School of Dentistry, Seoul National University and Dental Research Institute, Seoul, Korea
| |
Collapse
|
34
|
Meulstee JW, Nijsink J, Schreurs R, Verhamme LM, Xi T, Delye HHK, Borstlap WA, Maal TJJ. Toward Holographic-Guided Surgery. Surg Innov 2018; 26:86-94. [DOI: 10.1177/1553350618799552] [Citation(s) in RCA: 56] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The implementation of augmented reality (AR) in image-guided surgery (IGS) can improve surgical interventions by presenting the image data directly on the patient at the correct position and in the actual orientation. This approach can resolve the switching focus problem, which occurs in conventional IGS systems when the surgeon has to look away from the operation field to consult the image data on a 2-dimensional screen. The Microsoft HoloLens, a head-mounted AR display, was combined with an optical navigation system to create an AR-based IGS system. Experiments were performed on a phantom model to determine the accuracy of the complete system and to evaluate the effect of adding AR. The results demonstrated a mean Euclidean distance of 2.3 mm with a maximum error of 3.5 mm for the complete system. Adding AR visualization to a conventional system increased the mean error by 1.6 mm. The introduction of AR in IGS was promising. The presented system provided a solution for the switching focus problem and created a more intuitive guidance system. With a further reduction in the error and more research to optimize the visualization, many surgical applications could benefit from the advantages of AR guidance.
Collapse
Affiliation(s)
| | - Johan Nijsink
- Radboud University Medical Center, Nijmegen, Netherlands
| | - Ruud Schreurs
- Radboud University Medical Center, Nijmegen, Netherlands
- Academic Medical Center, Amsterdam, Netherlands
| | | | - Tong Xi
- Radboud University Medical Center, Nijmegen, Netherlands
| | | | | | | |
Collapse
|
35
|
Recent Development of Augmented Reality in Surgery: A Review. JOURNAL OF HEALTHCARE ENGINEERING 2017; 2017:4574172. [PMID: 29065604 PMCID: PMC5585624 DOI: 10.1155/2017/4574172] [Citation(s) in RCA: 138] [Impact Index Per Article: 19.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/12/2017] [Accepted: 07/03/2017] [Indexed: 12/11/2022]
Abstract
Introduction The development augmented reality devices allow physicians to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency, safety, and cost and to enhance surgical training. However, the awareness of possibilities of augmented reality is generally low. This review evaluates whether augmented reality can presently improve the results of surgical procedures. Methods We performed a review of available literature dating from 2010 to November 2016 by searching PubMed and Scopus using the terms “augmented reality” and “surgery.” Results. The initial search yielded 808 studies. After removing duplicates and including only journal articles, a total of 417 studies were identified. By reading of abstracts, 91 relevant studies were chosen to be included. 11 references were gathered by cross-referencing. A total of 102 studies were included in this review. Conclusions The present literature suggest an increasing interest of surgeons regarding employing augmented reality into surgery leading to improved safety and efficacy of surgical procedures. Many studies showed that the performance of newly devised augmented reality systems is comparable to traditional techniques. However, several problems need to be addressed before augmented reality is implemented into the routine practice.
Collapse
|
36
|
Zhu M, Chai G, Lin L, Xin Y, Tan A, Bogari M, Zhang Y, Li Q. Effectiveness of a Novel Augmented Reality-Based Navigation System in Treatment of Orbital Hypertelorism. Ann Plast Surg 2016; 77:662-668. [DOI: 10.1097/sap.0000000000000644] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|