1
|
Shusterman A, Nashef R, Tecco S, Mangano C, Mangano F. Implant Placement using Mixed Reality-Based Dynamic Navigation: a Proof of Concept. J Dent 2024:105256. [PMID: 39043329 DOI: 10.1016/j.jdent.2024.105256] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2024] [Revised: 07/11/2024] [Accepted: 07/16/2024] [Indexed: 07/25/2024] Open
Abstract
OBJECTIVES To present the first clinical application of a novel mixed reality-based dynamic navigation (MR-DN) system in the rehabilitation of a single tooth gap. METHODS The protocol consisted of the following: (1) three-dimensional patient data acquisition using intraoral scanning (IOS) and cone-beam computed tomography (CBCT), (2) implant planning using guided surgery software, (3) holography-guided implant placement using the novel MR-DN system (ANNA®, MARS Dental, Haifa, Israel) and (4) placement accuracy verification. RESULTS The novel MR-DN system was safe and time-efficient, as the surgery took 30 minutes from anaesthesia to suturing. The accuracy of implant placement was high with minimal deviations recorded in the three planes of space compared to the presurgical planning: the error at the entry point planar distance (XY) was 0.381 mm, and the entry point planar distance (Z) was 0.173 mm, for a 3D entry point distance (En) of 0.417 mm. A 3D apex deviation (An) of 0.193 mm was registered, with an angle difference of 1.852°. CONCLUSIONS This proof-of-concept study demonstrated the clinical feasibility of MR-DN for guided implant placement in single tooth gaps. Further clinical studies on a large sample of patients are needed to confirm these positive preliminary results. Statement of clinical relevance: The use of MR-DN can change the perspectives of guided dental implant surgery as a possible alternative to the classic static and dynamic guided surgical techniques for the rehabilitation of single tooth gaps.
Collapse
Affiliation(s)
| | - Rizan Nashef
- Oral and Maxillofacial Surgery Unit, Shaare Zedek Medical center, Jerusalem Israel.
| | - Simona Tecco
- Department of Dental Sciences, San Raffaele University, Milan Italy.
| | - Carlo Mangano
- Department of Dental Sciences, San Raffaele University, Milan Italy.
| | - Francesco Mangano
- Department of Pediatric, Preventive Dentistry and Orthodontics, I. M. Sechenov First State Medical University, Moscow, Russian Federation.
| |
Collapse
|
2
|
Al Hamad KQ, Said KN, Engelschalk M, Matoug-Elwerfelli M, Gupta N, Eric J, Ali SA, Ali K, Daas H, Abu Alhaija ES. Taxonomic discordance of immersive realities in dentistry: A systematic scoping review. J Dent 2024; 146:105058. [PMID: 38729286 DOI: 10.1016/j.jdent.2024.105058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2024] [Revised: 05/04/2024] [Accepted: 05/07/2024] [Indexed: 05/12/2024] Open
Abstract
OBJECTIVES This review aimed to map taxonomy frameworks, descriptions, and applications of immersive technologies in the dental literature. DATA The Preferred reporting items for systematic reviews and meta-analyses extension for scoping reviews (PRISMA-ScR) guidelines was followed, and the protocol was registered at open science framework platform (https://doi.org/10.17605/OSF.IO/H6N8M). SOURCES Systematic search was conducted in MEDLINE (via PubMed), Scopus, and Cochrane Library databases, and complemented by manual search. STUDY SELECTION A total of 84 articles were included, with 81 % between 2019 and 2023. Most studies were experimental (62 %), including education (25 %), protocol feasibility (20 %), in vitro (11 %), and cadaver (6 %). Other study types included clinical report/technique article (24 %), clinical study (9 %), technical note/tip to reader (4 %), and randomized controlled trial (1 %). Three-quarters of the included studies were published in oral and maxillofacial surgery (38 %), dental education (26 %), and implant (12 %) disciplines. Methods of display included head mounted display device (HMD) (55 %), see through screen (32 %), 2D screen display (11 %), and projector display (2 %). Descriptions of immersive realities were fragmented and inconsistent with lack of clear taxonomy framework for the umbrella and the subset terms including virtual reality (VR), augmented reality (AR), mixed reality (MR), augmented virtuality (AV), extended reality, and X reality. CONCLUSIONS Immersive reality applications in dentistry are gaining popularity with a notable surge in the number of publications in the last 5 years. Ambiguities are apparent in the descriptions of immersive realities. A taxonomy framework based on method of display (full or partial) and reality class (VR, AR, or MR) is proposed. CLINICAL SIGNIFICANCE Understanding different reality classes can be perplexing due to their blurred boundaries and conceptual overlapping. Immersive technologies offer novel educational and clinical applications. This domain is fast developing. With the current fragmented and inconsistent terminologies, a comprehensive taxonomy framework is necessary.
Collapse
Affiliation(s)
- Khaled Q Al Hamad
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar.
| | - Khalid N Said
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Marcus Engelschalk
- Department of Oral and Maxillofacial Surgery, University Medical Center Hamburg-Eppendorf, Germany
| | | | - Nidhi Gupta
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Jelena Eric
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Shaymaa A Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar; Hamad Medical Corporation, Doha, Qatar
| | - Kamran Ali
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | - Hanin Daas
- College of Dental Medicine, QU Health, Qatar University, Doha, Qatar
| | | |
Collapse
|
3
|
Shusterman A, Nashef R, Tecco S, Mangano C, Lerner H, Mangano FG. Accuracy of implant placement using a mixed reality-based dynamic navigation system versus static computer-assisted and freehand surgery: An in Vitro study. J Dent 2024; 146:105052. [PMID: 38734298 DOI: 10.1016/j.jdent.2024.105052] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2024] [Revised: 05/01/2024] [Accepted: 05/02/2024] [Indexed: 05/13/2024] Open
Abstract
PURPOSE This in vitro study aimed to compare the accuracy of dental implant placement in partially edentulous maxillary models using a mixed reality-based dynamic navigation (MR-DN) system to conventional static computer-assisted implant surgery (s-CAIS) and a freehand (FH) method. METHODS Forty-five partially edentulous models (with teeth missing in positions #15, #16 and #25) were assigned to three groups (15 per group). The same experienced operator performed the model surgeries using an MR-DN system (group 1), s-CAIS (group 2) and FH (group 3). In total, 135 dental implants were placed (45 per group). The primary outcomes were the linear coronal deviation (entry error; En), apical deviation (apex error; Ap), XY and Z deviations, and angular deviation (An) between the planned and actual (post-surgery) position of the implants in the models. These deviations were computed as the distances between the stereolithographic (STL) files for the planned implants and placed implants captured with an intraoral scanner. RESULTS Across the three implant sites, the MR-DN system was significantly more accurate than the FH method (in XY, Z, En, Ap and An) and s-CAIS (in Z, Ap and An), respectively. However, S-CAIS was more accurate than MR-DN in XY, and no difference was found between MR-DN and s-CAIS in En. CONCLUSIONS Within the limits of this study (in vitro design, only partially edentulous models), implant placement accuracy with MR-DN was superior to that of FH and similar to that of s-CAIS. STATEMENT OF CLINICAL RELEVANCE In vitro, MR-DN showed greater accuracy in implant positioning than FH, and similar accuracy to s-CAIS: it could, therefore, represent a new option for the surgeon. However, clinical studies are needed to determine the feasibility of MR-DN.
Collapse
Affiliation(s)
| | - Rizan Nashef
- Oral and Maxillofacial Surgery Unit, Shaare Zedek Medical center, Jerusalem Israel.
| | - Simona Tecco
- Department of Dental Sciences, San Raffaele University, Milan Italy
| | - Carlo Mangano
- Department of Dental Sciences, San Raffaele University, Milan Italy
| | - Henriette Lerner
- Academic Teaching and Research Institution of Johann Wolfgang Goethe University, Frankfurt, Germany.
| | - Francesco Guido Mangano
- Department of Pediatric, Preventive Dentistry and Orthodontics, I. M. Sechenov First State Medical University, Moscow, Russian Federation.
| |
Collapse
|
4
|
Rieder M, Remschmidt B, Gsaxner C, Gaessler J, Payer M, Zemann W, Wallner J. Augmented Reality-Guided Extraction of Fully Impacted Lower Third Molars Based on Maxillofacial CBCT Scans. Bioengineering (Basel) 2024; 11:625. [PMID: 38927861 PMCID: PMC11200966 DOI: 10.3390/bioengineering11060625] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2024] [Revised: 06/07/2024] [Accepted: 06/16/2024] [Indexed: 06/28/2024] Open
Abstract
(1) Background: This study aimed to integrate an augmented reality (AR) image-guided surgery (IGS) system, based on preoperative cone beam computed tomography (CBCT) scans, into clinical practice. (2) Methods: In preclinical and clinical surgical setups, an AR-guided visualization system based on Microsoft's HoloLens 2 was assessed for complex lower third molar (LTM) extractions. In this study, the system's potential intraoperative feasibility and usability is described first. Preparation and operating times for each procedure were measured, as well as the system's usability, using the System Usability Scale (SUS). (3) Results: A total of six LTMs (n = 6) were analyzed, two extracted from human cadaver head specimens (n = 2) and four from clinical patients (n = 4). The average preparation time was 166 ± 44 s, while the operation time averaged 21 ± 5.9 min. The overall mean SUS score was 79.1 ± 9.3. When analyzed separately, the usability score categorized the AR-guidance system as "good" in clinical patients and "best imaginable" in human cadaver head procedures. (4) Conclusions: This translational study analyzed the first successful and functionally stable application of the HoloLens technology for complex LTM extraction in clinical patients. Further research is needed to refine the technology's integration into clinical practice to improve patient outcomes.
Collapse
Affiliation(s)
- Marcus Rieder
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Bernhard Remschmidt
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria
| | - Jan Gaessler
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Michael Payer
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Wolfgang Zemann
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Juergen Wallner
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| |
Collapse
|
5
|
Al-Asali M, Alqutaibi AY, Al-Sarem M, Saeed F. Deep learning-based approach for 3D bone segmentation and prediction of missing tooth region for dental implant planning. Sci Rep 2024; 14:13888. [PMID: 38880802 PMCID: PMC11180661 DOI: 10.1038/s41598-024-64609-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Accepted: 06/11/2024] [Indexed: 06/18/2024] Open
Abstract
Recent studies have shown that dental implants have high long-term survival rates, indicating their effectiveness compared to other treatments. However, there is still a concern regarding treatment failure. Deep learning methods, specifically U-Net models, have been effectively applied to analyze medical and dental images. This study aims to utilize U-Net models to segment bone in regions where teeth are missing in cone-beam computerized tomography (CBCT) scans and predict the positions of implants. The proposed models were applied to a CBCT dataset of Taibah University Dental Hospital (TUDH) patients between 2018 and 2023. They were evaluated using different performance metrics and validated by a domain expert. The experimental results demonstrated outstanding performance in terms of dice, precision, and recall for bone segmentation (0.93, 0.94, and 0.93, respectively) with a low volume error (0.01). The proposed models offer promising automated dental implant planning for dental implantologists.
Collapse
Affiliation(s)
- Mohammed Al-Asali
- College of Computer Science and Engineering, Taibah University, 42353, Medina, Saudi Arabia
| | - Ahmed Yaseen Alqutaibi
- Substitutive Dental Sciences Department (Prosthodontics), College of Dentistry, Taibah University, 41311, Al Madinah, Saudi Arabia.
- Department of Prosthodontics, College of Dentistry, Ibb University, 70270, Ibb, Yemen.
| | - Mohammed Al-Sarem
- College of Computer Science and Engineering, Taibah University, 42353, Medina, Saudi Arabia
- Department of Computer Science, Sheba Region University, Marib, Yemen
| | - Faisal Saeed
- College of Computing and Digital Technology, Birmingham City University, Birmingham, B4 7XG, UK.
| |
Collapse
|
6
|
Shao L, Fu T, Lin Y, Xiao D, Ai D, Zhang T, Fan J, Song H, Yang J. Facial augmented reality based on hierarchical optimization of similarity aspect graph. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2024; 248:108108. [PMID: 38461712 DOI: 10.1016/j.cmpb.2024.108108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/16/2022] [Revised: 02/05/2024] [Accepted: 02/29/2024] [Indexed: 03/12/2024]
Abstract
BACKGROUND The existing face matching method requires a point cloud to be drawn on the real face for registration, which results in low registration accuracy due to the irregular deformation of the patient's skin that makes the point cloud have many outlier points. METHODS This work proposes a non-contact pose estimation method based on similarity aspect graph hierarchical optimization. The proposed method constructs a distance-weighted and triangular-constrained similarity measure to describe the similarity between views by automatically identifying the 2D and 3D feature points of the face. A mutual similarity clustering method is proposed to construct a hierarchical aspect graph with 3D pose as nodes. A Monte Carlo tree search strategy is used to search the hierarchical aspect graph for determining the optimal pose of the facial 3D model, so as to realize the accurate registration of the facial 3D model and the real face. RESULTS The proposed method was used to conduct accuracy verification experiments on the phantoms and volunteers, which were compared with four advanced pose calibration methods. The proposed method obtained average fusion errors of 1.13 ± 0.20 mm and 0.92 ± 0.08 mm in head phantom and volunteer experiments, respectively, which exhibits the best fusion performance among all comparison methods. CONCLUSIONS Our experiments proved the effectiveness of the proposed pose estimation method in facial augmented reality.
Collapse
Affiliation(s)
- Long Shao
- School of Computer Science & Technology, Beijing Institute of Technology, Beijing 100081, China
| | - Tianyu Fu
- School of Medical Technology, Beijing Institute of Technology, Beijing 100081, China.
| | - Yucong Lin
- School of Medical Technology, Beijing Institute of Technology, Beijing 100081, China
| | - Deqiang Xiao
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
| | - Danni Ai
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
| | - Tao Zhang
- Department of Stomatology, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing 100730, China
| | - Jingfan Fan
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China.
| | - Hong Song
- School of Computer Science & Technology, Beijing Institute of Technology, Beijing 100081, China.
| | - Jian Yang
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
| |
Collapse
|
7
|
Li F, Gao Q, Wang N, Greene N, Song T, Dianat O, Azimi E. Mixed reality guided root canal therapy. Healthc Technol Lett 2024; 11:167-178. [PMID: 38638496 PMCID: PMC11022218 DOI: 10.1049/htl2.12077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Accepted: 01/11/2024] [Indexed: 04/20/2024] Open
Abstract
Root canal therapy (RCT) is a widely performed procedure in dentistry, with over 25 million individuals undergoing it annually. This procedure is carried out to address inflammation or infection within the root canal system of affected teeth. However, accurately aligning CT scan information with the patient's tooth has posed challenges, leading to errors in tool positioning and potential negative outcomes. To overcome these challenges, a mixed reality application is developed using an optical see-through head-mounted display (OST-HMD). The application incorporates visual cues, an augmented mirror, and dynamically updated multi-view CT slices to address depth perception issues and achieve accurate tooth localization, comprehensive canal exploration, and prevention of perforation during RCT. Through the preliminary experimental assessment, significant improvements in the accuracy of the procedure are observed. Specifically, with the system the accuracy in position was improved from 1.4 to 0.4 mm (more than a 70% gain) using an Optical Tracker (NDI) and from 2.8 to 2.4 mm using an HMD, thereby achieving submillimeter accuracy with NDI. 6 participants were enrolled in the user study. The result of the study suggests that the average displacement on the crown plane of 1.27 ± 0.83 cm, an average depth error of 0.90 ± 0.72 cm and an average angular deviation of 1.83 ± 0.83°. Our error analysis further highlights the impact of HMD spatial localization and head motion on the registration and calibration process. Through seamless integration of CT image information with the patient's tooth, our mixed reality application assists dentists in achieving precise tool placement. This advancement in technology has the potential to elevate the quality of root canal procedures, ensuring better accuracy and enhancing overall treatment outcomes.
Collapse
Affiliation(s)
- Fangjie Li
- Department of Biomedical EngineeringJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Qingying Gao
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Nengyu Wang
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Nicholas Greene
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Tianyu Song
- Chair for Computer Aided Medical ProceduresTechnical University of MunichMunichGermany
| | - Omid Dianat
- School of DentistryUniversity of MarylandBaltimoreMarylandUSA
| | - Ehsan Azimi
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| |
Collapse
|
8
|
Engelschalk M, Al Hamad KQ, Mangano R, Smeets R, Molnar TF. Dental implant placement with immersive technologies: A preliminary clinical report of augmented and mixed reality applications. J Prosthet Dent 2024:S0022-3913(24)00141-0. [PMID: 38480015 DOI: 10.1016/j.prosdent.2024.02.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Revised: 02/12/2024] [Accepted: 02/13/2024] [Indexed: 04/21/2024]
Abstract
A preliminary clinical report of implant placements with 2 immersive reality technologies is described: augmented reality with head mounted display and mixed reality with a tablet PC. Both immersive realities are promising and could facilitate innovative dental applications. However, mixed reality requires further development for clinical optimization.
Collapse
Affiliation(s)
- Marcus Engelschalk
- Researcher, Department of Oral and Maxillofacial Surgery, University Medical Center Hamburg-Eppendorf, Hamburg, Germany; and Private practice, Munich, Germany
| | - Khaled Q Al Hamad
- Professor, College of Dental Medicine, Qatar University, QU Health, Doha, Qatar.
| | | | - Ralf Smeets
- Professor, Department of Oral and Maxillofacial Surgery, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Tamás F Molnar
- Professor, Medical Skill and Innovation Centre, Department of Operational Medicine, Medical School, University of Pécs, Pécs, Hungary
| |
Collapse
|
9
|
Hsu MC, Lin CC, Hsu JT, Yu JH, Huang HL. Effects of an augmented reality aided system on the placement precision of orthodontic miniscrews: A pilot study. J Dent Sci 2024; 19:100-108. [PMID: 38303815 PMCID: PMC10829748 DOI: 10.1016/j.jds.2023.05.025] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2023] [Revised: 05/19/2023] [Indexed: 02/03/2024] Open
Abstract
Background/purpose Augmented reality (AR) is gaining popularity in medical applications, which may aid clinicians in achieving improved clinical outcomes. The purpose of this study was to determine the positional and angle errors of orthodontic miniscrew placement by using a self-developed AR aided system. Materials and methods Cone beam computed tomography (CBCT) and patient printed models were used in in vitro experiments. The participants were divided into a control group and an AR group, in which traditional orthodontic methods and the AR-aided system were used respectively. After the information obtained from the CBCT images and navigation system was combined on the display device, the AR-aided system indicated the planned miniscrew position to guide the clinicians during the placement of miniscrews. Both methods were compared by a senior and a junior dentist, and the position and angle of miniscrew placement were statistically analyzed using Wilcoxon's signed-rank and Mann-Whitney U tests. Results When the AR-aided system was used, the accuracy of miniscrew placement in the mesiodistal position considerably increased (83%) when the procedure was performed by a senior clinician. In addition, the accuracy of miniscrew placement in the mesiodistal position and the angle of miniscrew placement considerably increased by approximately 67% and 72%, respectively, when the procedure was performed by a junior clinician. The position error of miniscrew placement was smaller for the junior clinician when the AR-aided system was used than for the senior clinician. Conclusion The AR-aided system improved the accuracy of miniscrew placement regardless of the clinician's level of experience.
Collapse
Affiliation(s)
- Meng-Chu Hsu
- School of Dentistry, China Medical University, Taichung, Taiwan
| | - Chih-Chieh Lin
- Department of Dentistry, China Medical University Hospital, Taichung, Taiwan
| | - Jui-Ting Hsu
- Department of Biomedical Engineering, China Medical University, Taichung, Taiwan
| | - Jian-Hong Yu
- School of Dentistry, China Medical University, Taichung, Taiwan
- Department of Dentistry, China Medical University Hospital, Taichung, Taiwan
| | - Heng-Li Huang
- School of Dentistry, China Medical University, Taichung, Taiwan
- Department of Bioinformatics and Medical Engineering, Asia University, Taichung, Taiwan
| |
Collapse
|
10
|
Tao B, Fan X, Wang F, Chen X, Shen Y, Wu Y. Comparison of the accuracy of dental implant placement using dynamic and augmented reality-based dynamic navigation: An in vitro study. J Dent Sci 2024; 19:196-202. [PMID: 38303816 PMCID: PMC10829549 DOI: 10.1016/j.jds.2023.05.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2023] [Revised: 05/05/2023] [Indexed: 02/03/2024] Open
Abstract
Background/purpose Augmented reality has been gradually applied in dental implant surgery. However, whether the dynamic navigation system integrated with augmented reality technology will further improve the accuracy is still unknown. The purpose of this study is to investigate the accuracy of dental implant placement using dynamic navigation and augmented reality-based dynamic navigation systems. Materials and methods Thirty-two cone-beam CT (CBCT) scans from clinical patients were collected and used to generate 64 phantoms that were allocated to the augmented reality-based dynamic navigation (ARDN) group or the conventional dynamic navigation (DN) group. The primary outcomes were global coronal, apical and angular deviations, and they were measured after image fusion. A linear mixed model with a random intercept was used. A P value < 0.05 was considered to indicate statistical significance. Results A total of 242 dental implants were placed in two groups. The global coronal, apical and angular deviations of the ARDN and DN groups were 1.31 ± 0.67 mm vs. 1.18 ± 0.59 mm, 1.36 ± 0.67 mm vs. 1.39 ± 0.55 mm, and 3.72 ± 2.13° vs. 3.1 ± 1.56°, respectively. No significant differences were found with regard to coronal and apical deviations (P = 0.16 and 0.6, respectively), but the DN group had a significantly lower angular deviation than the ARDN group (P = 0.02). Conclusion The augmented reality-based dynamic navigation system yielded a similar accuracy to the conventional dynamic navigation system for dental implant placement in coronal and apical points, but the augmented reality-based dynamic navigation system yielded a higher angular deviation.
Collapse
Affiliation(s)
- Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Feng Wang
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| |
Collapse
|
11
|
Condino S, Cutolo F, Carbone M, Cercenelli L, Badiali G, Montemurro N, Ferrari V. Registration Sanity Check for AR-guided Surgical Interventions: Experience From Head and Face Surgery. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2023; 12:258-267. [PMID: 38410181 PMCID: PMC10896424 DOI: 10.1109/jtehm.2023.3332088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Revised: 10/19/2023] [Accepted: 11/08/2023] [Indexed: 02/28/2024]
Abstract
Achieving and maintaining proper image registration accuracy is an open challenge of image-guided surgery. This work explores and assesses the efficacy of a registration sanity check method for augmented reality-guided navigation (AR-RSC), based on the visual inspection of virtual 3D models of landmarks. We analyze the AR-RSC sensitivity and specificity by recruiting 36 subjects to assess the registration accuracy of a set of 114 AR images generated from camera images acquired during an AR-guided orthognathic intervention. Translational or rotational errors of known magnitude up to ±1.5 mm/±15.5°, were artificially added to the image set in order to simulate different registration errors. This study analyses the performance of AR-RSC when varying (1) the virtual models selected for misalignment evaluation (e. g., the model of brackets, incisor teeth, and gingival margins in our experiment), (2) the type (translation/rotation) of registration error, and (3) the level of user experience in using AR technologies. Results show that: 1) the sensitivity and specificity of the AR-RSC depends on the virtual models (globally, a median true positive rate of up to 79.2% was reached with brackets, and a median true negative rate of up to 64.3% with incisor teeth), 2) there are error components that are more difficult to identify visually, 3) the level of user experience does not affect the method. In conclusion, the proposed AR-RSC, tested also in the operating room, could represent an efficient method to monitor and optimize the registration accuracy during the intervention, but special attention should be paid to the selection of the AR data chosen for the visual inspection of the registration accuracy.
Collapse
Affiliation(s)
- Sara Condino
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Fabrizio Cutolo
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Marina Carbone
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Laura Cercenelli
- EDIMES Laboratory of BioengineeringDepartment of Experimental, Diagnostic and Specialty MedicineUniversity of Bologna40138BolognaItaly
| | - Giovanni Badiali
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Nicola Montemurro
- Department of NeurosurgeryAzienda Ospedaliera Universitaria Pisana (AOUP)56127PisaItaly
| | - Vincenzo Ferrari
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| |
Collapse
|
12
|
Fan X, Tao B, Tu P, Shen Y, Wu Y, Chen X. A novel mixed reality-guided dental implant placement navigation system based on virtual-actual registration. Comput Biol Med 2023; 166:107560. [PMID: 37847946 DOI: 10.1016/j.compbiomed.2023.107560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 09/14/2023] [Accepted: 10/10/2023] [Indexed: 10/19/2023]
Abstract
BACKGROUNDS The key to successful dental implant surgery is to place the implants accurately along the pre-operative planned paths. The application of surgical navigation systems can significantly improve the safety and accuracy of implantation. However, the frequent shift of the views of the surgeon between the surgical site and the computer screen causes troubles, which is expected to be solved by the introduction of mixed-reality technology through the wearing of HoloLens devices by enabling the alignment of the virtual three-dimensional (3D) image with the actual surgical site in the same field of view. METHODS This study utilized mixed reality technology to enhance dental implant surgery navigation. Our first step was reconstructing a virtual 3D model from pre-operative cone-beam CT (CBCT) images. We then obtained the relative position between objects using the navigation device and HoloLens camera. Via the algorithms of virtual-actual registration, the transformation matrixes between the HoloLens devices and the navigation tracker were acquired through the HoloLens-tracker registration, and the transformation matrixes between the virtual model and the patient phantom through the image-phantom registration. In addition, the algorithm of surgical drill calibration assisted in acquiring transformation matrixes between the surgical drill and the patient phantom. These algorithms allow real-time tracking of the surgical drill's location and orientation relative to the patient phantom under the navigation device. With the aid of the HoloLens 2, virtual 3D images and actual patient phantoms can be aligned accurately, providing surgeons with a clear visualization of the implant path. RESULTS Phantom experiments were conducted using 30 patient phantoms, with a total of 102 dental implants inserted. Comparisons between the actual implant paths and the pre-operatively planned implant paths showed that our system achieved a coronal deviation of 1.507 ± 0.155 mm, an apical deviation of 1.542 ± 0.143 mm, and an angular deviation of 3.468 ± 0.339°. The deviation was not significantly different from that of the navigation-guided dental implant placement but better than the freehand dental implant placement. CONCLUSION Our proposed system realizes the integration of the pre-operative planned dental implant paths and the patient phantom, which helps surgeons achieve adequate accuracy in traditional dental implant surgery. Furthermore, this system is expected to be applicable to animal and cadaveric experiments in further studies.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China; Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
13
|
Remschmidt B, Rieder M, Gsaxner C, Gaessler J, Payer M, Wallner J. Augmented Reality-Guided Apicoectomy Based on Maxillofacial CBCT Scans. Diagnostics (Basel) 2023; 13:3037. [PMID: 37835780 PMCID: PMC10572956 DOI: 10.3390/diagnostics13193037] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Revised: 09/13/2023] [Accepted: 09/21/2023] [Indexed: 10/15/2023] Open
Abstract
Implementation of augmented reality (AR) image guidance systems using preoperative cone beam computed tomography (CBCT) scans in apicoectomies promises to help surgeons overcome iatrogenic complications associated with this procedure. This study aims to evaluate the intraoperative feasibility and usability of HoloLens 2, an established AR image guidance device, in the context of apicoectomies. Three experienced surgeons carried out four AR-guided apicoectomies each on human cadaver head specimens. Preparation and operating times of each procedure, as well as the subjective usability of HoloLens for AR image guidance in apicoectomies using the System Usability Scale (SUS), were measured. In total, twelve AR-guided apicoectomies on six human cadaver head specimens were performed (n = 12). The average preparation time amounted to 162 (±34) s. The surgical procedure itself took on average 9 (±2) min. There was no statistically significant difference between the three surgeons. Quantification of the usability of HoloLens revealed a mean SUS score of 80.4 (±6.8), indicating an "excellent" usability level. In conclusion, this study implies the suitability, practicality, and simplicity of AR image guidance systems such as the HoloLens in apicoectomies and advocates their routine implementation.
Collapse
Affiliation(s)
- Bernhard Remschmidt
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Marcus Rieder
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria
| | - Jan Gaessler
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Michael Payer
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Juergen Wallner
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| |
Collapse
|
14
|
Riad Deglow E, Zubizarreta-Macho Á, González Menéndez H, Lorrio Castro J, Galparsoro Catalán A, Tzironi G, Lobo Galindo AB, Alonso Ezpeleta LÓ, Hernández Montero S. Comparative analysis of two navigation techniques based on augmented reality technology for the orthodontic mini-implants placement. BMC Oral Health 2023; 23:542. [PMID: 37543581 PMCID: PMC10403882 DOI: 10.1186/s12903-023-03261-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 07/28/2023] [Indexed: 08/07/2023] Open
Abstract
To analyze and compare the accuracy and root contact prevalence, comparing a conventional freehand technique and two navigation techniques based on augmented reality technology for the orthodontic self-drilling mini-implants placement. Methods Two hundred and seven orthodontic self-drilling mini-implants were placed using either a conventional freehand technique (FHT) and two navigation techniques based on augmented reality technology (AR TOOTH and AR SCREWS). Accuracy across different dental sectors was also analyzed. CBCT and intraoral scans were taken both prior to and following orthodontic self-drilling mini-implants placement. The deviation angle and horizontal were then analyzed; these measurements were taken at the coronal entry point and apical endpoint between the planned and performed orthodontic self-drilling mini-implants. In addition, any complications resulting from mini-implant placement, such as spot perforations, were also analyzed across all dental sectors.Results The statistical analysis showed significant differences between study groups with regard to the coronal entry-point (p < 0.001), apical end-point(p < 0.001) and angular deviations (p < 0.001). Furthermore, statistically significant differences were shown between the orthodontic self-drilling mini-implants placement site at the coronal entry-point (p < 0.0001) and apical end-point (p < 0.001). Additionally, eight root perforations were observed in the FHT group, while there were no root perforations in the two navigation techniques based on augmented reality technology.Conclusions The navigation techniques based on augmented reality technology has an effect on the accuracy of orthodontic self-drilling mini-implants placement and results in fewer intraoperative complications, comparing to the conventional free-hand technique. The AR TOOTH augmented reality technique showed more accurate results between planned and placed orthodontic self-drilling mini-implants, comparing to the AR SCREWS and conventional free-hand techniques. The navigation techniques based on augmented reality technology showed fewer intraoperative complications, comparing to the conventional free-hand technique.
Collapse
Affiliation(s)
- Elena Riad Deglow
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Álvaro Zubizarreta-Macho
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
- Department of Surgery, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain
| | - Héctor González Menéndez
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Juan Lorrio Castro
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Agustín Galparsoro Catalán
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| | - Georgia Tzironi
- Department of Surgery, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain
| | - Ana Belén Lobo Galindo
- Department of Surgery, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain
| | | | - Sofía Hernández Montero
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X El Sabio University, Avda Universidad, 1. 28691, Villanueva de La Cañada, Madrid, Spain
| |
Collapse
|
15
|
Kim M, Chung M, Shin YG, Kim B. Automatic registration of dental CT and 3D scanned model using deep split jaw and surface curvature. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2023; 233:107467. [PMID: 36921464 DOI: 10.1016/j.cmpb.2023.107467] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Revised: 02/07/2023] [Accepted: 03/04/2023] [Indexed: 06/18/2023]
Abstract
BACKGROUND AND OBJECTIVES In the medical field, various image registration applications have been studied. In dentistry, the registration of computed tomography (CT) volume data and 3D optically scanned models is essential for various clinical applications, including orthognathic surgery, implant surgical planning, and augmented reality. Our purpose was to present a fully automatic registration method of dental CT data and 3D scanned models. METHODS We use a 2D convolutional neural network to regress a curve splitting the maxilla (i.e., upper jaw) and mandible (i.e., lower jaw) and the points specifying the front and back ends of the crown from the CT data. Using this regressed information, we extract the point cloud and vertices corresponding to the tooth crown from the CT and scanned data, respectively. We introduce a novel metric, called curvature variance of neighbor (CVN), to discriminate between highly fluctuating and smoothly varying regions of the tooth crown. The registration based on CVN enables more accurate fine registration while reducing the effects of metal artifacts. Moreover, the proposed method does not require any preprocessing such as extracting the iso-surface for the tooth crown from the CT data, thereby significantly reducing the computation time. RESULTS We evaluated the proposed method with the comparison to several promising registration techniques. Our experimental results using three datasets demonstrated that the proposed method exhibited higher registration accuracy (i.e., 2.85, 1.92, and 7.73 times smaller distance errors for individual datasets) and smaller computation time (i.e., 4.12 times faster registration) than one of the state-of-the-art methods. Moreover, the proposed method worked considerably well for partially scanned data, whereas other methods suffered from the unbalancing of information between the CT and scanned data. CONCLUSIONS The proposed method was able to perform fully automatic and highly accurate registration of dental CT data and 3D scanned models, even with severe metal artifacts. In addition, it could achieve fast registration because it did not require any preprocessing for iso-surface reconstruction from the CT data.
Collapse
Affiliation(s)
- Minchang Kim
- Department of Computer Science and Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Republic of Korea
| | - Minyoung Chung
- School of Software, Soongsil University, 369 Sangdo-Ro, Dongjak-Gu, Seoul 06978, Republic of Korea
| | - Yeong-Gil Shin
- Department of Computer Science and Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826, Republic of Korea
| | - Bohyoung Kim
- Division of Biomedical Engineering, Hankuk University of Foreign Studies, 81 Oedae-ro, Mohyeon-myeon, Cheoin-gu, Yongin-si, Gyeonggi-do 17035, Republic of Korea.
| |
Collapse
|
16
|
Mangano FG, Admakin O, Lerner H, Mangano C. Artificial Intelligence and Augmented Reality for Guided Implant Surgery Planning: a Proof of Concept. J Dent 2023; 133:104485. [PMID: 36965859 DOI: 10.1016/j.jdent.2023.104485] [Citation(s) in RCA: 20] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2023] [Revised: 03/08/2023] [Accepted: 03/13/2023] [Indexed: 03/27/2023] Open
Abstract
PURPOSE To present a novel protocol for authentic three-dimensional (3D) planning of dental implants, using artificial intelligence (AI) and augmented reality (AR). METHODS The novel protocol consists of (1) 3D data acquisition, with an intraoral scanner (IOS) and cone-beam computed tomography (CBCT); (2) application of AI for CBCT segmentation to obtain standard tessellation language (STL) models and automatic alignment with IOS models; (3) loading of selected STL models within the AR system and surgical planning with holograms; (4) surgical guide design with open-source computer-assisted-design (CAD) software; and (5) surgery on the patient. RESULTS This novel protocol is effective and time-efficient when used for planning simple cases of static guided implant surgery in the partially edentulous patient. The clinician can plan the implants in an authentic 3D environment, without using any radiological guided surgery software. The precision of implant placement looks clinically acceptable, with minor deviations. CONCLUSIONS AI and AR technologies can be successfully used for planning guided implant surgery for authentic 3D planning that may replace conventional guided surgery software. However, further clinical studies are needed to validate this protocol. STATEMENT OF CLINICAL RELEVANCE The combined use of AI and AR may change the perspectives of modern guided implant surgery for authentic 3D planning that may replace conventional guided surgery software.
Collapse
Affiliation(s)
- Francesco Guido Mangano
- Department of Pediatric, Preventive Dentistry and Orthodontics, Sechenov First State Medical University, Moscow, Russian Federation; Honorary Professor in Restorative Dental Sciences, Faculty of Dentistry, The University of Hong Kong, China.
| | - Oleg Admakin
- Department of Pediatric, Preventive Dentistry and Orthodontics, Sechenov First State Medical University, Moscow, Russian Federation.
| | - Henriette Lerner
- Academic Teaching and Research Institution of Johann Wolfgang Goethe University, Frankfurt, Germany.
| | | |
Collapse
|
17
|
Liu L, Wang X, Guan M, Fan Y, Yang Z, Li D, Bai Y, Li H. A mixed reality-based navigation method for dental implant navigation method: A pilot study. Comput Biol Med 2023; 154:106568. [PMID: 36739818 DOI: 10.1016/j.compbiomed.2023.106568] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 12/28/2022] [Accepted: 01/22/2023] [Indexed: 01/25/2023]
Abstract
This in vitro study aimed to put forward the development and investigation of a novel Mixed Reality (MR)-based dental implant navigation method and evaluate implant accuracy. Data were collected using 3D-cone beam computed tomography. The MR-based navigation system included a Hololens headset, an NDI (Northern Digital Inc.) Polaris optical tracking system, and a computer. A software system was developed. Resin models of dentition defects were created for a randomized comparison study with the MR-based navigation implantation system (MR group, n = 25) and the conventional free-hand approach (FH group, n = 25). Implant surgery on the models was completed by an oral surgeon. The precision and feasibility of the MR-based navigation method in dental implant surgery were assessed and evaluated by calculating the entry deviation, middle deviation, apex deviation, and angular deviation values of the implant. The system, including both the hardware and software, for the MR-based dental implant navigation method were successfully developed and a workflow of the method was established. Three-Dimensional (3D) reconstruction and visualization of the surgical instruments, dentition, and jawbone were achieved. Real-time tracking of implant tools and jaw model, holographic display via the MR headset, surgical guidance, and visualization of the intraoperative implant trajectory deviation from the planned trajectory were captured by our system. The MR-based navigation system was with better precise than the free-hand approach for entry deviation (MR: 0.6914 ± 0.2507 mm, FH: 1.571 ± 0.5004 mm, P = 0.000), middle deviation (MR: 0.7156 ± 0.2127 mm, FH: 1.170 ± 0.3448 mm, P = 0.000), apex deviation (MR: 0.7869 ± 0.2298 mm, FH: 0.9190 ± 0.3319 mm, P = 0.1082), and angular deviation (MR: 1.849 ± 0.6120°, FH: 4.933 ± 1.650°, P = 0.000).
Collapse
Affiliation(s)
- Lin Liu
- Department of Stomatology, The First Medical Center of PLA General Hospital, Beijing, 100853, China
| | - Xiaoyu Wang
- Department of Stomatology, The First Medical Center of PLA General Hospital, Beijing, 100853, China; Department of Stomatology, PLA Strategic Support Force Special Medical Center, Beijing, 100101, China
| | - Miaosheng Guan
- Department of Stomatology, The First Medical Center of PLA General Hospital, Beijing, 100853, China; PLA Rocket Force Characteristic Medical Center, Beijing, 100088, China
| | - Yiping Fan
- Department of Stomatology, The First Medical Center of PLA General Hospital, Beijing, 100853, China
| | - Zhongliang Yang
- Department of Stomatology, The First Medical Center of PLA General Hospital, Beijing, 100853, China
| | - Deyu Li
- Beijing Visual 3D Medical Science and Technology Development Co., LTD., Beijing, 100000, China.
| | - Yuming Bai
- Beijing Visual 3D Medical Science and Technology Development Co., LTD., Beijing, 100000, China
| | - Hongbo Li
- Department of Stomatology, The First Medical Center of PLA General Hospital, Beijing, 100853, China.
| |
Collapse
|
18
|
Sin M, Cho JH, Lee H, Kim K, Woo HS, Park JM. Development of a Real-Time 6-DOF Motion-Tracking System for Robotic Computer-Assisted Implant Surgery. SENSORS (BASEL, SWITZERLAND) 2023; 23:2450. [PMID: 36904653 PMCID: PMC10007561 DOI: 10.3390/s23052450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/18/2023] [Revised: 02/10/2023] [Accepted: 02/21/2023] [Indexed: 06/18/2023]
Abstract
In this paper, we investigate a motion-tracking system for robotic computer-assisted implant surgery. Failure of the accurate implant positioning may result in significant problems, thus an accurate real-time motion-tracking system is crucial for avoiding these issues in computer-assisted implant surgery. Essential features of the motion-tracking system are analyzed and classified into four categories: workspace, sampling rate, accuracy, and back-drivability. Based on this analysis, requirements for each category have been derived to ensure that the motion-tracking system meets the desired performance criteria. A novel 6-DOF motion-tracking system is proposed which demonstrates high accuracy and back-drivability, making it suitable for use in computer-assisted implant surgery. The results of the experiments confirm the effectiveness of the proposed system in achieving the essential features required for a motion-tracking system in robotic computer-assisted implant surgery.
Collapse
Affiliation(s)
- Minki Sin
- Department of Medical Robotics, Korea Institute of Machinery & Materials, Daegu 42994, Republic of Korea
| | - Jang Ho Cho
- Department of Medical Robotics, Korea Institute of Machinery & Materials, Daegu 42994, Republic of Korea
| | - Hyukjin Lee
- Department of Medical Robotics, Korea Institute of Machinery & Materials, Daegu 42994, Republic of Korea
| | - Kiyoung Kim
- Department of Medical Robotics, Korea Institute of Machinery & Materials, Daegu 42994, Republic of Korea
| | - Hyun Soo Woo
- Department of Medical Robotics, Korea Institute of Machinery & Materials, Daegu 42994, Republic of Korea
| | - Ji-Man Park
- Department of Prosthodontics & Dental Research Institute, Seoul National University School of Dentistry, Seoul 03080, Republic of Korea
| |
Collapse
|
19
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
20
|
Leung T, Dam VV, Lee DH. Accuracy of Augmented Reality-Assisted Navigation in Dental Implant Surgery: Systematic Review and Meta-analysis. J Med Internet Res 2023; 25:e42040. [PMID: 36598798 PMCID: PMC9856431 DOI: 10.2196/42040] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2022] [Revised: 11/09/2022] [Accepted: 11/25/2022] [Indexed: 11/27/2022] Open
Abstract
BACKGROUND The novel concept of immersive 3D augmented reality (AR) surgical navigation has recently been introduced in the medical field. This method allows surgeons to directly focus on the surgical objective without having to look at a separate monitor. In the dental field, the recently developed AR-assisted dental implant navigation system (AR navigation), which uses innovative image technology to directly visualize and track a presurgical plan over an actual surgical site, has attracted great interest. OBJECTIVE This study is the first systematic review and meta-analysis study that aimed to assess the accuracy of dental implants placed by AR navigation and compare it with that of the widely used implant placement methods, including the freehand method (FH), template-based static guidance (TG), and conventional navigation (CN). METHODS Individual search strategies were used in PubMed (MEDLINE), Scopus, ScienceDirect, Cochrane Library, and Google Scholar to search for articles published until March 21, 2022. This study was performed in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines and registered in the International Prospective Register of Systematic Reviews (PROSPERO) database. Peer-reviewed journal articles evaluating the positional deviations of dental implants placed using AR-assisted implant navigation systems were included. Cohen d statistical power analysis was used to investigate the effect size estimate and CIs of standardized mean differences (SMDs) between data sets. RESULTS Among the 425 articles retrieved, 15 articles were considered eligible for narrative review, 8 articles were considered for single-arm meta-analysis, and 4 were included in a 2-arm meta-analysis. The mean lateral, global, depth, and angular deviations of the dental implant placed using AR navigation were 0.90 (95% CI 0.78-1.02) mm, 1.18 (95% CI 0.95-1.41) mm, 0.78 (95% CI 0.48-1.08) mm, and 3.96° (95% CI 3.45°-4.48°), respectively. The accuracy of AR navigation was significantly higher than that of the FH method (SMD=-1.01; 95% CI -1.47 to -0.55; P<.001) and CN method (SMD=-0.46; 95% CI -0.64 to -0.29; P<.001). However, the accuracies of the AR navigation and TG methods were similar (SMD=0.06; 95% CI -0.62 to 0.74; P=.73). CONCLUSIONS The positional deviations of AR-navigated implant placements were within the safety zone, suggesting clinically acceptable accuracy of the AR navigation method. Moreover, the accuracy of AR implant navigation was comparable with that of the highly recommended dental implant-guided surgery method, TG, and superior to that of the conventional FH and CN methods. This review highlights the possibility of using AR navigation as an effective and accurate immersive surgical guide for dental implant placement.
Collapse
Affiliation(s)
| | - Van Viet Dam
- Department of Implantology, Hanoi National Hospital of Odonto-stomatology, Hanoi, Vietnam.,VNU School of Medicine and Pharmacy, Vietnam National University, Hanoi, Vietnam
| | - Du-Hyeong Lee
- Institute for Translational Research in Dentistry, Kyungpook National University, Daegu, Republic of Korea.,Department of Prosthodontics, School of Dentistry, Kyungpook National University, Daegu, Republic of Korea
| |
Collapse
|
21
|
Stünkel R, Zeller AN, Bohne T, Böhrnsen F, Wedi E, Raschke D, Kauffmann P. Accuracy of intraoral real-time navigation versus static, CAD/CAM-manufactured pilot drilling guides in dental implant surgery: an in vitro study. Int J Implant Dent 2022; 8:41. [PMID: 36198996 PMCID: PMC9535055 DOI: 10.1186/s40729-022-00430-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Accepted: 07/11/2022] [Indexed: 11/18/2022] Open
Abstract
Background Nowadays, 3D planning and static for dynamic aids play an increasing role in oral rehabilitation of the masticatory apparatus with dental implants. The aim of this study is to compare the accuracy of implant placement using a 3D-printed drilling guide and an intraoral real-time dynamic navigation system. Methods A total of 60 implants were placed on 12 partially edentulous lower jaw models. 30 were placed with pilot drilling guides, the other half with dynamic navigation (DENACAM®). In addition, implant placement in interdental gaps and free-end situations were investigated. Accuracy was assessed by cone-beam computed tomography (CBCT). Results Both systems achieved clinically acceptable results, yet more accurate results regarding the offset of implant base and tip in several spatial dimensions were achieved using drilling guides (each p < 0.05). With regard to angulation, real-time navigation was more precise (p = 0.0016). Its inaccuracy was 3°; the template-guided systems was 4.6°. Median horizontal deviation was 0.52 mm at base and 0.75 mm at tip using DENACAM®. When using the pilot drill guide, horizontal deviation was 0.34 mm in the median and at the tip by 0.59 mm. Regarding angulation, it was found that the closer the drill hole was to the system's marker, the better navigation performed. The template did not show this trend (p = 0.0043; and p = 0.0022). Conclusion Considering the limitations of an in vitro study, dynamic navigation can be used be a tool for reliable and accurate implantation. However, further clinical studies need to follow in order to provide an evidence-based recommendation for use in vivo. Supplementary Information The online version contains supplementary material available at 10.1186/s40729-022-00430-6.
Collapse
Affiliation(s)
- Robert Stünkel
- Department of Maxillofacial Surgery, Georg August University, Göttingen, Germany
| | - Alexander-Nicolai Zeller
- Department of Maxillofacial Surgery, Hannover Medical School, Carl-Neuberg-Straße 1, 30625, Hannover, Germany.
| | | | - Florian Böhrnsen
- Department of Maxillofacial Surgery, Georg August University, Göttingen, Germany
| | - Edris Wedi
- Department of Gastroenterology and Gastrointestinal Oncology, Interdisciplinary Endoscopy, University Medical Center, Georg August University, Göttingen, Germany
| | - David Raschke
- Department of Maxillofacial Surgery, Georg August University, Göttingen, Germany
| | - Philipp Kauffmann
- Department of Maxillofacial Surgery, Georg August University, Göttingen, Germany
| |
Collapse
|
22
|
Chen F, Xu P, Xie Y, Zhang D, Liao H, Zhao Z. Annotation-guided encoder-decoder network for bone extraction in ultrasound-assisted orthopedic surgery. Comput Biol Med 2022; 148:105813. [PMID: 35849949 DOI: 10.1016/j.compbiomed.2022.105813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2022] [Revised: 06/05/2022] [Accepted: 07/03/2022] [Indexed: 11/03/2022]
Abstract
The patients and surgeons are usually exposed in massive ionizing radiation during fluoroscopy-based navigation orthopedic surgery. Comparatively, ultrasound-assisted orthopedic surgery could not only decrease the risk of radiation but also provide rich navigation information. However, due to the artifacts in ultrasound images, the extraction of bone structure from ultrasound sequences can be a particularly difficult task, which leads to some major challenges in ultrasound-assisted orthopedic navigation. In this paper, we propose an annotation-guided encoder-decoder network (AGN) to extract bone structure from the radiation-free ultrasound sequences. Specifically, the variability of the ultrasound probe's pose leads to the change of the ultrasound frame during the acquisition of ultrasound sequences. Therefore, a feature alignment module deployed in the AGN model is used to achieve reliable matching across ultrasound frames. Moreover, inspired by the interactive ultrasound analysis, where user annotated foreground information can help target extraction, our AGN model incorporates the annotation information obtained by Siamese networks. Experimental results validated that the AGN model not only produced better bone surface extraction than state-of-the-art methods (IOU: 0.92 versus. 0.88), but also achieved almost real-time extraction with the speed about 15 frames per second. In addition, the acquired bone surface further provided radiation-free 3D intraoperative bone structure for the intuitive navigation of orthopedic surgery.
Collapse
Affiliation(s)
- Fang Chen
- Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, MIIT Key Laboratory of Pattern Analysis and Machine Intelligence, China.
| | - Peng Xu
- Children's Hospital of Nanjing Medical University, Nanjing, 21106, China.
| | - Yanting Xie
- Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, MIIT Key Laboratory of Pattern Analysis and Machine Intelligence, China
| | - Daoqiang Zhang
- Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, MIIT Key Laboratory of Pattern Analysis and Machine Intelligence, China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, China
| | - Zhe Zhao
- Department of Orthopaedics, Beijing Tsinghua Changgung Hospital, Tsinghua University, China
| |
Collapse
|
23
|
Shao L, Yang S, Fu T, Lin Y, Geng H, Ai D, Fan J, Song H, Zhang T, Yang J. Augmented reality calibration using feature triangulation iteration-based registration for surgical navigation. Comput Biol Med 2022; 148:105826. [PMID: 35810696 DOI: 10.1016/j.compbiomed.2022.105826] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 06/24/2022] [Accepted: 07/03/2022] [Indexed: 11/03/2022]
Abstract
BACKGROUND Marker-based augmented reality (AR) calibration methods for surgical navigation often require a second computed tomography scan of the patient, and their clinical application is limited due to high manufacturing costs and low accuracy. METHODS This work introduces a novel type of AR calibration framework that combines a Microsoft HoloLens device with a single camera registration module for surgical navigation. A camera is used to gather multi-view images of a patient for reconstruction in this framework. A shape feature matching-based search method is proposed to adjust the size of the reconstructed model. The double clustering-based 3D point cloud segmentation method and 3D line segment detection method are also proposed to extract the corner points of the image marker. The corner points are the registration data of the image marker. A feature triangulation iteration-based registration method is proposed to quickly and accurately calibrate the pose relationship between the image marker and the patient in the virtual and real space. The patient model after registration is wirelessly transmitted to the HoloLens device to display the AR scene. RESULTS The proposed approach was used to conduct accuracy verification experiments on the phantoms and volunteers, which were compared with six advanced AR calibration methods. The proposed method obtained average fusion errors of 0.70 ± 0.16 and 0.91 ± 0.13 mm in phantom and volunteer experiments, respectively. The fusion accuracy of the proposed method is the highest among all comparison methods. A volunteer liver puncture clinical simulation experiment was also conducted to show the clinical feasibility. CONCLUSIONS Our experiments proved the effectiveness of the proposed AR calibration method, and revealed a considerable potential for improving surgical performance.
Collapse
Affiliation(s)
- Long Shao
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing, 100081, China
| | - Shuo Yang
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing, 100081, China
| | - Tianyu Fu
- School of Medical Technology, Beijing Institute of Technology, Beijing, 100081, China.
| | - Yucong Lin
- School of Medical Technology, Beijing Institute of Technology, Beijing, 100081, China.
| | - Haixiao Geng
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing, 100081, China
| | - Danni Ai
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing, 100081, China
| | - Jingfan Fan
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing, 100081, China
| | - Hong Song
- School of Computer Science & Technology, Beijing Institute of Technology, Beijing, 100081, China
| | - Tao Zhang
- Peking Union Medical College Hospital, Department of Oral and Maxillofacial Surgery, Beijing, 100730, China
| | - Jian Yang
- Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing, 100081, China
| |
Collapse
|
24
|
Dolega-Dolegowski D, Proniewska K, Dolega-Dolegowska M, Pregowska A, Hajto-Bryk J, Trojak M, Chmiel J, Walecki P, Fudalej PS. Application of holography and augmented reality based technology to visualize the internal structure of the dental root - a proof of concept. Head Face Med 2022; 18:12. [PMID: 35382839 PMCID: PMC8981712 DOI: 10.1186/s13005-022-00307-4] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2021] [Accepted: 01/18/2022] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND The Augmented Reality (AR) blends digital information with the real world. Thanks to cameras, sensors, and displays it can supplement the physical world with holographic images. Nowadays, the applications of AR range from navigated surgery to vehicle navigation. DEVELOPMENT The purpose of this feasibility study was to develop an AR holographic system implementing Vertucci's classification of dental root morphology to facilitate the study of tooth anatomy. It was tailored to run on the AR HoloLens 2 (Microsoft) glasses. The 3D tooth models were created in Autodesk Maya and exported to Unity software. The holograms of dental roots can be projected in a natural setting of the dental office. The application allowed to display 3D objects in such a way that they could be rotated, zoomed in/out, and penetrated. The advantage of the proposed approach was that students could learn a 3D internal anatomy of the teeth without environmental visual restrictions. CONCLUSIONS It is feasible to visualize internal dental root anatomy with AR holographic system. AR holograms seem to be attractive adjunct for learning of root anatomy.
Collapse
Affiliation(s)
| | | | | | - Agnieszka Pregowska
- Institute of Fundamental Technological Research, Polish Academy of Sciences, Warsaw, Poland
| | | | | | - Jakub Chmiel
- Department of Cardiac and Vascular Diseases, Jagiellonian University Medical College, Krakow, Poland
| | - Piotr Walecki
- Jagiellonian University Medical College, Kraków, Poland
| | - Piotr S Fudalej
- Institute of Dentistry and Oral Sciences, Faculty of Medicine and Dentistry, Palacký University Olomouc, Olomouc, Czech Republic. .,Institute of Dentistry, Jagiellonian University Medical College, Kraków, Poland. .,Department of Orthodontics and Dentofacial Orthopedics, University of Bern, Bern, Switzerland.
| |
Collapse
|
25
|
Augmented reality navigation with real-time tracking for facial repair surgery. Int J Comput Assist Radiol Surg 2022; 17:981-991. [PMID: 35286586 DOI: 10.1007/s11548-022-02589-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2021] [Accepted: 02/26/2022] [Indexed: 11/05/2022]
Abstract
PURPOSE Facial repair surgeries (FRS) require accuracy for navigating the critical anatomy safely and quickly. The purpose of this paper is to develop a method to directly track the position of the patient using video data acquired from the single camera, which can achieve noninvasive, real time, and high positioning accuracy in FRS. METHODS Our method first performs camera calibration and registers the surface segmented from computed tomography to the patient. Then, a two-step constraint algorithm, which includes the feature local constraint and the distance standard deviation constraint, is used to find the optimal feature matching pair quickly. Finally, the movements of the camera and the patient decomposed from the image motion matrix are used to track the camera and the patient, respectively. RESULTS The proposed method achieved fusion error RMS of 1.44 ± 0.35, 1.50 ± 0.15, 1.63 ± 0.03 mm in skull phantom, cadaver mandible, and human experiments, respectively. The above errors of the proposed method were lower than those of the optical tracking system-based method. Additionally, the proposed method could process video streams up to 24 frames per second, which can meet the real-time requirements of FRS. CONCLUSIONS The proposed method does not rely on tracking markers attached to the patient; it could be executed automatically to maintain the correct augmented reality scene and overcome the decrease in positioning accuracy caused by patient movement during surgery.
Collapse
|
26
|
Accuracy of dental implant placement using augmented reality-based navigation, static computer assisted implant surgery, and the free-hand method: An in vitro study In vitro evaluation of accuracy of dental implant placement guided by three distinct navigational methods. J Dent 2022; 119:104070. [DOI: 10.1016/j.jdent.2022.104070] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2021] [Revised: 02/03/2022] [Accepted: 02/16/2022] [Indexed: 12/17/2022] Open
|
27
|
García-Sevilla M, Moreta-Martinez R, García-Mato D, Arenas de Frutos G, Ochandiano S, Navarro-Cuéllar C, Sanjuán de Moreta G, Pascau J. Surgical Navigation, Augmented Reality, and 3D Printing for Hard Palate Adenoid Cystic Carcinoma En-Bloc Resection: Case Report and Literature Review. Front Oncol 2022; 11:741191. [PMID: 35059309 PMCID: PMC8763795 DOI: 10.3389/fonc.2021.741191] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Accepted: 11/26/2021] [Indexed: 12/18/2022] Open
Abstract
Adenoid Cystic Carcinoma is a rare and aggressive tumor representing less than 1% of head and neck cancers. This malignancy often arises from the minor salivary glands, being the palate its most common location. Surgical en-bloc resection with clear margins is the primary treatment. However, this location presents a limited line of sight and a high risk of injuries, making the surgical procedure challenging. In this context, technologies such as intraoperative navigation can become an effective tool, reducing morbidity and improving the safety and accuracy of the procedure. Although their use is extended in fields such as neurosurgery, their application in maxillofacial surgery has not been widely evidenced. One reason is the need to rigidly fixate a navigation reference to the patient, which often entails an invasive setup. In this work, we studied three alternative and less invasive setups using optical tracking, 3D printing and augmented reality. We evaluated their precision in a patient-specific phantom, obtaining errors below 1 mm. The optimum setup was finally applied in a clinical case, where the navigation software was used to guide the tumor resection. Points were collected along the surgical margins after resection and compared with the real ones identified in the postoperative CT. Distances of less than 2 mm were obtained in 90% of the samples. Moreover, the navigation provided confidence to the surgeons, who could then undertake a less invasive and more conservative approach. The postoperative CT scans showed adequate resection margins and confirmed that the patient is free of disease after two years of follow-up.
Collapse
Affiliation(s)
- Mónica García-Sevilla
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Rafael Moreta-Martinez
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - David García-Mato
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| | - Gema Arenas de Frutos
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Santiago Ochandiano
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Carlos Navarro-Cuéllar
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Guillermo Sanjuán de Moreta
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain.,Servicio de Otorrinolaringología, Hospital General Universitario Gregorio Marañón, Madrid, Spain
| | - Javier Pascau
- Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Madrid, Spain.,Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, Spain
| |
Collapse
|
28
|
Riad Deglow E, Toledano Gil S, Zubizarreta-Macho Á, Bufalá Pérez M, Rodríguez Torres P, Tzironi G, Albaladejo Martínez A, López Román A, Hernández Montero S. Influence of the Computer-Aided Static Navigation Technique and Mixed Reality Technology on the Accuracy of the Orthodontic Micro-Screws Placement. An In Vitro Study. J Pers Med 2021; 11:jpm11100964. [PMID: 34683105 PMCID: PMC8539767 DOI: 10.3390/jpm11100964] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2021] [Revised: 09/22/2021] [Accepted: 09/24/2021] [Indexed: 12/03/2022] Open
Abstract
To analyze the effect of a computer-aided static navigation technique and mixed reality technology on the accuracy of orthodontic micro-screw placement. Material and methods: Two hundred and seven orthodontic micro-screws were placed using either a computer-aided static navigation technique (NAV), a mixed reality device (MR), or a conventional freehand technique (FHT). Accuracy across different dental sectors was also analyzed. CBCT and intraoral scans were taken both prior to and following orthodontic micro-screw placement. The deviation angle and horizontal deviation were then analyzed; these measurements were taken at the coronal entry point and apical endpoint between the planned and performed orthodontic micro-screws. In addition, any complications resulting from micro-screw placement, such as spot perforations, were also analyzed across all dental sectors. Results: The statistical analysis showed significant differences between study groups with regard to the coronal entry-point (p < 0.001). The NAV study group showed statistically significant differences from the FHT (p < 0.001) and MR study groups (p < 0.001) at the apical end-point (p < 0.001), and the FHT group found significant differences from the angular deviations of the NAV (p < 0.001) and MR study groups deviations (p = 0.0011). Different dental sectors also differed significantly. (p < 0.001) Additionally, twelve root perforations were observed in the FHT group, while there were no root perforations in the NAV group. Conclusions: Computer-aided static navigation technique enable more accurate orthodontic micro-screw placement and fewer intraoperative complications when compared with the mixed reality technology and conventional freehand techniques.
Collapse
Affiliation(s)
- Elena Riad Deglow
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X el Sabio University, 28691 Madrid, Spain; (E.R.D.); (S.T.G.); (M.B.P.); (P.R.T.); (A.L.R.); (S.H.M.)
| | - Sergio Toledano Gil
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X el Sabio University, 28691 Madrid, Spain; (E.R.D.); (S.T.G.); (M.B.P.); (P.R.T.); (A.L.R.); (S.H.M.)
| | - Álvaro Zubizarreta-Macho
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X el Sabio University, 28691 Madrid, Spain; (E.R.D.); (S.T.G.); (M.B.P.); (P.R.T.); (A.L.R.); (S.H.M.)
- Department of Orthodontics, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain; (G.T.); (A.A.M.)
- Correspondence:
| | - María Bufalá Pérez
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X el Sabio University, 28691 Madrid, Spain; (E.R.D.); (S.T.G.); (M.B.P.); (P.R.T.); (A.L.R.); (S.H.M.)
| | - Paulina Rodríguez Torres
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X el Sabio University, 28691 Madrid, Spain; (E.R.D.); (S.T.G.); (M.B.P.); (P.R.T.); (A.L.R.); (S.H.M.)
| | - Georgia Tzironi
- Department of Orthodontics, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain; (G.T.); (A.A.M.)
| | - Alberto Albaladejo Martínez
- Department of Orthodontics, Faculty of Medicine and Dentistry, University of Salamanca, 37008 Salamanca, Spain; (G.T.); (A.A.M.)
| | - Antonio López Román
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X el Sabio University, 28691 Madrid, Spain; (E.R.D.); (S.T.G.); (M.B.P.); (P.R.T.); (A.L.R.); (S.H.M.)
| | - Sofía Hernández Montero
- Department of Implant Surgery, Faculty of Health Sciences, Alfonso X el Sabio University, 28691 Madrid, Spain; (E.R.D.); (S.T.G.); (M.B.P.); (P.R.T.); (A.L.R.); (S.H.M.)
| |
Collapse
|
29
|
Registration-free workflow for electromagnetic and optical navigation in orbital and craniofacial surgery. Sci Rep 2021; 11:18080. [PMID: 34508161 PMCID: PMC8433137 DOI: 10.1038/s41598-021-97706-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2020] [Accepted: 08/13/2021] [Indexed: 11/25/2022] Open
Abstract
The accuracy of intra-operative navigation is largely dependent on the intra-operative registration procedure. Next to accuracy, important factors to consider for the registration procedure are invasiveness, time consumption, logistical demands, user-dependency, compatibility and radiation exposure. In this study, a workflow is presented that eliminates the need for a registration procedure altogether: registration-free navigation. In the workflow, the maxillary dental model is fused to the pre-operative imaging data using commercially available virtual planning software. A virtual Dynamic Reference Frame on a splint is designed on the patient’s fused maxillary dentition: during surgery, the splint containing the reference frame is positioned on the patient’s dentition. This alleviates the need for any registration procedure, since the position of the reference frame is known from the design. The accuracy of the workflow was evaluated in a cadaver set-up, and compared to bone-anchored fiducial, virtual splint and surface-based registration. The results showed that accuracy of the workflow was greatly dependent on tracking technique used: the workflow was the most accurate with electromagnetic tracking, but the least accurate with optical tracking. Although this method offers a time-efficient, non-invasive, radiation-free automatic alternative for registration, clinical implementation is hampered by the unexplained differences in accuracy between tracking techniques.
Collapse
|
30
|
Huang T, Li R, Li Y, Zhang X, Liao H. Augmented reality-based autostereoscopic surgical visualization system for telesurgery. Int J Comput Assist Radiol Surg 2021; 16:1985-1997. [PMID: 34363583 DOI: 10.1007/s11548-021-02463-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2021] [Accepted: 07/15/2021] [Indexed: 10/20/2022]
Abstract
PURPOSE The visualization of remote surgical scenes is the key to realizing the remote operation of surgical robots. However, current non-endoscopic surgical robot systems lack an effective visualization tool to offer sufficient surgical scene information and depth perception. METHODS We propose a novel autostereoscopic surgical visualization system integrating 3D intraoperative scene reconstruction, autostereoscopic 3D display, and augmented reality-based image fusion. The preoperative organ structure and the intraoperative surface point cloud are obtained from medical imaging and the RGB-D camera, respectively, and aligned by an automatic marker-free intraoperative registration algorithm. After registration, preoperative meshes with precalculated illumination and intraoperative textured point cloud are blended in real time. Finally, the fused image is shown on a 3D autostereoscopic display device to achieve depth perception. RESULTS A prototype of the autostereoscopic surgical visualization system was built. The system had a horizontal image resolution of 1.31 mm, a vertical image resolution of 0.82 mm, an average rendering rate of 33.1 FPS, an average registration rate of 20.5 FPS, and average registration errors of approximately 3 mm. A telesurgical robot prototype based on 3D autostereoscopic display was built. The quantitative evaluation experiments showed that our system achieved similar operational accuracy (1.79 ± 0.87 mm) as the conventional system (1.95 ± 0.71 mm), while having advantages in terms of completion time (with 34.11% reduction) and path length (with 35.87% reduction). Post-experimental questionnaires indicated that the system was user-friendly for novices and experts. CONCLUSION We propose a 3D surgical visualization system with augmented instruction and depth perception for telesurgery. The qualitative and quantitative evaluation results illustrate the accuracy and efficiency of the proposed system. Therefore, it shows great prospects in robotic surgery and telesurgery.
Collapse
Affiliation(s)
- Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Ruiyang Li
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Yangxi Li
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Xinran Zhang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China.
| |
Collapse
|
31
|
de Sales MC, Florez RM, da Silva Guimaraes J, da Silva Salomão GV, Tedesco TK, Allegrini S. Guided Surgery with 3D Printed Device: A Case Report. J ORAL IMPLANTOL 2021; 47:325-332. [PMID: 32835368 DOI: 10.1563/aaid-joi-d-19-00278] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Dental surgeons need in-depth knowledge of the bone tissue status and gingival morphology of atrophic maxillae. The aim of this study is to describe preoperative virtual planning of placement of 5 implants and to compare the plan with the actual surgical results. Three-dimensional (3D) planning of rehabilitation using software programs enables surgical guides to be specially designed for the implant site and manufactured using 3D printing. A patient with 5 teeth missing was selected for this study. The patient's maxillary region was scanned with cone-beam computed tomography (CBCT), and a cast model was produced. After virtual planning using ImplantViewer, 5 implants were placed using a printed surgical guide. Two weeks after the surgical procedure, the patient underwent another CBCT scan of the maxilla. Statistically significant differences were detected between the virtually planned positions and the actual positions of the implants, with a mean deviation of 0.36 mm in the cervical region and 0.7 mm in the apical region. The surgical technique used enables more accurate procedures compared with the conventional technique. Implants can be better positioned, with a high level of predictability, reducing both operating time and patient discomfort.
Collapse
Affiliation(s)
| | - Rafael Maluza Florez
- Departments of Oral Surgery and Prosthodontics, Santa Cecilia University, Santos, SP, Brazil
| | | | | | - Tamara Kerber Tedesco
- Division of Master and Doctorate in Dentistry, Program of Scientific Dentistry, Ibirapuera University, São Paulo, SP, Brazil
| | - Sergio Allegrini
- Division of Master and Doctorate in Dentistry, Program of Scientific Dentistry, Ibirapuera University, São Paulo, SP, Brazil
| |
Collapse
|
32
|
Virtual Reality (VR) Simulation and Augmented Reality (AR) Navigation in Orthognathic Surgery: A Case Report. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11125673] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
VR and AR technology have gradually developed to the extent that they could help operators in the surgical field. In this study, we present a case of VR simulation for preoperative planning and AR navigation applied to orthognathic surgery. The average difference between the preplanned data and the post-operative results was 3.00 mm, on average, and the standard deviation was 1.44 mm. VR simulation could provide great advantages for 3D medical simulations, with accurate manipulation and immersiveness. AR navigation has great potential in medical application; its advantages include displaying real time augmented 3D models of patients. Moreover, it is easily applied in the surgical field, without complicated 3D simulations or 3D-printed surgical guides.
Collapse
|
33
|
Zhou Z, Yang Z, Jiang S, Ma X, Zhang F, Yan H. Surgical Navigation System for Low-Dose-Rate Brachytherapy Based on Mixed Reality. IEEE COMPUTER GRAPHICS AND APPLICATIONS 2021; 41:113-123. [PMID: 31902757 DOI: 10.1109/mcg.2019.2963657] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
This article presents a personalized mixed reality (MR) surgical assistance system for brachytherapy. Using a novel, modified multi-information fusion method, the fusion of virtual organs and a preoperative plan for an actual patient and the real-time tracking of surgical tools were achieved. Using the quaternion-based iterative closest point (QICP) algorithm and a hand-eye calibration method, the preoperative plan can be fused into individual patients. Using the electromagnetic (EM) tracker, users can track the surgery tools in real time, without multiple CT scans, and doctors can immediately perform the surgery. We performed a series of experiments, including phantom and animal experiments, to test the accuracy and efficiency of the system. In the phantom experiment, the average needle location error was 0.957 mm. Based on the results of animal experiments, the needle insertion error was 2.416 mm. All experimental results indicated that the procedure could be applied in further clinical studies.
Collapse
|
34
|
Chen F, Cui X, Han B, Liu J, Zhang X, Liao H. Augmented reality navigation for minimally invasive knee surgery using enhanced arthroscopy. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 201:105952. [PMID: 33561710 DOI: 10.1016/j.cmpb.2021.105952] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 01/21/2021] [Indexed: 06/12/2023]
Abstract
PURPOSE During the minimally invasive knee surgery, surgeons insert surgical instruments and arthroscopy through small incisions, and implement treatment assisted by 2D arthroscopic images. However, this 2D arthroscopic navigation faces several problems. Firstly, the guidance information is displayed on a screen away from the surgical area, which makes hand/eye coordination difficult. Secondly, the small incision limits the surgeons to view the internal knee structures only from an arthroscopic camera. In addition, arthroscopic images commonly appear obscure visions. METHODS To solve these problems, we proposed a novel in-situ augmented reality navigation system with the enhanced arthroscopic information. Firstly, intraoperative anatomical locations were obtained by using arthroscopic images and arthroscopy calibration. Secondly, tissue properties-based model deformation method was proposed to update the 3D preoperative knee model with anatomical location information. Then, the updated model was further rendered with glasses-free real 3D display for achieving the global in-situ augmented reality view. In addition, virtual arthroscopic images were generated from the updated preoperative model to provide the anatomical information of the operation area. RESULTS Experimental results demonstrated that virtual arthroscopic images could reflect the correct structure information with a mean error of 0.32 mm. Compared with 2D arthroscopic navigation, the proposed augmented reality navigation reduced the targeting errors by 2.10 mm and 2.70 mm for the experiments of knee phantom and in-vitro swine knee, respectively. CONCLUSION Our navigation method is helpful for minimally invasive knee surgery since it can provide the global in-situ information and detail anatomical information.
Collapse
Affiliation(s)
- Fang Chen
- Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics, MIIT Key Laboratory of Pattern Analysis and Machine Intelligence, Nanjing, China.
| | - Xiwen Cui
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Boxuan Han
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Jia Liu
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Xinran Zhang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China.
| |
Collapse
|
35
|
Maharjan N, Alsadoon A, Prasad PWC, Abdullah S, Rashid TA. A novel visualization system of using augmented reality in knee replacement surgery: Enhanced bidirectional maximum correntropy algorithm. Int J Med Robot 2021; 17:e2223. [PMID: 33421286 DOI: 10.1002/rcs.2223] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2019] [Revised: 08/18/2020] [Accepted: 08/19/2020] [Indexed: 02/06/2023]
Abstract
BACKGROUND AND AIM Image registration and alignment are the main limitations of augmented reality (AR)-based knee replacement surgery. This research aims to decrease the registration error, eliminate outcomes that are trapped in local minima to improve the alignment problems, handle the occlusion and maximize the overlapping parts. METHODOLOGY Markerless image registration method was used for AR-based knee replacement surgery to guide and visualize the surgical operation. While weight least square algorithm was used to enhance stereo camera-based tracking by filling border occlusion in right-to-left direction and non-border occlusion from left-to-right direction. RESULTS This study has improved video precision to 0.57-0.61 mm alignment error. Furthermore, with the use of bidirectional points, that is, forward and backward directional cloud point, the iteration on image registration was decreased. This has led to improve the processing time as well. The processing time of video frames was improved to 7.4-11.74 frames per second. CONCLUSIONS It seems clear that this proposed system has focused on overcoming the misalignment difficulty caused by the movement of patient and enhancing the AR visualization during knee replacement surgery. The proposed system was reliable and favourable which helps in eliminating alignment error by ascertaining the optimal rigid transformation between two cloud points and removing the outliers and non-Gaussian noise. The proposed AR system helps in accurate visualization and navigation of anatomy of knee such as femur, tibia, cartilage, blood vessels and so forth.
Collapse
Affiliation(s)
- Nitish Maharjan
- School of Computing and Mathematics, Charles Sturt University (CSU), Sydney Campus, Wagga Wagga, Australia
| | - Abeer Alsadoon
- School of Computing and Mathematics, Charles Sturt University (CSU), Sydney Campus, Wagga Wagga, Australia.,School of Computer Data and Mathematical Sciences, University of Western Sydney (UWS), Sydney, Australia.,School of Information Technology, Southern Cross University (SCU), Sydney, Australia.,Asia Pacific International College (APIC), Information Technology Department, Sydney, Australia.,Kent Institute Australia, Sydney, Australia
| | - P W C Prasad
- School of Computing and Mathematics, Charles Sturt University (CSU), Sydney Campus, Wagga Wagga, Australia
| | - Salma Abdullah
- Department of Computer Engineering, University of Technology, Baghdad, Iraq
| | - Tarik A Rashid
- Asia Pacific International College (APIC), Information Technology Department, Sydney, Australia
| |
Collapse
|
36
|
Liu J, Li X, Shen S, Jiang X, Chen W, Li Z. Research on Panoramic Stitching Algorithm of Lateral Cranial Sequence Images in Dental Multifunctional Cone Beam Computed Tomography. SENSORS 2021; 21:s21062200. [PMID: 33801108 PMCID: PMC8004189 DOI: 10.3390/s21062200] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Revised: 03/14/2021] [Accepted: 03/19/2021] [Indexed: 11/16/2022]
Abstract
In the design of dental multifunctional Cone Beam Computed Tomography, the linear scanning strategy not only saves equipment cost, but also avoids the demand for patients to be repositioned when acquiring lateral cranial sequence images. In order to obtain panoramic images, we propose a local normalized cross-correlation stitching algorithm based on Gaussian Mixture Model. Firstly, the Block-Matching and 3D filtering algorithm is used to remove quantum and impulse noises according to the characteristics of X-ray images; Then, the segmentation of the irrelevant region and the extraction of the region of interest are performed by Gaussian Mixture Model; The locally normalized cross-relation is used to complete the registration with the multi-resolution strategy based on wavelet transform and Particle Swarm Optimization algorithm; Finally, image fusion is achieved by the weighted smoothing fusion algorithm. The experimental results show that the panoramic image obtained by this method has significant performance in both subjective vision and objective quality evaluation and can be applied to preoperative diagnosis of clinical dental deformity and postoperative effect evaluation.
Collapse
Affiliation(s)
- Junyuan Liu
- Medical Electronics and Information Technology Engineering Research Center, Chongqing University of Posts and Telecommunications, Chong Qing 400065, China; (J.L.); (S.S.); (X.J.); (W.C.)
| | - Xi Li
- Foundation Department, Chongqing Medical and Pharmaceutical College, Chongqing 401331, China;
- School of Communication and Information Engineering, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
| | - Siwan Shen
- Medical Electronics and Information Technology Engineering Research Center, Chongqing University of Posts and Telecommunications, Chong Qing 400065, China; (J.L.); (S.S.); (X.J.); (W.C.)
| | - Xiaoming Jiang
- Medical Electronics and Information Technology Engineering Research Center, Chongqing University of Posts and Telecommunications, Chong Qing 400065, China; (J.L.); (S.S.); (X.J.); (W.C.)
| | - Wang Chen
- Medical Electronics and Information Technology Engineering Research Center, Chongqing University of Posts and Telecommunications, Chong Qing 400065, China; (J.L.); (S.S.); (X.J.); (W.C.)
| | - Zhangyong Li
- Medical Electronics and Information Technology Engineering Research Center, Chongqing University of Posts and Telecommunications, Chong Qing 400065, China; (J.L.); (S.S.); (X.J.); (W.C.)
- Correspondence:
| |
Collapse
|
37
|
Jorba-García A, González-Barnadas A, Camps-Font O, Figueiredo R, Valmaseda-Castellón E. Accuracy assessment of dynamic computer-aided implant placement: a systematic review and meta-analysis. Clin Oral Investig 2021; 25:2479-2494. [PMID: 33635397 DOI: 10.1007/s00784-021-03833-8] [Citation(s) in RCA: 53] [Impact Index Per Article: 17.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2020] [Accepted: 02/05/2021] [Indexed: 12/18/2022]
Abstract
OBJECTIVES To assess the accuracy of dynamic computer-aided implant surgery (dCAIS) systems when used to place dental implants and to compare its accuracy with static computer-aided implant surgery (sCAIS) systems and freehand implant placement. MATERIALS AND METHODS An electronic search was made to identify all relevant studies reporting on the accuracy of dCAIS systems for dental implant placement. The following PICO question was developed: "In patients or artificial models, is dental implant placement accuracy higher when dCAIS systems are used in comparison with sCAIS systems or with freehand placement? The main outcome variable was angular deviation between the central axes of the planned and final position of the implant. The data were extracted in descriptive tables, and a meta-analysis of single means was performed in order to estimate the deviations for each variable using a random-effects model. RESULTS Out of 904 potential articles, the 24 selected assessed 9 different dynamic navigation systems. The mean angular and entry 3D global deviations for clinical studies were 3.68° (95% CI: 3.61 to 3.74; I2 = 99.4%) and 1.03 mm (95% CI: 1.01 to 1.04; I2 = 82.4%), respectively. Lower deviation values were reported in in vitro studies (mean angular deviation of 2.01° (95% CI: 1.95 to 2.07; I2 = 99.1%) and mean entry 3D global deviation of 0.46 mm (95% CI: 0.44 to 0.48 ; I2 = 98.5%). No significant differences were found between the different dCAIS systems. These systems were significantly more accurate than sCAIS systems (mean difference (MD): -0.86°; 95% CI: -1.35 to -0.36) and freehand implant placement (MD: -4.33°; 95% CI: -5.40 to -3.25). CONCLUSION dCAIS systems allow highly accurate implant placement with a mean angular of less than 4°. However, a 2-mm safety margin should be applied, since deviations of more than 1 mm were observed. dCAIS systems increase the implant placement accuracy when compared with freehand implant placement and also seem to slightly decrease the angular deviation in comparison with sCAIS systems. CLINICAL RELEVANCE The use of dCAIS could reduce the rate of complications since it allows a highly accurate implant placement.
Collapse
Affiliation(s)
- Adrià Jorba-García
- Faculty of Medicine and Health Sciences, University of Barcelona, Barcelona, Spain
| | - Albert González-Barnadas
- Faculty of Medicine and Health Sciences, University of Barcelona, Barcelona, Spain.,IDIBELL Institute, Barcelona, Spain
| | - Octavi Camps-Font
- Faculty of Medicine and Health Sciences, University of Barcelona, Barcelona, Spain.,IDIBELL Institute, Barcelona, Spain
| | - Rui Figueiredo
- Faculty of Medicine and Health Sciences, University of Barcelona, Barcelona, Spain. .,IDIBELL Institute, Barcelona, Spain. .,Facultat de Medicina i Ciències de la Salut, Campus de Bellvitge, Universitat de Barcelona (UB), Pavelló de Govern, 2a Planta, Despatx 2.9, C/Feixa Llarga s/n, E-08907 L'Hospitalet de Llobregat, Barcelona, Spain.
| | - Eduard Valmaseda-Castellón
- Faculty of Medicine and Health Sciences, University of Barcelona, Barcelona, Spain.,IDIBELL Institute, Barcelona, Spain
| |
Collapse
|
38
|
Benmahdjoub M, van Walsum T, van Twisk P, Wolvius EB. Augmented reality in craniomaxillofacial surgery: added value and proposed recommendations through a systematic review of the literature. Int J Oral Maxillofac Surg 2020; 50:969-978. [PMID: 33339731 DOI: 10.1016/j.ijom.2020.11.015] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2019] [Revised: 11/11/2020] [Accepted: 11/18/2020] [Indexed: 10/22/2022]
Abstract
This systematic review provides an overview of augmented reality (AR) and its benefits in craniomaxillofacial surgery in an attempt to answer the question: Is AR beneficial for craniomaxillofacial surgery? This review includes a description of the studies conducted, the systems used and their technical characteristics. The search was performed in four databases: PubMed, Cochrane Library, Embase, and Web of Science. All journal articles published during the past 11 years related to AR, mixed reality, craniomaxillofacial, and surgery were considered in this study. From a total of 7067 articles identified using AR- and surgery-related keywords, 39 articles were finally selected. Based on these articles, a classification of study types, surgery types, devices used, metrics reported, and benefits were collected. The findings of this review indicate that AR could provide various benefits, addressing the challenges of conventional navigation systems, such as hand-eye coordination and depth perception. However, three main concerns were raised while performing this study: (1) it is complicated to aggregate the metrics reported in the articles, (2) it is difficult to obtain statistical value from the current studies, and (3) user evaluation studies are lacking. This article concludes with recommendations for future studies by addressing the latter points.
Collapse
Affiliation(s)
- M Benmahdjoub
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands; Biomedical Imaging Group Rotterdam, Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands.
| | - T van Walsum
- Biomedical Imaging Group Rotterdam, Department of Radiology and Nuclear Medicine, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - P van Twisk
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| | - E B Wolvius
- Department of Oral and Maxillofacial Surgery, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands
| |
Collapse
|
39
|
Chen F, Cui X, Liu J, Han B, Zhang X, Zhang D, Liao H. Tissue Structure Updating for In Situ Augmented Reality Navigation Using Calibrated Ultrasound and Two-Level Surface Warping. IEEE Trans Biomed Eng 2020; 67:3211-3222. [PMID: 32175853 DOI: 10.1109/tbme.2020.2979535] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
OBJECTIVE In minimally invasive surgery (MIS), in situ augmented reality (AR) navigation systems are usually implemented using a glasses-free 3D display to represent the preoperative tissue structure, and can provide intuitive see-through guidance information. However, due to changes in intraoperative tissue, the preoperative tissue structure is not able to exactly correspond to reality, which influences the precision of in situ AR navigation. To solve this problem, we propose a method to update the tissue structure for in situ AR navigation in such way to reflect changes in intraoperative tissue. METHODS The proposed method to update the tissue structure is based on the calibrated ultrasound and two-level surface warping technologies. Firstly, the particle filter-based calibration is implemented to perform ultrasound calibration and obtain intraoperative position of anatomical points. Secondly, intraoperative positions of anatomical points are inputted in the two-level surface warping method to update the preoperative tissue structure. Finally, the glasses-free real 3-D display of the updated tissue structure is finished, and is superimposed onto a patient by a translucent mirror for in situ AR navigation. RESULTS we validated the proposed method by simulating liver tissue intervention, and achieved the tissue updating accuracy of 92.86%. Furthermore, the targeting error of AR navigation based on the proposed method was also evaluated through minimally invasive liver surgery, and the acquired mean targeting error was 1.92 mm. CONCLUSION The results demonstrate that the proposed AR navigation method is effective. SIGNIFICANCE The proposed method can facilitate MIS, as it provides accurate 3D navigation.
Collapse
|
40
|
Budhathoki S, Alsadoon A, Prasad P, Haddad S, Maag A. Augmented reality for narrow area navigation in jaw surgery: Modified tracking by detection volume subtraction algorithm. Int J Med Robot 2020; 16:e2097. [DOI: 10.1002/rcs.2097] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2019] [Revised: 02/19/2020] [Accepted: 02/20/2020] [Indexed: 12/27/2022]
Affiliation(s)
- Srijana Budhathoki
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| | - Abeer Alsadoon
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| | - P.W.C. Prasad
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| | - Sami Haddad
- Department of Oral and Maxillofacial ServicesGreater Western Sydney Area Health Services Sydney Australia
- Department of Oral and Maxillofacial ServicesCentral Coast Area Health Gosford Australia
| | - Angelika Maag
- School of Computing and MathematicsCharles Sturt University Sydney Campus Australia
- Department of Information TechnologyStudy Group Australia Sydney Campus Australia
| |
Collapse
|
41
|
Pérez-Pachón L, Poyade M, Lowe T, Gröning F. Image Overlay Surgery Based on Augmented Reality: A Systematic Review. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2020; 1260:175-195. [PMID: 33211313 DOI: 10.1007/978-3-030-47483-6_10] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
Augmented Reality (AR) applied to surgical guidance is gaining relevance in clinical practice. AR-based image overlay surgery (i.e. the accurate overlay of patient-specific virtual images onto the body surface) helps surgeons to transfer image data produced during the planning of the surgery (e.g. the correct resection margins of tissue flaps) to the operating room, thus increasing accuracy and reducing surgery times. We systematically reviewed 76 studies published between 2004 and August 2018 to explore which existing tracking and registration methods and technologies allow healthcare professionals and researchers to develop and implement these systems in-house. Most studies used non-invasive markers to automatically track a patient's position, as well as customised algorithms, tracking libraries or software development kits (SDKs) to compute the registration between patient-specific 3D models and the patient's body surface. Few studies combined the use of holographic headsets, SDKs and user-friendly game engines, and described portable and wearable systems that combine tracking, registration, hands-free navigation and direct visibility of the surgical site. Most accuracy tests included a low number of subjects and/or measurements and did not normally explore how these systems affect surgery times and success rates. We highlight the need for more procedure-specific experiments with a sufficient number of subjects and measurements and including data about surgical outcomes and patients' recovery. Validation of systems combining the use of holographic headsets, SDKs and game engines is especially interesting as this approach facilitates an easy development of mobile AR applications and thus the implementation of AR-based image overlay surgery in clinical practice.
Collapse
Affiliation(s)
- Laura Pérez-Pachón
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK.
| | - Matthieu Poyade
- School of Simulation and Visualisation, Glasgow School of Art, Glasgow, UK
| | - Terry Lowe
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
- Head and Neck Oncology Unit, Aberdeen Royal Infirmary (NHS Grampian), Aberdeen, UK
| | - Flora Gröning
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
| |
Collapse
|
42
|
Zhou Y, Yoo P, Feng Y, Sankar A, Sadr A, Seibel EJ. Towards AR-assisted visualisation and guidance for imaging of dental decay. Healthc Technol Lett 2019; 6:243-248. [PMID: 32038865 PMCID: PMC6952244 DOI: 10.1049/htl.2019.0082] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2019] [Accepted: 10/02/2019] [Indexed: 12/27/2022] Open
Abstract
Untreated dental decay is the most prevalent dental problem in the world, affecting up to 2.4 billion people and leading to a significant economic and social burden. Early detection can greatly mitigate irreversible effects of dental decay, avoiding the need for expensive restorative treatment that forever disrupts the enamel protective layer of teeth. However, two key challenges exist that make early decay management difficult: unreliable detection and lack of quantitative monitoring during treatment. New optically based imaging through the enamel provides the dentist a safe means to detect, locate, and monitor the healing process. This work explores the use of an augmented reality (AR) headset to improve the workflow of early decay therapy and monitoring. The proposed workflow includes two novel AR-enabled features: (i) in situ visualisation of pre-operative optically based dental images and (ii) augmented guidance for repetitive imaging during therapy monitoring. The workflow is designed to minimise distraction, mitigate hand-eye coordination problems, and help guide monitoring of early decay during therapy in both clinical and mobile environments. The results from quantitative evaluations as well as a formative qualitative user study uncover the potentials of the proposed system and indicate that AR can serve as a promising tool in tooth decay management.
Collapse
Affiliation(s)
- Yaxuan Zhou
- Department of Electrical and Computer Engineering, University of Washington, Seattle, WA 98195, USA
- Human Photonics Lab, Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| | - Paul Yoo
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA 98195, USA
| | - Yingru Feng
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA 98195, USA
| | - Aditya Sankar
- Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA 98195, USA
| | - Alireza Sadr
- School of Dentistry, University of Washington, Seattle, WA 98195, USA
| | - Eric J. Seibel
- Human Photonics Lab, Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| |
Collapse
|
43
|
Weidert S, Wang L, Landes J, Sandner P, Suero EM, Navab N, Kammerlander C, Euler E, Heide A. Video‐augmented fluoroscopy for distal interlocking of intramedullary nails decreased radiation exposure and surgical time in a bovine cadaveric setting. Int J Med Robot 2019; 15:e1995. [DOI: 10.1002/rcs.1995] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2018] [Revised: 02/05/2019] [Accepted: 03/06/2019] [Indexed: 11/12/2022]
Affiliation(s)
- Simon Weidert
- Department of General, Trauma and Reconstructive SurgeryHospital of the University of Munich Munich Germany
| | - Lejing Wang
- Chair for Computer Aided Medical Procedures & Augmented RealityTechnical University of Munich Munich Germany
| | - Juergen Landes
- Klinik für Orthopädie und UnfallchirurgieIsar Klinikum Munich Germany
| | - Philipp Sandner
- Frankfurt School Blockchain CenterFrankfurt School of Finance & Management Frankfurt Germany
| | - Eduardo M. Suero
- Department of General, Trauma and Reconstructive SurgeryHospital of the University of Munich Munich Germany
| | - Nassir Navab
- Chair for Computer Aided Medical Procedures & Augmented RealityTechnical University of Munich Munich Germany
| | - Christian Kammerlander
- Department of General, Trauma and Reconstructive SurgeryHospital of the University of Munich Munich Germany
| | - Ekkehard Euler
- Department of General, Trauma and Reconstructive SurgeryHospital of the University of Munich Munich Germany
| | - Anna Heide
- Department of General, Trauma and Reconstructive SurgeryHospital of the University of Munich Munich Germany
| |
Collapse
|
44
|
Xiao ZR, Xiong G. Computer-assisted Surgery for Scaphoid Fracture. Curr Med Sci 2018; 38:941-948. [PMID: 30536054 DOI: 10.1007/s11596-018-1968-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2018] [Revised: 10/11/2018] [Indexed: 01/09/2023]
Abstract
The computer-assisted surgery (CAS) has significantly improved the accuracy, reliability and outcomes of traumatic, spinal, nerve surgery and many other operations with a less invasive way. The application of CAS for scaphoid fractures remains experimental. The related studies are scanty and most of them are cadaver researches. Some intrinsic defects from the registration procedure, scan and immobilization of limbs may inevitably result in deviations. Some deviations become more obvious with operations of small bones (such as scaphoid) although they are acceptable for spine and other orthopedic surgeries. We reviewed the current literatures on the applications of CAS for scaphoid operation and summarized technical principles, scan and registration methods, immobilization of limbs and their outcomes. On the basis of the data, we analyzed the limitations of this technique and envisioned its future development.
Collapse
Affiliation(s)
- Zi-Run Xiao
- Department of Hand Surgery, Beijing Jishuitan Hospital, Beijing, 100035, China.,Department of Orthopaedic Surgery, the 91st Central Hospital of Chinese People's Liberation Army, Henan, 454000, China
| | - Ge Xiong
- Department of Hand Surgery, Beijing Jishuitan Hospital, Beijing, 100035, China.
| |
Collapse
|