1
|
Necker FN, Cholok DJ, Fischer MJ, Shaheen MS, Gifford K, Januszyk M, Leuze CW, Scholz M, Daniel BL, Momeni A. HoloDIEP-Faster and More Accurate Intraoperative DIEA Perforator Mapping Using a Novel Mixed Reality Tool. J Reconstr Microsurg 2024. [PMID: 39038461 DOI: 10.1055/s-0044-1788548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/24/2024]
Abstract
BACKGROUND Microsurgical breast reconstruction using abdominal tissue is a complex procedure, in part, due to variable vascular/perforator anatomy. Preoperative computed tomography angiography (CTA) has mitigated this challenge to some degree; yet it continues to pose certain challenges. The ability to map perforators with Mixed Reality has been demonstrated in case studies, but its accuracy has not been studied intraoperatively. Here, we compare the accuracy of "HoloDIEP" in identifying perforator location (vs. Doppler ultrasound) by using holographic 3D models derived from preoperative CTA. METHODS Using a custom application on HoloLens, the deep inferior epigastric artery vascular tree was traced in 15 patients who underwent microsurgical breast reconstruction. Perforator markings were compared against the 3D model in a coordinate system centered on the umbilicus. Holographic- and Doppler-identified markings were compared using a perspective-corrected photo technique against the 3D model along with measurement of duration of perforator mapping for each technique. RESULTS Vascular points in HoloDIEP skin markings were -0.97 ± 6.2 mm (perforators: -0.62 ± 6.13 mm) away from 3D-model ground-truth in radial length from the umbilicus at a true distance of 10.81 ± 6.14 mm (perforators: 11.40 ± 6.15 mm). Absolute difference in radial distance was twice as high for Doppler markings compared with Holo-markings (9.71 ± 6.16 and 4.02 ± 3.20 mm, respectively). Only in half of all cases (7/14), more than 50% of the Doppler-identified points were reasonably close (<30 mm) to 3D-model ground-truth. HoloDIEP was twice as fast as Doppler ultrasound (76.9s vs. 150.4 s per abdomen). CONCLUSION HoloDIEP allows for faster and more accurate intraoperative perforator mapping than Doppler ultrasound.
Collapse
Affiliation(s)
- Fabian N Necker
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
- Digital Anatomy Lab, Faculty of Medicine, Institute of Functional and Clinical Anatomy, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - David J Cholok
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Marc J Fischer
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Mohammed S Shaheen
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Kyle Gifford
- Department of Radiology, 3D and Quantitative Imaging, Stanford University School of Medicine, Stanford, California
| | - Michael Januszyk
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Christoph W Leuze
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Michael Scholz
- Digital Anatomy Lab, Faculty of Medicine, Institute of Functional and Clinical Anatomy, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Bruce L Daniel
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Arash Momeni
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| |
Collapse
|
2
|
Czako L, Sufliarsky B, Simko K, Sovis M, Vidova I, Farska J, Lifková M, Hamar T, Galis B. Exploring the Practical Applications of Artificial Intelligence, Deep Learning, and Machine Learning in Maxillofacial Surgery: A Comprehensive Analysis of Published Works. Bioengineering (Basel) 2024; 11:679. [PMID: 39061761 PMCID: PMC11274331 DOI: 10.3390/bioengineering11070679] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2024] [Revised: 05/29/2024] [Accepted: 06/13/2024] [Indexed: 07/28/2024] Open
Abstract
Artificial intelligence (AI), deep learning (DL), and machine learning (ML) are computer, machine, and engineering systems that mimic human intelligence to devise procedures. These technologies also provide opportunities to advance diagnostics and planning in human medicine and dentistry. The purpose of this literature review was to ascertain the applicability and significance of AI and to highlight its uses in maxillofacial surgery. Our primary inclusion criterion was an original paper written in English focusing on the use of AI, DL, or ML in maxillofacial surgery. The sources were PubMed, Scopus, and Web of Science, and the queries were made on the 31 December 2023. The search strings used were "artificial intelligence maxillofacial surgery", "machine learning maxillofacial surgery", and "deep learning maxillofacial surgery". Following the removal of duplicates, the remaining search results were screened by three independent operators to minimize the risk of bias. A total of 324 publications from 1992 to 2023 were finally selected. These were calculated according to the year of publication with a continuous increase (excluding 2012 and 2013) and R2 = 0.9295. Generally, in orthognathic dentistry and maxillofacial surgery, AI and ML have gained popularity over the past few decades. When we included the keywords "planning in maxillofacial surgery" and "planning in orthognathic surgery", the number significantly increased to 7535 publications. The first publication appeared in 1965, with an increasing trend (excluding 2014-2018), with an R2 value of 0.8642. These technologies have been found to be useful in diagnosis and treatment planning in head and neck surgical oncology, cosmetic and aesthetic surgery, and oral pathology. In orthognathic surgery, they have been utilized for diagnosis, treatment planning, assessment of treatment needs, and cephalometric analyses, among other applications. This review confirms that the current use of AI and ML in maxillofacial surgery is focused mainly on evaluating digital diagnostic methods, especially radiology, treatment plans, and postoperative results. However, as these technologies become integrated into maxillofacial surgery and robotic surgery in the head and neck region, it is expected that they will be gradually utilized to plan and comprehensively evaluate the success of maxillofacial surgeries.
Collapse
Affiliation(s)
- Ladislav Czako
- Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava and University Hospital, Ruzinovska 6, 826 06 Bratislava, Slovakia; (L.C.); (K.S.); (M.S.); (I.V.); (J.F.); (B.G.)
| | - Barbora Sufliarsky
- Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava and University Hospital, Ruzinovska 6, 826 06 Bratislava, Slovakia; (L.C.); (K.S.); (M.S.); (I.V.); (J.F.); (B.G.)
| | - Kristian Simko
- Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava and University Hospital, Ruzinovska 6, 826 06 Bratislava, Slovakia; (L.C.); (K.S.); (M.S.); (I.V.); (J.F.); (B.G.)
| | - Marek Sovis
- Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava and University Hospital, Ruzinovska 6, 826 06 Bratislava, Slovakia; (L.C.); (K.S.); (M.S.); (I.V.); (J.F.); (B.G.)
| | - Ivana Vidova
- Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava and University Hospital, Ruzinovska 6, 826 06 Bratislava, Slovakia; (L.C.); (K.S.); (M.S.); (I.V.); (J.F.); (B.G.)
| | - Julia Farska
- Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava and University Hospital, Ruzinovska 6, 826 06 Bratislava, Slovakia; (L.C.); (K.S.); (M.S.); (I.V.); (J.F.); (B.G.)
| | - Michaela Lifková
- Department of Stomatology and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava, St. Elisabeth Hospital Bratislava, Heydukova 10, 812 50 Bratislava, Slovakia;
| | - Tomas Hamar
- Institute of Medical Terminology and Foreign Languages, Faculty of Medicine, Comenius University in Bratislava, Moskovska 2, 811 08 Bratislava, Slovakia;
| | - Branislav Galis
- Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Comenius University in Bratislava and University Hospital, Ruzinovska 6, 826 06 Bratislava, Slovakia; (L.C.); (K.S.); (M.S.); (I.V.); (J.F.); (B.G.)
| |
Collapse
|
3
|
Niloy I, Liu RH, Pham NM, Yim CMR. Novel Use of Virtual Reality and Augmented Reality in Temporomandibular Total Joint Replacement Using Stock Prosthesis. J Oral Maxillofac Surg 2024; 82:632-640. [PMID: 38442876 DOI: 10.1016/j.joms.2024.02.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2023] [Revised: 01/17/2024] [Accepted: 02/12/2024] [Indexed: 03/07/2024]
Abstract
This technical innovation demonstrates the use of ImmersiveTouch Virtual Reality (VR) and Augmented Reality (AR)-guided total temporomandibular joint replacement (TJR) using Biomet stock prosthesis in 2 patients with condylar degeneration. TJR VR planning includes condylar resection, prosthesis selection and positioning, and interference identification. AR provides real-time guidance for osteotomies, placement of prostheses and fixation screws, occlusion verification, and flexibility to modify the surgical course. Radiographic analysis demonstrated high correspondence between the preoperative plan and postoperative result. The average differences in the positioning of the condylar and fossa prosthesis are 1.252 ± 0.269 mm and 1.393 ± 0.335 mm, respectively. The main challenges include a steep learning curve, intraoperative technical difficulties, added surgical time, and additional costs. In conclusion, the case report demonstrates the advantages of implementing AR and VR technology in TJR's using stock prostheses as a pilot study. Further clinical trials are needed prior to this innovation becoming a mainstream practice.
Collapse
Affiliation(s)
- Injamamul Niloy
- Department Oral & Maxillofacial Surgery, Walter Reed National Military Medical Center, Bethesda, MD
| | - Robert H Liu
- Department Oral & Maxillofacial Surgery, Walter Reed National Military Medical Center, Bethesda, MD.
| | - Nikole M Pham
- Department Oral & Maxillofacial Surgery, Walter Reed National Military Medical Center, Bethesda, MD
| | - Chang Min Richard Yim
- Department of Oral & Maxillofacial Surgery, Rutgers School of Dental Medicine, Newark, NJ
| |
Collapse
|
4
|
Chen Y, Zhong NN, Cao LM, Liu B, Bu LL. Surgical margins in head and neck squamous cell carcinoma: A narrative review. Int J Surg 2024; 110:3680-3700. [PMID: 38935830 PMCID: PMC11175762 DOI: 10.1097/js9.0000000000001306] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2023] [Accepted: 02/26/2024] [Indexed: 06/29/2024]
Abstract
Head and neck squamous cell carcinoma (HNSCC), a prevalent and frequently recurring malignancy, often necessitates surgical intervention. The surgical margin (SM) plays a pivotal role in determining the postoperative treatment strategy and prognostic evaluation of HNSCC. Nonetheless, the process of clinical appraisal and assessment of the SMs remains a complex and indeterminate endeavor, thereby leading to potential difficulties for surgeons in defining the extent of resection. In this regard, we undertake a comprehensive review of the suggested surgical distance in varying circumstances, diverse methods of margin evaluation, and the delicate balance that must be maintained between tissue resection and preservation in head and neck surgical procedures. This review is intended to provide surgeons with pragmatic guidance in selecting the most suitable resection techniques, and in improving patients' quality of life by achieving optimal functional and aesthetic restoration.
Collapse
Affiliation(s)
- Yang Chen
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology
| | - Nian-Nian Zhong
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology
| | - Lei-Ming Cao
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology
| | - Bing Liu
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology
- Department of Oral & Maxillofacial – Head Neck Oncology, School & Hospital of Stomatology, Wuhan University, Wuhan, People’s Republic of China
| | - Lin-Lin Bu
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology
- Department of Oral & Maxillofacial – Head Neck Oncology, School & Hospital of Stomatology, Wuhan University, Wuhan, People’s Republic of China
| |
Collapse
|
5
|
Martinho FC, Qadir SJ, Griffin IL, Melo MAS, Fay GG. Augmented Reality Head-Mounted Device and Dynamic Navigation System for Postremoval in Maxillary Molars. J Endod 2024; 50:844-851. [PMID: 38369102 DOI: 10.1016/j.joen.2024.02.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2023] [Revised: 01/20/2024] [Accepted: 02/07/2024] [Indexed: 02/20/2024]
Abstract
INTRODUCTION This study evaluates the feasibility of an augmented reality (AR) head-mounted device (HMD) displaying a dynamic navigation system (DNS) in the surgical site for fiber postremoval in maxillary molars and compares it to the DNS technique. METHODS Fifty maxillary first molars were divided into 2 groups: AR HMD + DNS (n = 25) and DNS (n = 25). The palatal canal was restored with RelyX fiber post (3M ESPE) luted with RelyX Unicem (3M ESPE). A core buildup was performed using Paracore (Coltene/Whaledent). Cone beam computed tomography (CBCT) scans were taken before and after postremoval. The drilling trajectory and depth were planned under X-guide software (X-Nav Technologies, Lansdale, PA). For the AR HMD + DNS group, the AR HMD (Microsoft HoloLens 2) displayed the DNS in the surgical site. The three dimensional (3D) deviations (Global coronal deviation [GCD] and global apical deviation [GAD]) and angular deflection (AD) were calculated. The number of mishaps and operating time were recorded. RESULTS Fiber post was removed from all samples (50/50). The AR HMD + DNS was more accurate than DNS, showing significantly lower GCD and GAD deviations and AD (P < .05). No mishap was detected. The AR HMD + DNS was as efficient in time as DNS (P > .05). CONCLUSIONS Within the limitations of this in vitro study, the AR HMD can safely display DNS in the surgical site for fiber post-removal in maxillary molars. AR HMD improved the DNS accuracy. Both AR HMD + DNS and DNS were time-efficient for fiber postremoval in maxillary molars.
Collapse
Affiliation(s)
- Frederico C Martinho
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland.
| | - Syed J Qadir
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Ina L Griffin
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Mary Anne S Melo
- Division of Operative Dentistry, Department of General Dentistry, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Guadalupe G Fay
- Division of Prosthodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| |
Collapse
|
6
|
Arensmeyer J, Bedetti B, Schnorr P, Buermann J, Zalepugas D, Schmidt J, Feodorovici P. A System for Mixed-Reality Holographic Overlays of Real-Time Rendered 3D-Reconstructed Imaging Using a Video Pass-through Head-Mounted Display-A Pathway to Future Navigation in Chest Wall Surgery. J Clin Med 2024; 13:2080. [PMID: 38610849 PMCID: PMC11012529 DOI: 10.3390/jcm13072080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2024] [Revised: 03/26/2024] [Accepted: 03/27/2024] [Indexed: 04/14/2024] Open
Abstract
Background: Three-dimensional reconstructions of state-of-the-art high-resolution imaging are progressively being used more for preprocedural assessment in thoracic surgery. It is a promising tool that aims to improve patient-specific treatment planning, for example, for minimally invasive or robotic-assisted lung resections. Increasingly available mixed-reality hardware based on video pass-through technology enables the projection of image data as a hologram onto the patient. We describe the novel method of real-time 3D surgical planning in a mixed-reality setting by presenting three representative cases utilizing volume rendering. Materials: A mixed-reality system was set up using a high-performance workstation running a video pass-through-based head-mounted display. Image data from computer tomography were imported and volume-rendered in real-time to be customized through live editing. The image-based hologram was projected onto the patient, highlighting the regions of interest. Results: Three oncological cases were selected to explore the potentials of the mixed-reality system. Two of them presented large tumor masses in the thoracic cavity, while a third case presented an unclear lesion of the chest wall. We aligned real-time rendered 3D holographic image data onto the patient allowing us to investigate the relationship between anatomical structures and their respective body position. Conclusions: The exploration of holographic overlay has proven to be promising in improving preprocedural surgical planning, particularly for complex oncological tasks in the thoracic surgical field. Further studies on outcome-related surgical planning and navigation should therefore be conducted. Ongoing technological progress of extended reality hardware and intelligent software features will most likely enhance applicability and the range of use in surgical fields within the near future.
Collapse
Affiliation(s)
- Jan Arensmeyer
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
| | - Benedetta Bedetti
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Philipp Schnorr
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Jens Buermann
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Donatas Zalepugas
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Joachim Schmidt
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
- Department of Thoracic Surgery, Helios Hospital Bonn/Rhein-Sieg, 53123 Bonn, Germany
| | - Philipp Feodorovici
- Division of Thoracic Surgery, Department of General, Thoracic and Vascular Surgery, University Hospital Bonn, 53127 Bonn, Germany (P.F.)
- Bonn Surgical Technology Center (BOSTER), University Hospital Bonn, 53227 Bonn, Germany
| |
Collapse
|
7
|
Begagić E, Bečulić H, Pugonja R, Memić Z, Balogun S, Džidić-Krivić A, Milanović E, Salković N, Nuhović A, Skomorac R, Sefo H, Pojskić M. Augmented Reality Integration in Skull Base Neurosurgery: A Systematic Review. MEDICINA (KAUNAS, LITHUANIA) 2024; 60:335. [PMID: 38399622 PMCID: PMC10889940 DOI: 10.3390/medicina60020335] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/26/2023] [Revised: 02/05/2024] [Accepted: 02/09/2024] [Indexed: 02/25/2024]
Abstract
Background and Objectives: To investigate the role of augmented reality (AR) in skull base (SB) neurosurgery. Materials and Methods: Utilizing PRISMA methodology, PubMed and Scopus databases were explored to extract data related to AR integration in SB surgery. Results: The majority of 19 included studies (42.1%) were conducted in the United States, with a focus on the last five years (77.8%). Categorization included phantom skull models (31.2%, n = 6), human cadavers (15.8%, n = 3), or human patients (52.6%, n = 10). Microscopic surgery was the predominant modality in 10 studies (52.6%). Of the 19 studies, surgical modality was specified in 18, with microscopic surgery being predominant (52.6%). Most studies used only CT as the data source (n = 9; 47.4%), and optical tracking was the prevalent tracking modality (n = 9; 47.3%). The Target Registration Error (TRE) spanned from 0.55 to 10.62 mm. Conclusion: Despite variations in Target Registration Error (TRE) values, the studies highlighted successful outcomes and minimal complications. Challenges, such as device practicality and data security, were acknowledged, but the application of low-cost AR devices suggests broader feasibility.
Collapse
Affiliation(s)
- Emir Begagić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Hakija Bečulić
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Ragib Pugonja
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Zlatan Memić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Simon Balogun
- Division of Neurosurgery, Department of Surgery, Obafemi Awolowo University Teaching Hospitals Complex, Ilesa Road PMB 5538, Ile-Ife 220282, Nigeria
| | - Amina Džidić-Krivić
- Department of Neurology, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina
| | - Elma Milanović
- Neurology Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Naida Salković
- Department of General Medicine, School of Medicine, University of Tuzla, Univerzitetska 1, 75000 Tuzla, Bosnia and Herzegovina;
| | - Adem Nuhović
- Department of General Medicine, School of Medicine, University of Sarajevo, Univerzitetska 1, 71000 Sarajevo, Bosnia and Herzegovina;
| | - Rasim Skomorac
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Surgery, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina
| | - Haso Sefo
- Neurosurgery Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Mirza Pojskić
- Department of Neurosurgery, University Hospital Marburg, Baldingerstr., 35033 Marburg, Germany
| |
Collapse
|
8
|
Castille J, Remy S, Vermue H, Victor J. The use of virtual reality to assess the bony landmarks at the knee joint - The role of imaging modality and the assessor's experience. Knee 2024; 46:41-51. [PMID: 38061164 DOI: 10.1016/j.knee.2023.11.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/15/2023] [Revised: 08/22/2023] [Accepted: 11/13/2023] [Indexed: 02/20/2024]
Abstract
BACKGROUND At present, extended reality technologies such as virtual reality (VR) have gained popularity in orthopedic surgery. The first aim of this study was to assess the precision of VR and other imaging modalities - computed tomography (CT), magnetic resonance imaging (MRI) - to localize bony landmarks near the knee joint. Secondly, the impact of the educational level of the assessor - medical master students, orthopedic residents, and orthopedic surgeons - on the precision with which landmarks near the knee joint could be localized was analyzed. METHODS We included a total of 77 participants: 62 medical master students, 10 orthopedic residents, and 5 orthopedic surgeons to analyze three cadaver legs. Every participant localized a series of sixteen bony landmarks on six different imaging modalities (CT, MRI, 3D-CT, 3D-MRI, VR-CT, VR-MRI). RESULTS Concerning the imaging modality, the inter- and intra-observer variability were lowest for 3D and VR, higher for MRI (respectively 7.6 mm and 6.9 mm), and highest for CT (respectively 9 mm and 8.7 m).Concerning the educational level of the assessor, inter- and intra-observer variability in VR were lowest for surgeons, (respectively 3.2 mm and 3.6 mm), higher for residents (respectively 5.9 mm and 6.5 mm) and medical students (respectively 5.9 mm and 5.8 mm). CONCLUSIONS VR can be considered a reliable imaging technique. Localization of landmarks tends to be more precise in VR and on 3D than on conventional CT and MRI images. Furthermore, orthopedic surgeons localize landmarks more precisely than orthopedic residents and medical students in VR.
Collapse
Affiliation(s)
- Jocelyn Castille
- Faculty of Medicine and Health Sciences, Ghent University, Ghent, Belgium.
| | - Stijn Remy
- Faculty of Medicine and Health Sciences, Ghent University, Ghent, Belgium.
| | - Hannes Vermue
- Faculty of Medicine and Health Sciences, Department of Human Structure and Repair, Ghent University, Ghent, Belgium
| | - Jan Victor
- Faculty of Medicine and Health Sciences, Department of Human Structure and Repair, Ghent University, Ghent, Belgium
| |
Collapse
|
9
|
Martinho FC, Griffin IL, Price JB, Tordik PA. Augmented Reality and 3-Dimensional Dynamic Navigation System Integration for Osteotomy and Root-end Resection. J Endod 2023; 49:1362-1368. [PMID: 37453501 DOI: 10.1016/j.joen.2023.07.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Revised: 07/03/2023] [Accepted: 07/05/2023] [Indexed: 07/18/2023]
Abstract
INTRODUCTION Augmented reality (AR) superimposes high-definition computer-generated virtual content onto the existing environment, providing users with an enhanced perception of reality. This study investigates the feasibility of integrating an AR head-mounted device into a 3-dimensional dynamic navigation system (3D-DNS) for osteotomy and root-end resection (RER). It compares the accuracy and efficiency of AR + 3D-DNS to 3D-DNS for osteotomy and RER. METHODS Seventy-two tooth roots of 3D-printed surgical jaw models were divided into two groups: AR + 3D-DNS (n = 36) and 3D-DNS (n = 36). Cone-beam computed tomography scans were taken pre and postoperatively. The osteotomy and RER were virtually planned on X-guide software and delivered under 3D-DNS guidance. For the AR + 3D-DNS group, an AR head-mounted device (Microsoft HoloLens 2) was integrated into the 3D-DNS. The 2D- and 3D-deviations were calculated. The osteotomy and RER time and the number of procedural mishaps were recorded. RESULTS Osteotomy and RER were completed in all samples (72/72). AR + 3D-DNS was more accurate than 3D-DNS, showing lower 2D- and 3D-deviation values (P < .05). The AR + 3D-DNS was more efficient in time than 3D-DNS (P < .05). There was no significant difference in the number of mishaps (P > .05). CONCLUSIONS Within the limitations of this in vitro study, the integration of an AR head-mounted device to 3D-DNS is feasible for osteotomy and RER. AR improved the accuracy and time efficiency of 3D-DNS in osteotomy and RER. Head-mounted AR has the potential to be safely and reliably integrated into 3D-DNS for endodontic microsurgery.
Collapse
Affiliation(s)
- Frederico C Martinho
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland.
| | - Ina L Griffin
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Jeffery B Price
- Division of Oral Radiology, Department of Oncology and Diagnostic Sciences, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Patricia A Tordik
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| |
Collapse
|
10
|
Kim EJ, Kim JY. The Metaverse for Healthcare: Trends, Applications, and Future Directions of Digital Therapeutics for Urology. Int Neurourol J 2023; 27:S3-12. [PMID: 37280754 DOI: 10.5213/inj.2346108.054] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Accepted: 05/16/2023] [Indexed: 06/08/2023] Open
Abstract
In recent years, the emergence of digital therapeutics as a novel approach to managing conditions has garnered significant attention. This approach involves using evidence-based therapeutic interventions that are facilitated by high-quality software programs to treat, manage, or prevent medical conditions. The incorporation of digital therapeutics into the Metaverse has increased the feasibility of their implementation and application in all areas of medical services. In urology, substantial digital therapeutics are being produced and researched, including mobile apps, bladder devices, pelvic floor muscle trainers, smart toilet systems, mixed reality-guided training and surgery, and training and telemedicine for urological consultations. The purpose of this review article is to provide a comprehensive overview of the current impact of the Metaverse on the field of digital therapeutics and identify its current trends, applications, and future perspectives in the field of urology.
Collapse
Affiliation(s)
- Eun Joung Kim
- Culture Contents Technology Institute, Gachon University, Seongnam, Korea
| | - Jung Yoon Kim
- Department of Game Media, College of Future Industry, Gachon University, Seongnam, Korea
| |
Collapse
|
11
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 20] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
12
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
13
|
Rzepka AM, Hussey KJ, Maltz MV, Babin K, Wilcox LM, Culham JC. Familiar size affects perception differently in virtual reality and the real world. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210464. [PMID: 36511414 PMCID: PMC9745877 DOI: 10.1098/rstb.2021.0464] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2022] [Accepted: 08/10/2022] [Indexed: 12/15/2022] Open
Abstract
The promise of virtual reality (VR) as a tool for perceptual and cognitive research rests on the assumption that perception in virtual environments generalizes to the real world. Here, we conducted two experiments to compare size and distance perception between VR and physical reality (Maltz et al. 2021 J. Vis. 21, 1-18). In experiment 1, we used VR to present dice and Rubik's cubes at their typical sizes or reversed sizes at distances that maintained a constant visual angle. After viewing the stimuli binocularly (to provide vergence and disparity information) or monocularly, participants manually estimated perceived size and distance. Unlike physical reality, where participants relied less on familiar size and more on presented size during binocular versus monocular viewing, in VR participants relied heavily on familiar size regardless of the availability of binocular cues. In experiment 2, we demonstrated that the effects in VR generalized to other stimuli and to a higher quality VR headset. These results suggest that the use of binocular cues and familiar size differs substantially between virtual and physical reality. A deeper understanding of perceptual differences is necessary before assuming that research outcomes from VR will generalize to the real world. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Anna M. Rzepka
- Neuroscience Program, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Kieran J. Hussey
- Neuroscience Program, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Margaret V. Maltz
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Karsten Babin
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| | - Laurie M. Wilcox
- Department of Psychology, York University, Toronto, ON, Canada M3J 1P3
| | - Jody C. Culham
- Neuroscience Program, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
- Department of Psychology, University of Western Ontario, Western Interdisciplinary Research Building, London, ON, Canada N6A 3K7
| |
Collapse
|
14
|
Cercenelli L, Babini F, Badiali G, Battaglia S, Tarsitano A, Marchetti C, Marcelli E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front Oncol 2022; 11:804748. [PMID: 35071009 PMCID: PMC8770836 DOI: 10.3389/fonc.2021.804748] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 12/10/2021] [Indexed: 11/13/2022] Open
Abstract
Background Augmented Reality (AR) represents an evolution of navigation-assisted surgery, providing surgeons with a virtual aid contextually merged with the real surgical field. We recently reported a case series of AR-assisted fibular flap harvesting for mandibular reconstruction. However, the registration accuracy between the real and the virtual content needs to be systematically evaluated before widely promoting this tool in clinical practice. In this paper, after description of the AR based protocol implemented for both tablet and HoloLens 2 smart glasses, we evaluated in a first test session the achievable registration accuracy with the two display solutions, and in a second test session the success rate in executing the AR-guided skin paddle incision task on a 3D printed leg phantom. Methods From a real computed tomography dataset, 3D virtual models of a human leg, including fibula, arteries and skin with planned paddle profile for harvesting, were obtained. All virtual models were imported into Unity software to develop a marker-less AR application suitable to be used both via tablet and via HoloLens 2 headset. The registration accuracy for both solutions was verified on a 3D printed leg phantom obtained from the virtual models, by repeatedly applying the tracking function and computing pose deviations between the AR-projected virtual skin paddle profile and the real one transferred to the phantom via a CAD/CAM cutting guide. The success rate in completing the AR-guided task of skin paddle harvesting was evaluated using CAD/CAM templates positioned on the phantom model surface. Results On average, the marker-less AR protocol showed comparable registration errors (ranging within 1-5 mm) for tablet-based and HoloLens-based solution. Registration accuracy seems to be quite sensitive to ambient light conditions. We found a good success rate in completing the AR-guided task within an error margin of 4 mm (97% and 100% for tablet and HoloLens, respectively). All subjects reported greater usability and ergonomics for HoloLens 2 solution. Conclusions Results revealed that the proposed marker-less AR based protocol may guarantee a registration error within 1-5 mm for assisting skin paddle harvesting in the clinical setting. Optimal lightening conditions and further improvement of marker-less tracking technologies have the potential to increase the efficiency and precision of this AR-assisted reconstructive surgery.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Federico Babini
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Salvatore Battaglia
- Maxillofacial Surgery Unit, Policlinico San Marco University Hospital, University of Catania, Catania, Italy
| | - Achille Tarsitano
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Claudio Marchetti
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| |
Collapse
|
15
|
VeLight: A 3D virtual reality tool for CT-based anatomy teaching and training. J Vis (Tokyo) 2021. [DOI: 10.1007/s12650-021-00790-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|