1
|
Hadida Barzilai D, Tejman‐Yarden S, Yogev D, Vazhgovsky O, Nagar N, Sasson L, Sion‐Sarid R, Parmet Y, Goldfarb A, Ilan O. Augmented Reality-Guided Mastoidectomy Simulation: A Randomized Controlled Trial Assessing Surgical Proficiency. Laryngoscope 2025; 135:894-900. [PMID: 39315469 PMCID: PMC11725687 DOI: 10.1002/lary.31791] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2024] [Revised: 08/30/2024] [Accepted: 09/06/2024] [Indexed: 09/25/2024]
Abstract
OBJECTIVE Mastoidectomy surgical training is challenging due to the complex nature of the anatomical structures involved. Traditional training methods based on direct patient care and cadaveric temporal bone training have practical shortcomings. 3D-printed temporal bone models and augmented reality (AR) have emerged as promising solutions, particularly for mastoidectomy surgery, which demands an understanding of intricate anatomical structures. Evidence is needed to explore the potential of AR technology in addressing these training challenges. METHODS 21 medical students in their clinical clerkship were recruited for this prospective, randomized controlled trial assessing mastoidectomy skills. The participants were randomly assigned to the AR group, which received real-time guidance during drilling on 3D-printed temporal bone models, or to the control group, which received traditional training methods. Skills were assessed on a modified Welling scale and evaluated independently by two senior otologists. RESULTS The AR group outperformed the control group, with a mean overall drilling score of 19.5 out of 25, compared with the control group's score of 12 (p < 0.01). The AR group was significantly better at defining mastoidectomy margins (p < 0.01), exposing the antrum, preserving the lateral semicircular canal (p < 0.05), sharpening the sinodural angle (p < 0.01), exposing the tegmen and attic, preserving the ossicles (p < 0.01), and thinning and preserving the external auditory canal (p < 0.05). CONCLUSION AR simulation in mastoidectomy, even in a single session, improved the proficiency of novice surgeons compared with traditional methods. LEVEL OF EVIDENCE NA Laryngoscope, 135:894-900, 2025.
Collapse
Affiliation(s)
| | - Shai Tejman‐Yarden
- The Engineering Medical Research LabSheba Medical CenterRamat GanIsrael
- The Edmond J. Safra International Congenital Heart CenterSheba Medical CenterRamat GanIsrael
| | - David Yogev
- The Engineering Medical Research LabSheba Medical CenterRamat GanIsrael
- Department of Otolaryngology and Head and Neck SurgerySheba Medical CenterTel HashomerIsrael
| | - Oliana Vazhgovsky
- The Engineering Medical Research LabSheba Medical CenterRamat GanIsrael
- The Edmond J. Safra International Congenital Heart CenterSheba Medical CenterRamat GanIsrael
| | - Netanel Nagar
- The Engineering Medical Research LabSheba Medical CenterRamat GanIsrael
| | - Lior Sasson
- Cardiothoracic Surgery, Wolfson Medical CenterTel Aviv UniversityHolonIsrael
| | | | - Yisrael Parmet
- Department of Industrial Engineering and ManagementBen Gurion UniversityBeer ShevaIsrael
| | - Abraham Goldfarb
- Department of Otorhinolaryngology and Head and Neck SurgeryEdith Wolfson Medical CenterHolonIsrael
| | - Ophir Ilan
- Department of Otorhinolaryngology and Head and Neck SurgeryEdith Wolfson Medical CenterHolonIsrael
| |
Collapse
|
2
|
Yang S, Li H, Zhang P, Yan W, Zhao Z, Ding H, Wang G. Automated Volumetric Milling Area Planning for Acoustic Neuroma Surgery via Evolutionary Multi-Objective Optimization. SENSORS (BASEL, SWITZERLAND) 2025; 25:448. [PMID: 39860818 PMCID: PMC11768615 DOI: 10.3390/s25020448] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/26/2024] [Revised: 01/11/2025] [Accepted: 01/13/2025] [Indexed: 01/27/2025]
Abstract
Mastoidectomy is critical in acoustic neuroma surgery, where precise planning of the bone milling area is essential for surgical navigation. The complexity of representing the irregular volumetric area and the presence of high-risk structures (e.g., blood vessels and nerves) complicate this task. In order to determine the bone area to mill using preoperative CT images automatically, we propose an automated planning method using evolutionary multi-objective optimization for safer and more efficient milling plans. High-resolution segmentation of the adjacent risk structures is performed on preoperative CT images with a template-based approach. The maximum milling area is defined based on constraints from the risk structures and tool dimensions. Deformation fields are used to simplify the volumetric area into limited continuous parameters suitable for optimization. Finally, a multi-objective optimization algorithm is used to achieve a Pareto-optimal design. Compared with manual planning on six volumes, our method reduced the potential damage to the scala vestibuli by 29.8%, improved the milling boundary smoothness by 78.3%, and increased target accessibility by 26.4%. Assessment by surgeons confirmed the clinical feasibility of the generated plans. In summary, this study presents a parameterization approach to irregular volumetric regions, enabling automated milling area planning through optimization techniques that ensure safety and feasibility. This method is also adaptable to various volumetric planning scenarios.
Collapse
Affiliation(s)
- Sheng Yang
- School of Biomedical Engineering, Tsinghua University, Shuang Qing Road, Beijing 100084, China; (S.Y.)
| | - Haowei Li
- School of Biomedical Engineering, Tsinghua University, Shuang Qing Road, Beijing 100084, China; (S.Y.)
| | - Peihai Zhang
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Li Tang Road, Beijing 100043, China
| | - Wenqing Yan
- School of Biomedical Engineering, Tsinghua University, Shuang Qing Road, Beijing 100084, China; (S.Y.)
| | - Zhe Zhao
- School of Clinical Medicine, Tsinghua University, Shuang Qing Road, Beijing 100084, China
- Orthopedics & Sports Medicine Center, Beijing Tsinghua Changgung Hospital, Li Tang Road, Beijing 100043, China
| | - Hui Ding
- School of Biomedical Engineering, Tsinghua University, Shuang Qing Road, Beijing 100084, China; (S.Y.)
| | - Guangzhi Wang
- School of Biomedical Engineering, Tsinghua University, Shuang Qing Road, Beijing 100084, China; (S.Y.)
| |
Collapse
|
3
|
Ye J, Chen Q, Zhong T, Liu J, Gao H. Is Overlain Display a Right Choice for AR Navigation? A Qualitative Study of Head-Mounted Augmented Reality Surgical Navigation on Accuracy for Large-Scale Clinical Deployment. CNS Neurosci Ther 2025; 31:e70217. [PMID: 39817491 PMCID: PMC11736426 DOI: 10.1111/cns.70217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2024] [Revised: 12/24/2024] [Accepted: 01/03/2025] [Indexed: 01/18/2025] Open
Abstract
BACKGROUND During the course of the past two decades, head-mounted augmented reality surgical navigation (HMARSN) systems have been increasingly employed in a variety of surgical specialties as a result of both advancements in augmented reality-related technologies and surgeons' desires to overcome some drawbacks inherent to conventional surgical navigation systems. In the present time, most experimental HMARSN systems adopt overlain display (OD) that overlay virtual models and planned routes of surgical tools on corresponding physical tissues, organs, lesions, and so forth, in a surgical field so as to provide surgeons with an intuitive and direct view to gain better hand-eye coordination as well as avoid attention shift and loss of sight (LOS), among other benefits during procedures. Yet, its system accuracy, which is the most crucial performance indicator of any surgical navigation system, is difficult to ascertain because it is highly subjective and user-dependent. Therefore, the aim of this study was to review presently available experimental OD HMARSN systems qualitatively, explore how their system accuracy is affected by overlain display, and find out if such systems are suited to large-scale clinical deployment. METHOD We searched PubMed and ScienceDirect with the following terms: head mounted augmented reality surgical navigation, and 445 records were returned in total. After screening and eligibility assessment, 60 papers were finally analyzed. Specifically, we focused on how their accuracies were defined and measured, as well as whether such accuracies are stable in clinical practice and competitive with corresponding commercially available systems. RESULTS AND CONCLUSIONS The primary findings are that the system accuracy of OD HMARSN systems is seriously affected by a transformation between the spaces of the user's eyes and the surgical field, because measurement of the transformation is heavily individualized and user-dependent. Additionally, the transformation itself is potentially subject to changes during surgical procedures, and hence unstable. Therefore, OD HMARSN systems are not suitable for large-scale clinical deployment.
Collapse
Affiliation(s)
- Jian Ye
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Qingwen Chen
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Tao Zhong
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Jian Liu
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Han Gao
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| |
Collapse
|
4
|
Shen Y, Wang S, Shen Y, Hu J. The Application of Augmented Reality Technology in Perioperative Visual Guidance: Technological Advances and Innovation Challenges. SENSORS (BASEL, SWITZERLAND) 2024; 24:7363. [PMID: 39599139 PMCID: PMC11598101 DOI: 10.3390/s24227363] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/09/2024] [Revised: 11/09/2024] [Accepted: 11/16/2024] [Indexed: 11/29/2024]
Abstract
In contemporary medical practice, perioperative visual guidance technology has become a critical element in enhancing the precision and safety of surgical procedures. This study provides a comprehensive review of the advancements in the application of Augmented Reality (AR) technology for perioperative visual guidance. This review begins with a retrospective look at the evolution of AR technology, including its initial applications in neurosurgery. It then delves into the technical challenges that AR faces in areas such as image processing, 3D reconstruction, spatial localization, and registration, underscoring the importance of improving the accuracy of AR systems and ensuring their stability and consistency in clinical use. Finally, the review looks forward to how AR technology could be further facilitated in medical applications with the integration of cutting-edge technologies like skin electronic devices and how the incorporation of machine learning could significantly enhance the accuracy of AR visual systems. As technology continues to advance, there is ample reason to believe that AR will be seamlessly integrated into medical practice, ushering the healthcare field into a new "Golden Age".
Collapse
Affiliation(s)
| | - Shuyi Wang
- School of Health Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China; (Y.S.); (Y.S.); (J.H.)
| | | | | |
Collapse
|
5
|
Aweeda M, Adegboye F, Yang SF, Topf MC. Enhancing Surgical Vision: Augmented Reality in Otolaryngology-Head and Neck Surgery. JOURNAL OF MEDICAL EXTENDED REALITY 2024; 1:124-136. [PMID: 39091667 PMCID: PMC11290041 DOI: 10.1089/jmxr.2024.0010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/15/2024] [Indexed: 08/04/2024]
Abstract
Augmented reality (AR) technology has become widely established in otolaryngology-head and neck surgery. Over the past 20 years, numerous AR systems have been investigated and validated across the subspecialties, both in cadaveric and in live surgical studies. AR displays projected through head-mounted devices, microscopes, and endoscopes, most commonly, have demonstrated utility in preoperative planning, intraoperative guidance, and improvement of surgical decision-making. Specifically, they have demonstrated feasibility in guiding tumor margin resections, identifying critical structures intraoperatively, and displaying patient-specific virtual models derived from preoperative imaging, with millimetric accuracy. This review summarizes both established and emerging AR technologies, detailing how their systems work, what features they offer, and their clinical impact across otolaryngology subspecialties. As AR technology continues to advance, its integration holds promise for enhancing surgical precision, simulation training, and ultimately, improving patient outcomes.
Collapse
Affiliation(s)
- Marina Aweeda
- Department of Otolaryngology—Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Feyisayo Adegboye
- Department of Otolaryngology—Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Shiayin F. Yang
- Department of Otolaryngology—Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Michael C. Topf
- Department of Otolaryngology—Head and Neck Surgery, Vanderbilt University Medical Center, Nashville, Tennessee, USA
- School of Engineering, Vanderbilt University, Nashville, Tennessee, USA
| |
Collapse
|
6
|
Pojskić M, Bopp MHA, Saß B, Nimsky C. Single-Center Experience in Microsurgical Resection of Acoustic Neurinomas and the Benefit of Microscope-Based Augmented Reality. MEDICINA (KAUNAS, LITHUANIA) 2024; 60:932. [PMID: 38929549 PMCID: PMC11487442 DOI: 10.3390/medicina60060932] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/03/2024] [Revised: 05/27/2024] [Accepted: 05/29/2024] [Indexed: 06/28/2024]
Abstract
Background and Objectives: Microsurgical resection with intraoperative neuromonitoring is the gold standard for acoustic neurinomas (ANs) which are classified as T3 or T4 tumors according to the Hannover Classification. Microscope-based augmented reality (AR) can be beneficial in cerebellopontine angle and lateral skull base surgery, since these are small areas packed with anatomical structures and the use of this technology enables automatic 3D building of a model without the need for a surgeon to mentally perform this task of transferring 2D images seen on the microscope into imaginary 3D images, which then reduces the possibility of error and provides better orientation in the operative field. Materials and Methods: All patients who underwent surgery for resection of ANs in our department were included in this study. Clinical outcomes in terms of postoperative neurological deficits and complications were evaluated, as well as neuroradiological outcomes for tumor remnants and recurrence. Results: A total of 43 consecutive patients (25 female, median age 60.5 ± 16 years) who underwent resection of ANs via retrosigmoid osteoclastic craniotomy with the use of intraoperative neuromonitoring (22 right-sided, 14 giant tumors, 10 cystic, 7 with hydrocephalus) by a single surgeon were included in this study, with a median follow up of 41.2 ± 32.2 months. A total of 18 patients underwent subtotal resection, 1 patient partial resection and 24 patients gross total resection. A total of 27 patients underwent resection in sitting position and the rest in semi-sitting position. Out of 37 patients who had no facial nerve deficit prior to surgery, 19 patients were intact following surgery, 7 patients had House Brackmann (HB) Grade II paresis, 3 patients HB III, 7 patients HB IV and 1 patient HB V. Wound healing deficit with cerebrospinal fluid (CSF) leak occurred in 8 patients (18.6%). Operative time was 317.3 ± 99 min. One patient which had recurrence and one further patient with partial resection underwent radiotherapy following surgery. A total of 16 patients (37.2%) underwent resection using fiducial-based navigation and microscope-based AR, all in sitting position. Segmented objects of interest in AR were the sigmoid and transverse sinus, tumor outline, cranial nerves (CN) VII, VIII and V, petrous vein, cochlea and semicircular canals and brain stem. Operative time and clinical outcome did not differ between the AR and the non-AR group. However, use of AR improved orientation in the operative field for craniotomy planning and microsurgical resection by identification of important neurovascular structures. Conclusions: The single-center experience of resection of ANs showed a high rate of gross total (GTR) and subtotal resection (STR) with low recurrence. Use of AR improves intraoperative orientation and facilitates craniotomy planning and AN resection through early improved identification of important anatomical relations to structures of the inner auditory canal, venous sinuses, petrous vein, brain stem and the course of cranial nerves.
Collapse
Affiliation(s)
- Mirza Pojskić
- Department of Neurosurgery, University of Marburg, 35037 Marburg, Germany; (M.H.A.B.); (B.S.); (C.N.)
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, 35037 Marburg, Germany; (M.H.A.B.); (B.S.); (C.N.)
- Marburg Center for Mind, Brain and Behavior (MCMBB), 35032 Marburg, Germany
| | - Benjamin Saß
- Department of Neurosurgery, University of Marburg, 35037 Marburg, Germany; (M.H.A.B.); (B.S.); (C.N.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, 35037 Marburg, Germany; (M.H.A.B.); (B.S.); (C.N.)
- Marburg Center for Mind, Brain and Behavior (MCMBB), 35032 Marburg, Germany
| |
Collapse
|
7
|
Begagić E, Bečulić H, Pugonja R, Memić Z, Balogun S, Džidić-Krivić A, Milanović E, Salković N, Nuhović A, Skomorac R, Sefo H, Pojskić M. Augmented Reality Integration in Skull Base Neurosurgery: A Systematic Review. MEDICINA (KAUNAS, LITHUANIA) 2024; 60:335. [PMID: 38399622 PMCID: PMC10889940 DOI: 10.3390/medicina60020335] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/26/2023] [Revised: 02/05/2024] [Accepted: 02/09/2024] [Indexed: 02/25/2024]
Abstract
Background and Objectives: To investigate the role of augmented reality (AR) in skull base (SB) neurosurgery. Materials and Methods: Utilizing PRISMA methodology, PubMed and Scopus databases were explored to extract data related to AR integration in SB surgery. Results: The majority of 19 included studies (42.1%) were conducted in the United States, with a focus on the last five years (77.8%). Categorization included phantom skull models (31.2%, n = 6), human cadavers (15.8%, n = 3), or human patients (52.6%, n = 10). Microscopic surgery was the predominant modality in 10 studies (52.6%). Of the 19 studies, surgical modality was specified in 18, with microscopic surgery being predominant (52.6%). Most studies used only CT as the data source (n = 9; 47.4%), and optical tracking was the prevalent tracking modality (n = 9; 47.3%). The Target Registration Error (TRE) spanned from 0.55 to 10.62 mm. Conclusion: Despite variations in Target Registration Error (TRE) values, the studies highlighted successful outcomes and minimal complications. Challenges, such as device practicality and data security, were acknowledged, but the application of low-cost AR devices suggests broader feasibility.
Collapse
Affiliation(s)
- Emir Begagić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Hakija Bečulić
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Ragib Pugonja
- Department of Anatomy, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Zlatan Memić
- Department of General Medicine, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina;
| | - Simon Balogun
- Division of Neurosurgery, Department of Surgery, Obafemi Awolowo University Teaching Hospitals Complex, Ilesa Road PMB 5538, Ile-Ife 220282, Nigeria
| | - Amina Džidić-Krivić
- Department of Neurology, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina
| | - Elma Milanović
- Neurology Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Naida Salković
- Department of General Medicine, School of Medicine, University of Tuzla, Univerzitetska 1, 75000 Tuzla, Bosnia and Herzegovina;
| | - Adem Nuhović
- Department of General Medicine, School of Medicine, University of Sarajevo, Univerzitetska 1, 71000 Sarajevo, Bosnia and Herzegovina;
| | - Rasim Skomorac
- Department of Neurosurgery, Cantonal Hospital Zenica, Crkvice 67, 72000 Zenica, Bosnia and Herzegovina; (H.B.)
- Department of Surgery, School of Medicine, University of Zenica, Travnička 1, 72000 Zenica, Bosnia and Herzegovina
| | - Haso Sefo
- Neurosurgery Clinic, Clinical Center University of Sarajevo, Bolnička 25, 71000 Sarajevo, Bosnia and Herzegovina
| | - Mirza Pojskić
- Department of Neurosurgery, University Hospital Marburg, Baldingerstr., 35033 Marburg, Germany
| |
Collapse
|
8
|
Kos TM, Colombo E, Bartels LW, Robe PA, van Doormaal TPC. Evaluation Metrics for Augmented Reality in Neurosurgical Preoperative Planning, Surgical Navigation, and Surgical Treatment Guidance: A Systematic Review. Oper Neurosurg (Hagerstown) 2023:01787389-990000000-01007. [PMID: 38146941 PMCID: PMC11008635 DOI: 10.1227/ons.0000000000001009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 10/10/2023] [Indexed: 12/27/2023] Open
Abstract
BACKGROUND AND OBJECTIVE Recent years have shown an advancement in the development of augmented reality (AR) technologies for preoperative visualization, surgical navigation, and intraoperative guidance for neurosurgery. However, proving added value for AR in clinical practice is challenging, partly because of a lack of standardized evaluation metrics. We performed a systematic review to provide an overview of the reported evaluation metrics for AR technologies in neurosurgical practice and to establish a foundation for assessment and comparison of such technologies. METHODS PubMed, Embase, and Cochrane were searched systematically for publications on assessment of AR for cranial neurosurgery on September 22, 2022. The findings were reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. RESULTS The systematic search yielded 830 publications; 114 were screened full text, and 80 were included for analysis. Among the included studies, 5% dealt with preoperative visualization using AR, with user perception as the most frequently reported metric. The majority (75%) researched AR technology for surgical navigation, with registration accuracy, clinical outcome, and time measurements as the most frequently reported metrics. In addition, 20% studied the use of AR for intraoperative guidance, with registration accuracy, task outcome, and user perception as the most frequently reported metrics. CONCLUSION For quality benchmarking of AR technologies in neurosurgery, evaluation metrics should be specific to the risk profile and clinical objectives of the technology. A key focus should be on using validated questionnaires to assess user perception; ensuring clear and unambiguous reporting of registration accuracy, precision, robustness, and system stability; and accurately measuring task performance in clinical studies. We provided an overview suggesting which evaluation metrics to use per AR application and innovation phase, aiming to improve the assessment of added value of AR for neurosurgical practice and to facilitate the integration in the clinical workflow.
Collapse
Affiliation(s)
- Tessa M Kos
- Image Sciences Institute, University Medical Center Utrecht, Utrecht , The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht , The Netherlands
| | - Elisa Colombo
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich , The Netherlands
| | - L Wilbert Bartels
- Image Sciences Institute, University Medical Center Utrecht, Utrecht , The Netherlands
| | - Pierre A Robe
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht , The Netherlands
| | - Tristan P C van Doormaal
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich , The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht , The Netherlands
| |
Collapse
|
9
|
Bartella AK, Hoshal SG, Lethaus B, Strong EB. Computer assisted skull base surgery: a contemporary review. Innov Surg Sci 2023; 8:149-157. [PMID: 38077490 PMCID: PMC10709692 DOI: 10.1515/iss-2021-0020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2021] [Accepted: 03/03/2022] [Indexed: 10/16/2024] Open
Abstract
Skull base surgery has evolved significantly since Harvey Cushing's first descriptions in the early 1900s. Computer aided surgery (CAS) applications continue to expand; they include virtual surgical planning, augmented and virtual reality, 3D printing of models/cutting guides/implants, surgical navigation, and intraoperative imaging. The authors will review the current skull base CAS literature and propose a computer aided surgical workflow categorizing these applications into 3 phases: 1) Virtual planning, 2) Surgical execution, 3) Intraoperative verification.
Collapse
Affiliation(s)
| | - Steven G. Hoshal
- Department of Otolaryngology – Head and Neck Surgery, University of California, Davis, Sacramento, CA, USA
| | - Bernd Lethaus
- Department of Oral and Maxillofacial Surgery, Leipzig UniversityLeipzig, Germany
| | - E. Bradley Strong
- Department of Otolaryngology – Head and Neck Surgery, University of California, Davis, Sacramento, CA, USA
| |
Collapse
|
10
|
Chen JX, Yu S, Ding AS, Lee DJ, Welling DB, Carey JP, Gray ST, Creighton FX. Augmented Reality in Otology/Neurotology: A Scoping Review with Implications for Practice and Education. Laryngoscope 2023; 133:1786-1795. [PMID: 36519414 PMCID: PMC10267287 DOI: 10.1002/lary.30515] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2022] [Revised: 10/29/2022] [Accepted: 11/21/2022] [Indexed: 12/23/2022]
Abstract
OBJECTIVE To determine how augmented reality (AR) has been applied to the field of otology/neurotology, examine trends and gaps in research, and provide an assessment of the future potential of this technology within surgical practice and education. DATA SOURCES PubMed, EMBASE, and Cochrane Library were assessed from their inceptions through October 2022. A manual bibliography search was also conducted. REVIEW METHODS A scoping review was conducted and reported according to PRISMA-ScR guidelines. Data from studies describing the application of AR to the field of otology/neurotology were evaluated, according to a priori inclusion/exclusion criteria. Exclusion criteria included non-English language articles, abstracts, letters/commentaries, conference papers, and review articles. RESULTS Eighteen articles covering a diverse range of AR platforms were included. Publication dates spanned from 2007 to 2022 and the rate of publication increased over this time. Six of 18 studies were case series in human patients although the remaining were proof of concepts in cadaveric/artificial/animal models. The most common application of AR was for surgical navigation (14 of 18 studies). Computed tomography was the most common source of input data. Few studies noted potential applications to surgical training. CONCLUSION Interest in the application of AR to otology/neurotology is growing based on the number of recent publications that use a broad range of hardware, software, and AR platforms. Large gaps in research such as the need for submillimeter registration error must be addressed prior to adoption in the operating room and for educational purposes. LEVEL OF EVIDENCE N/A Laryngoscope, 133:1786-1795, 2023.
Collapse
Affiliation(s)
- Jenny X. Chen
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, MD
| | | | - Andy S. Ding
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, MD
| | - Daniel J. Lee
- Department of Otolaryngology–Head and Neck Surgery, Massachusetts Eye and Ear, Boston, MA
- Department of Otolaryngology–Head and Neck Surgery, Harvard Medical School, Boston, MA
| | - D. Brad Welling
- Department of Otolaryngology–Head and Neck Surgery, Massachusetts Eye and Ear, Boston, MA
- Department of Otolaryngology–Head and Neck Surgery, Harvard Medical School, Boston, MA
| | - John P. Carey
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, MD
| | - Stacey T. Gray
- Department of Otolaryngology–Head and Neck Surgery, Massachusetts Eye and Ear, Boston, MA
- Department of Otolaryngology–Head and Neck Surgery, Harvard Medical School, Boston, MA
| | - Francis X. Creighton
- Department of Otolaryngology–Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, MD
| |
Collapse
|
11
|
Gu W, Knopf J, Cast J, Higgins LD, Knopf D, Unberath M. Nail it! vision-based drift correction for accurate mixed reality surgical guidance. Int J Comput Assist Radiol Surg 2023:10.1007/s11548-023-02950-x. [PMID: 37231201 DOI: 10.1007/s11548-023-02950-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2023] [Accepted: 05/02/2023] [Indexed: 05/27/2023]
Abstract
PURPOSE Mixed reality-guided surgery through head-mounted displays (HMDs) is gaining interest among surgeons. However, precise tracking of HMDs relative to the surgical environment is crucial for successful outcomes. Without fiducial markers, spatial tracking of the HMD suffers from millimeter- to centimeter-scale drift, resulting in misaligned visualization of registered overlays. Methods and workflows capable of automatically correcting for drift after patient registration are essential to assuring accurate execution of surgical plans. METHODS We present a mixed reality surgical navigation workflow that continuously corrects for drift after patient registration using only image-based methods. We demonstrate its feasibility and capabilities using the Microsoft HoloLens on glenoid pin placement in total shoulder arthroplasty. A phantom study was conducted involving five users with each user placing pins on six glenoids of different deformity, followed by a cadaver study by an attending surgeon. RESULTS In both studies, all users were satisfied with the registration overlay before drilling the pin. Postoperative CT scans showed 1.5 mm error in entry point deviation and 2.4[Formula: see text] error in pin orientation on average in the phantom study and 2.5 mm and 1.5[Formula: see text] in the cadaver study. A trained user takes around 90 s to complete the workflow. Our method also outperformed HoloLens native tracking in drift correction. CONCLUSION Our findings suggest that image-based drift correction can provide mixed reality environments precisely aligned with patient anatomy, enabling pin placement with consistently high accuracy. These techniques constitute a next step toward purely image-based mixed reality surgical guidance, without requiring patient markers or external tracking hardware.
Collapse
Affiliation(s)
- Wenhao Gu
- Johns Hopkins University, Baltimore, MD, USA.
| | | | - John Cast
- Johns Hopkins University, Baltimore, MD, USA
| | | | - David Knopf
- Arthrex Inc., 1 Arthrex Way, Naples, FL, USA
| | | |
Collapse
|
12
|
Bounajem MT, Cameron B, Sorensen K, Parr R, Gibby W, Prashant G, Evans JJ, Karsy M. Improved Accuracy and Lowered Learning Curve of Ventricular Targeting Using Augmented Reality-Phantom and Cadaveric Model Testing. Neurosurgery 2023; 92:884-891. [PMID: 36562619 DOI: 10.1227/neu.0000000000002293] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2022] [Accepted: 09/23/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Augmented reality (AR) has demonstrated significant potential in neurosurgical cranial, spine, and teaching applications. External ventricular drain (EVD) placement remains a common procedure, but with error rates in targeting between 10% and 40%. OBJECTIVE To evaluate Novarad VisAR guidance system for the placement of EVDs in phantom and cadaveric models. METHODS Two synthetic ventricular phantom models and a third cadaver model underwent computerized tomography imaging and registration with the VisAR system (Novarad). Root mean square (RMS), angular error (γ), and Euclidian distance were measured by multiple methods for various standard EVD placements. RESULTS Computerized tomography measurements on a phantom model (0.5-mm targets showed a mean Euclidean distance error of 1.20 ± 0.98 mm and γ of 1.25° ± 1.02°. Eight participants placed EVDs in lateral and occipital burr holes using VisAR in a second phantom anatomic ventricular model (mean RMS: 3.9 ± 1.8 mm, γ: 3.95° ± 1.78°). There were no statistically significant differences in accuracy for postgraduate year level, prior AR experience, prior EVD experience, or experience with video games ( P > .05). In comparing EVDs placed with anatomic landmarks vs VisAR navigation in a cadaver, VisAR demonstrated significantly better RMS and γ, 7.47 ± 0.94 mm and 7.12° ± 0.97°, respectively ( P ≤ .05). CONCLUSION The novel VisAR AR system resulted in accurate placement of EVDs with a rapid learning curve, which may improve clinical treatment and patient safety. Future applications of VisAR can be expanded to other cranial procedures.
Collapse
Affiliation(s)
- Michael T Bounajem
- Department of Neurosurgery, Clinical Neurosciences Center, University of Utah, Salt Lake City, Utah, USA
| | | | | | | | - Wendell Gibby
- Novarad, Provo, Utah, USA
- Department of Radiology, University of California-San Diego, San Diego, California, USA
| | - Giyarpuram Prashant
- Department of Neurosurgery, Thomas Jefferson University Hospital, Philadelphia, Pennsylvania, USA
| | - James J Evans
- Department of Neurosurgery, Thomas Jefferson University Hospital, Philadelphia, Pennsylvania, USA
| | - Michael Karsy
- Department of Neurosurgery, Clinical Neurosciences Center, University of Utah, Salt Lake City, Utah, USA
| |
Collapse
|
13
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 30] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
14
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
15
|
Ravindra VM, Tadlock MD, Gurney JM, Kraus KL, Dengler BA, Gordon J, Cooke J, Porensky P, Belverud S, Milton JO, Cardoso M, Carroll CP, Tomlin J, Champagne R, Bell RS, Viers AG, Ikeda DS. Attitudes Toward Neurosurgery Education for the Nonneurosurgeon: A Survey Study and Critical Analysis of U.S. Military Training Techniques and Future Prospects. World Neurosurg 2022; 167:e1335-e1344. [PMID: 36103986 DOI: 10.1016/j.wneu.2022.09.033] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2022] [Accepted: 09/07/2022] [Indexed: 11/27/2022]
Abstract
BACKGROUND The U.S. military requires medical readiness to support forward-deployed combat operations. Because time and distance to neurosurgical capabilities vary within the deployed trauma system, nonneurosurgeons are required to perform emergent cranial procedures in select cases. It is unclear whether these surgeons have sufficient training in these procedures. METHODS This quality-improvement study involved a voluntary, anonymized specialty-specific survey of active-duty surgeons about their experience and attitudes toward U.S. military emergency neurosurgical training. RESULTS Survey responses were received from 104 general surgeons and 26 neurosurgeons. Among general surgeons, 81% have deployed and 53% received training in emergency neurosurgical procedures before deployment. Only 16% of general surgeons reported participating in craniotomy/craniectomy procedures in the last year. Nine general surgeons reported performing an emergency neurosurgical procedure while on deployment/humanitarian mission, and 87% of respondents expressed interest in further predeployment emergency neurosurgery training. Among neurosurgeons, 81% had participated in training nonneurosurgeons and 73% believe that more comprehensive training for nonneurosurgeons before deployment is needed. General surgeons proposed lower procedure minimums for competency for external ventricular drain placement and craniotomy/craniectomy than did neurosurgeons. Only 37% of general surgeons had used mixed/augmented reality in any capacity previously; for combat procedures, most (90%) would prefer using synchronous supervision via high-fidelity video teleconferencing over mixed reality. CONCLUSIONS These survey results show a gap in readiness for neurosurgical procedures for forward-deployed general surgeons. Capitalizing on capabilities such as mixed/augmented reality would be a force multiplier and a potential means of improving neurosurgical capabilities in the forward-deployed environments.
Collapse
Affiliation(s)
- Vijay M Ravindra
- Department of Neurosurgery, Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA; Department of Neurosurgery, University of California San Diego, San Diego, California, USA; Department of Neurosurgery, University of Utah, Salt Lake City, Utah, USA
| | - Matthew D Tadlock
- Department of Surgery, Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA; Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA; 1st Medical Battalion, 1st Marine Logistics Group, Camp Pendleton, California, USA
| | - Jennifer M Gurney
- U.S. Army Institute of Surgical Research, Joint Base San Antonio, San Antonio, Texas, USA
| | - Kristin L Kraus
- Department of Neurosurgery, University of Utah, Salt Lake City, Utah, USA
| | - Bradley A Dengler
- Department of Neurosurgery, Walter Reed National Military Medical Center, Bethesda, Maryland, USA
| | - Jennifer Gordon
- Department of Surgery, U.S. Naval Hospital Okinawa, Okinawa, Japan
| | - Jonathon Cooke
- Department of Neurosurgery, Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA
| | - Paul Porensky
- Department of Neurosurgery, Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA
| | - Shawn Belverud
- Department of Neurosurgery, Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA
| | - Jason O Milton
- Department of Neurosurgery, Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA
| | - Mario Cardoso
- Department of Brain and Spine Surgery, Naval Medical Center, Portsmouth, Virginia, USA
| | - Christopher P Carroll
- Department of Brain and Spine Surgery, Naval Medical Center, Portsmouth, Virginia, USA
| | - Jeffrey Tomlin
- Department of Brain and Spine Surgery, Naval Medical Center, Portsmouth, Virginia, USA
| | - Roland Champagne
- Bioskills Training Center, Naval Medical Readiness Training Command, San Diego, California, USA
| | - Randy S Bell
- Department of Neurosurgery, Walter Reed National Military Medical Center, Bethesda, Maryland, USA
| | - Angela G Viers
- Department of Surgery, U.S. Naval Hospital Okinawa, Okinawa, Japan
| | - Daniel S Ikeda
- Department of Neurosurgery, Walter Reed National Military Medical Center, Bethesda, Maryland, USA.
| |
Collapse
|
16
|
Boaro A, Moscolo F, Feletti A, Polizzi G, Nunes S, Siddi F, Broekman M, Sala F. Visualization, navigation, augmentation. The ever-changing perspective of the neurosurgeon. BRAIN & SPINE 2022; 2:100926. [PMID: 36248169 PMCID: PMC9560703 DOI: 10.1016/j.bas.2022.100926] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Revised: 07/23/2022] [Accepted: 08/10/2022] [Indexed: 11/22/2022]
Abstract
Introduction The evolution of neurosurgery coincides with the evolution of visualization and navigation. Augmented reality technologies, with their ability to bring digital information into the real environment, have the potential to provide a new, revolutionary perspective to the neurosurgeon. Research question To provide an overview on the historical and technical aspects of visualization and navigation in neurosurgery, and to provide a systematic review on augmented reality (AR) applications in neurosurgery. Material and methods We provided an overview on the main historical milestones and technical features of visualization and navigation tools in neurosurgery. We systematically searched PubMed and Scopus databases for AR applications in neurosurgery and specifically discussed their relationship with current visualization and navigation systems, as well as main limitations. Results The evolution of visualization in neurosurgery is embodied by four magnification systems: surgical loupes, endoscope, surgical microscope and more recently the exoscope, each presenting independent features in terms of magnification capabilities, eye-hand coordination and the possibility to implement additional functions. In regard to navigation, two independent systems have been developed: the frame-based and the frame-less systems. The most frequent application setting for AR is brain surgery (71.6%), specifically neuro-oncology (36.2%) and microscope-based (29.2%), even though in the majority of cases AR applications presented their own visualization supports (66%). Discussion and conclusions The evolution of visualization and navigation in neurosurgery allowed for the development of more precise instruments; the development and clinical validation of AR applications, have the potential to be the next breakthrough, making surgeries safer, as well as improving surgical experience and reducing costs.
Collapse
Affiliation(s)
- A. Boaro
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - F. Moscolo
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - A. Feletti
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - G.M.V. Polizzi
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - S. Nunes
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| | - F. Siddi
- Department of Neurosurgery, Haaglanden Medical Center, The Hague, Zuid-Holland, the Netherlands
| | - M.L.D. Broekman
- Department of Neurosurgery, Haaglanden Medical Center, The Hague, Zuid-Holland, the Netherlands
- Department of Neurosurgery, Leiden University Medical Center, Leiden, Zuid-Holland, the Netherlands
| | - F. Sala
- Section of Neurosurgery, Department of Neurosciences, Biomedicine and Movement Sciences, University of Verona, Italy
| |
Collapse
|
17
|
Ding AS, Lu A, Li Z, Galaiya D, Ishii M, Siewerdsen JH, Taylor RH, Creighton FX. Statistical Shape Model of the Temporal Bone Using Segmentation Propagation. Otol Neurotol 2022; 43:e679-e687. [PMID: 35761465 PMCID: PMC10072910 DOI: 10.1097/mao.0000000000003554] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
HYPOTHESIS Automated image registration techniques can successfully determine anatomical variation in human temporal bones with statistical shape modeling. BACKGROUND There is a lack of knowledge about inter-patient anatomical variation in the temporal bone. Statistical shape models (SSMs) provide a powerful method for quantifying variation of anatomical structures in medical images but are time-intensive to manually develop. This study presents SSMs of temporal bone anatomy using automated image-registration techniques. METHODS Fifty-three cone-beam temporal bone CTs were included for SSM generation. The malleus, incus, stapes, bony labyrinth, and facial nerve were automatically segmented using 3D Slicer and a template-based segmentation propagation technique. Segmentations were then used to construct SSMs using MATLAB. The first three principal components of each SSM were analyzed to describe shape variation. RESULTS Principal component analysis of middle and inner ear structures revealed novel modes of anatomical variation. The first three principal components for the malleus represented variability in manubrium length (mean: 4.47 mm; ±2-SDs: 4.03-5.03 mm) and rotation about its long axis (±2-SDs: -1.6° to 1.8° posteriorly). The facial nerve exhibits variability in first and second genu angles. The bony labyrinth varies in the angle between the posterior and superior canals (mean: 88.9°; ±2-SDs: 83.7°-95.7°) and cochlear orientation (±2-SDs: -4.0° to 3.0° anterolaterally). CONCLUSIONS SSMs of temporal bone anatomy can inform surgeons on clinically relevant inter-patient variability. Anatomical variation elucidated by these models can provide novel insight into function and pathophysiology. These models also allow further investigation of anatomical variation based on age, BMI, sex, and geographical location.
Collapse
Affiliation(s)
- Andy S. Ding
- Department of Otolaryngology – Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland
- Department of Biomedical Engineering, Johns Hopkins University Whiting School of Engineering, Baltimore, Maryland
| | - Alexander Lu
- Department of Otolaryngology – Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland
- Department of Biomedical Engineering, Johns Hopkins University Whiting School of Engineering, Baltimore, Maryland
| | - Zhaoshuo Li
- Department of Computer Science, Johns Hopkins University Whiting School of Engineering, Baltimore, Maryland
| | - Deepa Galaiya
- Department of Otolaryngology – Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Masaru Ishii
- Department of Otolaryngology – Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Jeffrey H. Siewerdsen
- Department of Biomedical Engineering, Johns Hopkins University Whiting School of Engineering, Baltimore, Maryland
- Department of Computer Science, Johns Hopkins University Whiting School of Engineering, Baltimore, Maryland
| | - Russell H. Taylor
- Department of Computer Science, Johns Hopkins University Whiting School of Engineering, Baltimore, Maryland
| | - Francis X. Creighton
- Department of Otolaryngology – Head and Neck Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland
| |
Collapse
|
18
|
Microscope-Based Augmented Reality with Intraoperative Computed Tomography-Based Navigation for Resection of Skull Base Meningiomas in Consecutive Series of 39 Patients. Cancers (Basel) 2022; 14:cancers14092302. [PMID: 35565431 PMCID: PMC9101634 DOI: 10.3390/cancers14092302] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 04/27/2022] [Accepted: 05/04/2022] [Indexed: 11/16/2022] Open
Abstract
Background: The aim of surgery for skull base meningiomas is maximal resection with minimal damage to the involved cranial nerves and cerebral vessels; thus, implementation of technologies for improved orientation in the surgical field, such as neuronavigation and augmented reality (AR), is of interest. Methods: Included in the study were 39 consecutive patients (13 male, 26 female, mean age 64.08 ± 13.5 years) who underwent surgery for skull base meningiomas using microscope-based AR and automatic patient registration using intraoperative computed tomography (iCT). Results: Most common were olfactory meningiomas (6), cavernous sinus (6) and clinoidal (6) meningiomas, meningiomas of the medial (5) and lateral (5) sphenoid wing and meningiomas of the sphenoidal plane (5), followed by suprasellar (4), falcine (1) and middle fossa (1) meningiomas. There were 26 patients (66.6%) who underwent gross total resection (GTR) of the meningioma. Automatic registration applying iCT resulted in high accuracy (target registration error, 0.82 ± 0.37 mm). The effective radiation dose of the registration iCT scans was 0.58 ± 1.05 mSv. AR facilitated orientation in the resection of skull base meningiomas with encasement of cerebral vessels and compression of the optic chiasm, as well as in reoperations, increasing surgeon comfort. No injuries to critical neurovascular structures occurred. Out of 35 patients who lived to follow-up, 33 could ambulate at their last presentation. Conclusion: A microscope-based AR facilitates surgical orientation for resection of skull base meningiomas. Registration accuracy is very high using automatic registration with intraoperative imaging.
Collapse
|
19
|
Gu W, Shah K, Knopf J, Josewski C, Unberath M. A calibration-free workflow for image-based mixed reality navigation of total shoulder arthroplasty. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2022. [DOI: 10.1080/21681163.2021.2009378] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Wenhao Gu
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, Maryland, USA
| | - Kinjal Shah
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, Maryland, USA
| | | | | | - Mathias Unberath
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, Maryland, USA
| |
Collapse
|
20
|
Cote DJ, Ruzevick J, Strickland B, Donoho DA, Zada G. Commentary: Three-Dimensional Modeling for Augmented and Virtual Reality–Based Posterior Fossa Approach Selection Training: Technical Overview of Novel Open-Source Materials. Oper Neurosurg (Hagerstown) 2022; 22:e261. [DOI: 10.1227/ons.0000000000000236] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2022] [Accepted: 02/16/2022] [Indexed: 11/19/2022] Open
|
21
|
Steiert C, Behringer SP, Kraus LM, Bissolo M, Demerath T, Beck J, Grauvogel J, Reinacher PC. Augmented reality-assisted craniofacial reconstruction in skull base lesions - an innovative technique for single-step resection and cranioplasty in neurosurgery. Neurosurg Rev 2022; 45:2745-2755. [PMID: 35441994 PMCID: PMC9349131 DOI: 10.1007/s10143-022-01784-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2022] [Revised: 03/19/2022] [Accepted: 03/30/2022] [Indexed: 10/31/2022]
Abstract
Defects of the cranial vault often require cosmetic reconstruction with patient-specific implants, particularly in cases of craniofacial involvement. However, fabrication takes time and is expensive; therefore, efforts must be made to develop more rapidly available and more cost-effective alternatives. The current study investigated the feasibility of an augmented reality (AR)-assisted single-step procedure for repairing bony defects involving the facial skeleton and the skull base. In an experimental setting, nine neurosurgeons fabricated AR-assisted and conventionally shaped ("freehand") implants from polymethylmethacrylate (PMMA) on a skull model with a craniofacial bony defect. Deviations of the surface profile in comparison with the original model were quantified by means of volumetry, and the cosmetic results were evaluated using a multicomponent scoring system, each by two blinded neurosurgeons. Handling the AR equipment proved to be quite comfortable. The median volume deviating from the surface profile of the original model was low in the AR-assisted implants (6.40 cm3) and significantly reduced in comparison with the conventionally shaped implants (13.48 cm3). The cosmetic appearance of the AR-assisted implants was rated as very good (median 25.00 out of 30 points) and significantly improved in comparison with the conventionally shaped implants (median 14.75 out of 30 points). Our experiments showed outstanding results regarding the possibilities of AR-assisted procedures for single-step reconstruction of craniofacial defects. Although patient-specific implants still represent the gold standard in esthetic aspects, AR-assisted procedures hold high potential for an immediately and widely available, cost-effective alternative providing excellent cosmetic outcomes.
Collapse
Affiliation(s)
- Christine Steiert
- Department of Neurosurgery, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany.
| | - Simon Phillipp Behringer
- Department of Neurosurgery, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Luisa Mona Kraus
- Department of Neurosurgery, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Marco Bissolo
- Department of Neurosurgery, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Theo Demerath
- Department of Neuroradiology, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Juergen Beck
- Department of Neurosurgery, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Juergen Grauvogel
- Department of Neurosurgery, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Peter Christoph Reinacher
- Department of Stereotactic and Functional Neurosurgery, Medical Center - University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany.,Fraunhofer Institute for Laser Technology, Aachen, Germany
| |
Collapse
|
22
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
23
|
Augmented Reality Based Transmodiolar Cochlear Implantation. Otol Neurotol 2021; 43:190-198. [PMID: 34855687 DOI: 10.1097/mao.0000000000003437] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
HYPOTHESIS Transmodiolar auditory implantation via the middle ear cavity could be possible using an augmented reality system (ARS). BACKGROUND There is no clear landmark to indicate the cochlear apex or the modiolar axis. The ARS seems to be a promising tool for transmodiolar implantation by combining information from the preprocedure computed tomography scan (CT-scan) images to the real-time video of the surgical field. METHODS Eight human temporal bone resin models were included (five adults and three children). The procedure started by the identification of the modiolar axis on the preprocedure CT-scan followed by a 3D reconstruction of the images. Information on modiolar location and navigational guidance was supplemented to the reconstructed model, which was then registered with the surgical video using a point-based approach. Relative movements between the phantom and the microscope were tracked using image feature-based motion tracking. Based on the information provided via the ARS, the surgeon implanted the electrode-array inside the modiolus after drilling the helicothrema. Postprocedure CT-scan images were acquired to evaluate the registration error and the implantation accuracy. RESULTS The implantation could be conducted in all cases with a 2D registration error of 0.4 ± 0.24 mm. The mean entry point error was 0.6 ± 1.00 mm and the implant angular error 13.5 ± 8.93 degrees (n = 8), compatible with the procedure requirements. CONCLUSION We developed an image-based ARS to identify the extremities and the axis of the cochlear modiolus on intraprocedure videos. The system yielded submillimetric accuracy for implantation and remained stable throughout the experimental study.
Collapse
|
24
|
Sahovaler A, Chan HHL, Gualtieri T, Daly M, Ferrari M, Vannelli C, Eu D, Manojlovic-Kolarski M, Orzell S, Taboni S, de Almeida JR, Goldstein DP, Deganello A, Nicolai P, Gilbert RW, Irish JC. Augmented Reality and Intraoperative Navigation in Sinonasal Malignancies: A Preclinical Study. Front Oncol 2021; 11:723509. [PMID: 34790568 PMCID: PMC8591179 DOI: 10.3389/fonc.2021.723509] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Accepted: 10/12/2021] [Indexed: 11/13/2022] Open
Abstract
Objective To report the first use of a novel projected augmented reality (AR) system in open sinonasal tumor resections in preclinical models and to compare the AR approach with an advanced intraoperative navigation (IN) system. Methods Four tumor models were created. Five head and neck surgeons participated in the study performing virtual osteotomies. Unguided, AR, IN, and AR + IN simulations were performed. Statistical comparisons between approaches were obtained. Intratumoral cut rate was the main outcome. The groups were also compared in terms of percentage of intratumoral, close, adequate, and excessive distances from the tumor. Information on a wearable gaze tracker headset and NASA Task Load Index questionnaire results were analyzed as well. Results A total of 335 cuts were simulated. Intratumoral cuts were observed in 20.7%, 9.4%, 1.2,% and 0% of the unguided, AR, IN, and AR + IN simulations, respectively (p < 0.0001). The AR was superior than the unguided approach in univariate and multivariate models. The percentage of time looking at the screen during the procedures was 55.5% for the unguided approaches and 0%, 78.5%, and 61.8% in AR, IN, and AR + IN, respectively (p < 0.001). The combined approach significantly reduced the screen time compared with the IN procedure alone. Conclusion We reported the use of a novel AR system for oncological resections in open sinonasal approaches, with improved margin delineation compared with unguided techniques. AR improved the gaze-toggling drawback of IN. Further refinements of the AR system are needed before translating our experience to clinical practice.
Collapse
Affiliation(s)
- Axel Sahovaler
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Harley H L Chan
- Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Tommaso Gualtieri
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada.,Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy
| | - Michael Daly
- Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Marco Ferrari
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada.,Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy.,Section of Otorhinolaryngology-Head and Neck Surgery, University of Padua-Azienda Ospedaliera di Padova, Padua, Italy
| | - Claire Vannelli
- Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Donovan Eu
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| | - Mirko Manojlovic-Kolarski
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Susannah Orzell
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Stefano Taboni
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada.,Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy.,Section of Otorhinolaryngology-Head and Neck Surgery, University of Padua-Azienda Ospedaliera di Padova, Padua, Italy
| | - John R de Almeida
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - David P Goldstein
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Alberto Deganello
- Unit of Otorhinolaryngology-Head and Neck Surgery, University of Brescia-ASST "Spedali Civili di Brescia, Brescia, Italy
| | - Piero Nicolai
- Section of Otorhinolaryngology-Head and Neck Surgery, University of Padua-Azienda Ospedaliera di Padova, Padua, Italy
| | - Ralph W Gilbert
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada
| | - Jonathan C Irish
- Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University Health Network, Toronto, ON, Canada.,Guided Therapeutics (GTx) Program, Techna Institute, University Health Network, Toronto, ON, Canada
| |
Collapse
|
25
|
Patient-specific virtual and mixed reality for immersive, experiential anatomy education and for surgical planning in temporal bone surgery. Auris Nasus Larynx 2021; 48:1081-1091. [PMID: 34059399 DOI: 10.1016/j.anl.2021.03.009] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2020] [Revised: 02/13/2021] [Accepted: 03/16/2021] [Indexed: 11/20/2022]
Abstract
OBJECTIVE The recent development of extended reality technology has attracted interest in medicine. We explored the use of patient-specific virtual reality (VR) and mixed reality (MR) temporal bone models in anatomical teaching, pre-operative surgical planning and intra-operative surgical referencing. METHODS VR and MR temporal bone models were created and visualized on head-mounted display (HMD) and MR headset respectively, by a novel webservice that allows users to convert computed tomography images to VR and MR images without specific knowledge of programming. Eleven otorhinolaryngology trainees and specialists were asked to manipulate the healthy VR temporal bone model and to assess its validity by filling out a questionnaire. Additionally, VR and MR pathological models of petrous apex cholesteatoma were utilized for surgical planning pre-operatively and for referring to the anatomy during the surgery. RESULTS Most participants were favorable about the VR model and considered HMD as superior to a flat computer screen. 91% of the participants agreed or somewhat agreed that VR through HMD is cost effective. In addition, the VR pathological model was used for planning and sharing the surgical approach during a pre-operative surgical conference. The MR headset was worn intra-operatively to clarify the relationship between the pathological lesion and vital anatomical structures. CONCLUSION Regardless of the participants' training level in otorhinolaryngology or VR experience, all participants agreed that the VR temporal bone model is useful for anatomical education. Furthermore, the creation of patient-specific VR and MR models using the webservice and their pre- and intra-operative usages indicated the potential of innovative adjunctive surgical instrument.
Collapse
|
26
|
Gu W, Shah K, Knopf J, Navab N, Unberath M. Feasibility of image-based augmented reality guidance of total shoulder arthroplasty using microsoft HoloLens 1. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING: IMAGING & VISUALIZATION 2021. [DOI: 10.1080/21681163.2020.1835556] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Affiliation(s)
- Wenhao Gu
- Johns Hopkins University, Baltimore, USA
| | | | | | | | | |
Collapse
|
27
|
Augmented Reality, Virtual Reality and Artificial Intelligence in Orthopedic Surgery: A Systematic Review. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11073253] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
Background: The application of virtual and augmented reality technologies to orthopaedic surgery training and practice aims to increase the safety and accuracy of procedures and reducing complications and costs. The purpose of this systematic review is to summarise the present literature on this topic while providing a detailed analysis of current flaws and benefits. Methods: A comprehensive search on the PubMed, Cochrane, CINAHL, and Embase database was conducted from inception to February 2021. The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used to improve the reporting of the review. The Cochrane Risk of Bias Tool and the Methodological Index for Non-Randomized Studies (MINORS) was used to assess the quality and potential bias of the included randomized and non-randomized control trials, respectively. Results: Virtual reality has been proven revolutionary for both resident training and preoperative planning. Thanks to augmented reality, orthopaedic surgeons could carry out procedures faster and more accurately, improving overall safety. Artificial intelligence (AI) is a promising technology with limitless potential, but, nowadays, its use in orthopaedic surgery is limited to preoperative diagnosis. Conclusions: Extended reality technologies have the potential to reform orthopaedic training and practice, providing an opportunity for unidirectional growth towards a patient-centred approach.
Collapse
|
28
|
Liu PR, Lu L, Zhang JY, Huo TT, Liu SX, Ye ZW. Application of Artificial Intelligence in Medicine: An Overview. Curr Med Sci 2021; 41:1105-1115. [PMID: 34874486 PMCID: PMC8648557 DOI: 10.1007/s11596-021-2474-3] [Citation(s) in RCA: 75] [Impact Index Per Article: 18.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2020] [Accepted: 12/01/2020] [Indexed: 02/06/2023]
Abstract
Artificial intelligence (AI) is a new technical discipline that uses computer technology to research and develop the theory, method, technique, and application system for the simulation, extension, and expansion of human intelligence. With the assistance of new AI technology, the traditional medical environment has changed a lot. For example, a patient's diagnosis based on radiological, pathological, endoscopic, ultrasonographic, and biochemical examinations has been effectively promoted with a higher accuracy and a lower human workload. The medical treatments during the perioperative period, including the preoperative preparation, surgical period, and postoperative recovery period, have been significantly enhanced with better surgical effects. In addition, AI technology has also played a crucial role in medical drug production, medical management, and medical education, taking them into a new direction. The purpose of this review is to introduce the application of AI in medicine and to provide an outlook of future trends.
Collapse
Affiliation(s)
- Peng-ran Liu
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Lin Lu
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Jia-yao Zhang
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Tong-tong Huo
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Song-xiang Liu
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| | - Zhe-wei Ye
- Department of Orthopedics, Union Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, 430022 China
| |
Collapse
|
29
|
Timonen T, Iso-Mustajärvi M, Linder P, Lehtimäki A, Löppönen H, Elomaa AP, Dietz A. Virtual reality improves the accuracy of simulated preoperative planning in temporal bones: a feasibility and validation study. Eur Arch Otorhinolaryngol 2020; 278:2795-2806. [PMID: 32964264 PMCID: PMC8266780 DOI: 10.1007/s00405-020-06360-6] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2020] [Accepted: 09/08/2020] [Indexed: 11/26/2022]
Abstract
PURPOSE Consumer-grade virtual reality (VR) has recently enabled various medical applications, but more evidence supporting their validity is needed. We investigated the accuracy of simulated surgical planning in a VR environment (VR) with temporal bones and compared it to conventional cross-sectional image viewing in picture archiving and communication system (PACS) interface. METHODS Five experienced otologic surgeons measured significant anatomic structures and fiducials on five fresh-frozen cadaveric temporal bones in VR and cross-sectional viewing. Primary image data were acquired by computed tomography. In total, 275 anatomical landmark measurements and 250 measurements of the distance between fiducials were obtained with both methods. Distance measurements between the fiducials were confirmed by physical measurement obtained by Vernier caliper. The experts evaluated the subjective validity of both methods on a 5-point Likert scale qualitative survey. RESULTS A strong correlation based on intraclass coefficient was found between the methods on both the anatomical (r > 0.900) and fiducial measurements (r > 0.916). Two-tailed paired t-test and Bland-Altman plots demonstrated high equivalences between the VR and cross-sectional viewing with mean differences of 1.9% (p = 0.396) and 0.472 mm (p = 0.065) for anatomical and fiducial measurements, respectively. Gross measurement errors due to the misidentification of fiducials occurred more frequently in the cross-sectional viewing. The mean face and content validity rating for VR were significantly better compared to cross-sectional viewing (total mean score 4.11 vs 3.39, p < 0.001). CONCLUSION Our study supports good accuracy and reliability of VR environment for simulated surgical planning in temporal bones compared to conventional cross-sectional visualization.
Collapse
Affiliation(s)
- Tomi Timonen
- Department of Otorhinolaryngology, Kuopio University Hospital, Puijonlaaksontie 2, PL 100, 70210, Kuopio, Finland.
- School of Medicine, Institute of Clinical Medicine, University of Eastern Finland, Kuopio, Finland.
| | - Matti Iso-Mustajärvi
- Department of Otorhinolaryngology, Kuopio University Hospital, Puijonlaaksontie 2, PL 100, 70210, Kuopio, Finland
- Microsurgery Centre of Eastern Finland, Kuopio, Finland
| | - Pia Linder
- Department of Otorhinolaryngology, Kuopio University Hospital, Puijonlaaksontie 2, PL 100, 70210, Kuopio, Finland
| | - Antti Lehtimäki
- Department of Radiology, Kuopio University Hospital, Kuopio, Finland
| | - Heikki Löppönen
- Department of Otorhinolaryngology, Kuopio University Hospital, Puijonlaaksontie 2, PL 100, 70210, Kuopio, Finland
- School of Medicine, Institute of Clinical Medicine, University of Eastern Finland, Kuopio, Finland
| | | | - Aarno Dietz
- Department of Otorhinolaryngology, Kuopio University Hospital, Puijonlaaksontie 2, PL 100, 70210, Kuopio, Finland
- School of Medicine, Institute of Clinical Medicine, University of Eastern Finland, Kuopio, Finland
| |
Collapse
|