1
|
Wang C, Guo L, Zhu J, Zhu L, Li C, Zhu H, Song A, Lu L, Teng GJ, Navab N, Jiang Z. Review of robotic systems for thoracoabdominal puncture interventional surgery. APL Bioeng 2024; 8:021501. [PMID: 38572313 PMCID: PMC10987197 DOI: 10.1063/5.0180494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Accepted: 03/11/2024] [Indexed: 04/05/2024] Open
Abstract
Cancer, with high morbidity and high mortality, is one of the major burdens threatening human health globally. Intervention procedures via percutaneous puncture have been widely used by physicians due to its minimally invasive surgical approach. However, traditional manual puncture intervention depends on personal experience and faces challenges in terms of precisely puncture, learning-curve, safety and efficacy. The development of puncture interventional surgery robotic (PISR) systems could alleviate the aforementioned problems to a certain extent. This paper attempts to review the current status and prospective of PISR systems for thoracic and abdominal application. In this review, the key technologies related to the robotics, including spatial registration, positioning navigation, puncture guidance feedback, respiratory motion compensation, and motion control, are discussed in detail.
Collapse
Affiliation(s)
- Cheng Wang
- Hanglok-Tech Co. Ltd., Hengqin 519000, People's Republic of China
| | - Li Guo
- Hanglok-Tech Co. Ltd., Hengqin 519000, People's Republic of China
| | | | - Lifeng Zhu
- State Key Laboratory of Digital Medical Engineering, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, People's Republic of China
| | - Chichi Li
- School of Computer Science and Engineering, Macau University of Science and Technology, Macau, 999078, People's Republic of China
| | - Haidong Zhu
- Center of Interventional Radiology and Vascular Surgery, Department of Radiology, Zhongda Hospital, Medical School, Southeast University, Nanjing 210009, People's Republic of China
| | - Aiguo Song
- State Key Laboratory of Digital Medical Engineering, Jiangsu Key Lab of Remote Measurement and Control, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, People's Republic of China
| | | | - Gao-Jun Teng
- Center of Interventional Radiology and Vascular Surgery, Department of Radiology, Zhongda Hospital, Medical School, Southeast University, Nanjing 210009, People's Republic of China
| | | | - Zhongliang Jiang
- Computer Aided Medical Procedures, Technical University of Munich, Munich 80333, Germany
| |
Collapse
|
2
|
Díez-Montiel A, Pose-Díez-de-la-Lastra A, González-Álvarez A, Salmerón JI, Pascau J, Ochandiano S. Tablet-based Augmented reality and 3D printed templates in fully guided Microtia Reconstruction: a clinical workflow. 3D Print Med 2024; 10:17. [PMID: 38819536 PMCID: PMC11140883 DOI: 10.1186/s41205-024-00213-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2023] [Accepted: 04/04/2024] [Indexed: 06/01/2024] Open
Abstract
BACKGROUND Microtia is a congenital malformation of the auricle that affects approximately 4 of every 10,000 live newborns. Radiographic film paper is traditionally employed to bidimensionally trace the structures of the contralateral healthy ear in a quasi-artistic manner. Anatomical points provide linear and angular measurements. However, this technique proves time-consuming, subjectivity-rich, and greatly dependent on surgeon expertise. Hence, it's susceptible to shape errors and misplacement. METHODS We present an innovative clinical workflow that combines 3D printing and augmented reality (AR) to increase objectivity and reproducibility of these procedures. Specifically, we introduce patient-specific 3D cutting templates and remodeling molds to carve and construct the cartilaginous framework that will conform the new ear. Moreover, we developed an in-house AR application compatible with any commercial Android tablet. It precisely guides the positioning of the new ear during surgery, ensuring symmetrical alignment with the healthy one and avoiding time-consuming intraoperative linear or angular measurements. Our solution was evaluated in one case, first with controlled experiments in a simulation scenario and finally during surgery. RESULTS Overall, the ears placed in the simulation scenario had a mean absolute deviation of 2.2 ± 1.7 mm with respect to the reference plan. During the surgical intervention, the reconstructed ear was 3.1 mm longer and 1.3 mm wider with respect to the ideal plan and had a positioning error of 2.7 ± 2.4 mm relative to the contralateral side. Note that in this case, additional morphometric variations were induced from inflammation and other issues intended to be addressed in a subsequent stage of surgery, which are independent of our proposed solution. CONCLUSIONS In this work we propose an innovative workflow that combines 3D printing and AR to improve ear reconstruction and positioning in microtia correction procedures. Our implementation in the surgical workflow showed good accuracy, empowering surgeons to attain consistent and objective outcomes.
Collapse
Affiliation(s)
- Alberto Díez-Montiel
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, 28007, Spain
- Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, 28007, Spain
| | - Alicia Pose-Díez-de-la-Lastra
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, 28007, Spain.
- Departamento de Bioingeniería, Universidad Carlos III de Madrid, Leganés, 28911, Spain.
| | - Alba González-Álvarez
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, 28007, Spain
- Departamento de Bioingeniería, Universidad Carlos III de Madrid, Leganés, 28911, Spain
| | - José I Salmerón
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, 28007, Spain
- Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, 28007, Spain
| | - Javier Pascau
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, 28007, Spain
- Departamento de Bioingeniería, Universidad Carlos III de Madrid, Leganés, 28911, Spain
| | - Santiago Ochandiano
- Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, 28007, Spain
- Servicio de Cirugía Oral y Maxilofacial, Hospital General Universitario Gregorio Marañón, Madrid, 28007, Spain
| |
Collapse
|
3
|
Tang ZN, Hu LH, Yu Y, Zhang WB, Peng X. Mixed Reality Combined with Surgical Navigation in Resection of Micro- and Mini-Tumors of the Parotid Gland: A Pilot Study. Laryngoscope 2024; 134:1670-1678. [PMID: 37819631 DOI: 10.1002/lary.31104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Revised: 09/13/2023] [Accepted: 09/27/2023] [Indexed: 10/13/2023]
Abstract
OBJECTIVE This study aimed to evaluate the feasibility and outcomes of mixed reality combined with surgical navigation (MRSN) in the resection of parotid micro- and mini-tumors. METHODS Eighteen patients who underwent parotid tumor resection between December 2020 and November 2022 were included. Six patients were enrolled in MRSN group, and the surgeons performed the surgery with the help of MRSN technology. The surgical procedures include virtual planning, data transfer between mixed reality and surgical navigation, tumor localization and resection assisted by surgical navigation under mixed reality environment. Twelve patients were enrolled in control group, and intraoperative tumor localization and resection were performed according to the experience of the surgeon. Total surgery time and intraoperative bleeding were recorded. Perioperative complications were recorded during follow-up. RESULTS The mean surgery time of MRSN group (76.7 ± 14.0 min) and control group (65.4 ± 21.3 min) showed no significant difference (p = 0.220), so did the intraoperative bleeding of MRSN group (16.0 ± 8.0 mL) and control group (16.7 ± 6.6 mL) (p = 0.825). None of the patient in MRSN group underwent any complication, although one patient in control group suffered temporary facial paralysis. The mean deviation between the virtually marked and the intraoperative actual outermost point of tumor was 3.03 ± 0.83 mm. CONCLUSION MRSN technology can realize real-time three-dimensional visualization of the tumor, and it has the potential of enhancing the safety and accuracy of resection of micro- and mini-tumors of parotid gland. LEVEL OF EVIDENCE 4 Laryngoscope, 134:1670-1678, 2024.
Collapse
Affiliation(s)
- Zu-Nan Tang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Lei-Hao Hu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Yao Yu
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Wen-Bo Zhang
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| | - Xin Peng
- Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China
| |
Collapse
|
4
|
Qi Z, Jin H, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. The Feasibility and Accuracy of Holographic Navigation with Laser Crosshair Simulator Registration on a Mixed-Reality Display. SENSORS (BASEL, SWITZERLAND) 2024; 24:896. [PMID: 38339612 PMCID: PMC10857152 DOI: 10.3390/s24030896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2024] [Revised: 01/21/2024] [Accepted: 01/23/2024] [Indexed: 02/12/2024]
Abstract
Addressing conventional neurosurgical navigation systems' high costs and complexity, this study explores the feasibility and accuracy of a simplified, cost-effective mixed reality navigation (MRN) system based on a laser crosshair simulator (LCS). A new automatic registration method was developed, featuring coplanar laser emitters and a recognizable target pattern. The workflow was integrated into Microsoft's HoloLens-2 for practical application. The study assessed the system's precision by utilizing life-sized 3D-printed head phantoms based on computed tomography (CT) or magnetic resonance imaging (MRI) data from 19 patients (female/male: 7/12, average age: 54.4 ± 18.5 years) with intracranial lesions. Six to seven CT/MRI-visible scalp markers were used as reference points per case. The LCS-MRN's accuracy was evaluated through landmark-based and lesion-based analyses, using metrics such as target registration error (TRE) and Dice similarity coefficient (DSC). The system demonstrated immersive capabilities for observing intracranial structures across all cases. Analysis of 124 landmarks showed a TRE of 3.0 ± 0.5 mm, consistent across various surgical positions. The DSC of 0.83 ± 0.12 correlated significantly with lesion volume (Spearman rho = 0.813, p < 0.001). Therefore, the LCS-MRN system is a viable tool for neurosurgical planning, highlighting its low user dependency, cost-efficiency, and accuracy, with prospects for future clinical application enhancements.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Ruochu Xiong
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, Kanazawa 920-8641, Japan;
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| |
Collapse
|
5
|
Kos TM, Colombo E, Bartels LW, Robe PA, van Doormaal TPC. Evaluation Metrics for Augmented Reality in Neurosurgical Preoperative Planning, Surgical Navigation, and Surgical Treatment Guidance: A Systematic Review. Oper Neurosurg (Hagerstown) 2023; 26:01787389-990000000-01007. [PMID: 38146941 PMCID: PMC11008635 DOI: 10.1227/ons.0000000000001009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 10/10/2023] [Indexed: 12/27/2023] Open
Abstract
BACKGROUND AND OBJECTIVE Recent years have shown an advancement in the development of augmented reality (AR) technologies for preoperative visualization, surgical navigation, and intraoperative guidance for neurosurgery. However, proving added value for AR in clinical practice is challenging, partly because of a lack of standardized evaluation metrics. We performed a systematic review to provide an overview of the reported evaluation metrics for AR technologies in neurosurgical practice and to establish a foundation for assessment and comparison of such technologies. METHODS PubMed, Embase, and Cochrane were searched systematically for publications on assessment of AR for cranial neurosurgery on September 22, 2022. The findings were reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. RESULTS The systematic search yielded 830 publications; 114 were screened full text, and 80 were included for analysis. Among the included studies, 5% dealt with preoperative visualization using AR, with user perception as the most frequently reported metric. The majority (75%) researched AR technology for surgical navigation, with registration accuracy, clinical outcome, and time measurements as the most frequently reported metrics. In addition, 20% studied the use of AR for intraoperative guidance, with registration accuracy, task outcome, and user perception as the most frequently reported metrics. CONCLUSION For quality benchmarking of AR technologies in neurosurgery, evaluation metrics should be specific to the risk profile and clinical objectives of the technology. A key focus should be on using validated questionnaires to assess user perception; ensuring clear and unambiguous reporting of registration accuracy, precision, robustness, and system stability; and accurately measuring task performance in clinical studies. We provided an overview suggesting which evaluation metrics to use per AR application and innovation phase, aiming to improve the assessment of added value of AR for neurosurgical practice and to facilitate the integration in the clinical workflow.
Collapse
Affiliation(s)
- Tessa M. Kos
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Elisa Colombo
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
| | - L. Wilbert Bartels
- Image Sciences Institute, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Pierre A. Robe
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Tristan P. C. van Doormaal
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich, The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
6
|
Schneider M, Kunz C, Wirtz CR, Mathis-Ullrich F, Pala A, Hlavac M. Augmented Reality-Assisted versus Freehand Ventriculostomy in a Head Model. J Neurol Surg A Cent Eur Neurosurg 2023; 84:562-569. [PMID: 37402395 DOI: 10.1055/s-0042-1759827] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/06/2023]
Abstract
BACKGROUND Ventriculostomy (VST) is a frequent neurosurgical procedure. Freehand catheter placement represents the standard current practice. However, multiple attempts are often required. We present augmented reality (AR) headset guided VST with in-house developed head models. We conducted a proof of concept study in which we tested AR-guided as well as freehand VST. Repeated AR punctures were conducted to investigate if a learning curve can be derived. METHODS Five custom-made 3D-printed head models, each holding an anatomically different ventricular system, were filled with agarose gel. Eleven surgeons placed two AR-guided as well as two freehand ventricular drains per head. A subgroup of four surgeons did a total of three series of AR-guided punctures each to test for a learning curve. A Microsoft HoloLens served as the hardware platform. The marker-based tracking did not require rigid head fixation. Catheter tip position was evaluated in computed tomography scans. RESULTS Marker-tracking, image segmentation, and holographic display worked satisfactorily. In freehand VST, a success rate of 72.7% was achieved, which was higher than under AR guidance (68.2%, difference not statistically significant). Repeated AR-guided punctures increased the success rate from 65 to 95%. We assume a steep learning curve as repeated AR-guided punctures led to an increase in successful attempts. Overall user experience showed positive feedback. CONCLUSIONS We achieved promising results that encourage the continued development and technical improvement. However, several more developmental steps have to be taken before an application in humans can be considered. In the future, AR headset-based holograms have the potential to serve as a compact navigational help inside and outside the operating room.
Collapse
Affiliation(s)
- Max Schneider
- Department of Neurosurgery, University Hospital Ulm, Ulm, Germany
| | - Christian Kunz
- Institute for Anthropomatics and Robotics - Health Robotics and Automation (HERA), KIT, Karlsruhe, Germany
| | | | - Franziska Mathis-Ullrich
- Institute for Anthropomatics and Robotics - Health Robotics and Automation (HERA), KIT, Karlsruhe, Germany
| | - Andrej Pala
- Department of Neurosurgery, University Hospital Ulm, Ulm, Germany
| | - Michal Hlavac
- Department of Neurosurgery, University Hospital Ulm, Ulm, Germany
| |
Collapse
|
7
|
Hey G, Guyot M, Carter A, Lucke-Wold B. Augmented Reality in Neurosurgery: A New Paradigm for Training. MEDICINA (KAUNAS, LITHUANIA) 2023; 59:1721. [PMID: 37893439 PMCID: PMC10608758 DOI: 10.3390/medicina59101721] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/21/2023] [Revised: 09/23/2023] [Accepted: 09/24/2023] [Indexed: 10/29/2023]
Abstract
Augmented reality (AR) involves the overlay of computer-generated images onto the user's real-world visual field to modify or enhance the user's visual experience. With respect to neurosurgery, AR integrates preoperative and intraoperative imaging data to create an enriched surgical experience that has been shown to improve surgical planning, refine neuronavigation, and reduce operation time. In addition, AR has the potential to serve as a valuable training tool for neurosurgeons in a way that minimizes patient risk while facilitating comprehensive training opportunities. The increased use of AR in neurosurgery over the past decade has led to innovative research endeavors aiming to develop novel, more efficient AR systems while also improving and refining present ones. In this review, we provide a concise overview of AR, detail current and emerging uses of AR in neurosurgery and neurosurgical training, discuss the limitations of AR, and provide future research directions. Following the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), 386 articles were initially identified. Two independent reviewers (GH and AC) assessed article eligibility for inclusion, and 31 articles are included in this review. The literature search included original (retrospective and prospective) articles and case reports published in English between 2013 and 2023. AR assistance has shown promise within neuro-oncology, spinal neurosurgery, neurovascular surgery, skull-base surgery, and pediatric neurosurgery. Intraoperative use of AR was found to primarily assist with surgical planning and neuronavigation. Similarly, AR assistance for neurosurgical training focused primarily on surgical planning and neuronavigation. However, studies included in this review utilize small sample sizes and remain largely in the preliminary phase. Thus, future research must be conducted to further refine AR systems before widespread intraoperative and educational use.
Collapse
Affiliation(s)
- Grace Hey
- College of Medicine, University of Florida, Gainesville, FL 32610, USA
| | - Michael Guyot
- College of Medicine, University of Florida, Gainesville, FL 32610, USA
| | - Ashley Carter
- Eastern Virginia Medical School, Norfolk, VA 23507, USA
| | - Brandon Lucke-Wold
- Department of Neurosurgery, University of Florida, Gainesville, FL 32610, USA
| |
Collapse
|
8
|
Li CR, Shen CC, Yang MY, Tsuei YS, Lee CH. Intraoperative Augmented Reality in Microsurgery for Intracranial Arteriovenous Malformation: A Case Report and Literature Review. Brain Sci 2023; 13:brainsci13040653. [PMID: 37190618 DOI: 10.3390/brainsci13040653] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/20/2023] [Revised: 04/03/2023] [Accepted: 04/12/2023] [Indexed: 05/17/2023] Open
Abstract
BACKGROUND Intracranial arteriovenous malformations (AVMs) are lesions containing complex vessels with a lack of buffering capillary architecture which might result in hemorrhagic cerebrovascular accidents (CVAs). Intraoperative navigation can improve resection rates and functional preservation in patients with lesions in eloquent areas, but current systems have limitations that can distract the operator. Augmented Reality (AR) surgical technology can reduce these distractions and provide real-time information regarding vascular morphology and location. METHODS In this case report, an adult patient was admitted to the emergency department after a fall, and diagnostic imaging revealed a Spetzler-Martin grade I AVM in the right parietal region with evidence of rupture. The patient underwent a stereotactic microsurgical resection with assistance from augmented reality technology, which allowed for a hologram of the angioarchitecture to be projected onto the cortical surface, aiding in the recognition of the angiographic anatomy during surgery. RESULTS The patient's postoperative recovery went smoothly. At 6-month follow-up, the patient had remained in stable condition, experiencing complete relief from his previous symptoms. The follow-up examination also revealed complete obliteration of the AVMs without any remaining pathological vascular structure. CONCLUSIONS AR-assisted microsurgery makes both the dissection and resection steps safer and more delicate. As several innovations are occurring in AR technology today, it is likely that this novel technique will be increasingly adopted in both surgical applications and education. Although certain limitations exist, this technique may still become more efficient and precise as this novel technology its continues to develop further.
Collapse
Affiliation(s)
- Chi-Ruei Li
- Department of Neurosurgery, Neurological Institute, Taichung Veterans General Hospital, Taichung 40705, Taiwan
| | - Chiung-Chyi Shen
- Department of Neurosurgery, Neurological Institute, Taichung Veterans General Hospital, Taichung 40705, Taiwan
| | - Meng-Yin Yang
- Department of Neurosurgery, Neurological Institute, Taichung Veterans General Hospital, Taichung 40705, Taiwan
| | - Yuang-Seng Tsuei
- Department of Neurosurgery, Neurological Institute, Taichung Veterans General Hospital, Taichung 40705, Taiwan
| | - Chung-Hsin Lee
- Department of Neurosurgery, Neurological Institute, Taichung Veterans General Hospital, Taichung 40705, Taiwan
| |
Collapse
|
9
|
Goto Y, Kawaguchi A, Inoue Y, Nakamura Y, Oyama Y, Tomioka A, Higuchi F, Uno T, Shojima M, Kin T, Shin M. Efficacy of a Novel Augmented Reality Navigation System Using 3D Computer Graphic Modeling in Endoscopic Transsphenoidal Surgery for Sellar and Parasellar Tumors. Cancers (Basel) 2023; 15:cancers15072148. [PMID: 37046809 PMCID: PMC10093001 DOI: 10.3390/cancers15072148] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2023] [Revised: 03/30/2023] [Accepted: 04/03/2023] [Indexed: 04/14/2023] Open
Abstract
In endoscopic transsphenoidal skull base surgery, knowledge of tumor location on imaging and the anatomic structures is required simultaneously. However, it is often difficult to accurately reconstruct the endoscopic vision of the surgical field from the pre-surgical radiographic images because the lesion remarkably displaces the geography of normal anatomic structures. We created a precise three-dimensional computer graphic model from preoperative radiographic data that was then superimposed on a visual image of the actual surgical field and displayed on a video monitor during endoscopic transsphenoidal surgery. We evaluated the efficacy of this augmented reality (AR) navigation system in 15 consecutive patients with sellar and parasellar tumors. The average score overall was 4.7 [95% confidence interval: 4.58-4.82], which indicates that the AR navigation system was as useful as or more useful than conventional navigation in certain patients. In two patients, AR navigation was assessed as less useful than conventional navigation because perception of the depth of the lesion was more difficult. The developed system was more useful than conventional navigation for facilitating an immediate three-dimensional understanding of the lesion and surrounding structures.
Collapse
Affiliation(s)
- Yoshiaki Goto
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Ai Kawaguchi
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Yuki Inoue
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Yuki Nakamura
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Yuta Oyama
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Arisa Tomioka
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Fumi Higuchi
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Takeshi Uno
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Masaaki Shojima
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| | - Taichi Kin
- Department of Neurosurgery, University of Tokyo Hospital, 7-3-1 Hongo, Bunkyo-ku, Tokyo 133-8655, Japan
| | - Masahiro Shin
- Department of Neurosurgery, University of Teikyo Hospital, 2-11-1 Kaga, Itabashi-ku, Tokyo 179-8606, Japan
| |
Collapse
|
10
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 16] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
11
|
Van Gestel F, Frantz T, Buyck F, Geens W, Neuville Q, Bruneau M, Jansen B, Scheerlinck T, Vandemeulebroucke J, Duerinck J. Neuro-oncological augmented reality planning for intracranial tumor resection. Front Neurol 2023; 14:1104571. [PMID: 36998774 PMCID: PMC10043492 DOI: 10.3389/fneur.2023.1104571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2022] [Accepted: 02/14/2023] [Indexed: 03/18/2023] Open
Abstract
BackgroundBefore starting surgery for the resection of an intracranial tumor, its outlines are typically marked on the skin of the patient. This allows for the planning of the optimal skin incision, craniotomy, and angle of approach. Conventionally, the surgeon determines tumor borders using neuronavigation with a tracked pointer. However, interpretation errors can lead to important deviations, especially for deep-seated tumors, potentially resulting in a suboptimal approach with incomplete exposure. Augmented reality (AR) allows displaying of the tumor and critical structures directly on the patient, which can simplify and improve surgical preparation.MethodsWe developed an AR-based workflow for intracranial tumor resection planning deployed on the Microsoft HoloLens II, which exploits the built-in infrared-camera for tracking the patient. We initially performed a phantom study to assess the accuracy of the registration and tracking. Following this, we evaluated the AR-based planning step in a prospective clinical study for patients undergoing resection of a brain tumor. This planning step was performed by 12 surgeons and trainees with varying degrees of experience. After patient registration, tumor outlines were marked on the patient's skin by different investigators, consecutively using a conventional neuronavigation system and an AR-based system. Their performance in both registration and delineation was measured in terms of accuracy and duration and compared.ResultsDuring phantom testing, registration errors remained below 2.0 mm and 2.0° for both AR-based navigation and conventional neuronavigation, with no significant difference between both systems. In the prospective clinical trial, 20 patients underwent tumor resection planning. Registration accuracy was independent of user experience for both AR-based navigation and the commercial neuronavigation system. AR-guided tumor delineation was deemed superior in 65% of cases, equally good in 30% of cases, and inferior in 5% of cases when compared to the conventional navigation system. The overall planning time (AR = 119 ± 44 s, conventional = 187 ± 56 s) was significantly reduced through the adoption of the AR workflow (p < 0.001), with an average time reduction of 39%.ConclusionBy providing a more intuitive visualization of relevant data to the surgeon, AR navigation provides an accurate method for tumor resection planning that is quicker and more intuitive than conventional neuronavigation. Further research should focus on intraoperative implementations.
Collapse
Affiliation(s)
- Frederick Van Gestel
- Department of Neurosurgery, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- Research Group Center for Neurosciences (C4N-NEUR), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- *Correspondence: Frederick Van Gestel
| | - Taylor Frantz
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- IMEC, Leuven, Belgium
| | - Felix Buyck
- Department of Neurosurgery, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
| | - Wietse Geens
- Department of Neurosurgery, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
| | - Quentin Neuville
- Department of Neurosurgery, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- Research Group Center for Neurosciences (C4N-NEUR), Vrije Universiteit Brussel (VUB), Brussels, Belgium
| | - Michaël Bruneau
- Department of Neurosurgery, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
| | - Bart Jansen
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- IMEC, Leuven, Belgium
| | - Thierry Scheerlinck
- Department of Orthopedic Surgery and Traumatology, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- Research Group Beeldvorming en Fysische Wetenschappen (BEFY-ORTHO), Vrije Universiteit Brussel (VUB), Brussels, Belgium
| | - Jef Vandemeulebroucke
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- IMEC, Leuven, Belgium
- Department of Radiology, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
| | - Johnny Duerinck
- Department of Neurosurgery, Universitair Ziekenhuis Brussel (UZ Brussel), Vrije Universiteit Brussel (VUB), Brussels, Belgium
- Research Group Center for Neurosciences (C4N-NEUR), Vrije Universiteit Brussel (VUB), Brussels, Belgium
| |
Collapse
|
12
|
Elarjani T, Lu VM, Berry KM, Eichberg DG, Ivan ME, Komotar RJ, Luther EM. Commentary: Invention of an Online Interactive Virtual Neurosurgery Simulator With Audiovisual Capture for Tactile Feedback. Oper Neurosurg (Hagerstown) 2023; 24:e232-e233. [PMID: 36701687 DOI: 10.1227/ons.0000000000000568] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2022] [Accepted: 10/06/2022] [Indexed: 01/27/2023] Open
Affiliation(s)
- Turki Elarjani
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA
| | | | | | | | | | | | | |
Collapse
|
13
|
Head-Mounted Augmented Reality in the Planning of Cerebrovascular Neurosurgical Procedures: A Single-Center Initial Experience. World Neurosurg 2023; 171:e693-e706. [PMID: 36566980 DOI: 10.1016/j.wneu.2022.12.086] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 12/17/2022] [Accepted: 12/19/2022] [Indexed: 12/24/2022]
Abstract
BACKGROUND Augmented reality (AR) technology has played an increasing role in cerebrovascular neurosurgery over the last 2 decades. Hence, we aim to evaluate the technical and educational value of head-mounted AR in cerebrovascular procedures. METHODS This is a single-center retrospective study of patients who underwent open surgery for cranial and spinal cerebrovascular lesions between April and August 2022. In all cases, the Medivis Surgical AR platform and HoloLens 2 were used for preoperative and intraoperative (preincision) planning. Surgical plan adjustment due to the use of head-mounted AR and subjective educational value of the tool were recorded. RESULTS A total of 33 patients and 35 cerebrovascular neurosurgical procedures were analyzed. Procedures included 12 intracranial aneurysm clippings, 6 brain and 1 spinal arteriovenous malformation resections, 2 cranial dural arteriovenous fistula obliterations, 3 carotid endarterectomies, two extracranial-intracranial direct bypasses, two encephaloduroangiosynostosis for Moyamoya disease, 1 biopsy of the superficial temporal artery, 2 microvascular decompressions, 2 cavernoma resections, 1 combined intracranial aneurysm clipping and encephaloduroangiosynostosis for Moyamoya disease, and 1 percutaneous feeder catheterization for arteriovenous malformation embolization. Minor changes in the surgical plan were recorded in 16 of 35 procedures (45.7%). Subjective educational value was scored as "very helpful" for cranial, spinal arteriovenous malformations, and carotid endarterectomies; "helpful" for intracranial aneurysm, dural arteriovenous fistulas, direct bypass, encephaloduroangiosynostosis, and superficial temporal artery-biopsy; and "not helpful" for cavernoma resection and microvascular decompression. CONCLUSIONS Head-mounted AR can be used in cerebrovascular neurosurgery as an adjunctive tool that might influence surgical strategy, enable 3-dimensional understanding of complex anatomy, and provide great educational value in selected cases.
Collapse
|
14
|
Use of Mixed Reality in Neuro-Oncology: A Single Centre Experience. Life (Basel) 2023; 13:life13020398. [PMID: 36836755 PMCID: PMC9965132 DOI: 10.3390/life13020398] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2022] [Revised: 01/25/2023] [Accepted: 01/29/2023] [Indexed: 02/04/2023] Open
Abstract
(1) Background: Intra-operative neuronavigation is currently an essential component to most neurosurgical operations. Recent progress in mixed reality (MR) technology has attempted to overcome the disadvantages of the neuronavigation systems. We present our experience using the HoloLens 2 in neuro-oncology for both intra- and extra-axial tumours. (2) Results: We describe our experience with three patients who underwent tumour resection. We evaluated surgeon experience, accuracy of superimposed 3D image in tumour localisation with standard neuronavigation both pre- and intra-operatively. Surgeon training and usage for HoloLens 2 was short and easy. The process of image overlay was relatively straightforward for the three cases. Registration in prone position with a conventional neuronavigation system is often difficult, which was easily overcome during use of HoloLens 2. (3) Conclusion: Although certain limitations were identified, the authors feel that this system is a feasible alternative device for intra-operative visualization of neurosurgical pathology. Further studies are being planned to assess its accuracy and suitability across various surgical disciplines.
Collapse
|
15
|
Encarnacion Ramirez M, Ramirez Pena I, Barrientos Castillo RE, Sufianov A, Goncharov E, Soriano Sanchez JA, Colome-Hidalgo M, Nurmukhametov R, Cerda Céspedes JR, Montemurro N. Development of a 3D Printed Brain Model with Vasculature for Neurosurgical Procedure Visualisation and Training. Biomedicines 2023; 11:biomedicines11020330. [PMID: 36830866 PMCID: PMC9953411 DOI: 10.3390/biomedicines11020330] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2022] [Revised: 01/18/2023] [Accepted: 01/22/2023] [Indexed: 01/26/2023] Open
Abstract
BACKGROUND Simulation-based techniques using three-dimensional models are gaining popularity in neurosurgical training. Most pre-existing models are expensive, so we felt a need to develop a real-life model using 3D printing technology to train in endoscopic third ventriculostomy. METHODS The brain model was made using a 3D-printed resin mold from patient-specific MRI data. The mold was filled with silicone Ecoflex™ 00-10 and mixed with Silc Pig® pigment additives to replicate the color and consistency of brain tissue. The dura mater was made from quick-drying silicone paste admixed with gray dye. The blood vessels were made from a silicone 3D-printed mold based on magnetic resonance imaging. Liquid containing paprika oleoresin dye was used to simulate blood and was pumped through the vessels to simulate pulsatile motion. RESULTS Seven residents and eight senior neurosurgeons were recruited to test our model. The participants reported that the size and anatomy of the elements were very similar to real structures. The model was helpful for training neuroendoscopic 3D perception and navigation. CONCLUSIONS We developed an endoscopic third ventriculostomy training model using 3D printing technology that provides anatomical precision and a realistic simulation. We hope our model can provide an indispensable tool for young neurosurgeons to gain operative experience without exposing patients to risk.
Collapse
Affiliation(s)
| | | | | | - Albert Sufianov
- Department of Neurosurgery, First Moscow State Medical University (Sechenov University), 121359 Moscow, Russia
| | - Evgeniy Goncharov
- Traumatology and Orthopedics Center, Central Clinical Hospital of the Russian Academy of Sciences, 121359 Moscow, Russia
| | - Jose A. Soriano Sanchez
- Instituto Soriano de Cirugía de Columna Mínimamente Invasiva at ABC Hospital, Neurological Center, Santa Fe Campus, Mexico City 05100, Mexico
| | - Manuel Colome-Hidalgo
- Instituto de Investigación en Salud, Universidad Autònoma de Santo Domingo, Santo Domingo 10014, Dominican Republic
| | | | | | - Nicola Montemurro
- Department of Neurosurgery, Azienda Ospedaliera Universitaria Pisana (AOUP), University of Pisa, 56100 Pisa, Italy
- Correspondence:
| |
Collapse
|
16
|
Portnoy Y, Koren J, Khoury A, Factor S, Dadia S, Ran Y, Benady A. Three-dimensional technologies in presurgical planning of bone surgeries: current evidence and future perspectives. Int J Surg 2023; 109:3-10. [PMID: 36799780 PMCID: PMC10389328 DOI: 10.1097/js9.0000000000000201] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2022] [Accepted: 11/20/2022] [Indexed: 02/18/2023]
Abstract
BACKGROUND The recent development of three-dimensional (3D) technologies introduces a novel set of opportunities to the medical field in general, and specifically to surgery. The preoperative phase has proven to be a critical factor in surgical success. Utilization of 3D technologies has the potential to improve preoperative planning and overall surgical outcomes. In this narrative review article, the authors describe existing clinical data pertaining to the current use of 3D printing, virtual reality, and augmented reality in the preoperative phase of bone surgery. METHODS The methodology included keyword-based literature search in PubMed and Google Scholar for original articles published between 2014 and 2022. After excluding studies performed in nonbone surgery disciplines, data from 61 studies of five different surgical disciplines were processed to be included in this narrative review. RESULTS Among the mentioned technologies, 3D printing is currently the most advanced in terms of clinical use, predominantly creating anatomical models and patient-specific instruments that provide high-quality operative preparation. Virtual reality allows to set a surgical plan and to further simulate the procedure via a 2D screen or head mounted display. Augmented reality is found to be useful for surgical simulation upon 3D printed anatomical models or virtual phantoms. CONCLUSIONS Overall, 3D technologies are gradually becoming an integral part of a surgeon's preoperative toolbox, allowing for increased surgical accuracy and reduction of operation time, mainly in complex and unique surgical cases. This may eventually lead to improved surgical outcomes, thereby optimizing the personalized surgical approach.
Collapse
Affiliation(s)
- Yotam Portnoy
- First Faculty of Medicine, Charles University in Prague, Prague, Czechia
| | - Jonathan Koren
- First Faculty of Medicine, Charles University in Prague, Prague, Czechia
| | - Amal Khoury
- Sackler School of Medicine, Tel Aviv University
- Division of Orthopaedic Surgery
| | - Shai Factor
- Sackler School of Medicine, Tel Aviv University
- Division of Orthopaedic Surgery
| | - Solomon Dadia
- Sackler School of Medicine, Tel Aviv University
- Levin Center of 3D Printing and Surgical Innovation
- National Unit of Orthopedic Oncology
| | - Yuval Ran
- Sackler School of Medicine, Tel Aviv University
- Office of the Deputy Medical Manager, Tel Aviv Medical Center, Tel Aviv, Israel
| | - Amit Benady
- Sackler School of Medicine, Tel Aviv University
- Division of Orthopaedic Surgery
- Levin Center of 3D Printing and Surgical Innovation
| |
Collapse
|
17
|
Khan T, Biehl JT, Andrews EG, Babichenko D. A systematic comparison of the accuracy of monocular RGB tracking and LiDAR for neuronavigation. Healthc Technol Lett 2022; 9:91-101. [PMID: 36514478 PMCID: PMC9731545 DOI: 10.1049/htl2.12036] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 09/02/2022] [Accepted: 09/05/2022] [Indexed: 12/16/2022] Open
Abstract
With the advent of augmented reality (AR), the use of AR-guided systems in the field of medicine has gained traction. However, the wide-scale adaptation of these systems requires highly accurate and reliable tracking. In this work, the tracking accuracy of two technology platforms, LiDAR and Vuforia, are developed and rigorously tested for a catheter placement neurological procedure. Several experiments (900) are performed for each technology across various combinations of catheter lengths and insertion trajectories. This analysis shows that the LiDAR platform outperformed Vuforia; which is the state-of-the-art in monocular RGB tracking solutions. LiDAR had 75% less radial distance error and 26% less angle deviation error. Results provide key insights into the value and utility of LiDAR-based tracking in AR guidance systems.
Collapse
Affiliation(s)
- Talha Khan
- School of Computing and InformationUniversity of PittsburghPittsburghPAUSA
| | - Jacob T. Biehl
- School of Computing and InformationUniversity of PittsburghPittsburghPAUSA
| | - Edward G. Andrews
- Department of Neurological SurgerySchool of MedicineUniversity of PittsburghPittsburghPAUSA
| | - Dmitriy Babichenko
- School of Computing and InformationUniversity of PittsburghPittsburghPAUSA
| |
Collapse
|
18
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
19
|
Mulita F, Verras GI, Anagnostopoulos CN, Kotis K. A Smarter Health through the Internet of Surgical Things. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22124577. [PMID: 35746359 PMCID: PMC9231158 DOI: 10.3390/s22124577] [Citation(s) in RCA: 34] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/03/2022] [Revised: 06/10/2022] [Accepted: 06/14/2022] [Indexed: 05/14/2023]
Abstract
(1) Background: In the last few years, technological developments in the surgical field have been rapid and are continuously evolving. One of the most revolutionizing breakthroughs was the introduction of the IoT concept within surgical practice. Our systematic review aims to summarize the most important studies evaluating the IoT concept within surgical practice, focusing on Telesurgery and surgical Telementoring. (2) Methods: We conducted a systematic review of the current literature, focusing on the Internet of Surgical Things in Telesurgery and Telementoring. Forty-eight (48) studies were included in this review. As secondary research questions, we also included brief overviews of the use of IoT in image-guided surgery, and patient Telemonitoring, by systematically analyzing fourteen (14) and nineteen (19) studies, respectively. (3) Results: Data from 219 patients and 757 healthcare professionals were quantitively analyzed. Study designs were primarily observational or based on model development. Palpable advantages from the IoT incorporation mainly include less surgical hours, accessibility to high quality treatment, and safer and more effective surgical education. Despite the described technological advances, and proposed benefits of the systems presented, there are still identifiable gaps in the literature that need to be further explored in a systematic manner. (4) Conclusions: The use of the IoT concept within the surgery domain is a widely incorporated but less investigated concept. Advantages have become palpable over the past decade, yet further research is warranted.
Collapse
Affiliation(s)
- Francesk Mulita
- Intelligent Systems Lab, Department of Cultural Technology and Communication, University of the Aegean, 81100 Mytilene, Greece;
- Department of Surgery, General University Hospital of Patras, 26504 Rio, Greece;
- Correspondence: (F.M.); (K.K.); Tel.: +30-6974822712 (K.K.)
| | | | | | - Konstantinos Kotis
- Intelligent Systems Lab, Department of Cultural Technology and Communication, University of the Aegean, 81100 Mytilene, Greece;
- Correspondence: (F.M.); (K.K.); Tel.: +30-6974822712 (K.K.)
| |
Collapse
|
20
|
Yoo H, Sim T. Automated Machine Learning (AutoML)-based Surface Registration Methodology for Image-guided Surgical Navigation System. Med Phys 2022; 49:4845-4860. [PMID: 35543150 DOI: 10.1002/mp.15696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2021] [Revised: 04/05/2022] [Accepted: 04/19/2022] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND While the surface registration technique has the advantage of being relatively safe and the operation time is short, it generally has the disadvantage of low accuracy. PURPOSE This research proposes automated machine learning (AutoML)-based surface registration to improve the accuracy of image-guided surgical navigation systems. METHODS The state-of-the-art surface registration concept is that first, using a neural network model, a new point-cloud that matches the facial information acquired by a passive probe of an optical tracking system (OTS) is extracted from the facial information obtained by computerized tomography (CT). Target registration error (TRE) representing the accuracy of surface registration is then calculated by applying the iterative closest point (ICP) algorithm to the newly extracted point-cloud and OTS information. In this process, the hyperparameters used in the neural network model and ICP algorithm are automatically optimized using Bayesian Optimization with Expected Improvement to yield improved registration accuracy. RESULTS Using the proposed surface registration methodology, the average TRE for the targets located in the sinus space and nasal cavity of the soft phantoms is (0.939 ± 0.375) mm, which shows 57.8 % improvement compared to the average TRE of (2.227 ± 0.193) mm calculated by the conventional surface registration method (p < 0.01). The performance of the proposed methodology is evaluated, and the average TREs computed by the proposed methodology and the conventional method are (0.767 ± 0.132) mm and (2.615 ± 0.378) mm, respectively. Additionally, for one healthy adult, the clinical applicability of the AutoML-based surface registration is also presented. CONCLUSION Our findings showed that the registration accuracy could be improved while maintaining the advantages of the surface registration technique. This article is protected by copyright. All rights reserved.
Collapse
Affiliation(s)
- Hakje Yoo
- Korea University Research Institute for Medical Bigdata Science, College of Medicine, Korea University, 73 Goryeodae-ro, Seongbuk-gu, Seoul, 02841, Republic of Korea
| | - Taeyong Sim
- Department of Artificial Intelligence, Sejong University, 209, Neungdong-ro, Gwangjin-gu, Seoul, 05006, Republic of Korea
| |
Collapse
|
21
|
Jean WC. Virtual and Augmented Reality in Neurosurgery: The Evolution of its Application and Study Designs. World Neurosurg 2022; 161:459-464. [PMID: 35505566 DOI: 10.1016/j.wneu.2021.08.150] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2021] [Revised: 08/30/2021] [Accepted: 08/31/2021] [Indexed: 10/18/2022]
Abstract
BACKGROUND As the art of neurosurgery evolves in the 21st century, more emphasis is placed on minimally invasive techniques, which require technical precision. Simultaneously, the reduction on training hours continues, and teachers of neurosurgery faces "double jeopardy"-with harder skills to teach and less time to teach them. Mixed reality appears as the neurosurgical educators' natural ally: Virtual reality facilitates the learning of spatial relationships and permits rehearsal of skills, while augmented reality can make procedures safer and more efficient. Little wonder then, that the body of literature on mixed reality in neurosurgery has grown exponentially. METHODS Publications involving virtual and augmented reality in neurosurgery were examined. A total of 414 papers were included, and they were categorized according to study design and analyzed. RESULTS Half of the papers were published within the last 3 years alone. Whereas in the earlier half, most of the publications involved experiments in virtual reality simulation and the efficacy of skills acquisition, many of the more recent publication are proof-of-concept studies. This attests to the evolution of mixed reality in neurosurgery. As the technology advances, neurosurgeons are finding more applications, both in training and clinical practice. CONCLUSIONS With parallel advancement in Internet speed and artificial intelligence, the utilization of mixed reality will permeate neurosurgery. From solving staff problems in global neurosurgery, to mitigating the deleterious effect of duty-hour reductions, to improving individual operations, mixed reality will have a positive effect in many aspects of neurosurgery.
Collapse
Affiliation(s)
- Walter C Jean
- Division of Neurological Surgery, Lehigh Valley Health Network, Allentown, Pennsylvania, USA; Department of Neurosurgery and Brain Repair, University of South Florida Morsani College of Medicine, Tampa, Florida, USA.
| |
Collapse
|
22
|
Uhl C, Hatzl J, Meisenbacher K, Zimmer L, Hartmann N, Böckler D. Mixed-Reality-Assisted Puncture of the Common Femoral Artery in a Phantom Model. J Imaging 2022; 8:jimaging8020047. [PMID: 35200749 PMCID: PMC8874567 DOI: 10.3390/jimaging8020047] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2021] [Revised: 02/12/2022] [Accepted: 02/14/2022] [Indexed: 12/15/2022] Open
Abstract
Percutaneous femoral arterial access is daily practice in a variety of medical specialties and enables physicians worldwide to perform endovascular interventions. The reported incidence of percutaneous femoral arterial access complications is 3–18% and often results from suboptimal puncture location due to insufficient visualization of the target vessel. The purpose of this proof-of-concept study was to evaluate the feasibility and the positional error of a mixed-reality (MR)-assisted puncture of the common femoral artery in a phantom model using a commercially available navigation system. In total, 15 MR-assisted punctures were performed. Cone-beam computed tomography angiography (CTA) was used following each puncture to allow quantification of positional error of needle placements in the axial and sagittal planes. Technical success was achieved in 14/15 cases (93.3%) with a median axial positional error of 1.0 mm (IQR 1.3) and a median sagittal positional error of 1.1 mm (IQR 1.6). The median duration of the registration process and needle insertion was 2 min (IQR 1.0). MR-assisted puncture of the common femoral artery is feasible with acceptable positional errors in a phantom model. Future studies should aim to measure and reduce the positional error resulting from MR registration.
Collapse
|
23
|
Visualization, navigation, augmentation. The ever-changing perspective of the neurosurgeon. BRAIN AND SPINE 2022; 2:100926. [PMID: 36248169 PMCID: PMC9560703 DOI: 10.1016/j.bas.2022.100926] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Revised: 07/23/2022] [Accepted: 08/10/2022] [Indexed: 11/22/2022]
|
24
|
Eichberg DG, Ivan ME, Di L, Shah AH, Luther EM, Lu VM, Komotar RJ, Urakov TM. Augmented Reality for Enhancing Image-Guided Neurosurgery: Superimposing the Future onto the Present. World Neurosurg 2021; 157:235-236. [PMID: 34929765 DOI: 10.1016/j.wneu.2021.09.126] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Daniel G Eichberg
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA
| | - Michael E Ivan
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA; Sylvester Comprehensive Cancer Center, Miami, Florida, USA
| | - Long Di
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA
| | - Ashish H Shah
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA
| | - Evan M Luther
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA
| | - Victor M Lu
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA
| | - Ricardo J Komotar
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA; Sylvester Comprehensive Cancer Center, Miami, Florida, USA
| | - Timur M Urakov
- Department of Neurological Surgery, University of Miami Miller School of Medicine, Miami, Florida, USA
| |
Collapse
|