1
|
Madani S, Sayadi A, Turcotte R, Cecere R, Aoude A, Hooshiar A. A universal calibration framework for mixed-reality assisted surgery. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2025; 259:108470. [PMID: 39602987 DOI: 10.1016/j.cmpb.2024.108470] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/31/2023] [Revised: 09/06/2024] [Accepted: 10/18/2024] [Indexed: 11/29/2024]
Abstract
BACKGROUND Mixed-reality-assisted surgery has become increasingly prominent, offering real-time 3D visualization of target anatomy such as tumors. These systems facilitate translating preoperative 3D surgical plans to the patient's body intraoperatively and allow for interactive modifications based on the patient's real-time conditions. However, achieving sub-millimetre accuracy in mixed-reality (MR) visualization and interaction is crucial to mitigate device-related risks and enhance surgical precision. OBJECTIVE Given the critical role of camera calibration in hologram-to-patient anatomy registration, this study aims to develop a new device-agnostic and robust calibration method capable of achieving sub-millimetre accuracy, addressing the prevalent uncertainties associated with MR camera-to-world calibration. METHODS We utilized the precision of surgical navigation systems (NAV) to address the hand-eye calibration problem, thereby localizing the MR camera within a navigated surgical scene. The proposed calibration method was integrated into a representative surgery system and subjected to rigorous testing across various 2D and 3D camera trajectories that simulate surgeon head movements. RESULTS The calibration method demonstrated positional errors as low as 0.2 mm in spatial trajectories, with a standard error also at 0.2 mm, underscoring its robustness against camera motion. This accuracy complies with the accuracy and stability requirements essential for surgical applications. CONCLUSION The proposed fiducial-based hand-eye calibration method effectively incorporates the accuracy and reliability of surgical navigation systems into MR camera systems used in intraoperative applications. This integration facilitates high precision in surgical navigation, proving critical for enhancing surgical outcomes in mixed-reality-assisted procedures.
Collapse
Affiliation(s)
- Sepehr Madani
- Surgical Performance Enhancement and Robotics (SuPER) Centre, Department of Surgery, McGill University, 1650 Cedar Avenue, Montreal QC H3G 1A4, Canada
| | - Amir Sayadi
- Surgical Performance Enhancement and Robotics (SuPER) Centre, Department of Surgery, McGill University, 1650 Cedar Avenue, Montreal QC H3G 1A4, Canada
| | - Robert Turcotte
- Division of Orthopedic Surgery, Department of Surgery, McGill University, 1650 Cedar Avenue, Montreal QC H3G 1A4, Canada
| | - Renzo Cecere
- Division of Cardiac Surgery, Department of Surgery, McGill University, 1001 Decarie Blvd., Montreal QC H4A 3J1, Canada
| | - Ahmed Aoude
- Division of Orthopedic Surgery, Department of Surgery, McGill University, 1650 Cedar Avenue, Montreal QC H3G 1A4, Canada
| | - Amir Hooshiar
- Surgical Performance Enhancement and Robotics (SuPER) Centre, Department of Surgery, McGill University, 1650 Cedar Avenue, Montreal QC H3G 1A4, Canada.
| |
Collapse
|
2
|
Wang F, Dong J, Xu Y, Jin J, Xu Y, Yan X, Liu Z, Zhao H, Zhang J, Wang N, Hu X, Gao X, Xu L, Yang C, Ma S, Du J, Hu Y, Ji H, Hu S. Turning attention to tumor-host interface and focus on the peritumoral heterogeneity of glioblastoma. Nat Commun 2024; 15:10885. [PMID: 39738017 DOI: 10.1038/s41467-024-55243-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2024] [Accepted: 12/04/2024] [Indexed: 01/01/2025] Open
Abstract
Approximately 90% of glioblastoma recurrences occur in the peritumoral brain zone (PBZ), while the spatial heterogeneity of the PBZ is not well studied. In this study, two PBZ tissues and one tumor tissue sample are obtained from each patient via preoperative imaging. We assess the microenvironment and the characteristics of infiltrating immune/tumor cells using various techniques. Our data indicate there are one or more regions with higher cerebral blood flow in PBZ, which we collectively name the "higher cerebral blood flow interface" (HBI). The HBI exhibited more neovascularization than the "lower cerebral blood flow interfaces" (LBI). The HBI tend to have increased infiltration of macrophages and T lymphocytes infiltration compared with that in LBI. There are more tumor cells in the HBI than in LBI, with substantial differences in the gene expression profiles of these tumor cells. HBI may be the key area of PBZ-targeting therapy after surgical resection.
Collapse
Affiliation(s)
- Fang Wang
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
- Department of Neurosurgery, The First Affiliated Hospital of Zhengzhou University, Zhengzhou, Henan, China
| | - Jiawei Dong
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, Heilongjiang, China
| | - Yuyun Xu
- Department of Radiology, Zhejiang Provincial People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Jiaqi Jin
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Yan Xu
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Xiuwei Yan
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Zhihui Liu
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Hongtao Zhao
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Jiheng Zhang
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Nan Wang
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Xueyan Hu
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Xin Gao
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Lei Xu
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Chengyun Yang
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Shuai Ma
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Jianyang Du
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Ying Hu
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China.
- School of Life Science and Technology, Harbin Institute of Technology, Harbin, Heilongjiang, China.
| | - Hang Ji
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China.
- Department of Neurosurgery, West China Hospital Sichuan University, Chengdu, Sichuan, China.
| | - Shaoshan Hu
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People's Hospital,Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China.
| |
Collapse
|
3
|
Li J, Zhou Z, Yang J, Pepe A, Gsaxner C, Luijten G, Qu C, Zhang T, Chen X, Li W, Wodzinski M, Friedrich P, Xie K, Jin Y, Ambigapathy N, Nasca E, Solak N, Melito GM, Vu VD, Memon AR, Schlachta C, De Ribaupierre S, Patel R, Eagleson R, Chen X, Mächler H, Kirschke JS, de la Rosa E, Christ PF, Li HB, Ellis DG, Aizenberg MR, Gatidis S, Küstner T, Shusharina N, Heller N, Andrearczyk V, Depeursinge A, Hatt M, Sekuboyina A, Löffler MT, Liebl H, Dorent R, Vercauteren T, Shapey J, Kujawa A, Cornelissen S, Langenhuizen P, Ben-Hamadou A, Rekik A, Pujades S, Boyer E, Bolelli F, Grana C, Lumetti L, Salehi H, Ma J, Zhang Y, Gharleghi R, Beier S, Sowmya A, Garza-Villarreal EA, Balducci T, Angeles-Valdez D, Souza R, Rittner L, Frayne R, Ji Y, Ferrari V, Chatterjee S, Dubost F, Schreiber S, Mattern H, Speck O, Haehn D, John C, Nürnberger A, Pedrosa J, Ferreira C, Aresta G, Cunha A, Campilho A, Suter Y, Garcia J, Lalande A, Vandenbossche V, Van Oevelen A, Duquesne K, Mekhzoum H, Vandemeulebroucke J, Audenaert E, Krebs C, van Leeuwen T, Vereecke E, Heidemeyer H, Röhrig R, Hölzle F, Badeli V, Krieger K, Gunzer M, Chen J, van Meegdenburg T, Dada A, Balzer M, Fragemann J, Jonske F, Rempe M, Malorodov S, Bahnsen FH, Seibold C, Jaus A, Marinov Z, Jaeger PF, Stiefelhagen R, Santos AS, Lindo M, Ferreira A, Alves V, Kamp M, Abourayya A, Nensa F, Hörst F, Brehmer A, Heine L, Hanusrichter Y, Weßling M, Dudda M, Podleska LE, Fink MA, Keyl J, Tserpes K, Kim MS, Elhabian S, Lamecker H, Zukić D, Paniagua B, Wachinger C, Urschler M, Duong L, Wasserthal J, Hoyer PF, Basu O, Maal T, Witjes MJH, Schiele G, Chang TC, Ahmadi SA, Luo P, Menze B, Reyes M, Deserno TM, Davatzikos C, Puladi B, Fua P, Yuille AL, Kleesiek J, Egger J. MedShapeNet - a large-scale dataset of 3D medical shapes for computer vision. BIOMED ENG-BIOMED TE 2024:bmt-2024-0396. [PMID: 39733351 DOI: 10.1515/bmt-2024-0396] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2024] [Accepted: 09/21/2024] [Indexed: 12/31/2024]
Abstract
OBJECTIVES The shape is commonly used to describe the objects. State-of-the-art algorithms in medical imaging are predominantly diverging from computer vision, where voxel grids, meshes, point clouds, and implicit surface models are used. This is seen from the growing popularity of ShapeNet (51,300 models) and Princeton ModelNet (127,915 models). However, a large collection of anatomical shapes (e.g., bones, organs, vessels) and 3D models of surgical instruments is missing. METHODS We present MedShapeNet to translate data-driven vision algorithms to medical applications and to adapt state-of-the-art vision algorithms to medical problems. As a unique feature, we directly model the majority of shapes on the imaging data of real patients. We present use cases in classifying brain tumors, skull reconstructions, multi-class anatomy completion, education, and 3D printing. RESULTS By now, MedShapeNet includes 23 datasets with more than 100,000 shapes that are paired with annotations (ground truth). Our data is freely accessible via a web interface and a Python application programming interface and can be used for discriminative, reconstructive, and variational benchmarks as well as various applications in virtual, augmented, or mixed reality, and 3D printing. CONCLUSIONS MedShapeNet contains medical shapes from anatomy and surgical instruments and will continue to collect data for benchmarks and applications. The project page is: https://medshapenet.ikim.nrw/.
Collapse
Affiliation(s)
- Jianning Li
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Institute of Computer Graphics and Vision (ICG), Graz University of Technology, Graz, Austria
- Computer Algorithms for Medicine Laboratory (Cafe), Graz, Austria
| | - Zongwei Zhou
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA
| | - Jiancheng Yang
- Computer Vision Laboratory, Swiss Federal Institute of Technology Lausanne (EPFL), Lausanne, Switzerland
| | - Antonio Pepe
- Institute of Computer Graphics and Vision (ICG), Graz University of Technology, Graz, Austria
- Computer Algorithms for Medicine Laboratory (Cafe), Graz, Austria
| | - Christina Gsaxner
- Institute of Computer Graphics and Vision (ICG), Graz University of Technology, Graz, Austria
- Computer Algorithms for Medicine Laboratory (Cafe), Graz, Austria
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Gijs Luijten
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Institute of Computer Graphics and Vision (ICG), Graz University of Technology, Graz, Austria
- Computer Algorithms for Medicine Laboratory (Cafe), Graz, Austria
- Center for Virtual and Extended Reality in Medicine (ZvRM), University Hospital Essen, University Medicine Essen, Essen, Germany
| | - Chongyu Qu
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA
| | - Tiezheng Zhang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA
| | - Xiaoxi Chen
- Department of Radiology, Renji Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China
| | - Wenxuan Li
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA
| | - Marek Wodzinski
- Department of Measurement and Electronics, AGH University of Science and Technology, Krakow, Poland
- Information Systems Institute, University of Applied Sciences Western Switzerland (HES-SO Valais), Sierre, Switzerland
| | - Paul Friedrich
- Center for Medical Image Analysis & Navigation (CIAN), Department of Biomedical Engineering, University of Basel, Allschwil, Switzerland
| | - Kangxian Xie
- Department of Computer Science and Engineering, University at Buffalo, SUNY, NY, 14260, USA
| | - Yuan Jin
- Institute of Computer Graphics and Vision (ICG), Graz University of Technology, Graz, Austria
- Computer Algorithms for Medicine Laboratory (Cafe), Graz, Austria
- Research Center for Connected Healthcare Big Data, ZhejiangLab, Hangzhou, Zhejiang, China
| | - Narmada Ambigapathy
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Enrico Nasca
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Naida Solak
- Institute of Computer Graphics and Vision (ICG), Graz University of Technology, Graz, Austria
- Computer Algorithms for Medicine Laboratory (Cafe), Graz, Austria
| | - Gian Marco Melito
- Institute of Mechanics, Graz University of Technology, Graz, Austria
| | - Viet Duc Vu
- Department of Diagnostic and Interventional Radiology, University Hospital Giessen, Justus-Liebig-University Giessen, Giessen, Germany
| | - Afaque R Memon
- Department of Mechanical Engineering, Mehran University of Engineering and Technology, Jamshoro, Sindh, Pakistan
- Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Christopher Schlachta
- Canadian Surgical Technologies & Advanced Robotics (CSTAR), University Hospital, London, Canada
| | - Sandrine De Ribaupierre
- Canadian Surgical Technologies & Advanced Robotics (CSTAR), University Hospital, London, Canada
| | - Rajnikant Patel
- Canadian Surgical Technologies & Advanced Robotics (CSTAR), University Hospital, London, Canada
| | - Roy Eagleson
- Canadian Surgical Technologies & Advanced Robotics (CSTAR), University Hospital, London, Canada
| | - Xiaojun Chen
- State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Institute of Biomedical Manufacturing and Life Quality Engineering, Shanghai Jiao Tong University, Shanghai, People's Republic of China
- Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China
| | - Heinrich Mächler
- Department of Cardiac Surgery, Medical University Graz, Graz, Austria
| | - Jan Stefan Kirschke
- Geschäftsführender Oberarzt Abteilung für Interventionelle und Diagnostische Neuroradiologie, Universitätsklinikum der Technischen Universität München, München, Germany
| | - Ezequiel de la Rosa
- icometrix, Leuven, Belgium
- Department of Informatics, Technical University of Munich, Garching bei München, Germany
| | | | - Hongwei Bran Li
- Department of Quantitative Biomedicine, University of Zurich, Zurich, Switzerland
| | - David G Ellis
- Department of Neurosurgery, University of Nebraska Medical Center, Omaha, NE, USA
| | - Michele R Aizenberg
- Department of Neurosurgery, University of Nebraska Medical Center, Omaha, NE, USA
| | - Sergios Gatidis
- University Hospital of Tuebingen Diagnostic and Interventional Radiology Medical Image and Data Analysis (MIDAS.lab), Tuebingen, Germany
| | - Thomas Küstner
- University Hospital of Tuebingen Diagnostic and Interventional Radiology Medical Image and Data Analysis (MIDAS.lab), Tuebingen, Germany
| | - Nadya Shusharina
- Division of Radiation Biophysics, Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA, USA
| | | | - Vincent Andrearczyk
- Institute of Informatics, HES-SO Valais-Wallis University of Applied Sciences and Arts Western Switzerland, Sierre, Switzerland
| | - Adrien Depeursinge
- Institute of Informatics, HES-SO Valais-Wallis University of Applied Sciences and Arts Western Switzerland, Sierre, Switzerland
- Department of Nuclear Medicine and Molecular Imaging, Lausanne University Hospital (CHUV), Lausanne, Switzerland
| | - Mathieu Hatt
- LaTIM, INSERM UMR 1101, Univ Brest, Brest, France
| | - Anjany Sekuboyina
- Department of Informatics, Technical University of Munich, Garching bei München, Germany
| | | | - Hans Liebl
- Department of Neuroradiology, Klinikum Rechts der Isar, Munich, Germany
| | - Reuben Dorent
- King's College London, Strand, London, UK
- Department of Neurosurgery, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA
| | | | | | | | - Stefan Cornelissen
- Elisabeth-TweeSteden Hospital, Tilburg, Netherlands
- Video Coding & Architectures Research Group, Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, Netherlands
| | - Patrick Langenhuizen
- Elisabeth-TweeSteden Hospital, Tilburg, Netherlands
- Video Coding & Architectures Research Group, Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, Netherlands
| | - Achraf Ben-Hamadou
- Centre de Recherche en Numérique de Sfax, Laboratory of Signals, Systems, Artificial Intelligence and Networks, Sfax, Tunisia
- Udini, Aix-en-Provence, France
| | - Ahmed Rekik
- Centre de Recherche en Numérique de Sfax, Laboratory of Signals, Systems, Artificial Intelligence and Networks, Sfax, Tunisia
- Udini, Aix-en-Provence, France
| | - Sergi Pujades
- Inria, Université Grenoble Alpes, CNRS, Grenoble, France
| | - Edmond Boyer
- Inria, Université Grenoble Alpes, CNRS, Grenoble, France
| | - Federico Bolelli
- "Enzo Ferrari" Department of Engineering, University of Modena and Reggio Emilia, Modena, Italy
| | - Costantino Grana
- "Enzo Ferrari" Department of Engineering, University of Modena and Reggio Emilia, Modena, Italy
| | - Luca Lumetti
- "Enzo Ferrari" Department of Engineering, University of Modena and Reggio Emilia, Modena, Italy
| | - Hamidreza Salehi
- Department of Artificial Intelligence in Medical Sciences, Faculty of Advanced Technologies in Medicine, Iran University of Medical Sciences, Tehran, Iran
| | - Jun Ma
- Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, ON, Canada
- Peter Munk Cardiac Centre, University Health Network, Toronto, ON, Canada
- Vector Institute, Toronto, ON, Canada
| | - Yao Zhang
- Shanghai AI Laboratory, Shanghai, People's Republic of China
| | - Ramtin Gharleghi
- School of Mechanical and Manufacturing Engineering, UNSW, Sydney, NSW, Australia
| | - Susann Beier
- School of Mechanical and Manufacturing Engineering, UNSW, Sydney, NSW, Australia
| | - Arcot Sowmya
- School of Computer Science and Engineering, UNSW, Sydney, NSW, Australia
| | | | - Thania Balducci
- Institute of Neurobiology, Universidad Nacional Autónoma de México Campus Juriquilla, Querétaro, Mexico
| | - Diego Angeles-Valdez
- Institute of Neurobiology, Universidad Nacional Autónoma de México Campus Juriquilla, Querétaro, Mexico
- Department of Biomedical Sciences of Cells and Systems, Cognitive Neuroscience Center, University Medical Center Groningen, University of Groningen, Groningen, Netherlands
| | - Roberto Souza
- Advanced Imaging and Artificial Intelligence Lab, Electrical and Software Engineering Department, The Hotchkiss Brain Institute, University of Calgary, Calgary, Canada
| | - Leticia Rittner
- Medical Image Computing Lab, School of Electrical and Computer Engineering (FEEC), University of Campinas, Campinas, Brazil
| | - Richard Frayne
- Radiology and Clinical Neurosciences Departments, The Hotchkiss Brain Institute, University of Calgary, Calgary, Canada
- Seaman Family MR Research Centre, Foothills Medical Center, Calgary, Canada
| | - Yuanfeng Ji
- University of Hongkong, Pok Fu Lam, Hong Kong, People's Republic of China
| | - Vincenzo Ferrari
- EndoCAS Center, Department of Translational Research and of New Surgical and Medical Technologies, University of Pisa, Pisa, Italy
- Dipartimento di Ingegneria dell'Informazione, University of Pisa, Pisa, Italy
| | - Soumick Chatterjee
- Data and Knowledge Engineering Group, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
- Genomics Research Centre, Human Technopole, Milan, Italy
| | | | - Stefanie Schreiber
- German Centre for Neurodegenerative Disease, Magdeburg, Germany
- Centre for Behavioural Brain Sciences, Magdeburg, Germany
- Department of Neurology, Medical Faculty, University Hospital of Magdeburg, Magdeburg, Germany
| | - Hendrik Mattern
- German Centre for Neurodegenerative Disease, Magdeburg, Germany
- Centre for Behavioural Brain Sciences, Magdeburg, Germany
- Department of Biomedical Magnetic Resonance, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Oliver Speck
- German Centre for Neurodegenerative Disease, Magdeburg, Germany
- Centre for Behavioural Brain Sciences, Magdeburg, Germany
- Department of Biomedical Magnetic Resonance, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Daniel Haehn
- University of Massachusetts Boston, Boston, MA, USA
| | | | - Andreas Nürnberger
- Centre for Behavioural Brain Sciences, Magdeburg, Germany
- Data and Knowledge Engineering Group, Faculty of Computer Science, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - João Pedrosa
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, Portugal
- Faculty of Engineering, University of Porto (FEUP), Porto, Portugal
| | - Carlos Ferreira
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, Portugal
- Faculty of Engineering, University of Porto (FEUP), Porto, Portugal
| | - Guilherme Aresta
- Christian Doppler Lab for Artificial Intelligence in Retina, Department of Ophthalmology and Optometry, Medical University of Vienna, Vienna, Austria
| | - António Cunha
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, Portugal
- Universidade of Trás-os-Montes and Alto Douro (UTAD), Vila Real, Portugal
| | - Aurélio Campilho
- Institute for Systems and Computer Engineering, Technology and Science (INESC TEC), Porto, Portugal
- Faculty of Engineering, University of Porto (FEUP), Porto, Portugal
| | - Yannick Suter
- ARTORG Center for Biomedical Engineering Research, University of Bern, Bern, Switzerland
| | - Jose Garcia
- Center for Biomedical Image Computing and Analytics (CBICA), Perelman School of Medicine, University of Pennsylvania, Philadelphia, USA
| | - Alain Lalande
- ICMUB Laboratory, Faculty of Medicine, CNRS UMR 6302, University of Burgundy, Dijon, France
- Medical Imaging Department, University Hospital of Dijon, Dijon, France
| | | | - Aline Van Oevelen
- Department of Human Structure and Repair, Ghent University, Ghent, Belgium
| | - Kate Duquesne
- Department of Human Structure and Repair, Ghent University, Ghent, Belgium
| | - Hamza Mekhzoum
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussels, Belgium
| | - Jef Vandemeulebroucke
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussels, Belgium
| | - Emmanuel Audenaert
- Department of Human Structure and Repair, Ghent University, Ghent, Belgium
| | - Claudia Krebs
- Department of Cellular and Physiological Sciences, Life Sciences Centre, University of British Columbia, Vancouver, BC, Canada
| | - Timo van Leeuwen
- Department of Development & Regeneration, KU Leuven Campus Kulak, Kortrijk, Belgium
| | - Evie Vereecke
- Department of Development & Regeneration, KU Leuven Campus Kulak, Kortrijk, Belgium
| | - Hauke Heidemeyer
- Institute of Medical Informatics, University Hospital RWTH Aachen, Aachen, Germany
| | - Rainer Röhrig
- Institute of Medical Informatics, University Hospital RWTH Aachen, Aachen, Germany
| | - Frank Hölzle
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
| | - Vahid Badeli
- Institute of Fundamentals and Theory in Electrical Engineering, Graz University of Technology, Graz, Austria
| | - Kathrin Krieger
- Leibniz-Institut für Analytische Wissenschaften-ISAS-e.V., Dortmund, Germany
| | - Matthias Gunzer
- Leibniz-Institut für Analytische Wissenschaften-ISAS-e.V., Dortmund, Germany
- Institute for Experimental Immunology and Imaging, University Hospital, University Duisburg-Essen, Essen, Germany
| | - Jianxu Chen
- Leibniz-Institut für Analytische Wissenschaften-ISAS-e.V., Dortmund, Germany
| | - Timo van Meegdenburg
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Faculty of Statistics, Technical University Dortmund, Dortmund, Germany
| | - Amin Dada
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Miriam Balzer
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Jana Fragemann
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Frederic Jonske
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Moritz Rempe
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Stanislav Malorodov
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Fin H Bahnsen
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Constantin Seibold
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Alexander Jaus
- Computer Vision for Human-Computer Interaction Lab, Department of Informatics, Karlsruhe Institute of Technology, Karlsruhe, Germany
| | - Zdravko Marinov
- Computer Vision for Human-Computer Interaction Lab, Department of Informatics, Karlsruhe Institute of Technology, Karlsruhe, Germany
| | - Paul F Jaeger
- German Cancer Research Center (DKFZ) Heidelberg, Interactive Machine Learning Group, Heidelberg, Germany
- Helmholtz Imaging, DKFZ Heidelberg, Heidelberg, Germany
| | - Rainer Stiefelhagen
- Computer Vision for Human-Computer Interaction Lab, Department of Informatics, Karlsruhe Institute of Technology, Karlsruhe, Germany
| | - Ana Sofia Santos
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Center Algoritmi, LASI, University of Minho, Braga, Portugal
| | - Mariana Lindo
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Center Algoritmi, LASI, University of Minho, Braga, Portugal
| | - André Ferreira
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Center Algoritmi, LASI, University of Minho, Braga, Portugal
| | - Victor Alves
- Center Algoritmi, LASI, University of Minho, Braga, Portugal
| | - Michael Kamp
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
- Institute for Neuroinformatics, Ruhr University Bochum, Bochum, Germany
- Department of Data Science & AI, Monash University, Clayton, VIC, Australia
| | - Amr Abourayya
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Institute for Neuroinformatics, Ruhr University Bochum, Bochum, Germany
| | - Felix Nensa
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen (AöR), Essen, Germany
| | - Fabian Hörst
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
| | - Alexander Brehmer
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Lukas Heine
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
| | - Yannik Hanusrichter
- Department of Tumour Orthopaedics and Revision Arthroplasty, Orthopaedic Hospital Volmarstein, Wetter, Germany
- Center for Musculoskeletal Surgery, University Hospital of Essen, Essen, Germany
| | - Martin Weßling
- Department of Tumour Orthopaedics and Revision Arthroplasty, Orthopaedic Hospital Volmarstein, Wetter, Germany
- Center for Musculoskeletal Surgery, University Hospital of Essen, Essen, Germany
| | - Marcel Dudda
- Department of Trauma, Hand and Reconstructive Surgery, University Hospital Essen, Essen, Germany
- Department of Orthopaedics and Trauma Surgery, BG-Klinikum Duisburg, University of Duisburg-Essen, Essen, Germany
| | - Lars E Podleska
- Department of Tumor Orthopedics and Sarcoma Surgery, University Hospital Essen (AöR), Essen, Germany
| | - Matthias A Fink
- Clinic for Diagnostic and Interventional Radiology, University Hospital Heidelberg, Heidelberg, Germany
| | - Julius Keyl
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
| | - Konstantinos Tserpes
- Department of Informatics and Telematics, Harokopio University of Athens, Tavros, Greece
| | - Moon-Sung Kim
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Institute of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen (AöR), Essen, Germany
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
| | - Shireen Elhabian
- Scientific Computing and Imaging Institute, University of Utah, Salt Lake City, USA
| | | | - Dženan Zukić
- Medical Computing, Kitware Inc., Carrboro, NC, USA
| | | | - Christian Wachinger
- Lab for Artificial Intelligence in Medical Imaging, Department of Radiology, Technical University Munich, Munich, Germany
| | - Martin Urschler
- Institute for Medical Informatics, Statistics and Documentation, Medical University Graz, Graz, Austria
| | - Luc Duong
- Department of Software and IT Engineering, Ecole de Technologie Superieure, Montreal, Quebec, Canada
| | - Jakob Wasserthal
- Clinic of Radiology & Nuclear Medicine, University Hospital Basel, Basel, Switzerland
| | - Peter F Hoyer
- Pediatric Clinic II, University Children's Hospital Essen, University Duisburg-Essen, Essen, Germany
| | - Oliver Basu
- Pediatric Clinic III, University Children's Hospital Essen, University Duisburg-Essen, Essen, Germany
- Center for Virtual and Extended Reality in Medicine (ZvRM), University Hospital Essen, University Medicine Essen, Essen, Germany
| | - Thomas Maal
- Radboudumc 3D-Lab , Department of Oral and Maxillofacial Surgery , Radboud University Nijmegen Medical Centre, Nijmegen , The Netherlands
| | - Max J H Witjes
- 3D Lab, Department of Oral and Maxillofacial Surgery, University Medical Center Groningen, Groningen, the Netherlands
| | - Gregor Schiele
- Intelligent Embedded Systems Lab, University of Duisburg-Essen, Bismarckstraße 90, 47057 Duisburg, Germany
| | | | | | - Ping Luo
- University of Hongkong, Pok Fu Lam, Hong Kong, People's Republic of China
| | - Bjoern Menze
- Department of Quantitative Biomedicine, University of Zurich, Zurich, Switzerland
| | - Mauricio Reyes
- ARTORG Center for Biomedical Engineering Research, University of Bern, Bern, Switzerland
- Department of Radiation Oncology, University Hospital Bern, University of Bern, Bern, Switzerland
| | - Thomas M Deserno
- Peter L. Reichertz Institute for Medical Informatics of TU Braunschweig and Hannover Medical School, Braunschweig, Germany
| | - Christos Davatzikos
- Center for Biomedical Image Computing and Analytics , Penn Neurodegeneration Genomics Center , University of Pennsylvania, Philadelphia , PA , USA ; and Center for AI and Data Science for Integrated Diagnostics, University of Pennsylvania, Philadelphia, PA, USA
| | - Behrus Puladi
- Department of Oral and Maxillofacial Surgery, University Hospital RWTH Aachen, Aachen, Germany
- Institute of Medical Informatics, University Hospital RWTH Aachen, Aachen, Germany
| | - Pascal Fua
- Computer Vision Laboratory, Swiss Federal Institute of Technology Lausanne (EPFL), Lausanne, Switzerland
| | - Alan L Yuille
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA
| | - Jens Kleesiek
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- German Cancer Consortium (DKTK), Partner Site Essen, Essen, Germany
- Department of Physics, TU Dortmund University, Dortmund, Germany
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
| | - Jan Egger
- Institute for Artificial Intelligence in Medicine (IKIM), University Hospital Essen (AöR), Essen, Germany
- Institute of Computer Graphics and Vision (ICG), Graz University of Technology, Graz, Austria
- Computer Algorithms for Medicine Laboratory (Cafe), Graz, Austria
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
- Center for Virtual and Extended Reality in Medicine (ZvRM), University Hospital Essen, University Medicine Essen, Essen, Germany
| |
Collapse
|
4
|
Jung H, Raythatha J, Moghadam A, Jin G, Mao J, Hsu J, Kim J. RibMR - A Mixed Reality Visualization System for Rib Fracture Localization in Surgical Stabilization of Rib Fractures: Phantom, Preclinical, and Clinical Studies. JOURNAL OF IMAGING INFORMATICS IN MEDICINE 2024:10.1007/s10278-024-01332-2. [PMID: 39707113 DOI: 10.1007/s10278-024-01332-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/18/2023] [Revised: 11/04/2024] [Accepted: 11/04/2024] [Indexed: 12/23/2024]
Abstract
In surgical stabilization of rib fractures (SSRF), the current standard relies on preoperative CT imaging and often incorporates ultrasound (US) imaging. As an alternative, mixed reality (MR) technology holds promise for improving rib fracture localization. This study presents an MR-based visualization system designed for SSRF in a clinical setting. We developed RibMR - a visualization system using an MR head-mounted display that projects a patient-specific 3D hologram onto the patient. RibMR enables the localization of rib fractures in relation to the patient's anatomy. We conducted phantom study using a human mannequin, a preclinical study with two healthy patients, and clinical study with two patients to evaluate RibMR and compared it to US practice. RibMR localized rib fractures with an average accuracy of 0.38 ± 0.21 cm in phantom, 3.75 ± 2.45 cm in preclinical, and 1.47 ± 1.33 cm in clinical studies. RibMR took an average time (minutes) of 4.42 ± 0.98 for the phantom, 8.03 ± 3.67 for the preclinical, and 8.76 ± 0.65 for the clinical studies. Compared to US, RibMR located more fractures, including fractures occluded by other structures, with higher accuracy, faster speed, and improved localization rate. All participating surgeons provided positive feedback regarding accuracy, visualization quality, and usability. RibMR enabled accurate and time-efficient localization of rib fractures and showed better performance compared to US. RibMR is a promising alternative to US for localizing rib fractures in SSRF.
Collapse
Affiliation(s)
- Hoijoon Jung
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jineel Raythatha
- Trauma Service, Westmead Hospital, Westmead, NSW, 2145, Australia
- Department of Surgery, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Alireza Moghadam
- Trauma Service, Westmead Hospital, Westmead, NSW, 2145, Australia
- Department of Surgery, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Ge Jin
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jiawei Mao
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jeremy Hsu
- Trauma Service, Westmead Hospital, Westmead, NSW, 2145, Australia
- Department of Surgery, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jinman Kim
- Biomedical Data Analysis and Visualisation (BDAV) Lab, School of Computer Science, The University of Sydney, Camperdown, NSW, 2050, Australia.
| |
Collapse
|
5
|
Han Z, Dou Q. A review on organ deformation modeling approaches for reliable surgical navigation using augmented reality. Comput Assist Surg (Abingdon) 2024; 29:2357164. [PMID: 39253945 DOI: 10.1080/24699322.2024.2357164] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/11/2024] Open
Abstract
Augmented Reality (AR) holds the potential to revolutionize surgical procedures by allowing surgeons to visualize critical structures within the patient's body. This is achieved through superimposing preoperative organ models onto the actual anatomy. Challenges arise from dynamic deformations of organs during surgery, making preoperative models inadequate for faithfully representing intraoperative anatomy. To enable reliable navigation in augmented surgery, modeling of intraoperative deformation to obtain an accurate alignment of the preoperative organ model with the intraoperative anatomy is indispensable. Despite the existence of various methods proposed to model intraoperative organ deformation, there are still few literature reviews that systematically categorize and summarize these approaches. This review aims to fill this gap by providing a comprehensive and technical-oriented overview of modeling methods for intraoperative organ deformation in augmented reality in surgery. Through a systematic search and screening process, 112 closely relevant papers were included in this review. By presenting the current status of organ deformation modeling methods and their clinical applications, this review seeks to enhance the understanding of organ deformation modeling in AR-guided surgery, and discuss the potential topics for future advancements.
Collapse
Affiliation(s)
- Zheng Han
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China
| | - Qi Dou
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China
| |
Collapse
|
6
|
Zhang H, Killeen BD, Ku Y, Seenivasan L, Zhao Y, Liu M, Yang Y, Gu S, Martin‐Gomez A, Taylor, Osgood G, Unberath M. StraightTrack: Towards mixed reality navigation system for percutaneous K-wire insertion. Healthc Technol Lett 2024; 11:355-364. [PMID: 39720744 PMCID: PMC11665788 DOI: 10.1049/htl2.12103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2024] [Accepted: 11/25/2024] [Indexed: 12/26/2024] Open
Abstract
In percutaneous pelvic trauma surgery, accurate placement of Kirschner wires (K-wires) is crucial to ensure effective fracture fixation and avoid complications due to breaching the cortical bone along an unsuitable trajectory. Surgical navigation via mixed reality (MR) can help achieve precise wire placement in a low-profile form factor. Current approaches in this domain are as yet unsuitable for real-world deployment because they fall short of guaranteeing accurate visual feedback due to uncontrolled bending of the wire. To ensure accurate feedback, StraightTrack, an MR navigation system designed for percutaneous wire placement in complex anatomy, is introduced. StraightTrack features a marker body equipped with a rigid access cannula that mitigates wire bending due to interactions with soft tissue and a covered bony surface. Integrated with an optical see-through head-mounted display capable of tracking the cannula body, StraightTrack offers real-time 3D visualization and guidance without external trackers, which are prone to losing line-of-sight. In phantom experiments with two experienced orthopedic surgeons, StraightTrack improves wire placement accuracy, achieving the ideal trajectory within5.26 ± 2.29 mm and2.88 ± 1.49 , compared to over 12.08 mm and 4.07 for comparable methods. As MR navigation systems continue to mature, StraightTrack realizes their potential for internal fracture fixation and other percutaneous orthopedic procedures.
Collapse
Affiliation(s)
- Han Zhang
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | | | - Yu‐Chun Ku
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | | | - Yuxuan Zhao
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Mingxu Liu
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Yue Yang
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Suxi Gu
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | | | - Taylor
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| | - Greg Osgood
- Department of Orthopaedic SurgeryJohns Hopkins MedicineBaltimoreMarylandUSA
| | - Mathias Unberath
- Department of Computer ScienceJohns Hopkins UniversityBaltimoreMarylandUSA
| |
Collapse
|
7
|
Gao L, Zhang H, Xu Y, Dong Y, Sheng L, Fan Y, Qin C, Gu W. Mixed reality-assisted versus landmark-guided spinal puncture in elderly patients: protocol for a stratified randomized controlled trial. Trials 2024; 25:780. [PMID: 39558217 PMCID: PMC11575154 DOI: 10.1186/s13063-024-08628-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2024] [Accepted: 11/11/2024] [Indexed: 11/20/2024] Open
Abstract
BACKGROUND Performing spinal anesthesia in elderly patients with spine degeneration is challenging for novice practitioners. This stratified randomized controlled trial aims to compare the effectiveness of mixed reality-assisted spinal puncture (MRasp) with that of landmark-guided spinal puncture (LGsp) performed by novice practitioners in elderly patients. METHODS This prospective, single-center, stratified, blocked, parallel randomized controlled trial will include 168 patients (aged ≥ 65 years) scheduled for elective surgery involving spinal anesthesia. All spinal punctures will be performed by anesthesiology interns and residents trained at Huadong Hospital. Patients will be randomly assigned to the MRasp group (n = 84) or the LGsp group (n = 84). Based on each intern/resident's experience in spinal puncture, participants will be stratified into three clusters: the primary group, intermediate group, and advanced group. The primary outcome will be the comparison of the rate of successful first-attempt needle insertion between the MRasp group and the LGsp group. Secondary outcomes will include the number of needle insertion attempts, the number of redirection attempts, the number of passes, the rate of successful first needle pass, the spinal puncture time, the total procedure time, and the incidence of perioperative complications. A stratified subgroup analysis will also be conducted for interns/residents at different experience levels. DISCUSSION The findings from this trial establish the effectiveness of MRasp by novice practitioners in elderly patients. This trial may provide experimental evidence for exploring an effective visualization technology to assist in spinal puncture. TRIAL REGISTRATION Chinese Clinical Trials Registry ChiCTR2300075291. Registered on August 31, 2023. https://www.chictr.org.cn/bin/project/edit?pid=189622 .
Collapse
Affiliation(s)
- Lei Gao
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, China
| | - Haichao Zhang
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, China
| | - Yidi Xu
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, China
| | - Yanjun Dong
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, China
| | - Lu Sheng
- Department of Urology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, China
| | - Yongqian Fan
- Department of Orthopedics, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, China
| | - Chunhui Qin
- Department of Pain Management, Yueyang Hospital of Integrated Traditional Chinese and Western Medicine Affiliated to Shanghai University of Traditional Chinese Medicine, Shanghai, 200437, China.
| | - Weidong Gu
- Department of Anesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, 200040, China.
| |
Collapse
|
8
|
Lippert M, Dumont KA, Birkeland S, Nainamalai V, Solvin H, Suther KR, Bendz B, Elle OJ, Brun H. Cardiac anatomic digital twins: findings from a single national centre. EUROPEAN HEART JOURNAL. DIGITAL HEALTH 2024; 5:725-734. [PMID: 39563912 PMCID: PMC11570384 DOI: 10.1093/ehjdh/ztae070] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/26/2024] [Revised: 07/11/2024] [Accepted: 07/30/2024] [Indexed: 11/21/2024]
Abstract
Aims New three-dimensional cardiac visualization technologies are increasingly employed for anatomic digital twins in pre-operative planning. However, the role and influence of extended reality (virtual, augmented, or mixed) within heart team settings remain unclear. We aimed to assess the impact of mixed reality visualization of the intracardiac anatomy on surgical decision-making in patients with complex heart defects. Methods and results Between September 2020 and December 2022, we recruited 50 patients and generated anatomic digital twins and visualized them in mixed reality. These anatomic digital twins were presented to the heart team after initial decisions were made using standard visualization methods. Changes in the surgical strategy were recorded. Additionally, heart team members rated their mixed reality experience through a questionnaire, and post-operative outcomes were registered. Anatomic digital twins changed the initially decided upon surgical strategies for 68% of cases. While artificial intelligence facilitated the rapid creation of digital anatomic twins, manual corrections were always necessary. Conclusion In conclusion, mixed reality anatomic digital twins added information to standard visualization methods and significantly influenced surgical planning, with evidence that these strategies can be implemented safely without additional risk.
Collapse
Affiliation(s)
- Matthias Lippert
- The Intervention Centre, Division for Technology and Innovation, Oslo University Hospital, Rikshospitalet, PO Box 4950 Nydalen, Oslo 0424, Norway
- Institute of Clinical Medicine, University of Oslo, Kirkeveien 166, Oslo 0450, Norway
| | - Karl-Andreas Dumont
- Department of Cardiothoracic Surgery, Oslo University Hospital, Oslo, Norway
| | - Sigurd Birkeland
- Department of Cardiothoracic Surgery, Oslo University Hospital, Oslo, Norway
| | - Varatharajan Nainamalai
- The Intervention Centre, Division for Technology and Innovation, Oslo University Hospital, Rikshospitalet, PO Box 4950 Nydalen, Oslo 0424, Norway
| | - Håvard Solvin
- The Intervention Centre, Division for Technology and Innovation, Oslo University Hospital, Rikshospitalet, PO Box 4950 Nydalen, Oslo 0424, Norway
- Institute of Clinical Medicine, University of Oslo, Kirkeveien 166, Oslo 0450, Norway
| | - Kathrine Rydén Suther
- Department of Radiology, Division of Radiology and Nuclear Medicine, Oslo University Hospital, Rikshospitalet, Oslo, Norway
| | - Bjørn Bendz
- Institute of Clinical Medicine, University of Oslo, Kirkeveien 166, Oslo 0450, Norway
- Department of Cardiology, Oslo University Hospital, Oslo, Norway
| | - Ole Jakob Elle
- The Intervention Centre, Division for Technology and Innovation, Oslo University Hospital, Rikshospitalet, PO Box 4950 Nydalen, Oslo 0424, Norway
- Department of Informatics, University of Oslo, Oslo, Norway
| | - Henrik Brun
- The Intervention Centre, Division for Technology and Innovation, Oslo University Hospital, Rikshospitalet, PO Box 4950 Nydalen, Oslo 0424, Norway
- Department for Pediatric Cardiology, Oslo University Hospital, Oslo, Norway
| |
Collapse
|
9
|
Franzò M, Marinozzi F, Finti A, Lattao M, Trabassi D, Castiglia SF, Serrao M, Bini F. Mixed Reality-Based Smart Occupational Therapy Personalized Protocol for Cerebellar Ataxic Patients. Brain Sci 2024; 14:1023. [PMID: 39452035 PMCID: PMC11506775 DOI: 10.3390/brainsci14101023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2024] [Revised: 10/09/2024] [Accepted: 10/15/2024] [Indexed: 10/26/2024] Open
Abstract
BACKGROUND Occupational therapy (OT) is an essential component of patient care, and it is especially beneficial if focused on meaningful activities. For ataxic patients, traditional procedures are currently the most efficient, although without specific guidelines and suggestions for virtual reality integration. In this context, this study proposes Hybrid Smart Rehabilitation (HSR) based on mixed reality (MR) as an aid in overcoming limitations of the traditional OT procedures. METHODS MR-HSR is designed specifically for ataxic patients and developed in Unity with the Holographic Remoting setting for run-time intervention on the scene. The subject reaches a book and grabs it with their hand inside a holographic guide with audio-visive feedback. Hand trajectories acquired from eight ataxic patients and eight healthy subjects were compared and new variables were analyzed to evaluate the performance. The Trust in Automation questionnaire was submitted to assess the opinion of the patients. RESULTS Patients confirmed their trust in the developer and in the improvement that this system can bring to their rehabilitation. The "total time" and "sway area" of the trajectory were statistically significant and, together with the deviation of the trajectory from the main axis of the guide, although not statistically significant, made it possible to build a classifier. CONCLUSIONS The patient-specific MR-HSR can be considered as an integrative tool for assessing the subject's condition by analyzing new quantitative variables which, if matched to the Scale for the Assessment and Rating of Ataxia (SARA), could be the basis of a new index to assess the progressiveness of ataxia.
Collapse
Affiliation(s)
- Michela Franzò
- Department of Medico-Surgical Sciences and Biotechnologies, Sapienza University of Rome, 00196 Rome, Italy; (M.F.); (D.T.); (S.F.C.); (M.S.)
| | - Franco Marinozzi
- Department of Mechanical and Aerospace Engineering, Sapienza University of Rome, 00185 Rome, Italy; (F.M.); (A.F.); (M.L.)
| | - Alessia Finti
- Department of Mechanical and Aerospace Engineering, Sapienza University of Rome, 00185 Rome, Italy; (F.M.); (A.F.); (M.L.)
| | - Marco Lattao
- Department of Mechanical and Aerospace Engineering, Sapienza University of Rome, 00185 Rome, Italy; (F.M.); (A.F.); (M.L.)
| | - Dante Trabassi
- Department of Medico-Surgical Sciences and Biotechnologies, Sapienza University of Rome, 00196 Rome, Italy; (M.F.); (D.T.); (S.F.C.); (M.S.)
| | - Stefano Filippo Castiglia
- Department of Medico-Surgical Sciences and Biotechnologies, Sapienza University of Rome, 00196 Rome, Italy; (M.F.); (D.T.); (S.F.C.); (M.S.)
- Department of Brain and Behavioral Sciences, University of Pavia, 27100 Pavia, Italy
| | - Mariano Serrao
- Department of Medico-Surgical Sciences and Biotechnologies, Sapienza University of Rome, 00196 Rome, Italy; (M.F.); (D.T.); (S.F.C.); (M.S.)
| | - Fabiano Bini
- Department of Mechanical and Aerospace Engineering, Sapienza University of Rome, 00185 Rome, Italy; (F.M.); (A.F.); (M.L.)
| |
Collapse
|
10
|
Bochet Q, Raoul G, Lauwers L, Nicot R. Augmented reality in implantology: Virtual surgical checklist and augmented implant placement. JOURNAL OF STOMATOLOGY, ORAL AND MAXILLOFACIAL SURGERY 2024; 125:101813. [PMID: 38452901 DOI: 10.1016/j.jormas.2024.101813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/05/2023] [Revised: 02/17/2024] [Accepted: 03/05/2024] [Indexed: 03/09/2024]
Abstract
OBJECTIVES Aim of the present study was to create a pedagogical checklist for implant surgical protocol with an augmented reality (AR) guided freehand surgery to inexperienced surgeons using a head mounted display (HMD) with tracking. METHODS The anatomical model of a patient with two missing mandibular teeth requiring conventional single-tooth implants was selected. The computed tomography (CT) scans were extracted and imported into segmentation and implant planning software. A Patient-specific dental splint through an intermediate strut, supported 3D-printed QR code. A checklist was generated to guide surgical procedure. After tracking, the AR-HMD projects the virtual pre-surgical plan (inferior alveolar nerve (IAN), implant axis, implant location) onto the real 3D-printed anatomical models. The entire drilling sequence was based on the manufacturer's recommendations, on 3D-printed anatomical models. After the implant surgical procedure, CT of the 3D-printed models was performed to compare the actual and simulated implant placements. All procedures in the study were performed in accordance with the Declaration of Helsinki. RESULTS In total, two implants were placed in a 3D-printed anatomical model of a female patient who required implant rehabilitation for dental agenesis at the second mandibular premolar positions (#35 and #45). Superimposition of the actual and simulated implants showed high concordance between them. CONCLUSION AR in education offers crucial surgical information for novice surgeons in real time. However, the benefits provided by AR in clinical and educational implantology must be demonstrated in other studies involving a larger number of patients, surgeons and apprentices.
Collapse
Affiliation(s)
- Quentin Bochet
- Univ. Lille, CHU Lille, Department of Oral and Maxillofacial Surgery, Lille F-59000, France
| | - Gwénaël Raoul
- Univ. Lille, CHU Lille, INSERM, Department of Oral and Maxillo-Facial Surgery, U1008 - Advanced Drug Delivery Systems, Lille F-59000, France
| | - Ludovic Lauwers
- Univ. Lille, CHU Lille, Department of Oral and Maxillofacial Surgery, URL 2694 - METRICS, Lille F-59000, France
| | - Romain Nicot
- Univ. Lille, CHU Lille, INSERM, Department of Oral and Maxillo-Facial Surgery, U1008 - Advanced Drug Delivery Systems, Lille F-59000, France; CNRS, Centrale Lille, Univ. Lille, UMR 9013 - LaMcube - Laboratoire de Mécanique, Multiphysique, Multiéchelle, Lille F-59000, France.
| |
Collapse
|
11
|
Egger J, Gsaxner C, Luijten G, Chen J, Chen X, Bian J, Kleesiek J, Puladi B. Is the Apple Vision Pro the Ultimate Display? A First Perspective and Survey on Entering the Wonderland of Precision Medicine. JMIR Serious Games 2024; 12:e52785. [PMID: 39292499 PMCID: PMC11447423 DOI: 10.2196/52785] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Revised: 03/26/2024] [Accepted: 07/02/2024] [Indexed: 09/19/2024] Open
Abstract
At the Worldwide Developers Conference in June 2023, Apple introduced the Vision Pro. The Apple Vision Pro (AVP) is a mixed reality headset; more specifically, it is a virtual reality device with an additional video see-through capability. The video see-through capability turns the AVP into an augmented reality (AR) device. The AR feature is enabled by streaming the real world via cameras on the (virtual reality) screens in front of the user's eyes. This is, of course, not unique and is similar to other devices, such as the Varjo XR-3 (Varjo Technologies Oy). Nevertheless, the AVP has some interesting features, such as an inside-out screen that can show the headset wearer's eyes to "outsiders," and a button on the top, called the "digital crown," that allows a seamless blend of digital content with the user's physical space by turning it. In addition, it is untethered, except for the cable to the battery, which makes the headset more agile, compared to the Varjo XR-3. This could actually come closer to "The Ultimate Display," which Ivan Sutherland had already sketched in 1965. After a great response from the media and social networks to the release, we were able to test and review the new AVP ourselves in March 2024. Including an expert survey with 13 of our colleagues after testing the AVP in our institute, this Viewpoint explores whether the AVP can overcome clinical challenges that AR especially still faces in the medical domain; we also go beyond this and discuss whether the AVP could support clinicians in essential tasks to allow them to spend more time with their patients.
Collapse
Affiliation(s)
- Jan Egger
- Institute for Artificial Intelligence in Medicine, Essen University Hospital (AöR), Essen, Germany
- Center for Virtual and Extended Reality in Medicine (ZvRM), Essen University Hospital (AöR), Essen, Germany
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
| | - Christina Gsaxner
- Institute for Artificial Intelligence in Medicine, Essen University Hospital (AöR), Essen, Germany
- Department of Oral and Maxillofacial Surgery & Institute of Medical Informatics, University Hospital RWTH Aachen, Aachen, Germany
- Institute of Medical Informatics, University Hospital RWTH Aachen, Aachen, Germany
- Institute of Computer Graphics and Vision, Graz University of Technology, Graz, Austria
| | - Gijs Luijten
- Institute for Artificial Intelligence in Medicine, Essen University Hospital (AöR), Essen, Germany
- Institute of Computer Graphics and Vision, Graz University of Technology, Graz, Austria
| | - Jianxu Chen
- Leibniz-Institut für Analytische Wissenschaften (ISAS), Dortmund, Germany
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
- Institute of Medical Robotic, Shanghai Jiao Tong University, Shanghai, China
| | - Jiang Bian
- Health Outcomes and Biomedical Informatics, College of Medicine, University of Florida, Gainesville, FL, United States
| | - Jens Kleesiek
- Institute for Artificial Intelligence in Medicine, Essen University Hospital (AöR), Essen, Germany
- Cancer Research Center Cologne Essen (CCCE), University Medicine Essen (AöR), Essen, Germany
- German Cancer Consortium (DKTK), Partner Site Essen, Essen, Germany
- Department of Physics, TU Dortmund University, Dortmund, Germany
| | - Behrus Puladi
- Department of Oral and Maxillofacial Surgery & Institute of Medical Informatics, University Hospital RWTH Aachen, Aachen, Germany
- Institute of Medical Informatics, University Hospital RWTH Aachen, Aachen, Germany
| |
Collapse
|
12
|
Kihara T, Keller A, Ogawa T, Armand M, Martin-Gomez A. Evaluating the feasibility of using augmented reality for tooth preparation. J Dent 2024; 148:105217. [PMID: 38944264 DOI: 10.1016/j.jdent.2024.105217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2024] [Revised: 06/12/2024] [Accepted: 06/27/2024] [Indexed: 07/01/2024] Open
Abstract
OBJECTIVES Tooth preparation is complicated because it requires the preparation of an abutment while simultaneously predicting the ideal shape of the tooth. This study aimed to develop and evaluate a system using augmented reality (AR) head-mounted displays (HMDs) that provide dynamic navigation capabilities for tooth preparation. METHODS The proposed system utilizes optical see-through HMDs to overlay digital information onto the real world and enrich the user's environment. By integrating tracking algorithms and three-dimensional modeling, the system provides real-time visualization and navigation capabilities during tooth preparation by using two different visualization techniques. The experimental setup involved a comprehensive analysis of the distance to the surface and cross-sectional angles between the ideal and prepared teeth using three scenarios: traditional (without AR), overlay (AR-assisted visualization of the ideal prepared tooth), and cross-sectional (AR-assisted visualization with cross-sectional views and angular displays). RESULTS A user study (N = 24) revealed that the cross-sectional approach was more effective for angle adjustment and reduced the occurrence of over-reduction. Additional questionnaires revealed that the AR-assisted approaches were perceived as less difficult, with the cross-sectional approach excelling in terms of performance. CONCLUSIONS Visualization and navigation using cross-sectional approaches have the potential to support safer tooth preparation with less overreduction than traditional and overlay approaches do. The angular displays provided by the cross-sectional approach are considered helpful for tooth preparation. CLINICAL SIGNIFICANCE The AR navigation system can assist dentists during tooth preparation and has the potential to enhance the accuracy and safety of prosthodontic treatment.
Collapse
Affiliation(s)
- Takuya Kihara
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Fixed Prosthodontics, School of Dental Medicine, Tsurumi University, 2-1-3 Tsurumi, Tsurumi-ku, Yokohama, Kanagawa, 734-8501, Japan.
| | - Andreas Keller
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Computer Science, Technical University of Munich, Munich, Germany
| | - Takumi Ogawa
- Department of Fixed Prosthodontics, School of Dental Medicine, Tsurumi University, 2-1-3 Tsurumi, Tsurumi-ku, Yokohama, Kanagawa, 734-8501, Japan
| | - Mehran Armand
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA; Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA; Department of Orthopaedic Surgery, Johns Hopkins University, Baltimore, MD, USA
| | - Alejandro Martin-Gomez
- Biomechanical- and Image-Guided Surgical Systems (BIGSS), Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Hackerman Hall, 3400N, Charles Street, Baltimore, MD 21218, USA; Department of Computer Science, Johns Hopkins University, Baltimore, MD, USA; The Malone Center for Engineering in Healthcare, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
13
|
Janssen A, Jiang Y, Dumont AS, Khan P. Aspiration of Subdural Hygroma Using Augmented Reality Neuronavigation: A Case Report. Cureus 2024; 16:e69070. [PMID: 39391441 PMCID: PMC11465702 DOI: 10.7759/cureus.69070] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/09/2024] [Indexed: 10/12/2024] Open
Abstract
Augmented reality (AR) is emerging as a key technology in neurosurgery. Projecting three-dimensional (3D) anatomic models onto the surgical field provides unique operative information to make procedures safer and more efficient. A small footprint, rapid registration AR system was used for bedside guidance during aspiration of a subdural hygroma. A 77-year-old male presented for resection of a suprasellar tumor and subsequently developed a large bilateral subdural hygroma. We performed the aspiration of the hygroma at the bedside using AR guidance. The AR system allowed for precise needle placement during the aspiration. The aim of this report was to demonstrate the clinical feasibility of integrating a novel AR system into the clinical workflow of a bedside procedure. As AR continues to expand in the field, the benefits of this technology for various procedures will become more evident.
Collapse
Affiliation(s)
- Andrew Janssen
- Neurosurgery, Tulane University School of Medicine, New Orleans, USA
| | - Yinghua Jiang
- Neurosurgery, Tulane University School of Medicine, New Orleans, USA
| | - Aaron S Dumont
- Neurosurgery, Tulane University School of Medicine, New Orleans, USA
| | - Pervez Khan
- Neurosurgery, Tulane University School of Medicine, New Orleans, USA
| |
Collapse
|
14
|
Wang X, Yang C, Liu Z, Zhang J, Xue C, Xing L, Zheng Y, Geng C, Yin X. R-MFE-TCN: A correlation prediction model between body surface and tumor during respiratory movement. Med Phys 2024; 51:6075-6089. [PMID: 38801342 DOI: 10.1002/mp.17183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2024] [Revised: 04/30/2024] [Accepted: 05/11/2024] [Indexed: 05/29/2024] Open
Abstract
BACKGROUND 2D CT image-guided radiofrequency ablation (RFA) is an exciting minimally invasive treatment that can destroy liver tumors without removing them. However, CT images can only provide limited static information, and the tumor will move with the patient's respiratory movement. Therefore, how to accurately locate tumors under free conditions is an urgent problem to be solved at present. PURPOSE The purpose of this study is to propose a respiratory correlation prediction model for mixed reality surgical assistance system, Riemannian and Multivariate Feature Enhanced Temporal Convolutional Network (R-MFE-TCN), and to achieve accurate respiratory correlation prediction. METHODS The model adopts a respiration-oriented Riemannian information enhancement strategy to expand the diversity of the dataset. A new Multivariate Feature Enhancement module (MFE) is proposed to retain respiratory data information, so that the network can fully explore the correlation of internal and external data information, the dual-channel is used to retain multivariate respiratory feature, and the Multi-headed Self-attention obtains respiratory peak-to-valley value periodic information. This information significantly improves the prediction performance of the network. At the same time, the PSO algorithm is used for hyperparameter optimization. In the experiment, a total of seven patients' internal and external respiratory motion trajectories were obtained from the dataset, and the first six patients were selected as the training set. The respiratory signal collection frequency was 21 Hz. RESULTS A large number of experiments on the dataset prove the good performance of this method, which improves the prediction accuracy while also having strong robustness. This method can reduce the delay deviation under long window prediction and achieve good performance. In the case of 400 ms, the average RMSE and MAE are 0.0453 and 0.0361 mm, respectively, which is better than other research methods. CONCLUSION The R-MFE-TCN can be extended to respiratory correlation prediction in different clinical situations, meeting the accuracy requirements for respiratory delay prediction in surgical assistance.
Collapse
Affiliation(s)
- Xuehu Wang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Chang Yang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Ziqi Liu
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Jushuo Zhang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Chao Xue
- Senior Department of Orthopedics, the Fourth Medical Center of PLA General Hospital, Beijing, China
| | - Lihong Xing
- Affiliated Hospital of Hebei University, Baoding, China
| | - Yongchang Zheng
- Department of Liver Surgery, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College (CAMS & PUMC), Beijing, China
| | - Chen Geng
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
| | - Xiaoping Yin
- Affiliated Hospital of Hebei University, Baoding, China
| |
Collapse
|
15
|
Gao L, Xu Y, Zhang X, Jiang Z, Wu J, Dong Y, Li M, Jin L, Qiu J, You L, Qin C, Gu W. Comparison of Mixed Reality-Assisted Spinal Puncture with Landmark-Guided Spinal Puncture by Novice Practitioners: A Pilot Study. J Pain Res 2024; 17:2701-2712. [PMID: 39165722 PMCID: PMC11334921 DOI: 10.2147/jpr.s470285] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2024] [Accepted: 08/13/2024] [Indexed: 08/22/2024] Open
Abstract
Background Performing spinal anaesthesia in elderly patients with ligament calcification or hyperostosis is challenging for novice practitioners. This pilot study aimed to compare the effectiveness of mixed reality-assisted spinal puncture (MRasp) with that of landmark-guided spinal puncture (LGsp) by novice practitioners in elderly patients. Methods In this pilot study, 36 patients (aged ≥65 years) scheduled for elective surgery under spinal anaesthesia by anaesthesiology residents were included. Patients were randomly assigned to the MRasp group (n = 18) or the LGsp group (n = 18). The outcomes included the number of needle insertion attempts, redirection attempts, passes, the rate of successful first-attempt needle insertion, the rate of successful first needle pass, the spinal puncture time, the total procedure time, and the incidence of perioperative complications. Results The median number of needle insertion attempts was significantly fewer in the MRasp group than in the LGsp group (1.0 vs 2.0, P = 0.023). The proportion of patients with successful first-attempt needle insertion was 72.2% in the MRasp group and 44.4% in the LGsp group (P = 0.176). The incidence of perioperative complications did not significantly differ between the two groups. Conclusion This pilot study found that novice practitioners made significantly fewer needle insertion attempts in the MRasp group compared to the LGsp group when performing spinal anaesthesia on elderly patients. A future randomized controlled trial (RCT) is warranted to validate its effectiveness. Trial Registration This trial was registered at https://www.chictr.org.cn/showproj.html?proj=178960 (ChiCTR-IPR-2300068520). Public title: Mixed reality-assisted versus landmark-guided spinal puncture in elderly patients: a randomized controlled pilot study. Principal investigator: Lei Gao. The registration date was February 22, 2023. The date of the first participant enrolment was February 27, 2023.
Collapse
Affiliation(s)
- Lei Gao
- Department of Anaesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Yidi Xu
- Department of Anaesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Xixue Zhang
- Department of Anaesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Zhaoshun Jiang
- Department of Anaesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Jiajun Wu
- Department of Anaesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Yanjun Dong
- Department of Anaesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Ming Li
- Department of Radiology, Huadong Hospital affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Liang Jin
- Department of Radiology, Huadong Hospital affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Jianjian Qiu
- Department of Radiation Oncology, Huadong Hospital affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Lijue You
- Department of Computer Center, Huadong Hospital affiliated to Fudan University, Shanghai, People’s Republic of China
| | - Chunhui Qin
- Department of Pain Management, Yueyang Integrated Traditional Chinese Medicine and Western Medicine Hospital Affiliated to Shanghai University of Traditional Chinese Medicine, Shanghai, People’s Republic of China
| | - Weidong Gu
- Department of Anaesthesiology, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
- Shanghai Key Laboratory of Clinical Geriatric Medicine, Huadong Hospital Affiliated to Fudan University, Shanghai, People’s Republic of China
| |
Collapse
|
16
|
Finos K, Datta S, Sedrakyan A, Milsom JW, Pua BB. Mixed reality in interventional radiology: a focus on first clinical use of XR90 augmented reality-based visualization and navigation platform. Expert Rev Med Devices 2024; 21:679-688. [PMID: 39054630 DOI: 10.1080/17434440.2024.2379925] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2024] [Accepted: 06/28/2024] [Indexed: 07/27/2024]
Abstract
INTRODUCTION Augmented reality (AR) and virtual reality (VR) are emerging tools in interventional radiology (IR), enhancing IR education, preprocedural planning, and intraprocedural guidance. AREAS COVERED This review identifies current applications of AR/VR in IR, with a focus on studies that assess the clinical impact of AR/VR. We outline the relevant technology and assess current limitations and future directions in this space. We found that the use of AR in IR lags other surgical fields, and the majority of the data exists in case series or small-scale studies. Educational use of AR/VR improves learning anatomy, procedure steps, and procedural learning curves. Preprocedural use of AR/VR decreases procedure times, especially in complex procedures. Intraprocedural AR for live tracking is accurate within 5 mm live patients and has up to 0.75 mm in phantoms, offering decreased procedure time and radiation exposure. Challenges include cost, ergonomics, rapid segmentation, and organ motion. EXPERT OPINION The use of AR/VR in interventional radiology may lead to safer and more efficient procedures. However, more data from larger studies is needed to better understand where AR/VR is confers the most benefit in interventional radiology clinical practice.
Collapse
Affiliation(s)
- Kyle Finos
- Division of Interventional Radiology, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Sanjit Datta
- Division of Interventional Radiology, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Art Sedrakyan
- Population Health Science, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Jeffrey W Milsom
- Division of Colorectal Surgery, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| | - Bradley B Pua
- Division of Interventional Radiology, New York Presbyterian Hospital/Weill Cornell Medicine, New York, USA
| |
Collapse
|
17
|
Ai X, Santamaria V, Agrawal SK. Characterizing the Effects of Adding Virtual and Augmented Reality in Robot-Assisted Training. IEEE Trans Neural Syst Rehabil Eng 2024; 32:2709-2718. [PMID: 39042524 PMCID: PMC11324333 DOI: 10.1109/tnsre.2024.3432661] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/25/2024]
Abstract
Extended reality (XR) technology combines physical reality with computer synthetic virtuality to deliver immersive experience to users. Virtual reality (VR) and augmented reality (AR) are two subdomains within XR with different immersion levels. Both of these have the potential to be combined with robot-assisted training protocols to maximize postural control improvement. In this study, we conducted a randomized control experiment with sixty-three healthy subjects to compare the effectiveness of robot-assisted posture training combined with VR or AR against robotic training alone. A robotic Trunk Support Trainer (TruST) was employed to deliver assistive force at the trunk as subjects moved beyond the stability limits during training. Our results showed that both VR and AR significantly enhanced the training outcomes of the TruST intervention. However, the VR group experienced higher simulator sickness compared to the AR group, suggesting that AR is better suited for sitting posture training in conjunction with TruST intervention. Our findings highlight the added value of XR to robot-assisted training and provide novel insights into the differences between AR and VR when integrated into a robotic training protocol. In addition, we developed a custom XR application that suited well for TruST intervention requirements. Our approach can be extended to other studies to develop novel XR-enhanced robotic training platforms.
Collapse
|
18
|
Qi Z, Corr F, Grimm D, Nimsky C, Bopp MHA. Extended Reality-Based Head-Mounted Displays for Surgical Education: A Ten-Year Systematic Review. Bioengineering (Basel) 2024; 11:741. [PMID: 39199699 PMCID: PMC11351461 DOI: 10.3390/bioengineering11080741] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2024] [Revised: 07/17/2024] [Accepted: 07/19/2024] [Indexed: 09/01/2024] Open
Abstract
Surgical education demands extensive knowledge and skill acquisition within limited time frames, often limited by reduced training opportunities and high-pressure environments. This review evaluates the effectiveness of extended reality-based head-mounted display (ExR-HMD) technology in surgical education, examining its impact on educational outcomes and exploring its strengths and limitations. Data from PubMed, Cochrane Library, Web of Science, ScienceDirect, Scopus, ACM Digital Library, IEEE Xplore, WorldCat, and Google Scholar (Year: 2014-2024) were synthesized. After screening, 32 studies comparing ExR-HMD and traditional surgical training methods for medical students or residents were identified. Quality and bias were assessed using the Medical Education Research Study Quality Instrument, Newcastle-Ottawa Scale-Education, and Cochrane Risk of Bias Tools. Results indicate that ExR-HMD offers benefits such as increased immersion, spatial awareness, and interaction and supports motor skill acquisition theory and constructivist educational theories. However, challenges such as system fidelity, operational inconvenience, and physical discomfort were noted. Nearly half the studies reported outcomes comparable or superior to traditional methods, emphasizing the importance of social interaction. Limitations include study heterogeneity and English-only publications. ExR-HMD shows promise but needs educational theory integration and social interaction. Future research should address technical and economic barriers to global accessibility.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany; (F.C.); (D.G.); (C.N.)
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China
| | - Felix Corr
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany; (F.C.); (D.G.); (C.N.)
| | - Dustin Grimm
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany; (F.C.); (D.G.); (C.N.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany; (F.C.); (D.G.); (C.N.)
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany; (F.C.); (D.G.); (C.N.)
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| |
Collapse
|
19
|
Rock JP, Schultz L, Dempsey R, Cohen J. Integration of Mixed Reality Technology Into a Global Neurosurgery Bootcamp. Cureus 2024; 16:e63888. [PMID: 39100053 PMCID: PMC11298066 DOI: 10.7759/cureus.63888] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/04/2024] [Indexed: 08/06/2024] Open
Abstract
International bootcamps are important for providing access to advanced education and training to physicians around the world. In countries where resources are scarce, the opportunity to be exposed to advanced training and the latest technologies is limited. We set out to evaluate the educational value of integrating augmented reality (AR) into the curriculum of a global neurosurgery bootcamp. AR was integrated into this year's neurosurgical bootcamp in Hanoi, Vietnam, organized by the Foundation for International Education in Neurological Surgery (FIENS). Participants had not experienced this technology before a surgical adjunct. A study was conducted to evaluate how AR impacts the surgical approach to a cranial tumor for boot camp participants with limited neurosurgical experience. Without the use of AR, the majority of participants (66%) chose the incorrect surgical approach to a frontal tumor. However, after using AR to visualize the lesion in 3D, all participants chose the correct surgical approach. Additionally, participants were more precise when planning with AR as the distance from the skull insertion point to the tumor was significantly shorter with AR than without AR. This study demonstrated the potential of AR to improve the education and enhance the experience trainees have at international bootcamps. Importantly, it is our hope that industry involvement in these global initiatives continues to grow as it is critical for trainees in developing countries to be exposed to common as well as emerging medical technologies.
Collapse
Affiliation(s)
- Jack P Rock
- Neurological Surgery, Henry Ford Health System, Detroit, USA
| | - Lonni Schultz
- Neurological Surgery, Henry Ford Health System, Detroit, USA
| | - Robert Dempsey
- Neurological Surgery, University of Wisconsin School of Medicine and Public Health, Madison, USA
| | | |
Collapse
|
20
|
Rieder M, Remschmidt B, Gsaxner C, Gaessler J, Payer M, Zemann W, Wallner J. Augmented Reality-Guided Extraction of Fully Impacted Lower Third Molars Based on Maxillofacial CBCT Scans. Bioengineering (Basel) 2024; 11:625. [PMID: 38927861 PMCID: PMC11200966 DOI: 10.3390/bioengineering11060625] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2024] [Revised: 06/07/2024] [Accepted: 06/16/2024] [Indexed: 06/28/2024] Open
Abstract
(1) Background: This study aimed to integrate an augmented reality (AR) image-guided surgery (IGS) system, based on preoperative cone beam computed tomography (CBCT) scans, into clinical practice. (2) Methods: In preclinical and clinical surgical setups, an AR-guided visualization system based on Microsoft's HoloLens 2 was assessed for complex lower third molar (LTM) extractions. In this study, the system's potential intraoperative feasibility and usability is described first. Preparation and operating times for each procedure were measured, as well as the system's usability, using the System Usability Scale (SUS). (3) Results: A total of six LTMs (n = 6) were analyzed, two extracted from human cadaver head specimens (n = 2) and four from clinical patients (n = 4). The average preparation time was 166 ± 44 s, while the operation time averaged 21 ± 5.9 min. The overall mean SUS score was 79.1 ± 9.3. When analyzed separately, the usability score categorized the AR-guidance system as "good" in clinical patients and "best imaginable" in human cadaver head procedures. (4) Conclusions: This translational study analyzed the first successful and functionally stable application of the HoloLens technology for complex LTM extraction in clinical patients. Further research is needed to refine the technology's integration into clinical practice to improve patient outcomes.
Collapse
Affiliation(s)
- Marcus Rieder
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Bernhard Remschmidt
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria
| | - Jan Gaessler
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Michael Payer
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Wolfgang Zemann
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| | - Juergen Wallner
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| |
Collapse
|
21
|
Hunt R, Scarpace L, Rock J. Integration of Augmented Reality Into Glioma Resection Surgery: A Case Report. Cureus 2024; 16:e53573. [PMID: 38445166 PMCID: PMC10914376 DOI: 10.7759/cureus.53573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/03/2024] [Indexed: 03/07/2024] Open
Abstract
Augmented reality (AR) is an exciting technology that has garnered considerable attention in the field of neurosurgery. Despite this, clinical use of this technology is still in its infancy. An area of great potential for this technology is the ability to display 3D anatomy overlaid with the patient to assist with presurgical and intraoperative decision-making. A 39-year-old woman presented with headaches and was experiencing what was described as a whooshing sound. MRI revealed the presence of a large left frontal mass involving the genu of the corpus callosum, with heterogeneous enhancement and central hemorrhagic necrosis, confirmed to be a glioma. She underwent a craniotomy with intraoperative MRI for resection. An augmented reality system was used to superimpose 3D holographic anatomy onto the patient's head for surgical planning. This report highlights a new AR technology and its immediate application to cranial neurosurgery. It is critical to document new uses of this technology as the field continues to integrate AR as well as other next-generation technologies into practice.
Collapse
Affiliation(s)
- Rachel Hunt
- Neurosurgery, Henry Ford Health System, Detroit, USA
| | - Lisa Scarpace
- Neurosurgery, Henry Ford Health System, Detroit, USA
| | - Jack Rock
- Neurosurgery, Henry Ford Health System, Detroit, USA
| |
Collapse
|
22
|
Kantak PA, Bartlett S, Chaker A, Harmon S, Mansour T, Pawloski J, Telemi E, Yeo H, Winslow S, Cohen J, Scarpace L, Robin A, Rock JP. Augmented Reality Registration System for Visualization of Skull Landmarks. World Neurosurg 2024; 182:e369-e376. [PMID: 38013107 DOI: 10.1016/j.wneu.2023.11.110] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2023] [Revised: 11/21/2023] [Accepted: 11/22/2023] [Indexed: 11/29/2023]
Abstract
BACKGROUND Augmented reality (AR) is an emerging technology in neurosurgery with the potential to become a strategic tool in the delivery of care and education for trainees. Advances in technology have demonstrated promising use for improving visualization and spatial awareness of critical neuroanatomic structures. In this report, we employ a novel AR registration system for the visualization and targeting of skull landmarks. METHODS A markerless AR system was used to register 3-dimensional reconstructions of suture lines onto the head via a head-mounted display. Participants were required to identify craniometric points with and without AR assistance. Targeting error was measured as the Euclidian distance between the user-defined location and the true craniometric point on the subjects' heads. RESULTS All participants successfully registered 3-dimensional reconstructions onto the subjects' heads. Targeting accuracy was significantly improved with AR (3.59 ± 1.29 mm). Across all target points, AR increased accuracy by an average of 19.96 ± 3.80 mm. Posttest surveys revealed that participants felt the technology increased their confidence in identifying landmarks (4.6/5) and that the technology will be useful for clinical care (4.2/5). CONCLUSIONS While several areas of improvement and innovation can further enhance the use of AR in neurosurgery, this report demonstrates the feasibility of a markerless headset-based AR system for visualizing craniometric points on the skull. As the technology continues to advance, AR is expected to play an increasingly significant role in neurosurgery, transforming how surgeries are performed and improving patient care.
Collapse
Affiliation(s)
- Pranish A Kantak
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Seamus Bartlett
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Anisse Chaker
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Samuel Harmon
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Tarek Mansour
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Jacob Pawloski
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Edvin Telemi
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Heegook Yeo
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Samantha Winslow
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | | | - Lisa Scarpace
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Adam Robin
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA
| | - Jack P Rock
- Department of Neurological Surgery, Henry Ford Hospital, Detroit, Michigan, USA.
| |
Collapse
|
23
|
Qi Z, Jin H, Wang Q, Gan Z, Xiong R, Zhang S, Liu M, Wang J, Ding X, Chen X, Zhang J, Nimsky C, Bopp MHA. The Feasibility and Accuracy of Holographic Navigation with Laser Crosshair Simulator Registration on a Mixed-Reality Display. SENSORS (BASEL, SWITZERLAND) 2024; 24:896. [PMID: 38339612 PMCID: PMC10857152 DOI: 10.3390/s24030896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2024] [Revised: 01/21/2024] [Accepted: 01/23/2024] [Indexed: 02/12/2024]
Abstract
Addressing conventional neurosurgical navigation systems' high costs and complexity, this study explores the feasibility and accuracy of a simplified, cost-effective mixed reality navigation (MRN) system based on a laser crosshair simulator (LCS). A new automatic registration method was developed, featuring coplanar laser emitters and a recognizable target pattern. The workflow was integrated into Microsoft's HoloLens-2 for practical application. The study assessed the system's precision by utilizing life-sized 3D-printed head phantoms based on computed tomography (CT) or magnetic resonance imaging (MRI) data from 19 patients (female/male: 7/12, average age: 54.4 ± 18.5 years) with intracranial lesions. Six to seven CT/MRI-visible scalp markers were used as reference points per case. The LCS-MRN's accuracy was evaluated through landmark-based and lesion-based analyses, using metrics such as target registration error (TRE) and Dice similarity coefficient (DSC). The system demonstrated immersive capabilities for observing intracranial structures across all cases. Analysis of 124 landmarks showed a TRE of 3.0 ± 0.5 mm, consistent across various surgical positions. The DSC of 0.83 ± 0.12 correlated significantly with lesion volume (Spearman rho = 0.813, p < 0.001). Therefore, the LCS-MRN system is a viable tool for neurosurgical planning, highlighting its low user dependency, cost-efficiency, and accuracy, with prospects for future clinical application enhancements.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Ruochu Xiong
- Department of Neurosurgery, Division of Medicine, Graduate School of Medical Sciences, Kanazawa University, Takara-machi 13-1, Kanazawa 920-8641, Japan;
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Minghang Liu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xinyu Ding
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (H.J.); (Q.W.); (Z.G.); (S.Z.); (M.L.); (J.W.); (X.D.); (X.C.); (J.Z.)
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| |
Collapse
|
24
|
Tao B, Fan X, Wang F, Chen X, Shen Y, Wu Y. Comparison of the accuracy of dental implant placement using dynamic and augmented reality-based dynamic navigation: An in vitro study. J Dent Sci 2024; 19:196-202. [PMID: 38303816 PMCID: PMC10829549 DOI: 10.1016/j.jds.2023.05.006] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2023] [Revised: 05/05/2023] [Indexed: 02/03/2024] Open
Abstract
Background/purpose Augmented reality has been gradually applied in dental implant surgery. However, whether the dynamic navigation system integrated with augmented reality technology will further improve the accuracy is still unknown. The purpose of this study is to investigate the accuracy of dental implant placement using dynamic navigation and augmented reality-based dynamic navigation systems. Materials and methods Thirty-two cone-beam CT (CBCT) scans from clinical patients were collected and used to generate 64 phantoms that were allocated to the augmented reality-based dynamic navigation (ARDN) group or the conventional dynamic navigation (DN) group. The primary outcomes were global coronal, apical and angular deviations, and they were measured after image fusion. A linear mixed model with a random intercept was used. A P value < 0.05 was considered to indicate statistical significance. Results A total of 242 dental implants were placed in two groups. The global coronal, apical and angular deviations of the ARDN and DN groups were 1.31 ± 0.67 mm vs. 1.18 ± 0.59 mm, 1.36 ± 0.67 mm vs. 1.39 ± 0.55 mm, and 3.72 ± 2.13° vs. 3.1 ± 1.56°, respectively. No significant differences were found with regard to coronal and apical deviations (P = 0.16 and 0.6, respectively), but the DN group had a significantly lower angular deviation than the ARDN group (P = 0.02). Conclusion The augmented reality-based dynamic navigation system yielded a similar accuracy to the conventional dynamic navigation system for dental implant placement in coronal and apical points, but the augmented reality-based dynamic navigation system yielded a higher angular deviation.
Collapse
Affiliation(s)
- Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Feng Wang
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, College of Stomatology, Shanghai Jiao Tong University, Shanghai, China
- National Center for Stomatology, National Clinical Research Center for Oral Diseases, Shanghai Key Laboratory of Stomatology, Shanghai Research Institute of Stomatology, Shanghai, China
| |
Collapse
|
25
|
Vidal-Sicart S, Goñi E, Cebrecos I, Rioja ME, Perissinotti A, Sampol C, Vidal O, Saavedra-Pérez D, Ferrer A, Martí C, Ferrer Rebolleda J, García Velloso MJ, Orozco-Cortés J, Díaz-Feijóo B, Niñerola-Baizán A, Valdés Olmos RA. Continuous innovation in precision radio-guided surgery. Rev Esp Med Nucl Imagen Mol 2024; 43:39-54. [PMID: 37963516 DOI: 10.1016/j.remnie.2023.11.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2023] [Accepted: 10/26/2023] [Indexed: 11/16/2023]
Abstract
Since its origins, nuclear medicine has faced technological changes that led to modifying operating modes and adapting protocols. In the field of radioguided surgery, the incorporation of preoperative scintigraphic imaging and intraoperative detection with the gamma probe provided a definitive boost to sentinel lymph node biopsy to become a standard procedure for melanoma and breast cancer. The various technological innovations and consequent adaptation of protocols come together in the coexistence of the disruptive and the gradual. As obvious examples we have the introduction of SPECT/CT in the preoperative field and Drop-in probes in the intraoperative field. Other innovative aspects with possible application in radio-guided surgery are based on the application of artificial intelligence, navigation and telecare.
Collapse
Affiliation(s)
- Sergi Vidal-Sicart
- Servicio de Medicina Nuclear, Hospital Clínic Barcelona, Barcelona, Spain; Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain.
| | - Elena Goñi
- Servicio de Medicina Nuclear, Hospital Universitario de Navarra, Pamplona, Spain
| | - Isaac Cebrecos
- Instituto Clínic de Ginecología, Obstetricia y Neonatología (ICGON), Hospital Clínic Barcelona, Barcelona, Spain
| | | | - Andrés Perissinotti
- Servicio de Medicina Nuclear, Hospital Clínic Barcelona, Barcelona, Spain; Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Centro de Investigación Biomédica en Red de Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), ISCIII, Madrid, Spain
| | - Catalina Sampol
- Servicio de Medicina Nuclear, Hospital Universitario Son Espases, Palma de Mallorca, Spain
| | - Oscar Vidal
- Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Cirugía General y Digestiva, ICMDiM, Hospital Clínic de Barcelona, Barcelona, Spain; Departamento de Cirugía, Universitat de Barcelona, Barcelona, Spain
| | - David Saavedra-Pérez
- Cirugía General y Digestiva, ICMDiM, Hospital Clínic de Barcelona, Barcelona, Spain
| | - Ada Ferrer
- Servicio de Cirugía Maxilofacial, Hospital Clínic Barcelona, Barcelona, Spain
| | - Carles Martí
- Servicio de Cirugía Maxilofacial, Hospital Clínic Barcelona, Barcelona, Spain
| | - José Ferrer Rebolleda
- Servicio Medicina Nuclear Ascires, Hospital General Universitario de Valencia, Valencia, Spain
| | | | - Jhon Orozco-Cortés
- Servicio de Medicina Nuclear, Hospital Clínico Universitario de Valencia, Barcelona, Spain
| | - Berta Díaz-Feijóo
- Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Instituto Clínic de Ginecología, Obstetricia y Neonatología (ICGON), Hospital Clínic Barcelona, Barcelona, Spain; Departamento de Cirugía, Universitat de Barcelona, Barcelona, Spain
| | - Aida Niñerola-Baizán
- Servicio de Medicina Nuclear, Hospital Clínic Barcelona, Barcelona, Spain; Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain; Centro de Investigación Biomédica en Red de Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), ISCIII, Madrid, Spain; Departamento de Biomedicina, Facultad de Medicina, Universitat de Barcelona, Barcelona, Spain
| | - Renato Alfredo Valdés Olmos
- Department of Radiology, Section of Nuclear Medicine & Interventional Molecular Imaging Laboratory, Leiden University Medical Center, Leiden, The Netherlands
| |
Collapse
|
26
|
Olexa J, Trang A, Kim K, Rakovec M, Saadon J, Parker W. Augmented Reality-Assisted Placement of Ommaya Reservoir for Cyst Aspiration: A Case Report. Cureus 2024; 16:e52383. [PMID: 38371146 PMCID: PMC10870692 DOI: 10.7759/cureus.52383] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/16/2024] [Indexed: 02/20/2024] Open
Abstract
Image guidance technologies can significantly improve the accuracy and safety of intracranial catheter insertions. Augmented reality (AR) allows surgeons to visualize 3D information overlaid onto a patient's head. As such, AR has emerged as a novel image guidance technology that offers unique advantages when navigating intracranial targets. A 71-year-old woman with a history of brain metastasis from breast cancer and prior resection surgery and chemotherapy presented with altered mental status and generalized weakness worse on her left side. Magnetic resonance imaging (MRI) demonstrated right frontotemporoparietal edema with a contrast-enhancing mass. MR perfusion confirmed an active tumor with an enlarging right temporal pole cyst. A cyst aspiration was performed via Ommaya reservoir placement. Neuro-navigation (BrainLab, Munich, Germany) and AR navigation were used to plan the trajectory from the temporal gyrus to the cyst. Post-operative computed tomography (CT) demonstrated good placement of the reservoir, reconstitution of the temporal horn of the lateral ventricle with decreased external mass effect, and no areas of hemorrhage. AR has tremendous potential in the field of neurosurgery for improving the accuracy and safety of procedures. This case demonstrates an encouraging application of AR and can serve as an example to drive expanded clinical use of this technology.
Collapse
Affiliation(s)
- Joshua Olexa
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Annie Trang
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Kevin Kim
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Maureen Rakovec
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Jordan Saadon
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| | - Whitney Parker
- Neurosurgery, University of Maryland School of Medicine, Baltimore, USA
| |
Collapse
|
27
|
Tătaru OS, Ferro M, Marchioni M, Veccia A, Coman O, Lasorsa F, Brescia A, Crocetto F, Barone B, Catellani M, Lazar A, Petrisor M, Vartolomei MD, Lucarelli G, Antonelli A, Schips L, Autorino R, Rocco B, Azamfirei L. HoloLens ® platform for healthcare professionals simulation training, teaching, and its urological applications: an up-to-date review. Ther Adv Urol 2024; 16:17562872241297554. [PMID: 39654822 PMCID: PMC11626676 DOI: 10.1177/17562872241297554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2023] [Accepted: 10/15/2024] [Indexed: 12/12/2024] Open
Abstract
The advancements of technological devices and software are putting mixed reality in the frontline of teaching medical personnel. The Microsoft® HoloLens 2® offers a unique 3D visualization of a hologram in a physical, real environment and allows the urologists to interact with it. This review provides a state-of-the-art analysis of the applications of the HoloLens® in a medical and healthcare context of teaching through simulation designed for medical students, nurses, residents especially in urology. Our objective has been to perform a comprehensively analysis of the studies in PubMed/Medline database from January 2016 to April 2023. The identified articles that researched Microsoft HoloLens, having description of feasibility and teaching outcomes in medicine with an emphasize in urological healthcare, have been included. The qualitative analysis performed identifies an increasing use of HoloLens in a teaching setting that covers a great area of expertise in medical sciences (anatomy, anatomic pathology, biochemistry, pharmacogenomics, clinical skills, emergency medicine and nurse education, imaging), and above these urology applications (urological procedures and technique, skill improvement, perception of complex renal tumors, accuracy of calyx puncture guidance in percutaneous nephrolithotomy and targeted biopsy of the prostate) can mostly benefit from it. The future potential of HoloLens technology in teaching is immense. So far, studies have focused on feasibility, applicability, perception, comparisons with traditional methods, and limitations. Moving forward, research should also prioritize the development of applications specifically for urology. This will require validation of needs and the creation of adequate protocols to standardize future research efforts.
Collapse
Affiliation(s)
- Octavian Sabin Tătaru
- Department of Simulation Applied in Medicine, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | - Matteo Ferro
- Istituto Europeo di Oncologia, IRCCS—Istituto di Ricovero e Cura a Carattere Scientifico, via Ripamonti 435 Milano, Italy
- Università degli Studi di Milano, Milan, Italy
| | - Michele Marchioni
- Department of Medical, Oral and Biotechnological Sciences, G. d’Annunzio, University of Chieti, Urology Unit, “SS. Annunziata” Hospital, Chieti, Italy; Department of Urology, ASL Abruzzo 2, Chieti, Italy
| | - Alessandro Veccia
- Department of Urology, University of Verona, Azienda Ospedaliera Universitaria Integrata of Verona, Verona, Italy
| | - Oana Coman
- Department of Simulation Applied in Medicine, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | - Francesco Lasorsa
- Department of Emergency and Organ Transplantation, Urology, Andrology and Kidney Transplantation Unit, University of Bari, Bari, Italy
| | - Antonio Brescia
- Department of Urology, European Institute of Oncology, IRCCS, Milan, Italy
- Università degli Studi di Milano, Milan, Italy
| | - Felice Crocetto
- Department of Neurosciences and Reproductive Sciences and Odontostomatology, University of Naples Federico II, Naples, Italy
| | - Biagio Barone
- Department of Surgical Sciences, Urology Unit, AORN Sant’Anna e San Sebastiano, Caserta, Italy
| | - Michele Catellani
- Department of Urology, European Institute of Oncology, IRCCS, Milan, Italy
- Università degli Studi di Milano, Milan, Italy
| | - Alexandra Lazar
- Department of Anesthesia and Intensive Care, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | - Marius Petrisor
- Department of Simulation Applied in Medicine, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| | | | - Giuseppe Lucarelli
- Department of Emergency and Organ Transplantation, Urology, Andrology and Kidney Transplantation Unit, University of Bari, Bari, Italy
| | - Alessandro Antonelli
- Department of Urology, Azienda Ospedaliera Universitaria Integrata of Verona, University of Verona, Verona, Italy
| | - Luigi Schips
- Department of Medical, Oral and Biotechnological Sciences, G. d’Annunzio, University of Chieti, Urology Unit, “SS. Annunziata” Hospital, Chieti, Italy'
- Department of Urology, ASL Abruzzo 2, Chieti, Italy
| | - Riccardo Autorino
- Department of Urology, Rush University Medical Center, Chicago, IL, USA
| | - Bernardo Rocco
- Unit of Urology, Department of Health Science, University of Milan, ASST Santi Paolo and Carlo, Milan, Italy
- Matteo Ferro is also affiliated to Unit of Urology, Department of Health Science, University of Milan, ASST Santi Paolo and Carlo, Milan, Italy
- Bernardo Rocco is also affiliated to U.O.C. Clinica Urologica, Dipartimento Universitario di Medicina e Chirurgia Traslazionale Fondazione Policlinico Universitario, IRCCS, Rome, Italy; Università Cattolica del Sacro Cuore, Milan, Italy
- Giuseppe Lucarelli is also affiliated to Department of Precision and Regenerative Medicine and Ionian Area Urology, Andrology and Kidney Transplantation Unit, Aldo Moro University of Bari, Bari, Italy
| | - Leonard Azamfirei
- Department of Anesthesia and Intensive Care, George Emil Palade University of Medicine, Pharmacy, Science, and Technology of Targu Mures, Targu Mures, Romania
| |
Collapse
|
28
|
Kos TM, Colombo E, Bartels LW, Robe PA, van Doormaal TPC. Evaluation Metrics for Augmented Reality in Neurosurgical Preoperative Planning, Surgical Navigation, and Surgical Treatment Guidance: A Systematic Review. Oper Neurosurg (Hagerstown) 2023:01787389-990000000-01007. [PMID: 38146941 PMCID: PMC11008635 DOI: 10.1227/ons.0000000000001009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Accepted: 10/10/2023] [Indexed: 12/27/2023] Open
Abstract
BACKGROUND AND OBJECTIVE Recent years have shown an advancement in the development of augmented reality (AR) technologies for preoperative visualization, surgical navigation, and intraoperative guidance for neurosurgery. However, proving added value for AR in clinical practice is challenging, partly because of a lack of standardized evaluation metrics. We performed a systematic review to provide an overview of the reported evaluation metrics for AR technologies in neurosurgical practice and to establish a foundation for assessment and comparison of such technologies. METHODS PubMed, Embase, and Cochrane were searched systematically for publications on assessment of AR for cranial neurosurgery on September 22, 2022. The findings were reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. RESULTS The systematic search yielded 830 publications; 114 were screened full text, and 80 were included for analysis. Among the included studies, 5% dealt with preoperative visualization using AR, with user perception as the most frequently reported metric. The majority (75%) researched AR technology for surgical navigation, with registration accuracy, clinical outcome, and time measurements as the most frequently reported metrics. In addition, 20% studied the use of AR for intraoperative guidance, with registration accuracy, task outcome, and user perception as the most frequently reported metrics. CONCLUSION For quality benchmarking of AR technologies in neurosurgery, evaluation metrics should be specific to the risk profile and clinical objectives of the technology. A key focus should be on using validated questionnaires to assess user perception; ensuring clear and unambiguous reporting of registration accuracy, precision, robustness, and system stability; and accurately measuring task performance in clinical studies. We provided an overview suggesting which evaluation metrics to use per AR application and innovation phase, aiming to improve the assessment of added value of AR for neurosurgical practice and to facilitate the integration in the clinical workflow.
Collapse
Affiliation(s)
- Tessa M Kos
- Image Sciences Institute, University Medical Center Utrecht, Utrecht , The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht , The Netherlands
| | - Elisa Colombo
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich , The Netherlands
| | - L Wilbert Bartels
- Image Sciences Institute, University Medical Center Utrecht, Utrecht , The Netherlands
| | - Pierre A Robe
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht , The Netherlands
| | - Tristan P C van Doormaal
- Department of Neurosurgery, Clinical Neuroscience Center, Universitätsspital Zürich, Zurich , The Netherlands
- Department of Neurosurgery, University Medical Center Utrecht, Utrecht , The Netherlands
| |
Collapse
|
29
|
Qi Z, Bopp MHA, Nimsky C, Chen X, Xu X, Wang Q, Gan Z, Zhang S, Wang J, Jin H, Zhang J. A Novel Registration Method for a Mixed Reality Navigation System Based on a Laser Crosshair Simulator: A Technical Note. Bioengineering (Basel) 2023; 10:1290. [PMID: 38002414 PMCID: PMC10669875 DOI: 10.3390/bioengineering10111290] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2023] [Accepted: 11/01/2023] [Indexed: 11/26/2023] Open
Abstract
Mixed Reality Navigation (MRN) is pivotal in augmented reality-assisted intelligent neurosurgical interventions. However, existing MRN registration methods face challenges in concurrently achieving low user dependency, high accuracy, and clinical applicability. This study proposes and evaluates a novel registration method based on a laser crosshair simulator, evaluating its feasibility and accuracy. A novel registration method employing a laser crosshair simulator was introduced, designed to replicate the scanner frame's position on the patient. The system autonomously calculates the transformation, mapping coordinates from the tracking space to the reference image space. A mathematical model and workflow for registration were designed, and a Universal Windows Platform (UWP) application was developed on HoloLens-2. Finally, a head phantom was used to measure the system's target registration error (TRE). The proposed method was successfully implemented, obviating the need for user interactions with virtual objects during the registration process. Regarding accuracy, the average deviation was 3.7 ± 1.7 mm. This method shows encouraging results in efficiency and intuitiveness and marks a valuable advancement in low-cost, easy-to-use MRN systems. The potential for enhancing accuracy and adaptability in intervention procedures positions this approach as promising for improving surgical outcomes.
Collapse
Affiliation(s)
- Ziyu Qi
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
| | - Miriam H. A. Bopp
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Baldingerstrasse, 35043 Marburg, Germany;
- Center for Mind, Brain and Behavior (CMBB), 35043 Marburg, Germany
| | - Xiaolei Chen
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Xinghua Xu
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Qun Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| | - Zhichao Gan
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Shiyu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Jingyue Wang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
| | - Haitao Jin
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
- Medical School of Chinese PLA, Beijing 100853, China
- NCO School, Army Medical University, Shijiazhuang 050081, China
| | - Jiashu Zhang
- Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing 100853, China; (X.C.); (X.X.); (Q.W.); (Z.G.); (S.Z.); (J.W.); (H.J.)
| |
Collapse
|
30
|
Fan X, Tao B, Tu P, Shen Y, Wu Y, Chen X. A novel mixed reality-guided dental implant placement navigation system based on virtual-actual registration. Comput Biol Med 2023; 166:107560. [PMID: 37847946 DOI: 10.1016/j.compbiomed.2023.107560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 09/14/2023] [Accepted: 10/10/2023] [Indexed: 10/19/2023]
Abstract
BACKGROUNDS The key to successful dental implant surgery is to place the implants accurately along the pre-operative planned paths. The application of surgical navigation systems can significantly improve the safety and accuracy of implantation. However, the frequent shift of the views of the surgeon between the surgical site and the computer screen causes troubles, which is expected to be solved by the introduction of mixed-reality technology through the wearing of HoloLens devices by enabling the alignment of the virtual three-dimensional (3D) image with the actual surgical site in the same field of view. METHODS This study utilized mixed reality technology to enhance dental implant surgery navigation. Our first step was reconstructing a virtual 3D model from pre-operative cone-beam CT (CBCT) images. We then obtained the relative position between objects using the navigation device and HoloLens camera. Via the algorithms of virtual-actual registration, the transformation matrixes between the HoloLens devices and the navigation tracker were acquired through the HoloLens-tracker registration, and the transformation matrixes between the virtual model and the patient phantom through the image-phantom registration. In addition, the algorithm of surgical drill calibration assisted in acquiring transformation matrixes between the surgical drill and the patient phantom. These algorithms allow real-time tracking of the surgical drill's location and orientation relative to the patient phantom under the navigation device. With the aid of the HoloLens 2, virtual 3D images and actual patient phantoms can be aligned accurately, providing surgeons with a clear visualization of the implant path. RESULTS Phantom experiments were conducted using 30 patient phantoms, with a total of 102 dental implants inserted. Comparisons between the actual implant paths and the pre-operatively planned implant paths showed that our system achieved a coronal deviation of 1.507 ± 0.155 mm, an apical deviation of 1.542 ± 0.143 mm, and an angular deviation of 3.468 ± 0.339°. The deviation was not significantly different from that of the navigation-guided dental implant placement but better than the freehand dental implant placement. CONCLUSION Our proposed system realizes the integration of the pre-operative planned dental implant paths and the patient phantom, which helps surgeons achieve adequate accuracy in traditional dental implant surgery. Furthermore, this system is expected to be applicable to animal and cadaveric experiments in further studies.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China; Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
31
|
Martinho FC, Griffin IL, Price JB, Tordik PA. Augmented Reality and 3-Dimensional Dynamic Navigation System Integration for Osteotomy and Root-end Resection. J Endod 2023; 49:1362-1368. [PMID: 37453501 DOI: 10.1016/j.joen.2023.07.007] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2023] [Revised: 07/03/2023] [Accepted: 07/05/2023] [Indexed: 07/18/2023]
Abstract
INTRODUCTION Augmented reality (AR) superimposes high-definition computer-generated virtual content onto the existing environment, providing users with an enhanced perception of reality. This study investigates the feasibility of integrating an AR head-mounted device into a 3-dimensional dynamic navigation system (3D-DNS) for osteotomy and root-end resection (RER). It compares the accuracy and efficiency of AR + 3D-DNS to 3D-DNS for osteotomy and RER. METHODS Seventy-two tooth roots of 3D-printed surgical jaw models were divided into two groups: AR + 3D-DNS (n = 36) and 3D-DNS (n = 36). Cone-beam computed tomography scans were taken pre and postoperatively. The osteotomy and RER were virtually planned on X-guide software and delivered under 3D-DNS guidance. For the AR + 3D-DNS group, an AR head-mounted device (Microsoft HoloLens 2) was integrated into the 3D-DNS. The 2D- and 3D-deviations were calculated. The osteotomy and RER time and the number of procedural mishaps were recorded. RESULTS Osteotomy and RER were completed in all samples (72/72). AR + 3D-DNS was more accurate than 3D-DNS, showing lower 2D- and 3D-deviation values (P < .05). The AR + 3D-DNS was more efficient in time than 3D-DNS (P < .05). There was no significant difference in the number of mishaps (P > .05). CONCLUSIONS Within the limitations of this in vitro study, the integration of an AR head-mounted device to 3D-DNS is feasible for osteotomy and RER. AR improved the accuracy and time efficiency of 3D-DNS in osteotomy and RER. Head-mounted AR has the potential to be safely and reliably integrated into 3D-DNS for endodontic microsurgery.
Collapse
Affiliation(s)
- Frederico C Martinho
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland.
| | - Ina L Griffin
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Jeffery B Price
- Division of Oral Radiology, Department of Oncology and Diagnostic Sciences, University of Maryland, School of Dentistry, Baltimore, Maryland
| | - Patricia A Tordik
- Division of Endodontics, Department of Advanced Oral Sciences and Therapeutics, University of Maryland, School of Dentistry, Baltimore, Maryland
| |
Collapse
|
32
|
Remschmidt B, Rieder M, Gsaxner C, Gaessler J, Payer M, Wallner J. Augmented Reality-Guided Apicoectomy Based on Maxillofacial CBCT Scans. Diagnostics (Basel) 2023; 13:3037. [PMID: 37835780 PMCID: PMC10572956 DOI: 10.3390/diagnostics13193037] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Revised: 09/13/2023] [Accepted: 09/21/2023] [Indexed: 10/15/2023] Open
Abstract
Implementation of augmented reality (AR) image guidance systems using preoperative cone beam computed tomography (CBCT) scans in apicoectomies promises to help surgeons overcome iatrogenic complications associated with this procedure. This study aims to evaluate the intraoperative feasibility and usability of HoloLens 2, an established AR image guidance device, in the context of apicoectomies. Three experienced surgeons carried out four AR-guided apicoectomies each on human cadaver head specimens. Preparation and operating times of each procedure, as well as the subjective usability of HoloLens for AR image guidance in apicoectomies using the System Usability Scale (SUS), were measured. In total, twelve AR-guided apicoectomies on six human cadaver head specimens were performed (n = 12). The average preparation time amounted to 162 (±34) s. The surgical procedure itself took on average 9 (±2) min. There was no statistically significant difference between the three surgeons. Quantification of the usability of HoloLens revealed a mean SUS score of 80.4 (±6.8), indicating an "excellent" usability level. In conclusion, this study implies the suitability, practicality, and simplicity of AR image guidance systems such as the HoloLens in apicoectomies and advocates their routine implementation.
Collapse
Affiliation(s)
- Bernhard Remschmidt
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Marcus Rieder
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria
| | - Jan Gaessler
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Michael Payer
- Division of Oral Surgery and Orthodontics, Department of Dental Medicine and Oral Health, Medical University of Graz, 8010 Graz, Austria
| | - Juergen Wallner
- Division of Oral and Maxillofacial Surgery, Department of Dental Medicine and Oral Health, Medical University of Graz, 8036 Graz, Austria
| |
Collapse
|
33
|
Tu M, Jung H, Moghadam A, Raythatha J, Hsu J, Kim J. Exploring the Performance of Geometry-Based Markerless Registration in a Simulated Surgical Environment: A Comparative Study of Registration Algorithms in Medical Augmented Reality. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-4. [PMID: 38083251 DOI: 10.1109/embc40787.2023.10341197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Augmented Reality (AR) has been utilized in multiple applications in the medical field, such as augmenting Computed Tomography (CT) images onto the patient's body during surgery. However, one of the challenges in its utilization is to register the pre-operative CT images to the patient's body accurately. The current registration process requires prior attachment of tracking markers, and their localization within the body and CT images. This process can be cumbersome, error-prone, and dependent on the surgeon's experience. Moreover, there are cases where medical instruments, drapes, or the body may occlude the markers. In light of these limitations, markerless registration algorithms have the potential to aid the registration process in the clinical setting. While those algorithms have been successfully used in other sectors, such as multimedia, they have not yet been thoroughly investigated in a clinical setting, especially in surgery, where there are more challenging cases with different positions of the patients in the image and the surgical environment. In this paper, we benchmarked and evaluated the performance of 6 state-of-the-art markerless registration algorithms from the multimedia sector by registering a CT image onto the whole-body phantom dataset acquired from a simulated surgical environment. We also analyzed the suitability of these algorithms for use in the surgical setting and discussed their potential for the advancement of AR-assisted surgery.Clinical Relevance-Our study provides insight into the potential of AR-assisted surgery and helps practitioners in choosing the most suitable registration algorithm for their needs to improve patient outcomes, reduce the risk of surgical errors and shorten the time of preoperative planning.
Collapse
|
34
|
Ochi Y, Yanai S, Yoshino Y, Sawada M, Sakate S, Kanno K, Andou M. Clinical use of mixed reality for laparoscopic myomectomy. Int J Gynaecol Obstet 2023. [PMID: 36965106 DOI: 10.1002/ijgo.14765] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2022] [Revised: 01/17/2023] [Accepted: 03/13/2023] [Indexed: 03/27/2023]
|
35
|
Zari G, Condino S, Cutolo F, Ferrari V. Magic Leap 1 versus Microsoft HoloLens 2 for the Visualization of 3D Content Obtained from Radiological Images. SENSORS (BASEL, SWITZERLAND) 2023; 23:3040. [PMID: 36991751 PMCID: PMC10054537 DOI: 10.3390/s23063040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/24/2023] [Revised: 03/01/2023] [Accepted: 03/08/2023] [Indexed: 06/19/2023]
Abstract
The adoption of extended reality solutions is growing rapidly in the healthcare world. Augmented reality (AR) and virtual reality (VR) interfaces can bring advantages in various medical-health sectors; it is thus not surprising that the medical MR market is among the fastest-growing ones. The present study reports on a comparison between two of the most popular MR head-mounted displays, Magic Leap 1 and Microsoft HoloLens 2, for the visualization of 3D medical imaging data. We evaluate the functionalities and performance of both devices through a user-study in which surgeons and residents assessed the visualization of 3D computer-generated anatomical models. The digital content is obtained through a dedicated medical imaging suite (Verima imaging suite) developed by the Italian start-up company (Witapp s.r.l.). According to our performance analysis in terms of frame rate, there are no significant differences between the two devices. The surgical staff expressed a clear preference for Magic Leap 1, particularly for the better visualization quality and the ease of interaction with the 3D virtual content. Nonetheless, even though the results of the questionnaire were slightly more positive for Magic Leap 1, the spatial understanding of the 3D anatomical model in terms of depth relations and spatial arrangement was positively evaluated for both devices.
Collapse
Affiliation(s)
- Giulia Zari
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
| | - Sara Condino
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy
| | - Fabrizio Cutolo
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy
| | - Vincenzo Ferrari
- Information Engineering Department, University of Pisa, Via Girolamo Caruso, 16, 56122 Pisa, Italy; (G.Z.); (S.C.); (V.F.)
- EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, 56126 Pisa, Italy
| |
Collapse
|