1
|
Lewis KO, Popov V, Fatima SS. From static web to metaverse: reinventing medical education in the post-pandemic era. Ann Med 2024; 56:2305694. [PMID: 38261592 PMCID: PMC10810636 DOI: 10.1080/07853890.2024.2305694] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/01/2023] [Accepted: 01/06/2024] [Indexed: 01/25/2024] Open
Abstract
The World Wide Web and the advancement of computer technology in the 1960s and 1990s respectively set the ground for a substantial and simultaneous change in many facets of our life, including medicine, health care, and medical education. The traditional didactic approach has shifted towards more dynamic and interactive methods, leveraging technologies such as simulation tools, virtual reality, and online platforms. At the forefront is the remarkable evolution that has revolutionized how medical knowledge is accessed, disseminated, and integrated into pedagogical practices. The COVID-19 pandemic also led to rapid and large-scale adoption of e-learning and digital resources in medical education because of widespread lockdowns, social distancing measures, and the closure of medical schools and healthcare training programs. This review paper examines the evolution of medical education from the Flexnerian era to the modern digital age, closely examining the influence of the evolving WWW and its shift from Education 1.0 to Education 4.0. This evolution has been further accentuated by the transition from the static landscapes of Web 2D to the immersive realms of Web 3D, especially considering the growing notion of the metaverse. The application of the metaverse is an interconnected, virtual shared space that includes virtual reality (VR), augmented reality (AR), and mixed reality (MR) to create a fertile ground for simulation-based training, collaborative learning, and experiential skill acquisition for competency development. This review includes the multifaceted applications of the metaverse in medical education, outlining both its benefits and challenges. Through insightful case studies and examples, it highlights the innovative potential of the metaverse as a platform for immersive learning experiences. Moreover, the review addresses the role of emerging technologies in shaping the post-pandemic future of medical education, ultimately culminating in a series of recommendations tailored for medical institutions aiming to successfully capitalize on revolutionary changes.
Collapse
Affiliation(s)
- Kadriye O. Lewis
- Children’s Mercy Kansas City, Department of Pediatrics, UMKC School of Medicine, Kansas City, MO, USA
| | - Vitaliy Popov
- Department of Learning Health Sciences, University of MI Medical School, Ann Arbor, MI, USA
| | - Syeda Sadia Fatima
- Department of Biological and Biomedical Sciences, The Aga Khan University, Karachi, Pakistan
| |
Collapse
|
2
|
Herzog I, Mendiratta D, Para A, Berg A, Kaushal N, Vives M. Assessing the potential role of ChatGPT in spine surgery research. J Exp Orthop 2024; 11:e12057. [PMID: 38873173 PMCID: PMC11170336 DOI: 10.1002/jeo2.12057] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/29/2023] [Revised: 05/12/2024] [Accepted: 05/28/2024] [Indexed: 06/15/2024] Open
Abstract
Purpose Since its release in November 2022, Chat Generative Pre-Trained Transformer 3.5 (ChatGPT), a complex machine learning model, has garnered more than 100 million users worldwide. The aim of this study is to determine how well ChatGPT can generate novel systematic review ideas on topics within spine surgery. Methods ChatGPT was instructed to give ten novel systematic review ideas for five popular topics in spine surgery literature: microdiscectomy, laminectomy, spinal fusion, kyphoplasty and disc replacement. A comprehensive literature search was conducted in PubMed, CINAHL, EMBASE and Cochrane. The number of nonsystematic review articles and number of systematic review papers that had been published on each ChatGPT-generated idea were recorded. Results Overall, ChatGPT had a 68% accuracy rate in creating novel systematic review ideas. More specifically, the accuracy rates were 80%, 80%, 40%, 70% and 70% for microdiscectomy, laminectomy, spinal fusion, kyphoplasty and disc replacement, respectively. However, there was a 32% rate of ChatGPT generating ideas for which there were 0 nonsystematic review articles published. There was a 71.4%, 50%, 22.2%, 50%, 62.5% and 51.2% success rate of generating novel systematic review ideas, for which there were also nonsystematic reviews published, for microdiscectomy, laminectomy, spinal fusion, kyphoplasty, disc replacement and overall, respectively. Conclusions ChatGPT generated novel systematic review ideas at an overall rate of 68%. ChatGPT can help identify knowledge gaps in spine research that warrant further investigation, when used under supervision of an experienced spine specialist. This technology can be erroneous and lacks intrinsic logic; so, it should never be used in isolation. Level of Evidence Not applicable.
Collapse
Affiliation(s)
- Isabel Herzog
- Rutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| | | | - Ashok Para
- Rutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| | - Ari Berg
- Rutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| | - Neil Kaushal
- Rutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| | - Michael Vives
- Rutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| |
Collapse
|
3
|
Mor E, Tejman-Yarden S, Mor-Hadar D, Assaf D, Eifer M, Nagar N, Vazhgovsky O, Duffield J, Henderson MA, Speakman D, Snow H, Gyorki DE. 3D-SARC: A Pilot Study Testing the Use of a 3D Augmented-Reality Model with Conventional Imaging as a Preoperative Assessment Tool for Surgical Resection of Retroperitoneal Sarcoma. Ann Surg Oncol 2024:10.1245/s10434-024-15634-w. [PMID: 38898325 DOI: 10.1245/s10434-024-15634-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2024] [Accepted: 06/05/2024] [Indexed: 06/21/2024]
Abstract
BACKGROUND Retroperitoneal sarcomas (RPSs) present a surgical challenge, with complex anatomic relationships to organs and vascular structures. This pilot study investigated the role of three-dimensional (3D) augmented reality (3DAR) compared with standard imaging in preoperative planning and resection strategies. METHODS For the study, 13 patients who underwent surgical resection of their RPS were selected based on the location of their tumor (right, left, pelvis). From the patients' preoperative computed tomography (CT) scans, 3DAR models were created using a D2P program and projected by an augmented-reality (AR) glass (Hololens). The 3DAR models were evaluated by three experienced sarcoma surgeons and compared with the baseline two-dimensional (2D) contrast-enhanced CT scans. RESULTS Three members of the surgical team evaluated 13 models of retroperitoneal sarcomas, resulting in a total of 26 responses. When the surgical team was asked to evaluate whether the 3DAR better prepared the surgeon for planned surgical resection, 10 responses favored the 3DAR, 5 favored the 2D CT scans and 11 showed no difference (p = 0.074). According to 15 (57.6 %) of the 26 responses, the 3DAR offered additional value over standard imaging in the preoperative planning (median score of 4; range, 1-5). The median stated likelihood that the surgeons would consult the 3DAR was 5 (range, 2-5) for the preoperative setting and 3 (range, 1-5) for the intraoperative setting. CONCLUSIONS This pilot study suggests that the use of 3DAR may provide additional value over current standard imaging in the preoperative planning for surgical resection of RPS, and the technology merits further study.
Collapse
Affiliation(s)
- Eyal Mor
- Division of Cancer Surgery, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia.
- The Surgical Oncology Unit - Division of Surgery, Sheba Medical Center, Tel Hashomer, Affiliated with the Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel.
- Faculty of Medicine, Tel Aviv University, Ramat Aviv, Tel Aviv, Israel.
| | - Shai Tejman-Yarden
- Faculty of Medicine, Tel Aviv University, Ramat Aviv, Tel Aviv, Israel
- The Edmond J. Safra International Congenital Heart Center, Sheba Medical Center, Ramat Gan, Israel
- The Engineering Medical Research Lab, Sheba Medical Center, Ramat Gan, Israel
| | - Danielle Mor-Hadar
- Division of Cancer Surgery, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia
- Faculty of Medicine, Tel Aviv University, Ramat Aviv, Tel Aviv, Israel
| | - Dan Assaf
- The Surgical Oncology Unit - Division of Surgery, Sheba Medical Center, Tel Hashomer, Affiliated with the Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel
- Faculty of Medicine, Tel Aviv University, Ramat Aviv, Tel Aviv, Israel
| | - Michal Eifer
- Faculty of Medicine, Tel Aviv University, Ramat Aviv, Tel Aviv, Israel
- Cancer Imaging, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia
| | - Netanel Nagar
- Industrial Design Department, Shenkar College of Engineering, Design and Art, Ramat-Gan, Israel
| | - Oliana Vazhgovsky
- The Engineering Medical Research Lab, Sheba Medical Center, Ramat Gan, Israel
| | - Jaime Duffield
- Division of Cancer Surgery, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia
| | - Michael A Henderson
- Division of Cancer Surgery, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia
- Sir Peter MacCallum Department of Oncology, University of Melbourne, Melbourne, VIC, Australia
| | - David Speakman
- Division of Cancer Surgery, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia
- Sir Peter MacCallum Department of Oncology, University of Melbourne, Melbourne, VIC, Australia
| | - Hayden Snow
- Division of Cancer Surgery, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia
- Sir Peter MacCallum Department of Oncology, University of Melbourne, Melbourne, VIC, Australia
| | - David E Gyorki
- Division of Cancer Surgery, Peter MacCallum Cancer Centre, Melbourne, VIC, Australia
- Sir Peter MacCallum Department of Oncology, University of Melbourne, Melbourne, VIC, Australia
| |
Collapse
|
4
|
Sullivan J, Skladman R, Varagur K, Tenenbaum E, Sacks JL, Martin C, Gordon T, Murphy J, Moritz WR, Sacks JM. From Augmented to Virtual Reality in Plastic Surgery: Blazing the Trail to a New Frontier. J Reconstr Microsurg 2024; 40:398-406. [PMID: 37884060 DOI: 10.1055/a-2199-3870] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2023]
Abstract
BACKGROUND Augmented reality (AR) and virtual reality (VR)-termed mixed reality-have shown promise in the care of operative patients. Currently, AR and VR have well-known applications for craniofacial surgery, specifically in preoperative planning. However, the application of AR/VR technology to other reconstructive challenges has not been widely adopted. Thus, the purpose of this investigation is to outline the current applications of AR and VR in the operative setting. METHODS The literature pertaining to the use of AR/VR technology in the operative setting was examined. Emphasis was placed on the use of mixed reality technology in surgical subspecialities, including plastic surgery, oral and maxillofacial surgery, colorectal surgery, neurosurgery, otolaryngology, neurosurgery, and orthopaedic surgery. RESULTS Presently, mixed reality is widely used in the care of patients requiring complex reconstruction of the craniomaxillofacial skeleton for pre- and intraoperative planning. For upper extremity amputees, there is evidence that VR may be efficacious in the treatment of phantom limb pain. Furthermore, VR has untapped potential as a cost-effective tool for microsurgical education and for training residents on techniques in surgical and nonsurgical aesthetic treatment. There is utility for mixed reality in breast reconstruction for preoperative planning, mapping perforators, and decreasing operative time. VR has well- documented applications in the planning of deep inferior epigastric perforator flaps by creating three-dimensional immersive simulations based on a patient's preoperative computed tomography angiogram. CONCLUSION The benefits of AR and VR are numerous for both patients and surgeons. VR has been shown to increase surgical precision and decrease operative time. Furthermore, it is effective for patient-specific rehearsal which uses the patient's exact anatomical data to rehearse the procedure before performing it on the actual patient. Taken together, AR/VR technology can improve patient outcomes, decrease operative times, and lower the burden of care on both patients and health care institutions.
Collapse
Affiliation(s)
- Janessa Sullivan
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - Rachel Skladman
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - Kaamya Varagur
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - Elijah Tenenbaum
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - Jacob L Sacks
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - Cameron Martin
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - Terry Gordon
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - John Murphy
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - William R Moritz
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| | - Justin M Sacks
- Division of Plastic and Reconstructive Surgery, Department of Surgery, Washington University in St. Louis School of Medicine, St. Louis, Missouri
| |
Collapse
|
5
|
Saccenti L, Bessy H, Ben Jedidia B, Longere B, Tortolano L, Derbel H, Luciani A, Kobeiter H, Grandpierre T, Tacher V. Performance Comparison of Augmented Reality Versus Ultrasound Guidance for Puncture: A Phantom Study. Cardiovasc Intervent Radiol 2024:10.1007/s00270-024-03727-8. [PMID: 38710797 DOI: 10.1007/s00270-024-03727-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 04/02/2024] [Indexed: 05/08/2024]
Abstract
PURPOSE Augmented reality (AR) is an innovative approach that could assist percutaneous procedures; by directly seeing "through" a phantom, targeting a lesion might be more intuitive than using ultrasound (US). The objective of this study was to compare the performance of experienced interventional radiologists and operators untrained in soft tissue lesion puncture using AR guidance and standard US guidance. MATERIAL AND METHODS Three trained interventional radiologists with 5-10 years of experience and three untrained operators performed punctures of five targets in an abdominal phantom, with US guidance and AR guidance. Correct targeting, accuracy (defined as the Euclidean distance between the tip and the center of the target), planning time, and puncture time were documented. RESULTS Accuracy was higher for the trained group than the untrained group using US guidance (1 mm versus 4 mm, p = 0.001), but not when using AR guidance (4 mm vs. 4 mm, p = 0.76). All operators combined, no significant difference was found concerning accuracy between US and AR guidance (2 mm vs. 4 mm, p = 0.09), but planning time and puncture time were significantly shorter using AR (respectively, 15.1 s vs. 74 s, p < 0.001; 16.1 s vs. 59 s; p < 0.001). CONCLUSION Untrained and trained operators obtained comparable accuracy in percutaneous punctures when using AR guidance whereas US performance was better in the experienced group. All operators together, accuracy was similar between US and AR guidance, but shorter planning time, puncture time were found for AR guidance.
Collapse
Affiliation(s)
- Laetitia Saccenti
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France.
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France.
| | - Hugo Bessy
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
| | | | - Benjamin Longere
- Department of Cardiovascular Radiology, CHU Lille, 59000, Lille, France
| | | | - Haytham Derbel
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Alain Luciani
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Hicham Kobeiter
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| | - Thierry Grandpierre
- Ecole superieure d'ingenieurs en electrotechnique et electronique, ESIEE Paris, Noisy Le Grand, France
| | - Vania Tacher
- Imagerie Medicale, Hopital Henri Mondor, Creteil, France
- Henri Mondor's Institute of Biomedical Research, Inserm, U955 Team N°18, Creteil, France
| |
Collapse
|
6
|
Pojskić M, Bopp M, Saß B, Nimsky C. Single-Center Experience of Resection of 120 Cases of Intradural Spinal Tumors. World Neurosurg 2024:S1878-8750(24)00634-X. [PMID: 38642835 DOI: 10.1016/j.wneu.2024.04.071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2024] [Accepted: 04/14/2024] [Indexed: 04/22/2024]
Abstract
BACKGROUND Our study presents a single-center experience of resection of intradural spinal tumors either with or without using intraoperative computed tomography-based registration and microscope-based augmented reality (AR). Microscope-based AR was recently described for improved orientation in the operative field in spine surgery, using superimposed images of segmented structures of interest in a two-dimensional or three-dimensional mode. METHODS All patients who underwent surgery for resection of intradural spinal tumors at our department were retrospectively included in the study. Clinical outcomes in terms of postoperative neurologic deficits and complications were evaluated, as well as neuroradiologic outcomes for tumor remnants and recurrence. RESULTS 112 patients (57 female, 55 male; median age 55.8 ± 17.8 years) who underwent 120 surgeries for resection of intradural spinal tumors with the use of intraoperative neuromonitoring were included in the study, with a median follow-up of 39 ± 34.4 months. Nine patients died during the follow-up for reasons unrelated to surgery. The most common tumors were meningioma (n = 41), schwannoma (n = 37), myopapillary ependymomas (n = 12), ependymomas (n = 10), and others (20). Tumors were in the thoracic spine (n = 46), lumbar spine (n = 39), cervical spine (n = 32), lumbosacral spine (n = 1), thoracic and lumbar spine (n = 1), and 1 tumor in the cervical, thoracic, and lumbar spine. Four biopsies were performed, 10 partial resections, 13 subtotal resections, and 93 gross total resections. Laminectomy was the common approach. In 79 cases, patients experienced neurologic deficits before surgery, with ataxia and paraparesis as the most common ones. After surgery, 67 patients were unchanged, 49 improved and 4 worsened. Operative time, extent of resection, clinical outcome, and complication rate did not differ between the AR and non-AR groups. However, the use of AR improved orientation in the operative field by identification of important neurovascular structures. CONCLUSIONS High rates of gross total resection with favorable neurologic outcomes in most patients as well as low recurrence rates with comparable complication rates were noted in our single-center experience. AR improved intraoperative orientation and increased surgeons' comfort by enabling early identification of important anatomic structures; however, clinical and radiologic outcomes did not differ, when AR was not used.
Collapse
Affiliation(s)
- Mirza Pojskić
- Department of Neurosurgery, University of Marburg, Marburg, Germany.
| | - Miriam Bopp
- Department of Neurosurgery, University of Marburg, Marburg, Germany; Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| | - Benjamin Saß
- Department of Neurosurgery, University of Marburg, Marburg, Germany
| | - Christopher Nimsky
- Department of Neurosurgery, University of Marburg, Marburg, Germany; Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| |
Collapse
|
7
|
Pressman SM, Borna S, Gomez-Cabello CA, Haider SA, Haider C, Forte AJ. AI and Ethics: A Systematic Review of the Ethical Considerations of Large Language Model Use in Surgery Research. Healthcare (Basel) 2024; 12:825. [PMID: 38667587 PMCID: PMC11050155 DOI: 10.3390/healthcare12080825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2024] [Revised: 04/02/2024] [Accepted: 04/09/2024] [Indexed: 04/28/2024] Open
Abstract
INTRODUCTION As large language models receive greater attention in medical research, the investigation of ethical considerations is warranted. This review aims to explore surgery literature to identify ethical concerns surrounding these artificial intelligence models and evaluate how autonomy, beneficence, nonmaleficence, and justice are represented within these ethical discussions to provide insights in order to guide further research and practice. METHODS A systematic review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Five electronic databases were searched in October 2023. Eligible studies included surgery-related articles that focused on large language models and contained adequate ethical discussion. Study details, including specialty and ethical concerns, were collected. RESULTS The literature search yielded 1179 articles, with 53 meeting the inclusion criteria. Plastic surgery, orthopedic surgery, and neurosurgery were the most represented surgical specialties. Autonomy was the most explicitly cited ethical principle. The most frequently discussed ethical concern was accuracy (n = 45, 84.9%), followed by bias, patient confidentiality, and responsibility. CONCLUSION The ethical implications of using large language models in surgery are complex and evolving. The integration of these models into surgery necessitates continuous ethical discourse to ensure responsible and ethical use, balancing technological advancement with human dignity and safety.
Collapse
Affiliation(s)
| | - Sahar Borna
- Division of Plastic Surgery, Mayo Clinic, Jacksonville, FL 32224, USA
| | | | - Syed A. Haider
- Division of Plastic Surgery, Mayo Clinic, Jacksonville, FL 32224, USA
| | - Clifton Haider
- Department of Physiology and Biomedical Engineering, Mayo Clinic, Rochester, MN 55905, USA
| | - Antonio J. Forte
- Division of Plastic Surgery, Mayo Clinic, Jacksonville, FL 32224, USA
- Center for Digital Health, Mayo Clinic, Rochester, MN 55905, USA
| |
Collapse
|
8
|
Kanno H, Handa K, Murotani M, Ozawa H. A Novel Intraoperative CT Navigation System for Spinal Fusion Surgery in Lumbar Degenerative Disease: Accuracy and Safety of Pedicle Screw Placement. J Clin Med 2024; 13:2105. [PMID: 38610870 PMCID: PMC11012415 DOI: 10.3390/jcm13072105] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2024] [Revised: 03/31/2024] [Accepted: 04/03/2024] [Indexed: 04/14/2024] Open
Abstract
Background: In recent years, intraoperative computed tomography (CT) navigation has become widely used for the insertion of pedicle screws in spinal fusion surgery. However, conventional intraoperative CT navigation may be impaired by infrared interference between the infrared camera and surgical instruments, which can lead to the misplacement of pedicle screws. Recently, a novel intraoperative CT navigation system, NextAR, has been developed. It uses a small infrared camera mounted on surgical instruments within the surgical field. NextAR navigation can minimize the problem of infrared interference and be expected to improve the accuracy of pedicle screw placement. Methods: This study investigated the accuracy of pedicle screw insertion under NextAR navigation in spinal fusion surgery for lumbar degenerative diseases. The accuracy of pedicle screw placement was evaluated in 15 consecutive patients using a CT grading scale. Results: Screw perforation occurred in only 1 of the total 70 screws (1.4%). Specifically, there was one grade 1 perforation within 2 mm, but no perforations larger than 2 mm. There were no reoperations or neurological complications due to screw misplacement. Conclusions: NextAR navigation can provide high accuracy for pedicle screw insertion and help ensure safe spinal fusion surgery for lumbar degenerative diseases.
Collapse
Affiliation(s)
- Haruo Kanno
- Department of Orthopaedic Surgery, Tohoku Medical and Pharmaceutical University, Sendai 983-8536, Japan
| | - Kyoichi Handa
- Department of Orthopaedic Surgery, Tohoku Medical and Pharmaceutical University, Sendai 983-8536, Japan
| | - Motoki Murotani
- Department of Orthopaedic Surgery, Tohoku Medical and Pharmaceutical University, Sendai 983-8536, Japan
| | - Hiroshi Ozawa
- Department of Orthopaedic Surgery, Tohoku Medical and Pharmaceutical University, Sendai 983-8536, Japan
| |
Collapse
|
9
|
Bcharah G, Gupta N, Panico N, Winspear S, Bagley A, Turnow M, D'Amico R, Ukachukwu AEK. Innovations in Spine Surgery: A Narrative Review of Current Integrative Technologies. World Neurosurg 2024; 184:127-136. [PMID: 38159609 DOI: 10.1016/j.wneu.2023.12.124] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2023] [Accepted: 12/22/2023] [Indexed: 01/03/2024]
Abstract
Neurosurgical technologies have become increasingly more adaptive, featuring real-time and patient-specific guidance in preoperative, intraoperative, and postoperative settings. This review offers insight into how these integrative innovations compare with conventional approaches in spine surgery, focusing on machine learning (ML), artificial intelligence, augmented reality and virtual reality, and spinal navigation systems. Data on technology applications, diagnostic and procedural accuracy, intraoperative times, radiation exposures, postoperative outcomes, and costs were extracted and compared with conventional methods to assess their advantages and limitations. Preoperatively, augmented reality and virtual reality have applications in surgical training and planning that are more immersive, case specific, and risk-free and have been shown to enhance accuracy and reduce complications. ML algorithms have demonstrated high accuracy in predicting surgical candidacy (up to 92.1%) and tailoring personalized treatments based on patient-specific variables. Intraoperatively, advantages include more accurate pedicle screw insertion (96%-99% with ML), enhanced visualization, reduced radiation exposure (49 μSv with O-arm navigation vs. 556 μSv with fluoroscopy), increased efficiency, and potential for fewer intraoperative complications compared with conventional approaches. Postoperatively, certain ML and artificial intelligence models have outperformed conventional methods in predicting all postoperative complications of >6000 patients as well as predicting variables contributing to in-hospital and 90-day mortality. However, applying these technologies comes with limitations, such as longer operative times (up to 35.6% longer) with navigation, dependency on datasets, costs, accessibility, steep learning curve, and inherent software malfunctions. As these technologies advance, continuing to assess their efficacy and limitations will be crucial to their successful integration within spine surgery.
Collapse
Affiliation(s)
- George Bcharah
- Mayo Clinic Alix School of Medicine, Scottsdale, Arizona, USA
| | - Nithin Gupta
- Campbell University School of Osteopathic Medicine, Lillington, North Carolina, USA
| | - Nicholas Panico
- Lake Erie College of Osteopathic Medicine, Erie, Pennsylvania, USA
| | - Spencer Winspear
- Campbell University School of Osteopathic Medicine, Lillington, North Carolina, USA
| | - Austin Bagley
- Campbell University School of Osteopathic Medicine, Lillington, North Carolina, USA
| | - Morgan Turnow
- Kentucky College of Osteopathic Medicine, Pikeville, Kentucky, USA
| | - Randy D'Amico
- Department of Neurosurgery, Lenox Hill Hospital, New York, New York, USA
| | - Alvan-Emeka K Ukachukwu
- Department of Neurosurgery, Duke University, Durham, North Carolina, USA; Duke Global Neurosurgery and Neurology, Durham, North Carolina, USA.
| |
Collapse
|
10
|
Judy BF, Menta A, Pak HL, Azad TD, Witham TF. Augmented Reality and Virtual Reality in Spine Surgery: A Comprehensive Review. Neurosurg Clin N Am 2024; 35:207-216. [PMID: 38423736 DOI: 10.1016/j.nec.2023.11.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/02/2024]
Abstract
Augmented reality (AR) and virtual reality (VR) are powerful technologies with proven utility and tremendous potential. Spine surgery, in particular, may benefit from these developing technologies for resident training, preoperative education for patients, surgical planning and execution, and patient rehabilitation. In this review, the history, current applications, challenges, and future of AR/VR in spine surgery are examined.
Collapse
Affiliation(s)
- Brendan F Judy
- Department of Neurosurgery, Johns Hopkins Hospital, Johns Hopkins University School of Medicine, 1800 Orleans Street, 6007 Zayed Tower, Baltimore, MD 21287, USA.
| | - Arjun Menta
- Department of Neurosurgery, Johns Hopkins Hospital, Johns Hopkins University School of Medicine, 1800 Orleans Street, 6007 Zayed Tower, Baltimore, MD 21287, USA
| | - Ho Lim Pak
- Department of Neurosurgery, Johns Hopkins Hospital, Johns Hopkins University School of Medicine, 1800 Orleans Street, 6007 Zayed Tower, Baltimore, MD 21287, USA
| | - Tej D Azad
- Department of Neurosurgery, Johns Hopkins Hospital, Johns Hopkins University School of Medicine, 1800 Orleans Street, 6007 Zayed Tower, Baltimore, MD 21287, USA
| | - Timothy F Witham
- Department of Neurosurgery, Johns Hopkins Hospital, Johns Hopkins University School of Medicine, 1800 Orleans Street, 6007 Zayed Tower, Baltimore, MD 21287, USA.
| |
Collapse
|
11
|
Jang Y, Lim S, Lee S, Je LG, Kim T, Joo S, Seo J, Lee D, Koh JC. Clinical Application of an Augmented Reality Navigation System for Transforaminal Epidural Injection: A Randomized Controlled Trial. J Clin Med 2024; 13:1992. [PMID: 38610758 PMCID: PMC11012780 DOI: 10.3390/jcm13071992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2024] [Revised: 02/10/2024] [Accepted: 03/27/2024] [Indexed: 04/14/2024] Open
Abstract
Objectives: Augmented reality (AR) navigation systems are emerging to simplify and enhance the precision of medical procedures. Lumbosacral transforaminal epidural injection is a commonly performed procedure for the treatment and diagnosis of radiculopathy. Accurate needle placement while avoiding critical structures remains a challenge. For this purpose, we conducted a randomized controlled trial for our augmented reality navigation system. Methods: This randomized controlled study involved 28 patients, split between a traditional C-arm guided group (control) and an AR navigation guided group (AR-NAVI), to compare procedure efficiency and radiation exposure. The AR-NAVI group used a real-time tracking system displaying spinal structure and needle position on an AR head-mounted display. The procedural time and C-arm usage (radiation exposure) were measured. Results: All patients underwent successful procedures without complications. The AR-NAVI group demonstrated significantly reduced times and C-arm usage for needle entry to the target point (58.57 ± 33.31 vs. 124.91 ± 41.14, p < 0.001 and 3.79 ± 1.97 vs. 8.86 ± 3.94, p < 0.001). Conclusions: The use of the AR navigation system significantly improved procedure efficiency and safety by reducing time and radiation exposure, suggesting a promising direction for future enhancements and validation.
Collapse
Affiliation(s)
- Yookyung Jang
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Sunghwan Lim
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea; (S.L.); (D.L.)
| | - Sunhee Lee
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Lee Gyeong Je
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Taesan Kim
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| | - Subin Joo
- Department of Medical Assistant Robot, Korea Institute of Machinery and Materials, Daegu 42994, Republic of Korea; (S.J.); (J.S.)
| | - Joonho Seo
- Department of Medical Assistant Robot, Korea Institute of Machinery and Materials, Daegu 42994, Republic of Korea; (S.J.); (J.S.)
| | - Deukhee Lee
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea; (S.L.); (D.L.)
| | - Jae Chul Koh
- Department of Anesthesiology and Pain Medicine, Korea University College of Medicine, Seoul 02841, Republic of Korea; (Y.J.); (S.L.); (L.G.J.); (T.K.)
| |
Collapse
|
12
|
Junga A, Schmidle P, Pielage L, Schulze H, Hätscher O, Ständer S, Marschall B, Braun SA. New horizons in dermatological education: Skin cancer screening with virtual reality. J Eur Acad Dermatol Venereol 2024. [PMID: 38497674 DOI: 10.1111/jdv.19960] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2023] [Accepted: 02/08/2024] [Indexed: 03/19/2024]
Abstract
BACKGROUND Technological advances in the field of virtual reality (VR) offer new opportunities in many areas of life, including medical education. The University of Münster has been using VR scenarios in the education of medical students for several years, especially for situations that are difficult to reproduce in reality (e.g., brain death). Due to the consistently positive feedback from students, a dermatological VR scenario for skin cancer screening was developed. OBJECTIVES Presentation and first evaluation of the skin cancer screening VR scenario to determine to what extent the technical implementation of the scenario was evaluated overall by the students and how their subjective competence to perform a skin cancer screening changed over the course of the teaching unit (theory seminar, VR scenario, theoretical debriefing). METHODS Students (n = 140) participating in the curricular pilot project during the 2023 summer term were surveyed throughout the teaching unit using several established questionnaires (System Usability Scale, Simulation Task-Load-Index, Realism and Presence Questionnaire) as well as additional questions on cybersickness and subjective learning. RESULTS (i) The use of VR is technically feasible, (ii) students evaluate the VR scenario as a useful curricular supplement, and (iii) from the students' subjective perspective, a good learning outcome is achieved. Although preparation and follow-up appear to be important for overall learning, the greatest increase in subjective competence to perform a skin cancer screening is achieved by the VR scenario. CONCLUSIONS Technically feasible and positively evaluated by students, VR can already be a useful addition to dermatology education, although costs are still high. As a visual discipline, dermatology offers special opportunities to create VR scenarios that are not always available or comfortable for patients in reality. Additionally, VR scenarios guarantee the same conditions for all students, which is essential for a high-quality education.
Collapse
Affiliation(s)
- Anna Junga
- Institute of Education and Student Affairs, University of Münster, Münster, Germany
- Department of Urology, Stiftungsklinikum PROSELIS, Recklinghausen, Germany
| | - Paul Schmidle
- Department of Dermatology, Medical Faculty, University of Münster, Münster, Germany
| | - Leon Pielage
- Institute for Geoinformatics, University of Münster, Münster, Germany
| | - Henriette Schulze
- Institute of Education and Student Affairs, University of Münster, Münster, Germany
| | - Ole Hätscher
- Institute of Education and Student Affairs, University of Münster, Münster, Germany
- Department of Psychology, University of Münster, Münster, Germany
| | - Sonja Ständer
- Department of Dermatology, Medical Faculty, University of Münster, Münster, Germany
| | - Bernhard Marschall
- Institute of Education and Student Affairs, University of Münster, Münster, Germany
| | - Stephan Alexander Braun
- Department of Dermatology, Medical Faculty, University of Münster, Münster, Germany
- Department of Dermatology, Medical Faculty, Heinrich-Heine University, Düsseldorf, Germany
| |
Collapse
|
13
|
Da Mutten R, Zanier O, Theiler S, Ryu SJ, Regli L, Serra C, Staartjes VE. Whole Spine Segmentation Using Object Detection and Semantic Segmentation. Neurospine 2024; 21:57-67. [PMID: 38317546 PMCID: PMC10992645 DOI: 10.14245/ns.2347178.589] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2023] [Revised: 01/06/2024] [Accepted: 01/07/2024] [Indexed: 02/07/2024] Open
Abstract
OBJECTIVE Virtual and augmented reality have enjoyed increased attention in spine surgery. Preoperative planning, pedicle screw placement, and surgical training are among the most studied use cases. Identifying osseous structures is a key aspect of navigating a 3-dimensional virtual reconstruction. To automate the otherwise time-consuming process of labeling vertebrae on each slice individually, we propose a fully automated pipeline that automates segmentation on computed tomography (CT) and which can form the basis for further virtual or augmented reality application and radiomic analysis. METHODS Based on a large public dataset of annotated vertebral CT scans, we first trained a YOLOv8m (You-Only-Look-Once algorithm, Version 8 and size medium) to detect each vertebra individually. On the then cropped images, a 2D-U-Net was developed and externally validated on 2 different public datasets. RESULTS Two hundred fourteen CT scans (cervical, thoracic, or lumbar spine) were used for model training, and 40 scans were used for external validation. Vertebra recognition achieved a mAP50 (mean average precision with Jaccard threshold of 0.5) of over 0.84, and the segmentation algorithm attained a mean Dice score of 0.75 ± 0.14 at internal, 0.77 ± 0.12 and 0.82 ± 0.14 at external validation, respectively. CONCLUSION We propose a 2-stage approach consisting of single vertebra labeling by an object detection algorithm followed by semantic segmentation. In our externally validated pilot study, we demonstrate robust performance for our object detection network in identifying individual vertebrae, as well as for our segmentation model in precisely delineating the bony structures.
Collapse
Affiliation(s)
- Raffaele Da Mutten
- Machine Intelligence in Clinical Neuroscience (MICN) Laboratory, Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zürich, University of Zürich, Zürich, Switzerland
| | - Olivier Zanier
- Machine Intelligence in Clinical Neuroscience (MICN) Laboratory, Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zürich, University of Zürich, Zürich, Switzerland
| | - Sven Theiler
- Machine Intelligence in Clinical Neuroscience (MICN) Laboratory, Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zürich, University of Zürich, Zürich, Switzerland
| | - Seung-Jun Ryu
- Department of Neurosurgery, Daejeon Eulji University Hospital, Eulji University Medical School, Daejeon, Korea
| | - Luca Regli
- Machine Intelligence in Clinical Neuroscience (MICN) Laboratory, Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zürich, University of Zürich, Zürich, Switzerland
| | - Carlo Serra
- Machine Intelligence in Clinical Neuroscience (MICN) Laboratory, Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zürich, University of Zürich, Zürich, Switzerland
| | - Victor E. Staartjes
- Machine Intelligence in Clinical Neuroscience (MICN) Laboratory, Department of Neurosurgery, Clinical Neuroscience Center, University Hospital Zürich, University of Zürich, Zürich, Switzerland
| |
Collapse
|
14
|
Zaki MM, Joshi RS, Joseph JR, Saadeh YS, Kashlan ON, Godzik J, Uribe JS, Park P. Virtual Reality-Enabled Resident Education of Lateral-Access Spine Surgery. World Neurosurg 2024; 183:e401-e407. [PMID: 38143034 DOI: 10.1016/j.wneu.2023.12.108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2023] [Revised: 12/18/2023] [Accepted: 12/18/2023] [Indexed: 12/26/2023]
Abstract
OBJECTIVE Lateral-access spine surgery has many benefits, but adoption has been limited by a steep learning curve. Virtual reality (VR) is gaining popularity and lends itself as a useful tool in enhancing neurosurgical resident education. We thus sought to assess whether VR-based simulation could enhance the training of neurosurgery residents in lateral spine surgery. METHODS Neurosurgery residents completed a VR-based lateral spine module on lateral patient positioning and performing lateral lumbar interbody fusion using the PrecisionOS VR system on the Meta Quest 2 headset. Simulation occurred 1×/week every other week for a total of 3 simulations over 6 weeks. Pre- and postintervention surveys as well as intrasimulation performance metrics were assessed over time. RESULTS The majority of resident participants showed improvement in performance scores, including an automated PrecisionOS precision score, number of radiographs used within the simulation, and time to completion. All participants showed improvement in comfort with anatomic landmarks for lateral access surgery, confidence performing lateral surgery without direct supervision, and assessing fluoroscopy in spine surgery for hardware placement and image interpretation. Participant perception on the utility of VR as an educational tool also improved. CONCLUSIONS VR-based simulation enhanced neurosurgical residents' ability to understand lateral access surgery. Immersive surgical simulation resulted in improved resident confidence with surgical technique and workflow, perceived improvement in anatomical knowledge, and simulation performance scores. Trainee perceptions on virtual simulation and training as a curriculum supplement also improved following completion of VR training.
Collapse
Affiliation(s)
- Mark M Zaki
- Department of Neurosurgery, University of Michigan, Ann Arbor, Michigan, USA
| | - Rushikesh S Joshi
- Department of Neurosurgery, University of Michigan, Ann Arbor, Michigan, USA
| | - Jacob R Joseph
- Department of Neurosurgery, University of Michigan, Ann Arbor, Michigan, USA
| | - Yamaan S Saadeh
- Department of Neurosurgery, University of Michigan, Ann Arbor, Michigan, USA
| | - Osama N Kashlan
- Department of Neurosurgery, University of Michigan, Ann Arbor, Michigan, USA
| | - Jakub Godzik
- Department of Neurosurgery, University of Alabama at Birmingham, Birmingham, Alabama, USA
| | - Juan S Uribe
- Department of Neurosurgery, Barrow Neurological Institute, Phoenix, Arizona, USA
| | - Paul Park
- Department of Neurosurgery, Semmes-Murphey Neurologic and Spine Institute, University of Tennessee, Memphis, Tennessee, USA.
| |
Collapse
|
15
|
Bian D, Lin Z, Lu H, Zhong Q, Wang K, Tang X, Zang J. The application of extended reality technology-assisted intraoperative navigation in orthopedic surgery. Front Surg 2024; 11:1336703. [PMID: 38375409 PMCID: PMC10875025 DOI: 10.3389/fsurg.2024.1336703] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2023] [Accepted: 01/23/2024] [Indexed: 02/21/2024] Open
Abstract
Extended reality (XR) technology refers to any situation where real-world objects are enhanced with computer technology, including virtual reality, augmented reality, and mixed reality. Augmented reality and mixed reality technologies have been widely applied in orthopedic clinical practice, including in teaching, preoperative planning, intraoperative navigation, and surgical outcome evaluation. The primary goal of this narrative review is to summarize the effectiveness and superiority of XR-technology-assisted intraoperative navigation in the fields of trauma, joint, spine, and bone tumor surgery, as well as to discuss the current shortcomings in intraoperative navigation applications. We reviewed titles of more than 200 studies obtained from PubMed with the following search terms: extended reality, mixed reality, augmented reality, virtual reality, intraoperative navigation, and orthopedic surgery; of those 200 studies, 69 related papers were selected for abstract review. Finally, the full text of 55 studies was analyzed and reviewed. They were classified into four groups-trauma, joint, spine, and bone tumor surgery-according to their content. Most of studies that we reviewed showed that XR-technology-assisted intraoperative navigation can effectively improve the accuracy of implant placement, such as that of screws and prostheses, reduce postoperative complications caused by inaccurate implantation, facilitate the achievement of tumor-free surgical margins, shorten the surgical duration, reduce radiation exposure for patients and surgeons, minimize further damage caused by the need for visual exposure during surgery, and provide richer and more efficient intraoperative communication, thereby facilitating academic exchange, medical assistance, and the implementation of remote healthcare.
Collapse
Affiliation(s)
- Dongxiao Bian
- Department of Musculoskeletal Tumor, Peking University People’s Hospital, Beijing, China
| | - Zhipeng Lin
- State Key Laboratory of Virtual Reality Technology and Systems, Beihang University, Beijing, China
| | - Hao Lu
- Traumatic Orthopedic Department, Peking University People’s Hospital, Beijing, China
| | - Qunjie Zhong
- Arthritis Clinic and Research Center, Peking University People’s Hospital, Beijing, China
| | - Kaifeng Wang
- Spinal Surgery Department, Peking University People’s Hospital, Beijing, China
| | - Xiaodong Tang
- Department of Musculoskeletal Tumor, Peking University People’s Hospital, Beijing, China
| | - Jie Zang
- Department of Musculoskeletal Tumor, Peking University People’s Hospital, Beijing, China
| |
Collapse
|
16
|
Adida S, Legarreta AD, Hudson JS, McCarthy D, Andrews E, Shanahan R, Taori S, Lavadi RS, Buell TJ, Hamilton DK, Agarwal N, Gerszten PC. Machine Learning in Spine Surgery: A Narrative Review. Neurosurgery 2024; 94:53-64. [PMID: 37930259 DOI: 10.1227/neu.0000000000002660] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2023] [Accepted: 07/06/2023] [Indexed: 11/07/2023] Open
Abstract
Artificial intelligence and machine learning (ML) can offer revolutionary advances in their application to the field of spine surgery. Within the past 5 years, novel applications of ML have assisted in surgical decision-making, intraoperative imaging and navigation, and optimization of clinical outcomes. ML has the capacity to address many different clinical needs and improve diagnostic and surgical techniques. This review will discuss current applications of ML in the context of spine surgery by breaking down its implementation preoperatively, intraoperatively, and postoperatively. Ethical considerations to ML and challenges in ML implementation must be addressed to maximally benefit patients, spine surgeons, and the healthcare system. Areas for future research in augmented reality and mixed reality, along with limitations in generalizability and bias, will also be highlighted.
Collapse
Affiliation(s)
- Samuel Adida
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Andrew D Legarreta
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Joseph S Hudson
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - David McCarthy
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Edward Andrews
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Regan Shanahan
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Suchet Taori
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Raj Swaroop Lavadi
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Thomas J Buell
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - D Kojo Hamilton
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| | - Nitin Agarwal
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
- Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh , Pennsylvania , USA
| | - Peter C Gerszten
- Department of Neurosurgery, University of Pittsburgh School of Medicine, Pittsburgh , Pennsylvania , USA
| |
Collapse
|
17
|
Theivendrampillai S, Yang B, Little M, Blick C. Targeted augmented reality-guided transperineal prostate biopsies study: initial experience. Ther Adv Urol 2024; 16:17562872241232582. [PMID: 38464882 PMCID: PMC10924555 DOI: 10.1177/17562872241232582] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2023] [Accepted: 01/24/2024] [Indexed: 03/12/2024] Open
Abstract
Background Transperineal biopsy of magnetic resonance imaging (MRI)-detected prostate lesions is now the established technique used in prostate cancer (CaP) diagnostics. Virtual Surgery Intelligence (VSI) Holomedicine by Apoqlar (Hamburg, Germany) is a mixed reality (MR)/augmented reality (AR) software platform that runs on the HoloLens II system (Microsoft, Redford, USA). Multiparametric prostate MRI images were converted into 3D holograms and added into a MR space, enabling visualization of a 3D hologram and image-assisted prostate biopsy. Objective The Targeted Augmented Reality-GuidEd Transperineal (TARGET) study investigated the feasibility of performing AR-guided prostate biopsies in a MR framework, using the VSI platform in patients with MRI-detected prostate lesions. Methods Ten patients with a clinical suspicion of CaP on MRI (Prostate Imaging-Reporting and Data System, PI-RADS 4/5) were uploaded to the VSI HoloLens system. Two MR/AR-guided prostate biopsies were then acquired using the PrecisionPoint Freehand transperineal biopsy system. Cognitive fusion biopsies were performed as standard of care following the MR/AR-guided prostate biopsies. Results All 10 patients successfully underwent MR/AR-guided prostate biopsy after 3D MR images were overlaid on the patient's body. Prostatic tissue was obtained in all MR/AR-guided specimens. Seven patients (70%) had matching histology in both the standard and MR/AR-guided biopsies. The remaining three had ISUP (International Society of Urological Pathology) Grade 2 CaP. There were no immediate complications. Conclusion We believe this is a world first. The initial feasibility data from the TARGET study demonstrated that an MR/AR-guided prostate biopsy utilizing the VSI Holomedicine system is a viable option in CaP diagnostics. The next stage in development is to combine AR images with real-time needle insertion and to provide further data to formally appraise the sensitivity and specificity of the technique.
Collapse
Affiliation(s)
| | - Bob Yang
- Royal Berkshire Hospital, Reading, Berkshire, UK
| | - Mark Little
- Royal Berkshire Hospital, Reading, Berkshire, UK
| | - Christopher Blick
- Department of Urology, Royal Berkshire Hospital, Craven Road, Reading, Berkshire RG1 5AN, UK
| |
Collapse
|
18
|
Combalia A, Sanchez-Vives MV, Donegan T. Immersive virtual reality in orthopaedics-a narrative review. INTERNATIONAL ORTHOPAEDICS 2024; 48:21-30. [PMID: 37566225 PMCID: PMC10766717 DOI: 10.1007/s00264-023-05911-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/20/2023] [Accepted: 07/23/2023] [Indexed: 08/12/2023]
Abstract
PURPOSE This narrative review explores the applications and benefits of immersive virtual reality (VR) in orthopaedics, with a focus on surgical training, patient functional recovery, and pain management. METHODS The review examines existing literature and research studies on immersive VR in orthopaedics, analyzing both experimental and clinical studies. RESULTS Immersive VR provides a realistic simulation environment for orthopaedic surgery training, enhancing surgical skills, reducing errors, and improving overall performance. In post-surgical recovery and rehabilitation, immersive VR environments can facilitate motor learning and functional recovery through virtual embodiment, motor imagery during action observation, and virtual training. Additionally VR-based functional recovery programs can improve patient adherence and outcomes. Moreover, VR has the potential to revolutionize pain management, offering a non-invasive, drug-free alternative. Virtual reality analgesia acts by a variety of means including engagement and diverting patients' attention, anxiety reduction, and specific virtual-body transformations. CONCLUSION Immersive virtual reality holds significant promise in orthopaedics, demonstrating potential for improved surgical training, patient functional recovery, and pain management but further research is needed to fully exploit the benefits of VR technology in these areas.
Collapse
Affiliation(s)
- A Combalia
- Departament de Cirurgia i Especialitats Medicoquirúrgiques, Facultat de Medicina i Ciències de la Salut, Universitat de Barcelona (UB), c. Casanova, 143, 08036, Barcelona, Spain.
- Servei de Cirurgia Ortopèdica i Traumatologia, Hospital Clínic de Barcelona, Universitat de Barcelona (UB), c. Villarroel, 170, 08036, Barcelona, Spain.
- Facultat de Medicina i Ciències de la Salut, Universitat de Barcelona (UB), c. Casanova, 143, 08036, Barcelona, Spain.
- Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), c. Villarroel, 170, 08036, Barcelona, Spain.
| | - M V Sanchez-Vives
- Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), c. Villarroel, 170, 08036, Barcelona, Spain.
- Institución Catalana de Investigación y Estudios Avanzados (ICREA), Passeig de Lluís Companys, 23, 08010, Barcelona, Spain.
| | - T Donegan
- Institut d'Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), c. Villarroel, 170, 08036, Barcelona, Spain
| |
Collapse
|
19
|
Azad TD, Warman A, Tracz JA, Hughes LP, Judy BF, Witham TF. Augmented reality in spine surgery - past, present, and future. Spine J 2024; 24:1-13. [PMID: 37660893 DOI: 10.1016/j.spinee.2023.08.015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/13/2023] [Revised: 07/27/2023] [Accepted: 08/29/2023] [Indexed: 09/05/2023]
Abstract
BACKGROUND CONTEXT Augmented reality (AR) is increasingly recognized as a valuable tool in spine surgery. Here we provides an overview of the key developments and technological milestones that have laid the foundation for AR applications in this field. We also assess the quality of existing studies on AR systems in spine surgery and explore potential future applications. PURPOSE The purpose of this narrative review is to examine the role of AR in spine surgery. It aims to highlight the evolution of AR technology in this context, evaluate the existing body of research, and outline potential future directions for integrating AR into spine surgery. STUDY DESIGN Narrative review. METHODS We conducted a thorough literature search to identify studies and developments related to AR in spine surgery. Relevant articles, reports, and technological advancements were analyzed to establish the historical context and current state of AR in this field. RESULTS The review identifies significant milestones in the development of AR technology for spine surgery. It discusses the growing body of research and highlights the strengths and weaknesses of existing investigations. Additionally, it presents insights into the potential for AR to enhance spine surgical education and speculates on future applications. CONCLUSIONS Augmented reality has emerged as a promising adjunct in spine surgery, with notable advancements and research efforts. The integration of AR into the spine surgery operating room holds promise, as does its potential to revolutionize surgical education. Future applications of AR in spine surgery may include real-time navigation, enhanced visualization, and improved patient outcomes. Continued development and evaluation of AR technology are essential for its successful implementation in this specialized surgical field.
Collapse
Affiliation(s)
- Tej D Azad
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Anmol Warman
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Jovanna A Tracz
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Liam P Hughes
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Brendan F Judy
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA
| | - Timothy F Witham
- Department of Neurosurgery, Johns Hopkins University School of Medicine, 600 N. Wolfe St, Meyer 7-109, Baltimore, MD 21287, USA.
| |
Collapse
|
20
|
Schwendner M, Ille S, Wostrack M, Meyer B. Evaluating a cutting-edge augmented reality-supported navigation system for spinal instrumentation. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2024; 33:282-288. [PMID: 37962688 DOI: 10.1007/s00586-023-08011-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Revised: 08/27/2023] [Accepted: 10/16/2023] [Indexed: 11/15/2023]
Abstract
OBJECTIVE Dorsal instrumentation using pedicle screws is a standard treatment for multiple spinal pathologies, such as trauma, infection, or degenerative indications. Intraoperative three-dimensional (3D) imaging and navigated pedicle screw placement are used at multiple centers. For the present study, we evaluated a new navigation system enabling augmented reality (AR)-supported pedicle screw placement while integrating navigation cameras into the reference array and drill guide. The present study aimed to evaluate its clinical application regarding safety, efficacy, and accuracy. METHODS A total of 20 patients were operated on between 06/2021 and 01/2022 using the new technique for intraoperative navigation. Intraoperative data with a focus on accuracy and patient safety, including patient outcome, were analyzed. The accuracy of pedicle screw placement was evaluated by intraoperative CT imaging. RESULTS A median of 8 (4-18) pedicle screws were placed in each case. Percutaneous instrumentation was performed in 14 patients (70%). The duration of pedicle screw placement (duration scan-scan) was 56 ± 26 (30-107) min. Intraoperative screw revision was necessary for 3 of 180 pedicle screws (1.7%). Intraoperatively, no major complications occurred-one case of delay due to software issues and one case of difficult screw placement were reported. CONCLUSION The current study's results could confirm the use of the present AR-supported system for navigated pedicle screw placement for dorsal instrumentation in clinical routine. It provides a reliable and safe tool for 3D imaging-based pedicle screw placement, only requires a minimal intraoperative setup, and provides new opportunities by integrating AR.
Collapse
Affiliation(s)
- Maximilian Schwendner
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany
- TUM Neuroimaging Center, School of Medicine, Klinikum Rechts der Isar, Technical University of Munich, Munich, Germany
| | - Sebastian Ille
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany.
- TUM Neuroimaging Center, School of Medicine, Klinikum Rechts der Isar, Technical University of Munich, Munich, Germany.
| | - Maria Wostrack
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany
| | - Bernhard Meyer
- Department of Neurosurgery, Klinikum Rechts der Isar, Technische Universität München, Ismaninger Str. 22, 81675, Munich, Germany
| |
Collapse
|
21
|
Stucki J, Dastgir R, Baur DA, Quereshy FA. The use of virtual reality and augmented reality in oral and maxillofacial surgery: A narrative review. Oral Surg Oral Med Oral Pathol Oral Radiol 2024; 137:12-18. [PMID: 37723007 DOI: 10.1016/j.oooo.2023.07.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Accepted: 07/03/2023] [Indexed: 09/20/2023]
Abstract
OBJECTIVE The purpose of this article is to review the current uses of virtual reality (VR) and augmented reality (AR) in oral and maxillofacial surgery. We discuss the use of VR/AR in educational training, surgical planning, advances in hardware and software, and the implementation of VR/AR in this field. STUDY DESIGN A retrospective comprehensive review search of PubMed, Web of Science, Embase, and Cochrane Library was conducted. The search resulted in finding 313 English articles in the last 10 years. RESULTS A total of 38 articles were selected after a meticulous review of the aims, objectives, and methodology by 2 independent reviewers. CONCLUSIONS Virtual reality/AR technology offers significant potential in various aspects, including student education, resident evaluation, surgical planning, and overall surgical implementation. However, its widespread adoption in practice is hindered by factors such as the need for further research, cost concerns, unfamiliarity among current educators, and the necessity for technological improvement. Furthermore, residency programs hold a unique position to influence the future of oral and maxillofacial surgery. As VR/AR has demonstrated substantial benefits in resident education and other applications, residency programs have much to gain by integrating these emerging technologies into their curricula.
Collapse
Affiliation(s)
- Jacob Stucki
- Resident, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA
| | - Ramtin Dastgir
- Research Fellow, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA
| | - Dale A Baur
- Professor and Chair, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA
| | - Faisal A Quereshy
- Professor and Program Director, Department of Oral and Maxillofacial Surgery, Case Western Reserve University, Cleveland, OH, USA.
| |
Collapse
|
22
|
Groh J, Schramm S, Renner N, Krause J, Perl M. [Innovative 3D imaging]. UNFALLCHIRURGIE (HEIDELBERG, GERMANY) 2023; 126:921-927. [PMID: 37851089 DOI: 10.1007/s00113-023-01372-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 09/07/2023] [Indexed: 10/19/2023]
Abstract
Intraoperative 2D fluoroscopy is often performed for repositioning and implant control. However, this does not always provide the details needed to reliably detect joint steps or incorrect repositioning. Over the last few years, intraoperative 3D imaging has been established and further developed. Multiple studies demonstrate an advantage and better intraoperative control through 3D imaging. Examples are the upper ankle, the proximal tibia and the distal radius; the rates of intraoperative revisions with digital volume tomography (DVT) are between 20-30%. Technical advancements, such as metal artifact reductions, automated plane setting, automated screw detection, and robotic DVT devices, facilitate intraoperative operation, shorten surgical time, and provide improved image quality. By processing the data sets in the form of an immersive, computer-simulated image in terms of "augmented reality" (AR), increased precision can be achieved intraoperatively while reducing radiation exposure. The implementation of these systems is associated with costs, which are offset by cost savings from avoided revisions. Adequate counter-financing is still lacking at the present time. Intraoperative 3D imaging represents an important tool for intraoperative control. The current data situation makes it necessary to address the routine use of 3D procedures, especially in the joint area. The indications are becoming increasingly broader. Technical innovations such as robotics and AR have significantly improved 3D devices in recent years and offer high potential for integration into the OR.
Collapse
Affiliation(s)
- J Groh
- Klinik für Unfallchirurgie und Orthopädie, Universitätsklinikum Erlangen, Friedrich-Alexander-Universität Erlangen-Nürnberg, Krankenhausstraße 12, 91054, Erlangen, Deutschland
| | - S Schramm
- Klinik für Unfallchirurgie und Orthopädie, Universitätsklinikum Erlangen, Friedrich-Alexander-Universität Erlangen-Nürnberg, Krankenhausstraße 12, 91054, Erlangen, Deutschland
| | - N Renner
- Klinik für Unfallchirurgie und Orthopädie, Universitätsklinikum Erlangen, Friedrich-Alexander-Universität Erlangen-Nürnberg, Krankenhausstraße 12, 91054, Erlangen, Deutschland
| | - J Krause
- Klinik für Unfallchirurgie und Orthopädie, Universitätsklinikum Erlangen, Friedrich-Alexander-Universität Erlangen-Nürnberg, Krankenhausstraße 12, 91054, Erlangen, Deutschland
| | - M Perl
- Klinik für Unfallchirurgie und Orthopädie, Universitätsklinikum Erlangen, Friedrich-Alexander-Universität Erlangen-Nürnberg, Krankenhausstraße 12, 91054, Erlangen, Deutschland.
| |
Collapse
|
23
|
Huang X, Liu X, Zhu B, Hou X, Hai B, Li S, Yu D, Zheng W, Li R, Pan J, Yao Y, Dai Z, Zeng H. Evaluation of Augmented Reality Surgical Navigation in Percutaneous Endoscopic Lumbar Discectomy: Clinical Study. Bioengineering (Basel) 2023; 10:1297. [PMID: 38002421 PMCID: PMC10669401 DOI: 10.3390/bioengineering10111297] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2023] [Revised: 11/03/2023] [Accepted: 11/07/2023] [Indexed: 11/26/2023] Open
Abstract
BACKGROUND The puncture procedure in percutaneous endoscopic lumbar discectomy (PELD) is non-visual, and the learning curve for PELD is steep. METHODS An augmented reality surgical navigation (ARSN) system was designed and utilized in PELD. The system possesses three core functionalities: augmented reality (AR) radiograph overlay, AR puncture needle real-time tracking, and AR navigation. We conducted a prospective randomized controlled trial to evaluate its feasibility and effectiveness. A total of 20 patients with lumbar disc herniation treated with PELD were analyzed. Of these, 10 patients were treated with the guidance of ARSN (ARSN group). The remaining 10 patients were treated using C-arm fluoroscopy guidance (control group). RESULTS The AR radiographs and AR puncture needle were successfully superimposed on the intraoperative videos. The anteroposterior and lateral AR tracking distance errors were 1.55 ± 0.17 mm and 1.78 ± 0.21 mm. The ARSN group exhibited a significant reduction in both the number of puncture attempts (2.0 ± 0.4 vs. 6.9 ± 0.5, p = 0.000) and the number of fluoroscopies (10.6 ± 0.9 vs. 18.5 ± 1.6, p = 0.000) compared with the control group. Complications were not observed in either group. CONCLUSIONS The results indicate that the clinical application of the ARSN system in PELD is effective and feasible.
Collapse
Affiliation(s)
- Xin Huang
- Pain Medicine Center, Peking University Third Hospital, Beijing 100191, China; (X.H.)
| | - Xiaoguang Liu
- Pain Medicine Center, Peking University Third Hospital, Beijing 100191, China; (X.H.)
- Department of Orthopedics, Peking University Third Hospital, Beijing 100191, China
| | - Bin Zhu
- Department of Orthopedics, Beijing Friendship Hospital, Beijing 100052, China
| | - Xiangyu Hou
- Department of Orthopedics, Peking University Third Hospital, Beijing 100191, China
| | - Bao Hai
- Department of Orthopedics, Peking University Third Hospital, Beijing 100191, China
| | - Shuiqing Li
- Pain Medicine Center, Peking University Third Hospital, Beijing 100191, China; (X.H.)
| | - Dongfang Yu
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China
| | - Wenhao Zheng
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China
| | - Ranyang Li
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China
| | - Junjun Pan
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China
| | - Youjie Yao
- Smart Learning Institute, Beijing Normal University, Beijing 100875, China
| | - Zailin Dai
- Smart Learning Institute, Beijing Normal University, Beijing 100875, China
| | - Haijun Zeng
- Smart Learning Institute, Beijing Normal University, Beijing 100875, China
| |
Collapse
|
24
|
Lyuksemburg V, Abou-Hanna J, Marshall JS, Bramlet MT, Waltz AL, Pieta Keller SM, Dwyer A, Orcutt ST. Virtual Reality for Preoperative Planning in Complex Surgical Oncology: A Single-Center Experience. J Surg Res 2023; 291:546-556. [PMID: 37540972 DOI: 10.1016/j.jss.2023.07.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2023] [Revised: 06/28/2023] [Accepted: 07/02/2023] [Indexed: 08/06/2023]
Abstract
INTRODUCTION Virtual reality models (VRM) are three-dimensional (3D) simulations of two-dimensional (2D) images, creating a more accurate mental representation of patient-specific anatomy. METHODS Patients were retrospectively identified who underwent complex oncologic resections whose operations differed from preoperative plans between April 2018 and April 2019. Virtual reality modeling was performed based on preoperative 2D images to assess feasibility of use of this technology to create models. Preoperative plans made based upon 2D imaging versus VRM were compared to the final operations performed. Once the use of VRM to create preoperative plans was deemed feasible, individuals undergoing complex oncologic resections whose operative plans were difficult to define preoperatively were enrolled prospectively from July 2019 to December 2021. Preoperative plans made based upon 2D imaging and VRM by both the operating surgeon and a consulting surgeon were compared to the operation performed. Confidence in each operative plan was also measured. RESULTS Twenty patients were identified, seven retrospective and 13 prospective, with tumors of the liver, pancreas, retroperitoneum, stomach, and soft tissue. Retrospectively, VRM were unable to be created in one patient due to a poor quality 2D image; the remainder (86%) were successfully able to be created and examined. Virtual reality modeling more clearly defined the extent of resection in 50% of successful cases. Prospectively, all VRM were successfully performed. The concordance of the operative plan with VRM was higher than with 2D imaging (92% versus 54% for the operating surgeon and 69% versus 23% for the consulting surgeon). Confidence in the operative plan after VRM compared to 2D imaging also increased for both surgeons (by 15% and 8% for the operating and consulting surgeons, respectively). CONCLUSIONS Virtual reality modeling is feasible and may improve preoperative planning compared to 2D imaging. Further investigation is warranted.
Collapse
Affiliation(s)
- Vadim Lyuksemburg
- Department of Surgery, University of Illinois College Medicine at Peoria, Peoria, Illinois
| | - Jameil Abou-Hanna
- Department of Surgery, University of Illinois College Medicine at Peoria, Peoria, Illinois
| | - J Stephen Marshall
- Department of Surgery, University of Illinois College Medicine at Peoria, Peoria, Illinois
| | - Matthew T Bramlet
- Department of Pediatrics, University of Illinois College of Medicine at Peoria, Peoria, Illinois
| | - Alexa L Waltz
- Jump Trading Simulation & Education Center, OSF HealthCare, Peoria, Illinois
| | | | - Anthony Dwyer
- Department of Surgery, University of Illinois College Medicine at Peoria, Peoria, Illinois
| | - Sonia T Orcutt
- Department of Surgery, University of Illinois College Medicine at Peoria, Peoria, Illinois.
| |
Collapse
|
25
|
Khan T, Zhu TS, Downes T, Cheng L, Kass NM, Andrews EG, Biehl JT. Understanding Effects of Visual Feedback Delay in AR on Fine Motor Surgical Tasks. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:4697-4707. [PMID: 37788206 DOI: 10.1109/tvcg.2023.3320214] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/05/2023]
Abstract
Latency is a pervasive issue in various systems that can significantly impact motor performance and user perception. In medical settings, latency can hinder surgeons' ability to quickly correct movements, resulting in an experience that doesn't align with user expectations and standards of care. Despite numerous studies reporting on the negative effects of latency, there is still a gap in understanding how it impacts the use of augmented reality (AR) in medical settings. This study aims to address this gap by examining how latency impacts motor task performance and subjective perceptions, such as cognitive load, on two display types: a monitor display, traditionally used inside an operating room (OR), and a Microsoft HoloLens 2 display. Our findings indicate that both level of latency and display type impact motor performance, and higher latencies on the HoloLens result in relatively poor performance. However, cognitive load was found to be unrelated to display type or latency, but was dependent on the surgeon's training level. Surgeons did not compromise accuracy to gain more speed and were generally well aware of the latency in the system irrespective of their performance on task. Our study provides valuable insights into acceptable thresholds of latency for AR displays and proposes design implications for the successful implementation and use of AR in surgical settings.
Collapse
|
26
|
de Marinis R, Marigi EM, Atwan Y, Yang L, Oeding JF, Gupta P, Pareek A, Sanchez-Sotelo J, Sperling JW. Current clinical applications of artificial intelligence in shoulder surgery: what the busy shoulder surgeon needs to know and what's coming next. JSES REVIEWS, REPORTS, AND TECHNIQUES 2023; 3:447-453. [PMID: 37928999 PMCID: PMC10625013 DOI: 10.1016/j.xrrt.2023.07.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/07/2023]
Abstract
Background Artificial intelligence (AI) is a continuously expanding field with the potential to transform a variety of industries-including health care-by providing automation, efficiency, precision, accuracy, and decision-making support for simple and complex tasks. Basic knowledge of the key features as well as limitations of AI is paramount to understand current developments in this field and to successfully apply them to shoulder surgery. The purpose of the present review is to provide an overview of AI within orthopedics and shoulder surgery exploring current and forthcoming AI applications. Methods PubMed and Scopus databases were searched to provide a narrative review of the most relevant literature on AI applications in shoulder surgery. Results Despite the enormous clinical and research potential of AI, orthopedic surgery has been a relatively late adopter of AI technologies. Image evaluation, surgical planning, aiding decision-making, and facilitating patient evaluations over time are some of the current areas of development with enormous opportunities to improve surgical practice, research, and education. Furthermore, the advancement of AI-driven strategies has the potential to create a more efficient medical system that may reduce the overall cost of delivering and implementing quality health care for patients with shoulder pathology. Conclusion AI is an expanding field with the potential for broad clinical and research applications in orthopedic surgery. Many challenges still need to be addressed to fully leverage the potential of AI to clinical practice and research such as privacy issues, data ownership, and external validation of the proposed models.
Collapse
Affiliation(s)
- Rodrigo de Marinis
- Department of Orthopedic Surgery, Mayo Clinic, Rochester, MN, USA
- Department of Orthopedic Surgery, Pontificia Universidad Católica de Chile, Santiago, Chile
- Shoulder and Elbow Unit, Hospital Dr. Sótero del Rio, Santiago, Chile
| | - Erick M. Marigi
- Department of Orthopedic Surgery, Mayo Clinic, Rochester, MN, USA
| | - Yousif Atwan
- Department of Orthopedic Surgery, Mayo Clinic, Rochester, MN, USA
| | - Linjun Yang
- Orthopedic Surgery Artificial Intelligence Lab (OSAIL), Mayo Clinic, Rochester, MN, USA
| | - Jacob F. Oeding
- Department of Orthopedic Surgery, Mayo Clinic, Rochester, MN, USA
| | - Puneet Gupta
- Department of Orthopaedic Surgery, George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| | - Ayoosh Pareek
- Department of Orthopaedic Surgery, Hospital for Special Surgery, New York, NY, USA
| | | | - John W. Sperling
- Department of Orthopedic Surgery, Mayo Clinic, Rochester, MN, USA
| |
Collapse
|
27
|
Foley D, Hardacker P, McCarthy M. Emerging Technologies within Spine Surgery. Life (Basel) 2023; 13:2028. [PMID: 37895410 PMCID: PMC10608700 DOI: 10.3390/life13102028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 10/02/2023] [Accepted: 10/07/2023] [Indexed: 10/29/2023] Open
Abstract
New innovations within spine surgery continue to propel the field forward. These technologies improve surgeons' understanding of their patients and allow them to optimize treatment planning both in the operating room and clinic. Additionally, changes in the implants and surgeon practice habits continue to evolve secondary to emerging biomaterials and device design. With ongoing advancements, patients can expect enhanced preoperative decision-making, improved patient outcomes, and better intraoperative execution. Additionally, these changes may decrease many of the most common complications following spine surgery in order to reduce morbidity, mortality, and the need for reoperation. This article reviews some of these technological advancements and how they are projected to impact the field. As the field continues to advance, it is vital that practitioners remain knowledgeable of these changes in order to provide the most effective treatment possible.
Collapse
Affiliation(s)
- David Foley
- Department of Orthopaedic Surgery, Indiana University School of Medicine, Indianapolis, IN 46202, USA
| | - Pierce Hardacker
- Indiana University School of Medicine, Indianapolis, IN 46202, USA;
| | | |
Collapse
|
28
|
Laskay NMB, George JA, Knowlin L, Chang TP, Johnston JM, Godzik J. Optimizing Surgical Performance Using Preoperative Virtual Reality Planning: A Systematic Review. World J Surg 2023; 47:2367-2377. [PMID: 37204439 DOI: 10.1007/s00268-023-07064-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/28/2023] [Indexed: 05/20/2023]
Abstract
BACKGROUND Surgery is often a complex process that requires detailed 3-dimensional anatomical knowledge and rigorous interplay between team members to attain ideal operational efficiency or "flow." Virtual Reality (VR) represents a technology by which to rehearse complex plans and communicate precise steps to a surgical team prior to entering the operating room. The objective of this study was to evaluate the use of VR for preoperative surgical team planning and interdisciplinary communication across all surgical specialties. METHODS A systematic review of the literature was performed examining existing research on VR use for preoperative surgical team planning and interdisciplinary communication across all surgical fields in order to optimize surgical efficiency. MEDLINE, SCOPUS, CINAHL databases were searched from inception to July 31, 2022 using standardized search clauses. A qualitative data synthesis was performed with particular attention to preoperative planning, surgical efficiency optimization, and interdisciplinary collaboration/communication techniques determined a priori. Preferred Reporting Items for Systematic Review and Meta-Analysis guidelines were followed. All included studies were appraised for their quality using the Medical Education Research Study Quality Instrument (MERSQI) tool. RESULTS One thousand and ninety-three non-duplicated articles with abstract and full text availability were identified. Thirteen articles that examined preoperative VR-based planning techniques for optimization of surgical efficiency and/or interdisciplinary communication fulfilled inclusion and exclusion criteria. These studies had a low-to-medium methodological quality with a MERSQI mean score of 10.04 out of 18 (standard deviation 3.61). CONCLUSIONS This review demonstrates that time spent rehearsing and visualizing patient-specific anatomical relationships in VR may improve operative efficiency and communication across multiple surgical specialties.
Collapse
Affiliation(s)
- Nicholas M B Laskay
- Department of Neurosurgery, University of Alabama at Birmingham, 1060 Faculty Office Tower, 1720 2nd Avenue South, Birmingham, AL, 35294-3410, USA.
| | - Jordan A George
- Heersink School of Medicine, University of Alabama at Birmingham, Birmingham, AL, USA
| | - Laquanda Knowlin
- Department of Surgery, Children's Hospital Los Angeles, Los Angeles, CA, USA
| | - Todd P Chang
- Division of Emergency and Transport Medicine, Children's Hospital Los Angeles, Los Angeles, CA, USA
| | - James M Johnston
- Department of Neurosurgery, University of Alabama at Birmingham, 1060 Faculty Office Tower, 1720 2nd Avenue South, Birmingham, AL, 35294-3410, USA
| | - Jakub Godzik
- Department of Neurosurgery, University of Alabama at Birmingham, 1060 Faculty Office Tower, 1720 2nd Avenue South, Birmingham, AL, 35294-3410, USA
| |
Collapse
|
29
|
Huang X, Liu X, Zhu B, Hou X, Hai B, Yu D, Zheng W, Li R, Pan J, Yao Y, Dai Z, Zeng H. Augmented Reality Surgical Navigation in Minimally Invasive Spine Surgery: A Preclinical Study. Bioengineering (Basel) 2023; 10:1094. [PMID: 37760196 PMCID: PMC10525156 DOI: 10.3390/bioengineering10091094] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2023] [Revised: 09/12/2023] [Accepted: 09/15/2023] [Indexed: 09/29/2023] Open
Abstract
BACKGROUND In minimally invasive spine surgery (MISS), where the surgeon cannot directly see the patient's internal anatomical structure, the implementation of augmented reality (AR) technology may solve this problem. METHODS We combined AR, artificial intelligence, and optical tracking to enhance the augmented reality minimally invasive spine surgery (AR-MISS) system. The system has three functions: AR radiograph superimposition, AR real-time puncture needle tracking, and AR intraoperative navigation. The three functions of the system were evaluated through beagle animal experiments. RESULTS The AR radiographs were successfully superimposed on the real intraoperative videos. The anteroposterior (AP) and lateral errors of superimposed AR radiographs were 0.74 ± 0.21 mm and 1.13 ± 0.40 mm, respectively. The puncture needles could be tracked by the AR-MISS system in real time. The AP and lateral errors of the real-time AR needle tracking were 1.26 ± 0.20 mm and 1.22 ± 0.25 mm, respectively. With the help of AR radiographs and AR puncture needles, the puncture procedure could be guided visually by the system in real-time. The anteroposterior and lateral errors of AR-guided puncture were 2.47 ± 0.86 mm and 2.85 ± 1.17 mm, respectively. CONCLUSIONS The results indicate that the AR-MISS system is accurate and applicable.
Collapse
Affiliation(s)
- Xin Huang
- Pain Medicine Center, Peking University Third Hospital, Beijing 100191, China;
| | - Xiaoguang Liu
- Pain Medicine Center, Peking University Third Hospital, Beijing 100191, China;
- Department of Orthopedics, Peking University Third Hospital, Beijing 100191, China; (X.H.); (B.H.)
| | - Bin Zhu
- Department of Orthopedics, Beijing Friendship Hospital, Beijing 100052, China;
| | - Xiangyu Hou
- Department of Orthopedics, Peking University Third Hospital, Beijing 100191, China; (X.H.); (B.H.)
| | - Bao Hai
- Department of Orthopedics, Peking University Third Hospital, Beijing 100191, China; (X.H.); (B.H.)
| | - Dongfang Yu
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China; (D.Y.); (R.L.)
| | - Wenhao Zheng
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China; (D.Y.); (R.L.)
| | - Ranyang Li
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China; (D.Y.); (R.L.)
| | - Junjun Pan
- State Key Laboratory of Virtual Reality Technology and Systems, Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing 100191, China; (D.Y.); (R.L.)
| | - Youjie Yao
- Smart Learning Institute, Beijing Normal University, Beijing 100875, China
| | - Zailin Dai
- Smart Learning Institute, Beijing Normal University, Beijing 100875, China
| | - Haijun Zeng
- Smart Learning Institute, Beijing Normal University, Beijing 100875, China
| |
Collapse
|
30
|
Raffa G, Spiriev T, Zoia C, Aldea CC, Bartek Jr J, Bauer M, Ben-Shalom N, Belo D, Drosos E, Freyschlag CF, Kaprovoy S, Lepic M, Lippa L, Rabiei K, Schwake M, Stengel FC, Stienen MN, Gandía-González ML. The use of advanced technology for preoperative planning in cranial surgery - A survey by the EANS Young Neurosurgeons Committee. BRAIN & SPINE 2023; 3:102665. [PMID: 38021023 PMCID: PMC10668051 DOI: 10.1016/j.bas.2023.102665] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/24/2023] [Revised: 08/16/2023] [Accepted: 08/25/2023] [Indexed: 12/01/2023]
Abstract
Introduction Technological advancements provided several preoperative tools allowing for precise preoperative planning in cranial neurosurgery, aiming to increase the efficacy and safety of surgery. However, little data are available regarding if and how young neurosurgeons are trained in using such technologies, how often they use them in clinical practice, and how valuable they consider these technologies. Research question How frequently these technologies are used during training and clinical practice as well as to how their perceived value can be qualitatively assessed. Materials and methods The Young Neurosurgeons' Committee (YNC) of the European Association of Neurosurgical Societies (EANS) distributed a 14-items survey among young neurosurgeons between June 1st and August 31st, 2022. Results A total of 441 responses were collected. Most responders (42.34%) received "formal" training during their residency. Planning techniques were used mainly in neuro-oncology (90.86%), and 3D visualization of patients' DICOM dataset using open-source software was the most frequently used (>20 times/month, 20.34% of responders). Software for 3D visualization of patients' DICOM dataset was the most valuable technology, especially for planning surgical approach (42.03%). Conversely, simulation based on augmented/mixed/virtual reality was considered the less valuable tool, being rated below sufficiency by 39.7% of responders. Discussion and conclusion Training for using preoperative planning technologies in cranial neurosurgery is provided by neurosurgical residency programs. Software for 3D visualization of DICOM datasets is the most valuable and used tool, especially in neuro-oncology. Interestingly, simulation tools based on augmented/virtual/mixed reality are considered less valuable and, therefore, less used than other technologies.
Collapse
Affiliation(s)
- Giovanni Raffa
- Division of Neurosurgery, BIOMORF Department, University of Messina, Messina, Italy
| | - Toma Spiriev
- Department of Neurosurgery, Acibadem CityClinic Tokuda Hospital Sofia, Bulgaria
| | - Cesare Zoia
- Neurosurgery Unit, Fondazione IRCCS Policlinico San Matteo, Pavia, Italy
| | - Cristina C. Aldea
- Department of Neurosurgery, Cluj County Emergency Hospital, University of Medicine and Pharmacy Iuliu Hatieganu, Cluj-Napoca, Romania
| | - Jiri Bartek Jr
- Department of Clinical Neuroscience, Karolinska Institutet and Department of Neurosurgery, Karolinska University Hospital, Stockholm, Sweden
- Department of Neurosurgery, Rigshospitalet, Copenhagen, Denmark
| | - Marlies Bauer
- Department of Neurosurgery, Medical University of Innsbruck, Innsbruck, Austria
| | - Netanel Ben-Shalom
- Department of Neurosurgery, Rabin Medical Center, Belinson Campus, Petah Tikva, Israel
| | - Diogo Belo
- Neurosurgery Department, Centro Hospitalar Lisboa Norte (CHLN), Lisbon, Portugal
| | | | | | - Stanislav Kaprovoy
- Burdenko Neurosurgical Center, Department of Spinal and Peripheral Nerve Surgery, Department of International Affairs, Moscow, Russia
| | - Milan Lepic
- Clinic for Neurosurgery, Military Medical Academy, Belgrade, Serbia
| | - Laura Lippa
- Dept of Neurosurgery, ASST Ospedale Niguarda, Milano, Italy
| | - Katrin Rabiei
- Institution of Neuroscience & Physiology, Sahlgrenska Academy, Gothenberg, Sweden
- Art Clinic Hospitals, Gothenburg, Sweden
| | - Michael Schwake
- Department of Neurosurgery, University Hospital Muenster, Germany
| | - Felix C. Stengel
- Department of Neurosurgery and Spine Center of Eastern Switzerland, Cantonal Hospital St.Gallen, St.Gallen, Switzerland
| | - Martin N. Stienen
- Department of Neurosurgery and Spine Center of Eastern Switzerland, Cantonal Hospital St.Gallen, St.Gallen, Switzerland
| | - Maria L. Gandía-González
- Department of Neurosurgery, Hospital Universitario La Paz, Idipaz, Madrid, Spain
- University Autonomous of Madrid, Spain
| |
Collapse
|
31
|
Taghian A, Abo-Zahhad M, Sayed MS, Abd El-Malek AH. Virtual and augmented reality in biomedical engineering. Biomed Eng Online 2023; 22:76. [PMID: 37525193 PMCID: PMC10391968 DOI: 10.1186/s12938-023-01138-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2022] [Accepted: 07/12/2023] [Indexed: 08/02/2023] Open
Abstract
BACKGROUND In the future, extended reality technology will be widely used. People will be led to utilize virtual reality (VR) and augmented reality (AR) technologies in their daily lives, hobbies, numerous types of entertainment, and employment. Medical augmented reality has evolved with applications ranging from medical education to picture-guided surgery. Moreover, a bulk of research is focused on clinical applications, with the majority of research devoted to surgery or intervention, followed by rehabilitation and treatment applications. Numerous studies have also looked into the use of augmented reality in medical education and training. METHODS Using the databases Semantic Scholar, Web of Science, Scopus, IEEE Xplore, and ScienceDirect, a scoping review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria. To find other articles, a manual search was also carried out in Google Scholar. This study presents studies carried out over the previous 14 years (from 2009 to 2023) in detail. We classify this area of study into the following categories: (1) AR and VR in surgery, which is presented in the following subsections: subsection A: MR in neurosurgery; subsection B: spine surgery; subsection C: oral and maxillofacial surgery; and subsection D: AR-enhanced human-robot interaction; (2) AR and VR in medical education presented in the following subsections; subsection A: medical training; subsection B: schools and curriculum; subsection C: XR in Biomedicine; (3) AR and VR for rehabilitation presented in the following subsections; subsection A: stroke rehabilitation during COVID-19; subsection B: cancer and VR, and (4) Millimeter-wave and MIMO systems for AR and VR. RESULTS In total, 77 publications were selected based on the inclusion criteria. Four distinct AR and/or VR applications groups could be differentiated: AR and VR in surgery (N = 21), VR and AR in Medical Education (N = 30), AR and VR for Rehabilitation (N = 15), and Millimeter-Wave and MIMO Systems for AR and VR (N = 7), where N is number of cited studies. We found that the majority of research is devoted to medical training and education, with surgical or interventional applications coming in second. The research is mostly focused on rehabilitation, therapy, and clinical applications. Moreover, the application of XR in MIMO has been the subject of numerous research. CONCLUSION Examples of these diverse fields of applications are displayed in this review as follows: (1) augmented reality and virtual reality in surgery; (2) augmented reality and virtual reality in medical education; (3) augmented reality and virtual reality for rehabilitation; and (4) millimeter-wave and MIMO systems for augmented reality and virtual reality.
Collapse
Affiliation(s)
- Aya Taghian
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt.
| | - Mohammed Abo-Zahhad
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electrical Engineering, Assiut University, Assiut, Egypt
| | - Mohammed S Sayed
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
- Department of Electronics and Communications Engineering, Zagazig University, Zagazig, Ash Sharqia, Egypt
| | - Ahmed H Abd El-Malek
- Department of Electronics and Communications Engineering, Egypt-Japan University of Science and Technology, New Borg El-Arab City, Alexandria, Egypt
| |
Collapse
|
32
|
Dong J, Wang F, Xu Y, Gao X, Zhao H, Zhang J, Wang N, Liu Z, Yan X, Jin J, Ji H, Cheng R, Wang L, Qiu Z, Hu S. Using mixed reality technique combines multimodal imaging signatures to adjuvant glioma photodynamic therapy. Front Med (Lausanne) 2023; 10:1171819. [PMID: 37534312 PMCID: PMC10392826 DOI: 10.3389/fmed.2023.1171819] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Accepted: 06/27/2023] [Indexed: 08/04/2023] Open
Abstract
Background Photodynamic therapy (PDT) promotes significant tumor regression and extends the lifetime of patients. The actual operation of PDT often relies on the subjective judgment of experienced neurosurgeons. Patients can benefit more from precisely targeting PDT's key operating zones. Methods We used magnetic resonance imaging scans and created 3D digital models of patient anatomy. Multiple images are aligned and merged in STL format. Neurosurgeons use HoloLens to import reconstructions and assist in PDT execution. Also, immunohistochemistry was used to explore the association of hyperperfusion sites in PDT of glioma with patient survival. Results We constructed satisfactory 3D visualization of glioma models and accurately localized the hyperperfused areas of the tumor. Tumor tissue taken in these areas was rich in CD31, VEGFA and EGFR that were associated with poor prognosis in glioma patients. We report the first study using MR technology combined with PDT in the treatment of glioma. Based on this model, neurosurgeons can focus PDT on the hyperperfused area of the glioma. A direct benefit was expected for the patients in this treatment. Conclusion Using the Mixed Reality technique combines multimodal imaging signatures to adjuvant glioma PDT can better exploit the vascular sealing effect of PDT on glioma.
Collapse
Affiliation(s)
- Jiawei Dong
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, China
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Fang Wang
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, China
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Yuyun Xu
- Cancer Center, Department of Radiology, Zhejiang Provincial People's Hospital, Affiliated People's Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Xin Gao
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Hongtao Zhao
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Jiheng Zhang
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, China
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Nan Wang
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, China
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Zhihui Liu
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, China
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Xiuwei Yan
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, China
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Jiaqi Jin
- Department of Neurosurgery, The Second Affiliated Hospital of Harbin Medical University, Harbin, China
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Hang Ji
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| | - Ruiqi Cheng
- Heilongjiang Tuomeng Technology Co., Ltd, Harbin, China
| | - Lihai Wang
- College of Engineering and Technology, Northeast Forestry University, Harbin, China
| | - Zhaowen Qiu
- College of Information and Computer Engineering, Northeast Forestry University, Harbin, China
| | - Shaoshan Hu
- Cancer Center, Department of Neurosurgery, Zhejiang Provincial People’s Hospital, Affiliated People’s Hospital, Hangzhou Medical College, Hangzhou, Zhejiang, China
| |
Collapse
|
33
|
Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. SENSORS (BASEL, SWITZERLAND) 2023; 23:6202. [PMID: 37448050 DOI: 10.3390/s23136202] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 06/09/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future.
Collapse
Affiliation(s)
- Jenna Seetohul
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| | - Mahmood Shafiee
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
- School of Mechanical Engineering Sciences, University of Surrey, Guildford GU2 7XH, UK
| | - Konstantinos Sirlantzis
- School of Engineering, Technology and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Intelligent Interactions Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| |
Collapse
|
34
|
Wang J, Liu J, Wu L, Tao L, Liu X, Wang Z, Xiong Y. Accuracy of Femoral Tunnel Localization With Mixed Reality Technology-Assisted Single-Bundle ACL Reconstruction. Orthop J Sports Med 2023; 11:23259671231184399. [PMID: 37457048 PMCID: PMC10338724 DOI: 10.1177/23259671231184399] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/21/2023] [Accepted: 04/05/2023] [Indexed: 07/18/2023] Open
Abstract
Background It is clinically challenging to accurately drill femoral and tibial tunnels to reconstruct the anterior cruciate ligament (ACL). Mixed reality (MR) technology, a further development of virtual reality technology, presents virtual scene information in real time and establishes an interactive feedback information loop among the real world, the virtual world, and the user. Purpose/Hypothesis The purpose of this study was to investigate the structural and early clinical outcomes of ACL reconstruction assisted by MR technology. It was hypothesized that MR technology would improve the accuracy of tunnel localization. Study Design Cohort study; Level of evidence, 3. Methods Included were 44 patients at a single institution who underwent arthroscopic single-bundle ACL reconstruction between June 2020 and March 2022. Reconstruction with the aid of MR technology was performed in 21 patients (MR group), and conventional arthroscopic reconstruction was performed in 23 patients. Postoperatively, the parameters related to the bone tunnel positioning were compared by computed tomography imaging with 3-dimensional (3D) reconstruction, and 12-month postoperative clinical outcomes were assessed with the Lysholm and International Knee Documentation Committee scores. Results There was no statistically significant difference in projection angles in the coronal, axial, or sagittal plane between the preoperative virtually created tunnel guide pin and the actual tunnel (P > .05 for all). In the MR group, the center of the femoral tunnel exit was closer to the apex of the lateral femoral condyle along the proximal-distal axis (14.07 ± 4.12 vs 17.49 ± 6.24 mm for the conventional group; P < .05) and the graft bending angle was lower (117.71° ± 8.08° vs 127.81° ± 11.91° for the conventional group; P < .05). The scatterplot of the femoral tunnel location distribution showed that the entrance and exit points in the MR group were more concentrated and closer to the ideal location of the preoperative design than in the conventional group. Patients in both groups had significant preoperative-to-postoperative improvement based on outcome scores (P < .001 for all), with no significant difference between groups. Conclusion ACL reconstruction with the aid of MR technology allowed for more accurate positioning and orientation of the femoral tunnel during surgery when compared with conventional reconstruction.
Collapse
Affiliation(s)
- Jingkun Wang
- Department of Orthopaedics, Daping Hospital, Army Medical University, Chongqing, China
| | - Jun Liu
- Department of Orthopaedics, Daping Hospital, Army Medical University, Chongqing, China
| | - Liming Wu
- Department of Orthopaedics, Daping Hospital, Army Medical University, Chongqing, China
| | - Lun Tao
- Department of Orthopaedics, Daping Hospital, Army Medical University, Chongqing, China
| | - Xiangdong Liu
- Department of Orthopaedics, Daping Hospital, Army Medical University, Chongqing, China
| | - Ziming Wang
- Department of Orthopaedics, Daping Hospital, Army Medical University, Chongqing, China
| | - Yan Xiong
- Department of Orthopaedics, Daping Hospital, Army Medical University, Chongqing, China
| |
Collapse
|
35
|
Dirrichs T, Tietz E, Rüffer A, Hanten J, Nguyen TD, Dethlefsen E, Kuhl CK. Photon-counting versus Dual-Source CT of Congenital Heart Defects in Neonates and Infants: Initial Experience. Radiology 2023; 307:e223088. [PMID: 37219443 DOI: 10.1148/radiol.223088] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
Background Photon-counting CT (PCCT) has been shown to improve cardiovascular CT imaging in adults. Data in neonates, infants, and young children under the age of 3 years are missing. Purpose To compare image quality and radiation dose of ultrahigh-pitch PCCT with that of ultrahigh-pitch dual-source CT (DSCT) in children suspected of having congenital heart defects. Materials and Methods This is a prospective analysis of existing clinical CT studies in children suspected of having congenital heart defects who underwent contrast-enhanced PCCT or DSCT in the heart and thoracic aorta between January 2019 and October 2022. CT dose index and dose-length product were used to calculate effective radiation dose. Signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated by standardized region-of-interest analysis. SNR and CNR dose ratios were calculated. Visual image quality was assessed by four independent readers on a five-point scale: 5, excellent or absent; 4, good or minimal; 3, moderate; 2, limited or substantial; and 1, poor or massive. Results Contrast-enhanced PCCT (n = 30) or DSCT (n = 84) was performed in 113 children (55 female and 58 male participants; median age, 66 days [IQR, 15-270]; median height, 56 cm [IQR, 52-67]; and median weight, 4.5 kg [IQR, 3.4-7.1]). A diagnostic image quality score of at least 3 was obtained in 29 of 30 (97%) with PCCT versus 65 of 84 (77%) with DSCT. Mean overall image quality ratings were higher for PCCT versus DSCT (4.17 vs 3.16, respectively; P < .001). SNR and CNR were higher for PCCT versus DSCT with SNR (46.3 ± 16.3 vs 29.9 ± 15.3, respectively; P = .007) and CNR (62.0 ± 50.3 vs 37.2 ± 20.8, respectively; P = .001). Mean effective radiation doses were similar for PCCT and DSCT (0.50 mSv vs 0.52 mSv; P = .47). Conclusion At a similar radiation dose, PCCT offers a higher SNR and CNR and thus better cardiovascular imaging quality than DSCT in children suspected of having cardiac heart defects. © RSNA, 2023.
Collapse
Affiliation(s)
- Timm Dirrichs
- From the Department of Diagnostic and Interventional Radiology (T.D., E.T., E.D., C.K.K.), Department of Pediatric Heart Surgery (A.R., T.D.N.), and Department of Pediatric Cardiology (J.H.), RWTH Aachen University Hospital, Pauwelsstr 30, 52074 Aachen, Germany
| | - Eric Tietz
- From the Department of Diagnostic and Interventional Radiology (T.D., E.T., E.D., C.K.K.), Department of Pediatric Heart Surgery (A.R., T.D.N.), and Department of Pediatric Cardiology (J.H.), RWTH Aachen University Hospital, Pauwelsstr 30, 52074 Aachen, Germany
| | - André Rüffer
- From the Department of Diagnostic and Interventional Radiology (T.D., E.T., E.D., C.K.K.), Department of Pediatric Heart Surgery (A.R., T.D.N.), and Department of Pediatric Cardiology (J.H.), RWTH Aachen University Hospital, Pauwelsstr 30, 52074 Aachen, Germany
| | - Jens Hanten
- From the Department of Diagnostic and Interventional Radiology (T.D., E.T., E.D., C.K.K.), Department of Pediatric Heart Surgery (A.R., T.D.N.), and Department of Pediatric Cardiology (J.H.), RWTH Aachen University Hospital, Pauwelsstr 30, 52074 Aachen, Germany
| | - Thai Duy Nguyen
- From the Department of Diagnostic and Interventional Radiology (T.D., E.T., E.D., C.K.K.), Department of Pediatric Heart Surgery (A.R., T.D.N.), and Department of Pediatric Cardiology (J.H.), RWTH Aachen University Hospital, Pauwelsstr 30, 52074 Aachen, Germany
| | - Ebba Dethlefsen
- From the Department of Diagnostic and Interventional Radiology (T.D., E.T., E.D., C.K.K.), Department of Pediatric Heart Surgery (A.R., T.D.N.), and Department of Pediatric Cardiology (J.H.), RWTH Aachen University Hospital, Pauwelsstr 30, 52074 Aachen, Germany
| | - Christiane K Kuhl
- From the Department of Diagnostic and Interventional Radiology (T.D., E.T., E.D., C.K.K.), Department of Pediatric Heart Surgery (A.R., T.D.N.), and Department of Pediatric Cardiology (J.H.), RWTH Aachen University Hospital, Pauwelsstr 30, 52074 Aachen, Germany
| |
Collapse
|
36
|
Li H, Zhang P, Wang G, Liu H, Yang X, Wang G, Sun Z. Real-Time Navigation with Guide Template for Pedicle Screw Placement Using an Augmented Reality Head-Mounted Device: A Proof-of-Concept Study. Indian J Orthop 2023; 57:776-781. [PMID: 37128571 PMCID: PMC10147887 DOI: 10.1007/s43465-023-00859-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/11/2022] [Accepted: 02/26/2023] [Indexed: 05/03/2023]
Abstract
Objective This study aims to explore the real-time navigation with guide template using an augmented reality head-mounted device (ARHMD) for pedicle screw placement. Methods The spatial coordinate relationships between augmented reality images and real objects were established through the custom-made guide template, and the registration and tracking were completed using an ARHMD. The feasibility and accuracy of this method were verified by pedicle screw placement in 2 lumbar models. According to the Gertzbein-Robbins grading scale, the accuracy of pedicle screw placement was assessed. The navigation errors were estimated by measuring the deviation values of entry point and trajectory angle. Results A total of 20 pedicle K-wires were placed into L1-L5 in 2 lumbar models, which were successfully completed, with an average time of 11.5 min per model and 69 s per screw. The overall K-wires placement accuracy was 100% (20 screws). The navigation error was 2.77 ± 0.82 mm for the deviation value of entry point, and 3.03° ± 0.94° for the deviation value of trajectory angle. Conclusions The application of an ARHMD combined with guide template for pedicle screw placement is a promising navigation approach.
Collapse
Affiliation(s)
- Haowei Li
- Tsinghua University School of Medicine, Beijing, 100091 China
| | - Peihai Zhang
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| | - Guangzhi Wang
- Tsinghua University School of Medicine, Beijing, 100091 China
| | - Huiting Liu
- Peking Union Medical College Hospital, Beijing, 100730 China
| | - Xuejun Yang
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| | - Guihuai Wang
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| | - Zhenxing Sun
- Department of Neurosurgery, Beijing Tsinghua Changgung Hospital, Tsinghua University, Beijing, 102218 China
| |
Collapse
|
37
|
Ruggiero F, Cercenelli L, Emiliani N, Badiali G, Bevini M, Zucchelli M, Marcelli E, Tarsitano A. Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study. J Clin Med 2023; 12:jcm12072693. [PMID: 37048777 PMCID: PMC10095377 DOI: 10.3390/jcm12072693] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2023] [Revised: 03/29/2023] [Accepted: 03/31/2023] [Indexed: 04/08/2023] Open
Abstract
Background: Augmented reality (AR) allows the overlapping and integration of virtual information with the real environment. The camera of the AR device reads the object and integrates the virtual data. It has been widely applied to medical and surgical sciences in recent years and has the potential to enhance intraoperative navigation. Materials and methods: In this study, the authors aim to assess the accuracy of AR guidance when using the commercial HoloLens 2 head-mounted display (HMD) in pediatric craniofacial surgery. The Authors selected fronto-orbital remodeling (FOR) as the procedure to test (specifically, frontal osteotomy and nasal osteotomy were considered). Six people (three surgeons and three engineers) were recruited to perform the osteotomies on a 3D printed stereolithographic model under the guidance of AR. By means of calibrated CAD/CAM cutting guides with different grooves, the authors measured the accuracy of the osteotomies that were performed. We tested accuracy levels of ±1.5 mm, ±1 mm, and ±0.5 mm. Results: With the HoloLens 2, the majority of the individuals involved were able to successfully trace the trajectories of the frontal and nasal osteotomies with an accuracy level of ±1.5 mm. Additionally, 80% were able to achieve an accuracy level of ±1 mm when performing a nasal osteotomy, and 52% were able to achieve an accuracy level of ±1 mm when performing a frontal osteotomy, while 61% were able to achieve an accuracy level of ±0.5 mm when performing a nasal osteotomy, and 33% were able to achieve an accuracy level of ±0.5 mm when performing a frontal osteotomy. Conclusions: despite this being an in vitro study, the authors reported encouraging results for the prospective use of AR on actual patients.
Collapse
Affiliation(s)
- Federica Ruggiero
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Maxillo-Facial Surgery Unit, AUSL Bologna, 40124 Bologna, Italy
| | - Laura Cercenelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Nicolas Emiliani
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mirko Bevini
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mino Zucchelli
- Pediatric Neurosurgery, IRCCS Istituto delle Scienze Neurologiche di Bologna, Via Altura 3, 40138 Bologna, Italy
| | - Emanuela Marcelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| |
Collapse
|
38
|
Kwon H, Park JY. The Role and Future of Endoscopic Spine Surgery: A Narrative Review. Neurospine 2023; 20:43-55. [PMID: 37016853 PMCID: PMC10080412 DOI: 10.14245/ns.2346236.118] [Citation(s) in RCA: 10] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Accepted: 03/09/2023] [Indexed: 04/03/2023] Open
Abstract
Many types of surgeries are changing from conventional to minimally invasive techniques. Techniques in spine surgery have also changed, with endoscopic spine surgery (ESS) becoming a major surgical technique. Although ESS has advantages such as less soft tissue dissection and normal structure damage, reduced blood loss, less epidural scarring, reduced hospital stay, and earlier functional recovery, it is not possible to replace all spine surgery techniques with ESS. ESS was first used for discectomy in the lumbar spine, but the range of ESS has expanded to cover the entire spine, including the cervical and thoracic spine. With improvements in ESS instruments (optics, endoscope, endoscopic drill and shaver, irrigation pump, and multiportal endoscopic), limitations of ESS have gradually decreased, and it is possible to apply ESS to more spine pathologies. ESS currently incorporates new technologies, such as navigation, augmented and virtual reality, robotics, and 3-dimentional and ultraresolution visualization, to innovate and improve outcomes. In this article, we review the history and current status of ESS, and discuss future goals and possibilities for ESS through comparisons with conventional surgical techniques.
Collapse
Affiliation(s)
- Hyungjoo Kwon
- Department of Neurosurgery, Nowon Eulji Medical Center, Eulji University School of Medicine, Seoul, Korea
| | - Jeong-Yoon Park
- Department of Neurosurgery, Spine and Spinal Cord Institute, Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul, Korea
| |
Collapse
|
39
|
Augmented Reality in HBP surgery. Technology at your fingertips. Cir Esp 2023; 101:312-318. [PMID: 36781048 DOI: 10.1016/j.cireng.2023.02.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2022] [Accepted: 10/30/2022] [Indexed: 02/13/2023]
Abstract
Augmented reality is a technology that opens new possibilities in surgery. We present our experience in a hepatobiliary-pancreatic surgery unit in terms of preoperative planning, intraoperative support and teaching. For surgical planning, we have used 3D CT and MRI reconstructions to evaluate complex cases, which has made the interpretation of the anatomy more precise and the planning of the technique simpler. At an intraoperative level, it provides for remote holographic connection between specialists, the substitution of physical elements for virtual elements, and the use of virtual consultation models and surgical guides. In teaching, new lessons include sharing live video of surgery with the support of virtual elements for a better student understanding. As the experience has been satisfactory, augmented reality could be applied in the future to improve the results of hepatobiliary-pancreatic surgery.
Collapse
|
40
|
Human body donation and surgical training: a narrative review with global perspectives. Anat Sci Int 2023; 98:1-11. [PMID: 36227535 PMCID: PMC9845172 DOI: 10.1007/s12565-022-00689-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Accepted: 10/06/2022] [Indexed: 02/01/2023]
Abstract
Utilization of human material in surgical simulation training has been well-established as an effective teaching method. Despite the value of donor-based surgical simulation training, its application may be hampered by difficulties regarding access to donated bodies. Therefore, the aim of this review is to assess body donation and body acquisition practices with regard to surgical simulation training programs around the world. The results of this review highlight discrepancies regarding body donation practices and surgical simulation programs among continents and countries. The utilization of donor bodies in surgical simulation appears to mirror body donation practices. In countries that rely mostly or exclusively upon unclaimed bodies or executed criminals, there are scant reports of donor-based surgical simulation programs. In countries where willed-body donation is the principal source of human material, there tend to be many surgical simulation programs that incorporate human material as part of surgical training. This review suggests that, in anatomical and surgical education, the utilization of active willed-body donation programs, as opposed to the utilization of unclaimed human bodies, positively corresponds with the development of beneficial donor-based surgical simulation programs. Likewise, donor-based surgical simulation training programs may have an influence on the perpetualization of willed-body donations.
Collapse
|
41
|
Jun EK, Lim S, Seo J, Lee KH, Lee JH, Lee D, Koh JC. Augmented Reality-Assisted Navigation System for Transforaminal Epidural Injection. J Pain Res 2023; 16:921-931. [PMID: 36960464 PMCID: PMC10029754 DOI: 10.2147/jpr.s400955] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2022] [Accepted: 03/07/2023] [Indexed: 03/19/2023] Open
Abstract
Purpose Multiple studies have attempted to demonstrate the benefits of augmented reality (AR)-assisted navigation systems in surgery. Lumbosacral transforaminal epidural injection is an effective treatment commonly used in patients with radiculopathy due to spinal degenerative pathologies. However, few studies have applied AR-assisted navigation systems to this procedure. The study aimed to investigate the safety and effectiveness of an AR-assisted navigation system for transforaminal epidural injection. Patients and Methods Through a real-time tracking system and a wireless network to the head-mounted display, computed tomography images of the spine and the path of a spinal needle to the target were visualized on a torso phantom with respiration movements installed. From L1/L2 to L5/S1, needle insertions were performed using an AR-assisted system on the left side of the phantom, and the conventional method was performed on the right side. Results The procedure duration was approximately three times shorter, and the number of radiographs required was reduced in the experimental group compared to the control group. The distance from the needle tips to the target areas in the plan showed no significant difference between the two groups. (AR group 1.7 ± 2.3mm, control group 3.2 ± 2.8mm, P value 0.067). Conclusion An AR-assisted navigation system may be used to reduce the time required for spinal interventions and ensure the safety of patients and physicians in view of radiation exposure. Further studies are essential to apply AR-assisted navigation systems to spine interventions.
Collapse
Affiliation(s)
- Eun Kyung Jun
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
| | - Sunghwan Lim
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul, Korea
| | - Joonho Seo
- Department of Medical Assistant Robot, Korea Institute of Machinery and Materials, Daegu, Korea
| | - Kae Hong Lee
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
| | - Jae Hee Lee
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
| | - Deukhee Lee
- Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul, Korea
- Correspondence: Deukhee Lee, Center for Bionics, Korea Institute of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, Seoul, 136-791, Republic of Korea, Tel +82-2-958-5633, Fax +82-2-920-2275, Email
| | - Jae Chul Koh
- Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea
- Jae Chul Koh, Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, 73, Goryeodae-ro, Seongbukgu, Seoul, 02841, Korea, Tel +82-2-920-5632, Fax +82-2-920-2275, Email
| |
Collapse
|
42
|
Iwanaga J, Muo EC, Tabira Y, Watanabe K, Tubbs SJ, D'Antoni AV, Rajaram-Gilkes M, Loukas M, Khalil MK, Tubbs RS. Who really needs a Metaverse in anatomy education? A review with preliminary survey results. Clin Anat 2023; 36:77-82. [PMID: 36087277 DOI: 10.1002/ca.23949] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Accepted: 09/06/2022] [Indexed: 12/14/2022]
Abstract
The term Metaverse ("meta" defined as beyond, transcendence or virtuality, and "verse" meaning universe or world) denotes a "virtual reality space" for anatomy teaching. To ascertain how many anatomists are familiar or are using this adjunct in teaching, we conducted a short survey at the 2022 annual meeting of the American Association of Clinical Anatomists (AACA). Interestingly, only six respondents (9.4%) had used a Metaverse for teaching anatomy. Moreover, the vast majority of attendees were anatomy educators or basic science faculty, but not practicing physicians/surgeons or other actively practicing health care professionals; a group where this technology has been used much more commonly. The present manuscript was authored by anatomy educators, practicing physicians and other actively practicing health care professionals with backgrounds in diverse medical fields, that is, anatomists, medical doctors, physician assistants, dentists, occupational therapists, physical therapists, chiropractors, veterinarians, and medical students. Many of these authors have used or have been exposed to a Metaverse in the clinical realm. Therefore, the aim of the paper is to better understand those who are knowledgeable of a Metaverse and its use in anatomy education, and to provide ways forward for using such technology in this discipline.
Collapse
Affiliation(s)
- Joe Iwanaga
- Department of Neurosurgery, Tulane Center for Clinical Neurosciences, Tulane University School of Medicine, New Orleans, Louisiana, USA.,Department of Neurology, Tulane Center for Clinical Neurosciences, Tulane University School of Medicine, New Orleans, Louisiana, USA.,Department of Structural & Cellular Biology, Tulane University School of Medicine, New Orleans, Louisiana, USA.,Department of Oral and Maxillofacial Anatomy, Graduate School of Medical and Dental Sciences, Tokyo Medical and Dental University, Tokyo, Japan
| | - Edward C Muo
- Tulane University School of Medicine, New Orleans, Louisiana, USA
| | - Yoko Tabira
- Division of Gross and Clinical Anatomy, Department of Anatomy, Kurume University School of Medicine, Fukuoka, Japan
| | - Koichi Watanabe
- Division of Gross and Clinical Anatomy, Department of Anatomy, Kurume University School of Medicine, Fukuoka, Japan
| | - Susan J Tubbs
- Department of Neurosurgery, Tulane Center for Clinical Neurosciences, Tulane University School of Medicine, New Orleans, Louisiana, USA
| | - Anthony V D'Antoni
- Physician Assistant Program, Wagner College, Staten Island, New York, USA.,Division of Anatomy, Department of Radiology, Weill Cornell Medicine, New York, New York, USA
| | - Mathangi Rajaram-Gilkes
- Anatomical Sciences, Department of Medical Education, Geisinger Commonwealth School of Medicine, Scranton, Pennsylvania, USA
| | - Marios Loukas
- Department of Anatomical Sciences, St. George's University, St. George's, Grenada
| | - Mohammed K Khalil
- Biomedical Sciences, University of South Carolina, School of Medicine Greenville, Greenville, South Carolina, USA
| | - R Shane Tubbs
- Department of Neurosurgery, Tulane Center for Clinical Neurosciences, Tulane University School of Medicine, New Orleans, Louisiana, USA.,Department of Neurology, Tulane Center for Clinical Neurosciences, Tulane University School of Medicine, New Orleans, Louisiana, USA.,Department of Structural & Cellular Biology, Tulane University School of Medicine, New Orleans, Louisiana, USA.,Department of Anatomical Sciences, St. George's University, St. George's, Grenada.,Department of Otorhinolaryngology, Graduate School of Medical Sciences, Kyusyu University, Fukuoka, Japan.,Department of Surgery, Tulane University School of Medicine, New Orleans, Louisiana, USA.,Department of Neurosurgery and Ochsner Neuroscience Institute, Ochsner Health System, New Orleans, Louisiana, USA.,University of Queensland, Brisbane, Australia
| |
Collapse
|
43
|
Berger MF, Winter R, Tuca AC, Michelitsch B, Schenkenfelder B, Hartmann R, Giretzlehner M, Reishofer G, Kamolz LP, Lumenta DB. Workflow assessment of an augmented reality application for planning of perforator flaps in plastic reconstructive surgery: Game or game changer? Digit Health 2023; 9:20552076231173554. [PMID: 37179745 PMCID: PMC10170605 DOI: 10.1177/20552076231173554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Accepted: 04/14/2023] [Indexed: 05/15/2023] Open
Abstract
Objective In contrast to the rising amount of financial investments for research and development in medical technology worldwide is the lack of usability and clinical readiness of the produced systems. We evaluated an augmented reality (AR) setup under development for preoperative perforator vessel mapping for elective autologous breast reconstruction. Methods In this grant-supported research pilot, we used magnetic resonance angiography data (MR-A) of the trunk to superimpose the scans on the corresponding patients with hands-free AR goggles to identify regions-of-interest for surgical planning. Perforator location was assessed using MR-A imaging (MR-A projection) and Doppler ultrasound data (3D distance) and confirmed intraoperatively in all cases. We evaluated usability (System Usability Scale, SUS), data transfer load and documented personnel hours for software development, correlation of image data, as well as processing duration to clinical readiness (time from MR-A to AR projections per scan). Results All perforator locations were confirmed intraoperatively, and we found a strong correlation between MR-A projection and 3D distance measurements (Spearman r = 0.894). The overall usability (SUS) was 67 ± 10 (=moderate to good). The presented setup for AR projections took 173 min to clinical readiness (=availability on AR device per patient). Conclusion In this pilot, we calculated development investments based on project-approved grant-funded personnel hours with a moderate to good usability outcome resulting from some limitations: assessment was based on one-time testing with no previous training, a time lag of AR visualizations on the body and difficulties in spatial AR orientation. The use of AR systems can provide new opportunities for future surgical planning, but has more potential for educational (e.g., patient information) or training purposes of medical under- and postgraduates (spatial recognition of imaging data associated with anatomical structures and operative planning). We expect future usability improvements with refined user interfaces, faster AR hardware and artificial intelligence-enhanced visualization techniques.
Collapse
Affiliation(s)
- Matthias Fabian Berger
- Research Unit for Digital Surgery, Division of Plastic, Aesthetic and Reconstructive Surgery, Department of Surgery, Medical University of Graz, Graz, Austria
| | - Raimund Winter
- Research Unit for Digital Surgery, Division of Plastic, Aesthetic and Reconstructive Surgery, Department of Surgery, Medical University of Graz, Graz, Austria
| | - Alexandru-Cristian Tuca
- Research Unit for Digital Surgery, Division of Plastic, Aesthetic and Reconstructive Surgery, Department of Surgery, Medical University of Graz, Graz, Austria
| | - Birgit Michelitsch
- Research Unit for Digital Surgery, Division of Plastic, Aesthetic and Reconstructive Surgery, Department of Surgery, Medical University of Graz, Graz, Austria
| | | | | | | | - Gernot Reishofer
- Radiology Lab, Department of Radiology, Medical University of Graz, Graz, Austria
| | - Lars-Peter Kamolz
- Research Unit for Digital Surgery, Division of Plastic, Aesthetic and Reconstructive Surgery, Department of Surgery, Medical University of Graz, Graz, Austria
| | - David Benjamin Lumenta
- Research Unit for Digital Surgery, Division of Plastic, Aesthetic and Reconstructive Surgery, Department of Surgery, Medical University of Graz, Graz, Austria
| |
Collapse
|
44
|
Chen X, Sakai D, Fukuoka H, Shirai R, Ebina K, Shibuya S, Sase K, Tsujita T, Abe T, Oka K, Konno A. Basic Experiments Toward Mixed Reality Dynamic Navigation for Laparoscopic Surgery. JOURNAL OF ROBOTICS AND MECHATRONICS 2022. [DOI: 10.20965/jrm.2022.p1253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
Laparoscopic surgery is a minimally invasive procedure that is performed by viewing endoscopic camera images. However, the limited field of view of endoscopic cameras makes laparoscopic surgery difficult. To provide more visual information during laparoscopic surgeries, augmented reality (AR) surgical navigation systems have been developed to visualize the positional relationship between the surgical field and organs based on preoperative medical images of a patient. However, since earlier studies used preoperative medical images, the navigation became inaccurate as the surgery progressed because the organs were displaced and deformed during surgery. To solve this problem, we propose a mixed reality (MR) surgery navigation system in which surgical instruments are tracked by a motion capture (Mocap) system; we also evaluated the contact between the instruments and organs and simulated and visualized the deformation of the organ caused by the contact. This paper describes a method for the numerical calculation of the deformation of a soft body. Then, the basic technology of MR and projection mapping is presented for MR surgical navigation. The accuracy of the simulated and visualized deformations is evaluated through basic experiments using a soft rectangular cuboid object.
Collapse
|
45
|
Ceccariglia F, Cercenelli L, Badiali G, Marcelli E, Tarsitano A. Application of Augmented Reality to Maxillary Resections: A Three-Dimensional Approach to Maxillofacial Oncologic Surgery. J Pers Med 2022; 12:jpm12122047. [PMID: 36556268 PMCID: PMC9785494 DOI: 10.3390/jpm12122047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 12/03/2022] [Accepted: 12/07/2022] [Indexed: 12/14/2022] Open
Abstract
In the relevant global context, although virtual reality, augmented reality, and mixed reality have been emerging methodologies for several years, only now have technological and scientific advances made them suitable for revolutionizing clinical care and medical settings through the provision of advanced features and improved healthcare services. Over the past fifteen years, tools and applications using augmented reality (AR) have been designed and tested in the context of various surgical and medical disciplines, including maxillofacial surgery. The purpose of this paper is to show how a marker-less AR guidance system using the Microsoft® HoloLens 2 can be applied in mandible and maxillary demolition surgery to guide maxillary osteotomies. We describe three mandibular and maxillary oncologic resections performed during 2021 using AR support. In these three patients, we applied a marker-less tracking method based on recognition of the patient's facial profile. The surgeon, using HoloLens 2 smart glasses, could see the virtual surgical planning superimposed on the patient's anatomy. We showed that performing osteotomies under AR guidance is feasible and viable, as demonstrated by comparison with osteotomies performed using CAD-CAM cutting guides. This technology has advantages and disadvantages. However, further research is needed to improve the stability and robustness of the marker-less tracking method applied to patient face recognition.
Collapse
Affiliation(s)
- Francesco Ceccariglia
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Correspondence: ; Tel.: +39-051-2144197
| | - Laura Cercenelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| | - Emanuela Marcelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| |
Collapse
|
46
|
Current and Emerging Approaches for Spine Tumor Treatment. Int J Mol Sci 2022; 23:ijms232415680. [PMID: 36555324 PMCID: PMC9779730 DOI: 10.3390/ijms232415680] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 12/02/2022] [Accepted: 12/07/2022] [Indexed: 12/14/2022] Open
Abstract
Spine tumors represent a significant social and medical problem, affecting the quality of life of thousands of patients and imposing a burden on healthcare systems worldwide. Encompassing a wide range of diseases, spine tumors require prompt multidisciplinary treatment strategies, being mainly approached through chemotherapy, radiotherapy, and surgical interventions, either alone or in various combinations. However, these conventional tactics exhibit a series of drawbacks (e.g., multidrug resistance, tumor recurrence, systemic adverse effects, invasiveness, formation of large bone defects) which limit their application and efficacy. Therefore, recent research focused on finding better treatment alternatives by utilizing modern technologies to overcome the challenges associated with conventional treatments. In this context, the present paper aims to describe the types of spine tumors and the most common current treatment alternatives, further detailing the recent developments in anticancer nanoformulations, personalized implants, and enhanced surgical techniques.
Collapse
|
47
|
Rong K, Wu X, Xia Q, Chen J, Fei T, Li X, Jiang W. A Systematic Study to Compare the Precise Implantation of Hololens 2 Assisted with Acetabular Prosthesis for Total Hip Replacement. J BIOMATER TISS ENG 2022. [DOI: 10.1166/jbt.2022.3212] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
This study aims to evaluate the accuracy of the precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. A total of 80 orthopaedic doctors from our hospital are enrolled in this systematic study and these doctors are divided into following four
groups based on the experience of doctors treatment for orthopaedic patients and the Hololens 2 assisted:Rich experienced group with Hololens 2, rich experienced group without Hololens 2, inexperienced group with Hololens 2, inexperienced group without Hololens 2. The abduction angle, the
anteversion angle, the offset degree in the abduction angle, the offset degree in the anteversion angle in four groups are presented and these result are used to evaluate the accuracy of precise implantation of Hololens 2 assisted with acetabular prosthesis for total hip replacement. Finally,
all date in this study is collected and analyzed. The total of 80 physicians are included in this study. The results show that the outcomes between rich experienced group with Hololens 2 and rich experienced group without Hololens 2 are significant difference, and the outcomes between inexperienced
group with Hololens 2 and inexperienced group without Hololens 2 are significant difference. The result between any other two groups is no significant difference. Hololens 2 assisted with acetabular prosthesis for total hip replacement can improve the accuracy.
Collapse
Affiliation(s)
- Ke Rong
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Xuhua Wu
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Qingquan Xia
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Jie Chen
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| | - Teng Fei
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Xujun Li
- Department of Orthopedics, Minhang Hospital, Fudan University, Shanghai, 201199, China
| | - Weimin Jiang
- Department of Orthopedics, The First Affiliated Hospital of Soochow University, Soochow, 215006, China
| |
Collapse
|
48
|
Choi JY, Park SM, Kim HJ, Yeom JS. Recent Updates on Minimally Invasive Spine Surgery: Techniques, Technologies, and Indications. Asian Spine J 2022; 16:1013-1021. [PMID: 36573300 PMCID: PMC9827213 DOI: 10.31616/asj.2022.0436] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/30/2022] [Accepted: 12/02/2022] [Indexed: 12/28/2022] Open
Abstract
A number of minimally invasive spine surgeries (MISSs) have been developed to address the drawbacks of open spine surgery. Their advantages include small skin incisions, reduction in tissue damage, quick recovery, and short hospital stay. However, the clinical outcomes are comparable to open surgery. There was a cap on the number of indications that could be set for all spinal illnesses. The indications for MISSs have been expanding owing to mechanical and technological advances in medical equipment. Thus, this review presents the various MISSs developed to date, surgical indications, surgical techniques, and their advantages and disadvantages.
Collapse
Affiliation(s)
- Jun-Young Choi
- Spine Center and Department of Orthopaedic Surgery, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam, Korea
| | - Sang-Min Park
- Spine Center and Department of Orthopaedic Surgery, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam, Korea,Corresponding author: Sang-Min Park Spine Center and Department of Orthopaedic Surgery, Seoul National University Bundang Hospital, Seoul National University College of Medicine, 82 Gumi-ro 173beon-gil, Bundang-gu, Seongnam 13620, Korea Tel: +82-31-787-7208, Fax: +82-31-787-4056, E-mail:
| | - Ho-Joong Kim
- Spine Center and Department of Orthopaedic Surgery, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam, Korea
| | - Jin S. Yeom
- Spine Center and Department of Orthopaedic Surgery, Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam, Korea
| |
Collapse
|
49
|
Tigchelaar SS, Medress ZA, Quon J, Dang P, Barbery D, Bobrow A, Kin C, Louis R, Desai A. Augmented Reality Neuronavigation for En Bloc Resection of Spinal Column Lesions. World Neurosurg 2022; 167:102-110. [PMID: 36096393 DOI: 10.1016/j.wneu.2022.08.143] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 08/28/2022] [Accepted: 08/30/2022] [Indexed: 11/22/2022]
Abstract
BACKGROUND Primary tumors involving the spine are relatively rare but represent surgically challenging procedures with high patient morbidity. En bloc resection of these tumors necessitates large exposures, wide tumor margins, and poses risks to functionally relevant anatomical structures. Augmented reality neuronavigation (ARNV) represents a paradigm shift in neuronavigation, allowing on-demand visualization of 3D navigation data in real-time directly in line with the operative field. METHODS Here, we describe the first application of ARNV to perform distal sacrococcygectomies for the en bloc removal of sacral and retrorectal lesions involving the coccyx in 2 patients, as well as a thoracic 9-11 laminectomy with costotransversectomy for en bloc removal of a schwannoma in a third patient. RESULTS In our experience, ARNV allowed our teams to minimize the length of the incision, reduce the extent of bony resection, and enhanced visualization of critical adjacent anatomy. All tumors were resected en bloc, and the patients recovered well postoperatively, with no known complications. Pathologic analysis confirmed the en bloc removal of these lesions with negative margins. CONCLUSIONS We conclude that ARNV is an effective strategy for the precise, en bloc removal of spinal lesions including both sacrococcygeal tumors involving the retrorectal space and thoracic schwannomas.
Collapse
Affiliation(s)
- Seth S Tigchelaar
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA.
| | - Zachary A Medress
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Jennifer Quon
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| | - Phuong Dang
- Surgical Theater, Inc., Cleveland, Ohio, USA
| | | | | | - Cindy Kin
- Department of Surgery, Stanford University Medical Center, Stanford, California, USA
| | - Robert Louis
- The Brain and Spine Center, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA; Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach, Newport Beach, California, USA
| | - Atman Desai
- Department of Neurosurgery, Stanford University Medical Center, Stanford, California, USA
| |
Collapse
|
50
|
Wang G, Badal A, Jia X, Maltz JS, Mueller K, Myers KJ, Niu C, Vannier M, Yan P, Yu Z, Zeng R. Development of metaverse for intelligent healthcare. NAT MACH INTELL 2022; 4:922-929. [PMID: 36935774 PMCID: PMC10015955 DOI: 10.1038/s42256-022-00549-6] [Citation(s) in RCA: 30] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Accepted: 09/16/2022] [Indexed: 11/16/2022]
Abstract
The metaverse integrates physical and virtual realities, enabling humans and their avatars to interact in an environment supported by technologies such as high-speed internet, virtual reality, augmented reality, mixed and extended reality, blockchain, digital twins and artificial intelligence (AI), all enriched by effectively unlimited data. The metaverse recently emerged as social media and entertainment platforms, but extension to healthcare could have a profound impact on clinical practice and human health. As a group of academic, industrial, clinical and regulatory researchers, we identify unique opportunities for metaverse approaches in the healthcare domain. A metaverse of 'medical technology and AI' (MeTAI) can facilitate the development, prototyping, evaluation, regulation, translation and refinement of AI-based medical practice, especially medical imaging-guided diagnosis and therapy. Here, we present metaverse use cases, including virtual comparative scanning, raw data sharing, augmented regulatory science and metaversed medical intervention. We discuss relevant issues on the ecosystem of the MeTAI metaverse including privacy, security and disparity. We also identify specific action items for coordinated efforts to build the MeTAI metaverse for improved healthcare quality, accessibility, cost-effectiveness and patient satisfaction.
Collapse
Affiliation(s)
- Ge Wang
- Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA
| | - Andreu Badal
- Division of Imaging, Diagnostics and Software Reliability, OSEL, CDRH, US Food and Drug Administration, Silver Spring, MD, USA
| | - Xun Jia
- Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD, USA
| | - Jonathan S. Maltz
- Molecular Imaging and Computed Tomography, GE Healthcare, Waukesha, WI, USA
| | - Klaus Mueller
- Computer Science Department, Stony Brook University, Stony Brook, NY, USA
| | | | - Chuang Niu
- Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA
| | | | - Pingkun Yan
- Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA
| | - Zhou Yu
- Canon Medical Research USA, Vernon Hills, IL, USA
| | - Rongping Zeng
- Division of Imaging, Diagnostics and Software Reliability, OSEL, CDRH, US Food and Drug Administration, Silver Spring, MD, USA
| |
Collapse
|