51
|
Hersh A, Mahapatra S, Weber-Levine C, Awosika T, Theodore JN, Zakaria HM, Liu A, Witham TF, Theodore N. Augmented Reality in Spine Surgery: A Narrative Review. HSS J 2021; 17:351-358. [PMID: 34539277 PMCID: PMC8436352 DOI: 10.1177/15563316211028595] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Augmented reality (AR) navigation refers to novel technologies that superimpose images, such as radiographs and navigation pathways, onto a view of the operative field. The development of AR navigation has focused on improving the safety and efficacy of neurosurgical and orthopedic procedures. In this review, the authors focus on 3 types of AR technology used in spine surgery: AR surgical navigation, microscope-mediated heads-up display, and AR head-mounted displays. Microscope AR and head-mounted displays offer the advantage of reducing attention shift and line-of-sight interruptions inherent in traditional navigation systems. With the U.S. Food and Drug Administration's recent clearance of the XVision AR system (Augmedics, Arlington Heights, IL), the adoption and refinement of AR technology by spine surgeons will only accelerate.
Collapse
Affiliation(s)
- Andrew Hersh
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Smruti Mahapatra
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Carly Weber-Levine
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Tolulope Awosika
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | | | - Hesham M Zakaria
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Ann Liu
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Timothy F Witham
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Nicholas Theodore
- Department of Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| |
Collapse
|
52
|
Augmented reality-navigated pedicle screw placement: a cadaveric pilot study. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2021; 30:3731-3737. [PMID: 34350487 DOI: 10.1007/s00586-021-06950-w] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/08/2020] [Revised: 11/04/2020] [Accepted: 07/25/2021] [Indexed: 10/20/2022]
Abstract
PURPOSE Augmented reality (AR) is an emerging technology with great potential for surgical navigation through its ability to provide 3D holographic projection of otherwise hidden anatomical information. This pilot cadaver study investigated the feasibility and accuracy of one of the first holographic navigation techniques for lumbar pedicle screw placement. METHODS Lumbar computer tomography scans (CT) of two cadaver specimens and their reconstructed 3D models were used for pedicle screw trajectory planning. Planned trajectories and 3D models were subsequently uploaded to an AR head-mounted device. Randomly, k-wires were placed either into the left or the right pedicle of a vertebra (L1-5) with or without AR-navigation (by holographic projection of the planned trajectory). CT-scans were subsequently performed to assess accuracy of both techniques. RESULTS A total of 18 k-wires could be placed (8 navigated, 10 free hand) by two experienced spine surgeons. In two vertebrae, the AR-navigation was aborted because the registration of the preoperative plan with the intraoperative anatomy was imprecise due to a technical failure. The average differences of the screw entry points between planning and execution were 4.74 ± 2.37 mm in the freehand technique and 5.99 ± 3.60 mm in the AR-navigated technique (p = 0.39). The average deviation from the planned trajectories was 11.21° ± 7.64° in the freehand technique and 5.88° ± 3.69° in the AR-navigated technique (p = 0.09). CONCLUSION This pilot study demonstrates improved angular precision in one of the first AR-navigated pedicle screw placement studies worldwide. Technical shortcomings need to be eliminated before potential clinical applications.
Collapse
|
53
|
Yanni DS, Ozgur BM, Louis RG, Shekhtman Y, Iyer RR, Boddapati V, Iyer A, Patel PD, Jani R, Cummock M, Herur-Raman A, Dang P, Goldstein IM, Brant-Zawadzki M, Steineke T, Lenke LG. Real-time navigation guidance with intraoperative CT imaging for pedicle screw placement using an augmented reality head-mounted display: a proof-of-concept study. Neurosurg Focus 2021; 51:E11. [PMID: 34333483 DOI: 10.3171/2021.5.focus21209] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2021] [Accepted: 05/17/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Augmented reality (AR) has the potential to improve the accuracy and efficiency of instrumentation placement in spinal fusion surgery, increasing patient safety and outcomes, optimizing ergonomics in the surgical suite, and ultimately lowering procedural costs. The authors sought to describe the use of a commercial prototype Spine AR platform (SpineAR) that provides a commercial AR head-mounted display (ARHMD) user interface for navigation-guided spine surgery incorporating real-time navigation images from intraoperative imaging with a 3D-reconstructed model in the surgeon's field of view, and to assess screw placement accuracy via this method. METHODS Pedicle screw placement accuracy was assessed and compared with literature-reported data of the freehand (FH) technique. Accuracy with SpineAR was also compared between participants of varying spine surgical experience. Eleven operators without prior experience with AR-assisted pedicle screw placement took part in the study: 5 attending neurosurgeons and 6 trainees (1 neurosurgical fellow, 1 senior orthopedic resident, 3 neurosurgical residents, and 1 medical student). Commercially available 3D-printed lumbar spine models were utilized as surrogates of human anatomy. Among the operators, a total of 192 screws were instrumented bilaterally from L2-5 using SpineAR in 24 lumbar spine models. All but one trainee also inserted 8 screws using the FH method. In addition to accuracy scoring using the Gertzbein-Robbins grading scale, axial trajectory was assessed, and user feedback on experience with SpineAR was collected. RESULTS Based on the Gertzbein-Robbins grading scale, the overall screw placement accuracy using SpineAR among all users was 98.4% (192 screws). Accuracy for attendings and trainees was 99.1% (112 screws) and 97.5% (80 screws), respectively. Accuracy rates were higher compared with literature-reported lumbar screw placement accuracy using FH for attendings (99.1% vs 94.32%; p = 0.0212) and all users (98.4% vs 94.32%; p = 0.0099). The percentage of total inserted screws with a minimum of 5° medial angulation was 100%. No differences were observed between attendings and trainees or between the two methods. User feedback on SpineAR was generally positive. CONCLUSIONS Screw placement was feasible and accurate using SpineAR, an ARHMD platform with real-time navigation guidance that provided a favorable surgeon-user experience.
Collapse
Affiliation(s)
- Daniel S Yanni
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and.,2Disc Comfort, Inc., Newport Beach, California
| | - Burak M Ozgur
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and
| | - Robert G Louis
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and
| | - Yevgenia Shekhtman
- 3Neuroscience Institute, Hackensack Meridian JFK Medical Center, Edison; and
| | - Rajiv R Iyer
- 4Department of Orthopedic Surgery, Columbia University; and
| | | | - Asha Iyer
- 3Neuroscience Institute, Hackensack Meridian JFK Medical Center, Edison; and
| | - Purvee D Patel
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Raja Jani
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Matthew Cummock
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Aalap Herur-Raman
- 6George Washington University School of Medicine, Washington, DC; and
| | | | - Ira M Goldstein
- 5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Michael Brant-Zawadzki
- 1Pickup Family Neurosciences Institute, Hoag Memorial Hospital Presbyterian Newport Beach; and
| | - Thomas Steineke
- 3Neuroscience Institute, Hackensack Meridian JFK Medical Center, Edison; and
| | - Lawrence G Lenke
- 4Department of Orthopedic Surgery, Columbia University; and.,8Department of Neurological Surgery, NewYork-Presbyterian/Allen Hospital, New York, New York
| |
Collapse
|
54
|
Ivan ME, Eichberg DG, Di L, Shah AH, Luther EM, Lu VM, Komotar RJ, Urakov TM. Augmented reality head-mounted display-based incision planning in cranial neurosurgery: a prospective pilot study. Neurosurg Focus 2021; 51:E3. [PMID: 34333466 DOI: 10.3171/2021.5.focus20735] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Accepted: 05/13/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Monitor and wand-based neuronavigation stations (MWBNSs) for frameless intraoperative neuronavigation are routinely used in cranial neurosurgery. However, they are temporally and spatially cumbersome; the OR must be arranged around the MWBNS, at least one hand must be used to manipulate the MWBNS wand (interrupting a bimanual surgical technique), and the surgical workflow is interrupted as the surgeon stops to "check the navigation" on a remote monitor. Thus, there is need for continuous, real-time, hands-free, neuronavigation solutions. Augmented reality (AR) is poised to streamline these issues. The authors present the first reported prospective pilot study investigating the feasibility of using the OpenSight application with an AR head-mounted display to map out the borders of tumors in patients undergoing elective craniotomy for tumor resection, and to compare the degree of correspondence with MWBNS tracing. METHODS Eleven consecutive patients undergoing elective craniotomy for brain tumor resection were prospectively identified and underwent circumferential tumor border tracing at the time of incision planning by a surgeon wearing HoloLens AR glasses running the commercially available OpenSight application registered to the patient and preoperative MRI. Then, the same patient underwent circumferential tumor border tracing using the StealthStation S8 MWBNS. Postoperatively, both tumor border tracings were compared by two blinded board-certified neurosurgeons and rated as having an excellent, adequate, or poor correspondence degree based on a subjective sense of the overlap. Objective overlap area measurements were also determined. RESULTS Eleven patients undergoing craniotomy were included in the study. Five patient procedures were rated as having an excellent correspondence degree, 5 had an adequate correspondence degree, and 1 had poor correspondence. Both raters agreed on the rating in all cases. AR tracing was possible in all cases. CONCLUSIONS In this small pilot study, the authors found that AR was implementable in the workflow of a neurosurgery OR, and was a feasible method of preoperative tumor border identification for incision planning. Future studies are needed to identify strategies to improve and optimize AR accuracy.
Collapse
Affiliation(s)
- Michael E Ivan
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and.,2Sylvester Comprehensive Cancer Center, Miami, Florida
| | - Daniel G Eichberg
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Long Di
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Ashish H Shah
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Evan M Luther
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Victor M Lu
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and
| | - Ricardo J Komotar
- 1Department of Neurological Surgery, University of Miami Miller School of Medicine; and.,2Sylvester Comprehensive Cancer Center, Miami, Florida
| | | |
Collapse
|
55
|
Hu X, Baena FRY, Cutolo F. Head-Mounted Augmented Reality Platform for Markerless Orthopaedic Navigation. IEEE J Biomed Health Inform 2021; 26:910-921. [PMID: 34115600 DOI: 10.1109/jbhi.2021.3088442] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Visual augmented reality (AR) has the potential to improve the accuracy, efficiency and reproducibility of computer-assisted orthopaedic surgery (CAOS). AR Head-mounted displays (HMDs) further allow non-eye-shift target observation and egocentric view. Recently, a markerless tracking and registration (MTR) algorithm was proposed to avoid the artificial markers that are conventionally pinned into the target anatomy for tracking, as their use prolongs surgical workflow, introduces human-induced errors, and necessitates additional surgical invasion in patients. However, such an MTR-based method has neither been explored for surgical applications nor integrated into current AR HMDs, making the ergonomic HMD-based markerless AR CAOS navigation hard to achieve. To these aims, we present a versatile, device-agnostic and accurate HMD-based AR platform. Our software platform, supporting both video see-through (VST) and optical see-through (OST) modes, integrates two proposed fast calibration procedures using a specially designed calibration tool. According to the camera-based evaluation, our AR platform achieves a display error of 6.31 2.55 arcmin for VST and 7.72 3.73 arcmin for OST. A proof-of-concept markerless surgical navigation system to assist in femoral bone drilling was then developed based on the platform and Microsoft HoloLens 1. According to the user study, both VST and OST markerless navigation systems are reliable, with the OST system providing the best usability. The measured navigation error is 4.90 1.04 mm, 5.96 2.22 for VST system and 4.36 0.80 mm, 5.65 1.42 for OST system.
Collapse
|
56
|
Augmented reality in the operating room: a clinical feasibility study. BMC Musculoskelet Disord 2021; 22:451. [PMID: 34006234 PMCID: PMC8132365 DOI: 10.1186/s12891-021-04339-w] [Citation(s) in RCA: 27] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Accepted: 05/06/2021] [Indexed: 11/20/2022] Open
Abstract
Background Augmented Reality (AR) is a rapidly emerging technology finding growing acceptance and application in different fields of surgery. Various studies have been performed evaluating the precision and accuracy of AR guided navigation. This study investigates the feasibility of a commercially available AR head mounted device during orthopedic surgery. Methods Thirteen orthopedic surgeons from a Swiss university clinic performed 25 orthopedic surgical procedures wearing a holographic AR headset (HoloLens, Microsoft, Redmond, WA, USA) providing complementary three-dimensional, patient specific anatomic information. The surgeon’s experience of using the device during surgery was recorded using a standardized 58-item questionnaire grading different aspects on a 100-point scale with anchor statements. Results Surgeons were generally satisfied with image quality (85 ± 17 points) and accuracy of the virtual objects (84 ± 19 point). Wearing the AR device was rated as fairly comfortable (79 ± 13 points). Functionality of voice commands (68 ± 20 points) and gestures (66 ± 20 points) provided less favorable results. The greatest potential in the use of the AR device was found for surgical correction of deformities (87 ± 15 points). Overall, surgeons were satisfied with the application of this novel technology (78 ± 20 points) and future access to it was demanded (75 ± 22 points). Conclusion AR is a rapidly evolving technology with large potential in different surgical settings, offering the opportunity to provide a compact, low cost alternative requiring a minimum of infrastructure compared to conventional navigation systems. While surgeons where generally satisfied with image quality of the here tested head mounted AR device, some technical and ergonomic shortcomings were pointed out. This study serves as a proof of concept for the use of an AR head mounted device in a real-world sterile setting in orthopedic surgery. Supplementary Information The online version contains supplementary material available at 10.1186/s12891-021-04339-w.
Collapse
|
57
|
Ha J, Parekh P, Gamble D, Masters J, Jun P, Hester T, Daniels T, Halai M. Opportunities and challenges of using augmented reality and heads-up display in orthopaedic surgery: A narrative review. J Clin Orthop Trauma 2021; 18:209-215. [PMID: 34026489 PMCID: PMC8131920 DOI: 10.1016/j.jcot.2021.04.031] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/08/2021] [Revised: 03/28/2021] [Accepted: 04/29/2021] [Indexed: 12/16/2022] Open
Abstract
BACKGROUND & AIM Utilization of augmented reality (AR) and heads-up displays (HUD) to aid orthopaedic surgery has the potential to benefit surgeons and patients alike through improved accuracy, safety, and educational benefits. With the COVID-19 pandemic, the opportunity for adoption of novel technology is more relevant. The aims are to assess the technology available, to understand the current evidence regarding the benefit and to consider challenges to implementation in clinical practice. METHODS & RESULTS PRISMA guidelines were used to filter the literature. Of 1004 articles returned the following exclusion criteria were applied: 1) reviews/commentaries 2) unrelated to orthopaedic surgery 3) use of other AR wearables beyond visual aids leaving 42 papers for review.This review illustrates benefits including enhanced accuracy and reduced time of surgery, reduced radiation exposure and educational benefits. CONCLUSION Whilst there are obstacles to overcome, there are already reports of technology being used. As with all novel technologies, a greater understanding of the learning curve is crucial, in addition to shielding our patients from this learning curve. Improvements in usability and implementing surgeons' specific needs should increase uptake.
Collapse
Affiliation(s)
- Joon Ha
- Queen Elizabeth Hospital, London, UK,Corresponding author.
| | | | | | - James Masters
- Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), UK
| | - Peter Jun
- University of Alberta, Edmonton, Canada
| | | | | | - Mansur Halai
- St Michael's Hospital, University of Toronto, Canada
| |
Collapse
|
58
|
Sugahara K, Koyachi M, Koyama Y, Sugimoto M, Matsunaga S, Odaka K, Abe S, Katakura A. Mixed reality and three dimensional printed models for resection of maxillary tumor: a case report. Quant Imaging Med Surg 2021; 11:2187-2194. [PMID: 33936998 DOI: 10.21037/qims-20-597] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
In the field of oral and maxillofacial surgery, many institutions have recently begun using three-dimensional printers to create three-dimensional models and mixed reality in a variety of diseases. Here, we report the actual situation model which we made using three-dimensional printer from virtual operation data and the resection that was performed while grasping a maxillary benign tumor and neighboring three-dimensional structure by designing an application for Microsoft® HoloLens, and using Mixed Reality surgery support during the procedure.
Collapse
Affiliation(s)
- Keisuke Sugahara
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan.,Oral Health Science Center, Tokyo Dental College, Tokyo, Japan
| | - Masahide Koyachi
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan
| | - Yu Koyama
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan
| | - Maki Sugimoto
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan.,Okinaga Research Institute Innovation Lab, Teikyo University, Tokyo, Japan
| | - Satoru Matsunaga
- Oral Health Science Center, Tokyo Dental College, Tokyo, Japan.,Department of Anatomy, Tokyo Dental College, Tokyo, Japan
| | - Kento Odaka
- Department of Oral and Maxillofacial Radiology, Tokyo Dental College, Tokyo, Japan
| | - Shinichi Abe
- Department of Anatomy, Tokyo Dental College, Tokyo, Japan
| | - Akira Katakura
- Department of Oral Pathobiological Science and Surgery, Tokyo Dental College, Tokyo, Japan.,Oral Health Science Center, Tokyo Dental College, Tokyo, Japan
| |
Collapse
|
59
|
Tu P, Gao Y, Lungu AJ, Li D, Wang H, Chen X. Augmented reality based navigation for distal interlocking of intramedullary nails utilizing Microsoft HoloLens 2. Comput Biol Med 2021; 133:104402. [PMID: 33895460 DOI: 10.1016/j.compbiomed.2021.104402] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2021] [Revised: 03/24/2021] [Accepted: 04/11/2021] [Indexed: 11/19/2022]
Abstract
BACKGROUND AND OBJECTIVE The distal interlocking of intramedullary nail remains a technically demanding procedure. Existing augmented reality based solutions still suffer from hand-eye coordination problem, prolonged operation time, and inadequate resolution. In this study, an augmented reality based navigation system for distal interlocking of intramedullary nail is developed using Microsoft HoloLens 2, the state-of-the-art optical see-through head-mounted display. METHODS A customized registration cube is designed to assist surgeons with better depth perception when performing registration procedures. During drilling, surgeons can obtain accurate and in-situ visualization of intramedullary nail and drilling path, and dynamic navigation is enabled. An intraoperative warning system is proposed to provide intuitive feedback of real-time deviations and electromagnetic disturbances. RESULTS The preclinical phantom experiment showed that the reprojection errors along the X, Y, and Z axes were 1.55 ± 0.27 mm, 1.71 ± 0.40 mm, and 2.84 ± 0.78 mm, respectively. The end-to-end evaluation method indicated the distance error was 1.61 ± 0.44 mm, and the 3D angle error was 1.46 ± 0.46°. A cadaver experiment was also conducted to evaluate the feasibility of the system. CONCLUSION Our system has potential advantages over the 2D-screen based navigation system and the pointing device based navigation system in terms of accuracy and time consumption, and has tremendous application prospects.
Collapse
Affiliation(s)
- Puxun Tu
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yao Gao
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Abel J Lungu
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Dongyuan Li
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Huixiang Wang
- Department of Orthopedics, Shanghai Jiao Tong University Affiliated Sixth People's Hospital, Shanghai, China.
| | - Xiaojun Chen
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China; Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
60
|
Effect of marker position and size on the registration accuracy of HoloLens in a non-clinical setting with implications for high-precision surgical tasks. Int J Comput Assist Radiol Surg 2021; 16:955-966. [PMID: 33856643 PMCID: PMC8166698 DOI: 10.1007/s11548-021-02354-9] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Accepted: 03/16/2021] [Indexed: 01/26/2023]
Abstract
Purpose Emerging holographic headsets can be used to register patient-specific virtual models obtained from medical scans with the patient’s body. Maximising accuracy of the virtual models’ inclination angle and position (ideally, ≤ 2° and ≤ 2 mm, respectively, as in currently approved navigation systems) is vital for this application to be useful. This study investigated the accuracy with which a holographic headset registers virtual models with real-world features based on the position and size of image markers. Methods HoloLens® and the image-pattern-recognition tool Vuforia Engine™ were used to overlay a 5-cm-radius virtual hexagon on a monitor’s surface in a predefined position. The headset’s camera detection of an image marker (displayed on the monitor) triggered the rendering of the virtual hexagon on the headset’s lenses. 4 × 4, 8 × 8 and 12 × 12 cm image markers displayed at nine different positions were used. In total, the position and dimensions of 114 virtual hexagons were measured on photographs captured by the headset’s camera. Results Some image marker positions and the smallest image marker (4 × 4 cm) led to larger errors in the perceived dimensions of the virtual models than other image marker positions and larger markers (8 × 8 and 12 × 12 cm). ≤ 2° and ≤ 2 mm errors were found in 70.7% and 76% of cases, respectively. Conclusion Errors obtained in a non-negligible percentage of cases are not acceptable for certain surgical tasks (e.g. the identification of correct trajectories of surgical instruments). Achieving sufficient accuracy with image marker sizes that meet surgical needs and regardless of image marker position remains a challenge. Supplementary Information The online version contains supplementary material available at 10.1007/s11548-021-02354-9.
Collapse
|
61
|
Sakai D, Joyce K, Sugimoto M, Horikita N, Hiyama A, Sato M, Devitt A, Watanabe M. Augmented, virtual and mixed reality in spinal surgery: A real-world experience. J Orthop Surg (Hong Kong) 2021; 28:2309499020952698. [PMID: 32909902 DOI: 10.1177/2309499020952698] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/16/2022] Open
Abstract
This review aims to identify the role of augmented, virtual or mixed reality (AR, VR or MR) technologies in setting of spinal surgery. The authors address the challenges surrounding the implementation of this technology in the operating room. A technical standpoint addresses the efficacy of these imaging modalities based on the current literature in the field. Ultimately, these technologies must be cost-effective to ensure widespread adoption. This may be achieved through reduced surgical times and decreased incidence of post-operative complications and revisions while maintaining equivalent safety profile to alternative surgical approaches. While current studies focus mainly on the successful placement of pedicle screws via AR-guided instrumentation, a wider scope of procedures may be assisted using AR, VR or MR technology once efficacy and safety have been validated. These emerging technologies offer a significant advantage in the guidance of complex procedures that require high precision and accuracy using minimally invasive interventions.
Collapse
Affiliation(s)
- Daisuke Sakai
- Department of Orthopaedic Surgery, Surgical Science, Tokai University School of Medicine, Isehara, Kanagawa, Japan
| | - Kieran Joyce
- SFI Research Centre for Medical Devices, National University of Ireland, Galway, Ireland.,Department of Orthopaedic Surgery, School of Medicine, National University of Ireland, Galway, Ireland
| | - Maki Sugimoto
- Innovation Lab, Teikyo University Okinaga Research Institute, Tokyo, Japan
| | - Natsumi Horikita
- Department of Orthopaedic Surgery, Surgical Science, Tokai University School of Medicine, Isehara, Kanagawa, Japan
| | - Akihiko Hiyama
- Department of Orthopaedic Surgery, Surgical Science, Tokai University School of Medicine, Isehara, Kanagawa, Japan
| | - Masato Sato
- Department of Orthopaedic Surgery, Surgical Science, Tokai University School of Medicine, Isehara, Kanagawa, Japan
| | - Aiden Devitt
- Department of Orthopaedic Surgery, School of Medicine, National University of Ireland, Galway, Ireland
| | - Masahiko Watanabe
- Department of Orthopaedic Surgery, Surgical Science, Tokai University School of Medicine, Isehara, Kanagawa, Japan
| |
Collapse
|
62
|
Casari FA, Navab N, Hruby LA, Kriechling P, Nakamura R, Tori R, de Lourdes Dos Santos Nunes F, Queiroz MC, Fürnstahl P, Farshad M. Augmented Reality in Orthopedic Surgery Is Emerging from Proof of Concept Towards Clinical Studies: a Literature Review Explaining the Technology and Current State of the Art. Curr Rev Musculoskelet Med 2021; 14:192-203. [PMID: 33544367 PMCID: PMC7990993 DOI: 10.1007/s12178-021-09699-3] [Citation(s) in RCA: 45] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 01/08/2021] [Indexed: 02/07/2023]
Abstract
PURPOSE OF REVIEW Augmented reality (AR) is becoming increasingly popular in modern-day medicine. Computer-driven tools are progressively integrated into clinical and surgical procedures. The purpose of this review was to provide a comprehensive overview of the current technology and its challenges based on recent literature mainly focusing on clinical, cadaver, and innovative sawbone studies in the field of orthopedic surgery. The most relevant literature was selected according to clinical and innovational relevance and is summarized. RECENT FINDINGS Augmented reality applications in orthopedic surgery are increasingly reported. In this review, we summarize basic principles of AR including data preparation, visualization, and registration/tracking and present recently published clinical applications in the area of spine, osteotomies, arthroplasty, trauma, and orthopedic oncology. Higher accuracy in surgical execution, reduction of radiation exposure, and decreased surgery time are major findings presented in the literature. In light of the tremendous progress of technological developments in modern-day medicine and emerging numbers of research groups working on the implementation of AR in routine clinical procedures, we expect the AR technology soon to be implemented as standard devices in orthopedic surgery.
Collapse
Affiliation(s)
- Fabio A Casari
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Zurich, Switzerland.
- ROCS, Research in Orthopedic Computer Science, Balgrist Campus, University of Zurich, Forchstrasse 340, 8008, Zürich, Switzerland.
| | - Nassir Navab
- Computer Aided Medical Procedures (CAMP), Technische Universität München, Munich, Germany
- Computer Aided Medical Procedures (CAMP), Johns Hopkins University, Baltimore, MD, USA
| | - Laura A Hruby
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
- Department of Orthopaedics and Trauma Surgery, Medical University of Vienna, Vienna, Austria
| | - Philipp Kriechling
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Ricardo Nakamura
- Computer Engineering and Digital Systems Department, Escola Politécnica, Universidade de São Paulo, São Paulo, SP, Brazil
| | - Romero Tori
- Computer Engineering and Digital Systems Department, Escola Politécnica, Universidade de São Paulo, São Paulo, SP, Brazil
| | | | - Marcelo C Queiroz
- Orthopedics and Traumatology Department, Faculty of Medical Sciences of Santa Casa de Sao Paulo, Sao Paulo, SP, Brazil
| | - Philipp Fürnstahl
- ROCS, Research in Orthopedic Computer Science, Balgrist Campus, University of Zurich, Forchstrasse 340, 8008, Zürich, Switzerland
| | - Mazda Farshad
- Department of Orthopedic Surgery, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| |
Collapse
|
63
|
Burström G, Persson O, Edström E, Elmi-Terander A. Augmented reality navigation in spine surgery: a systematic review. Acta Neurochir (Wien) 2021; 163:843-852. [PMID: 33506289 PMCID: PMC7886712 DOI: 10.1007/s00701-021-04708-3] [Citation(s) in RCA: 54] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Accepted: 01/06/2021] [Indexed: 02/07/2023]
Abstract
BACKGROUND Conventional spinal navigation solutions have been criticized for having a negative impact on time in the operating room and workflow. AR navigation could potentially alleviate some of these concerns while retaining the benefits of navigated spine surgery. The objective of this study is to summarize the current evidence for using augmented reality (AR) navigation in spine surgery. METHODS We performed a systematic review to explore the current evidence for using AR navigation in spine surgery. PubMed and Web of Science were searched from database inception to November 27, 2020, for data on the AR navigation solutions; the reported efficacy of the systems; and their impact on workflow, radiation, and cost-benefit relationships. RESULTS In this systematic review, 28 studies were included in the final analysis. The main findings were superior workflow and non-inferior accuracy when comparing AR to free-hand (FH) or conventional surgical navigation techniques. A limited number of studies indicated decreased use of radiation. There were no studies reporting mortality, morbidity, or cost-benefit relationships. CONCLUSIONS AR provides a meaningful addition to FH surgery and traditional navigation methods for spine surgery. However, the current evidence base is limited and prospective studies on clinical outcomes and cost-benefit relationships are needed.
Collapse
|
64
|
Fotouhi J, Mehrfard A, Song T, Johnson A, Osgood G, Unberath M, Armand M, Navab N. Development and Pre-Clinical Analysis of Spatiotemporal-Aware Augmented Reality in Orthopedic Interventions. IEEE TRANSACTIONS ON MEDICAL IMAGING 2021; 40:765-778. [PMID: 33166252 PMCID: PMC8317976 DOI: 10.1109/tmi.2020.3037013] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. As a consequence, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. The main contribution of this paper is to reveal how exemplary workflows are redefined by taking full advantage of head-mounted displays when entirely co-registered with the imaging system at all times. The awareness of the system from the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. Our system achieved an error of 4.76 ± 2.91mm for placing K-wire in a fracture management procedure, and yielded errors of 1.57 ± 1.16° and 1.46 ± 1.00° in the abduction and anteversion angles, respectively, for total hip arthroplasty (THA). We compared the results with the outcomes from baseline standard operative and non-immersive AR procedures, which had yielded errors of [4.61mm, 4.76°, 4.77°] and [5.13mm, 1.78°, 1.43°], respectively, for wire placement, and abduction and anteversion during THA. We hope that our holistic approach towards improving the interface of surgery not only augments the surgeon's capabilities but also augments the surgical team's experience in carrying out an effective intervention with reduced complications and provide novel approaches of documenting procedures for training purposes.
Collapse
|
65
|
Augmented Reality Based Surgical Navigation of Complex Pelvic Osteotomies—A Feasibility Study on Cadavers. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11031228] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
Augmented reality (AR)-based surgical navigation may offer new possibilities for safe and accurate surgical execution of complex osteotomies. In this study we investigated the feasibility of navigating the periacetabular osteotomy of Ganz (PAO), known as one of the most complex orthopedic interventions, on two cadaveric pelves under realistic operating room conditions. Preoperative planning was conducted on computed tomography (CT)-reconstructed 3D models using an in-house developed software, which allowed creating cutting plane objects for planning of the osteotomies and reorientation of the acetabular fragment. An AR application was developed comprising point-based registration, motion compensation and guidance for osteotomies as well as fragment reorientation. Navigation accuracy was evaluated on CT-reconstructed 3D models, resulting in an error of 10.8 mm for osteotomy starting points and 5.4° for osteotomy directions. The reorientation errors were 6.7°, 7.0° and 0.9° for the x-, y- and z-axis, respectively. Average postoperative error of LCE angle was 4.5°. Our study demonstrated that the AR-based execution of complex osteotomies is feasible. Fragment realignment navigation needs further improvement, although it is more accurate than the state of the art in PAO surgery.
Collapse
|
66
|
Manni F, Mamprin M, Holthuizen R, Shan C, Burström G, Elmi-Terander A, Edström E, Zinger S, de With PHN. Multi-view 3D skin feature recognition and localization for patient tracking in spinal surgery applications. Biomed Eng Online 2021; 20:6. [PMID: 33413426 PMCID: PMC7792004 DOI: 10.1186/s12938-020-00843-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2020] [Accepted: 12/19/2020] [Indexed: 11/25/2022] Open
Abstract
BACKGROUND Minimally invasive spine surgery is dependent on accurate navigation. Computer-assisted navigation is increasingly used in minimally invasive surgery (MIS), but current solutions require the use of reference markers in the surgical field for both patient and instruments tracking. PURPOSE To improve reliability and facilitate clinical workflow, this study proposes a new marker-free tracking framework based on skin feature recognition. METHODS Maximally Stable Extremal Regions (MSER) and Speeded Up Robust Feature (SURF) algorithms are applied for skin feature detection. The proposed tracking framework is based on a multi-camera setup for obtaining multi-view acquisitions of the surgical area. Features can then be accurately detected using MSER and SURF and afterward localized by triangulation. The triangulation error is used for assessing the localization quality in 3D. RESULTS The framework was tested on a cadaver dataset and in eight clinical cases. The detected features for the entire patient datasets were found to have an overall triangulation error of 0.207 mm for MSER and 0.204 mm for SURF. The localization accuracy was compared to a system with conventional markers, serving as a ground truth. An average accuracy of 0.627 and 0.622 mm was achieved for MSER and SURF, respectively. CONCLUSIONS This study demonstrates that skin feature localization for patient tracking in a surgical setting is feasible. The technology shows promising results in terms of detected features and localization accuracy. In the future, the framework may be further improved by exploiting extended feature processing using modern optical imaging techniques for clinical applications where patient tracking is crucial.
Collapse
Affiliation(s)
- Francesca Manni
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands.
| | - Marco Mamprin
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| | | | - Caifeng Shan
- Shandong University of Science and Technology, Qingdao, China
| | - Gustav Burström
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Adrian Elmi-Terander
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Erik Edström
- Department of Neurosurgery, Karolinska University Hospital and Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
| | - Svitlana Zinger
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Peter H N de With
- Department of Electrical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| |
Collapse
|
67
|
Yuk FJ, Maragkos GA, Sato K, Steinberger J. Current innovation in virtual and augmented reality in spine surgery. ANNALS OF TRANSLATIONAL MEDICINE 2021; 9:94. [PMID: 33553387 PMCID: PMC7859743 DOI: 10.21037/atm-20-1132] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
In spinal surgery, outcomes are directly related both to patient and procedure selection, as well as the accuracy and precision of instrumentation placed. Poorly placed instrumentation can lead to spinal cord, nerve root or vascular injury. Traditionally, spine surgery was performed by open methods and placement of instrumentation under direct visualization. However, minimally invasive surgery (MIS) has seen substantial advances in spine, with an ever-increasing range of indications and procedures. For these reasons, novel methods to visualize anatomy and precisely guide surgery, such as intraoperative navigation, are extremely useful in this field. In this review, we present the recent advances and innovations utilizing simulation methods in spine surgery. The application of these techniques is still relatively new, however quickly being integrated in and outside the operating room. These include virtual reality (VR) (where the entire simulation is virtual), mixed reality (MR) (a combination of virtual and physical components), and augmented reality (AR) (the superimposition of a virtual component onto physical reality). VR and MR have primarily found applications in a teaching and preparatory role, while AR is mainly applied in hands-on surgical settings. The present review attempts to provide an overview of the latest advances and applications of these methods in the neurosurgical spine setting.
Collapse
Affiliation(s)
- Frank J Yuk
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Georgios A Maragkos
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| | - Kosuke Sato
- Hospital for Special Surgery, New York, NY, USA
| | - Jeremy Steinberger
- Department of Neurosurgery, Mount Sinai Hospital, Icahn School of Medicine at Mount Sinai, New York, NY, USA
| |
Collapse
|
68
|
Kim Y, Groombridge C, Romero L, Clare S, Fitzgerald MC. Decision Support Capabilities of Telemedicine in Emergency Prehospital Care: Systematic Review. J Med Internet Res 2020; 22:e18959. [PMID: 33289672 PMCID: PMC7755537 DOI: 10.2196/18959] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2020] [Revised: 07/16/2020] [Accepted: 10/28/2020] [Indexed: 12/25/2022] Open
Abstract
Background Telemedicine offers a unique opportunity to improve coordination and administration for urgent patient care remotely. In an emergency setting, it has been used to support first responders by providing telephone or video consultation with specialists at hospitals and through the exchange of prehospital patient information. This technological solution is evolving rapidly, yet there is a concern that it is being implemented without a demonstrated clinical need and effectiveness as well as without a thorough economic evaluation. Objective Our objective is to systematically review whether the clinical outcomes achieved, as reported in the literature, favor telemedicine decision support for medical interventions during prehospital care. Methods This systematic review included peer-reviewed journal articles. Searches of 7 databases and relevant reviews were conducted. Eligibility criteria consisted of studies that covered telemedicine as data- and information-sharing and two-way teleconsultation platforms, with the objective of supporting medical decisions (eg, diagnosis, treatment, and receiving hospital decision) in a prehospital emergency setting. Simulation studies and studies that included pediatric populations were excluded. The procedures in this review followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) statement. The Risk Of Bias In Non-randomised Studies–of Interventions (ROBINS-I) tool was used for the assessment of risk of bias. The results were synthesized based on predefined aspects of medical decisions that are made in a prehospital setting, which include diagnostic decision support, receiving facility decisions, and medical directions for treatment. All data extractions were done by at least two reviewers independently. Results Out of 42 full-text reviews, 7 were found eligible. Diagnostic support and medical direction and decision for treatments were often reported. A key finding of this review was the high agreement between prehospital diagnoses via telemedicine and final in-hospital diagnoses, as supported by quantitative evidence. However, a majority of the articles described the clinical value of having access to remote experts without robust quantitative data. Most telemedicine solutions were evaluated within a feasibility or short-term preliminary study. In general, the results were positive for telemedicine use; however, biases, due to preintervention confounding factors and a lack of documentation on quality assurance and protocol for telemedicine activation, make it difficult to determine the direct effect on patient outcomes. Conclusions The information-sharing capacity of telemedicine enables access to remote experts to support medical decision making on scene or in prolonged field care. The influence of human and technology factors on patient care is poorly understood and documented.
Collapse
Affiliation(s)
- Yesul Kim
- National Trauma Research Institute, Melbourne, Australia.,Monash University, Melbourne, Australia.,Trauma Services, Alfred Health, Melbourne, Australia
| | - Christopher Groombridge
- National Trauma Research Institute, Melbourne, Australia.,Monash University, Melbourne, Australia.,Trauma Services, Alfred Health, Melbourne, Australia
| | - Lorena Romero
- The Ian Potter Library, Alfred Health, Melbourne, Australia
| | - Steven Clare
- National Trauma Research Institute, Melbourne, Australia.,Trauma Services, Alfred Health, Melbourne, Australia
| | - Mark Christopher Fitzgerald
- National Trauma Research Institute, Melbourne, Australia.,Monash University, Melbourne, Australia.,Trauma Services, Alfred Health, Melbourne, Australia
| |
Collapse
|
69
|
McKnight RR, Pean CA, Buck JS, Hwang JS, Hsu JR, Pierrie SN. Virtual Reality and Augmented Reality-Translating Surgical Training into Surgical Technique. Curr Rev Musculoskelet Med 2020; 13:663-674. [PMID: 32779019 PMCID: PMC7661680 DOI: 10.1007/s12178-020-09667-3] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
PURPOSE OF REVIEW As immersive learning outside of the operating room is increasingly recognized as a valuable method of surgical training, virtual reality (VR) and augmented reality (AR) are increasingly utilized in orthopedic surgical training. This article reviews the evolving nature of these training tools and provides examples of their use and efficacy. The practical and ethical implications of incorporating this technology and its impact on both orthopedic surgeons and their patients are also discussed. RECENT FINDINGS Head-mounted displays (HMDs) represent a possible adjunct to surgical accuracy and education. While the hardware is advanced, there is still much work to be done in developing software that allows for seamless, reliable, useful integration into clinical practice and training. Surgical training is changing: AR and VR will become mainstays of future training efforts. More evidence is needed to determine which training technology translates to improved clinical performance. Volatility within the HMD industry will likely delay advances in surgical training.
Collapse
Affiliation(s)
- R Randall McKnight
- Department of Orthopaedic Surgery, Atrium Health Musculoskeletal Institute, 1001 Blythe Blvd, Charlotte, NC, 28203, USA.
| | - Christian A Pean
- Department of Orthopedic Surgery, NYU Langone Health, New York, NY, USA
| | - J Stewart Buck
- Department of Orthopaedic Surgery, Atrium Health Musculoskeletal Institute, 1001 Blythe Blvd, Charlotte, NC, 28203, USA
| | - John S Hwang
- Department of Orthopedic Surgery, Mount Carmel, Columbus, OH, USA
- Department of Orthopedic Surgery, Orthopedic ONE, Columbus, OH, USA
| | - Joseph R Hsu
- Department of Orthopaedic Surgery, Atrium Health Musculoskeletal Institute, 1001 Blythe Blvd, Charlotte, NC, 28203, USA
| | - Sarah N Pierrie
- Department of Orthopaedics and Center for the Intrepid, San Antonio Military Medical Center, Fort Sam Houston, TX, USA
| |
Collapse
|
70
|
Heinrich F, Schwenderling L, Joeres F, Lawonn K, Hansen C. Comparison of Augmented Reality Display Techniques to Support Medical Needle Insertion. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3568-3575. [PMID: 33006930 DOI: 10.1109/tvcg.2020.3023637] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Augmented reality (AR) may be a useful technique to overcome issues of conventionally used navigation systems supporting medical needle insertions, like increased mental workload and complicated hand-eye coordination. Previous research primarily focused on the development of AR navigation systems designed for specific displaying devices, but differences between employed methods have not been investigated before. To this end, a user study involving a needle insertion task was conducted comparing different AR display techniques with a monitor-based approach as baseline condition for the visualization of navigation information. A video see-through stationary display, an optical see-through head-mounted display and a spatial AR projector-camera-system were investigated in this comparison. Results suggest advantages of using projected navigation information in terms of lower task completion time, lower angular deviation and affirmative subjective participant feedback. Techniques requiring the intermediate view on screens, i.e. the stationary display and the baseline condition, showed less favorable results. Thus, benefits of providing AR navigation information compared to a conventionally used method could be identified. Significant objective measures results, as well as an identification of advantages and disadvantages of individual display techniques contribute to the development and design of improved needle navigation systems.
Collapse
|
71
|
Baba S, Kawaguchi K, Itamoto K, Watanabe T, Hayashida M, Mae T, Nakashima Y, Kato G. Use of an inertial measurement unit sensor in pedicle screw placement improves trajectory accuracy. PLoS One 2020; 15:e0242512. [PMID: 33196657 PMCID: PMC7668595 DOI: 10.1371/journal.pone.0242512] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2020] [Accepted: 11/03/2020] [Indexed: 12/02/2022] Open
Abstract
Ascertaining the accuracy of the pedicle screw (PS) trajectories is important as PS malpositioning can cause critical complications. We aimed to determine the angle range over which estimation is unreliable; build a low-cost PS placement support system that uses an inertial measurement unit (IMU) to enable the monitoring of surgical tools and PS trajectories, and determine the situations where IMU support would be most beneficial. In PS insertion experiments, we used cadaver samples that included lumbar porcine spines. Computed tomography images obtained before and after PS insertion were viewed. Offsets between the planned and implanted PS trajectories in the freehand and IMU-assisted groups were analyzed. The PS cortical bone breaches were classified according to the Gertzbein and Robbins criteria (GRC). Added head-down tilted sample experiments were repeated wherein we expected a decreased rostro-caudal rotational accuracy of the PS according to the angle estimation ability results. Evaluation of the PS trajectory accuracy revealed no significant advantage of IMU-assisted rostro-caudal rotational accuracy versus freehand accuracy. According to the GRC, IMU assistance significantly increased the rate of clinically acceptable PS positions (RoCA) than the freehand technique. In the head-down tilted sample experiments, IMU assist provided increased accuracies with both rostro-caudal and medial rotational techniques when compared with the freehand technique. In the freehand group, RoCA was significantly decreased in samples with rostral tilting relative to that in the samples without. However, In the IMU-assisted group, no significant difference in RoCA between the samples with and without head-down tilting was observed. Even when the planned PS medial and/or rostro-caudal rotational angle was relatively large and difficult to reproduce manually, IMU-support helped maintain the PS trajectory accuracy and positioning safety. IMU assist in PS placement was more beneficial, especially for larger rostro-caudal and/or medial rotational pedicle angles.
Collapse
Affiliation(s)
- Satoshi Baba
- Department of Spine Surgery, Saga Medical Center, Koseikan, Saga, Japan
- Trauma Center, Saga Medical Center, Koseikan, Saga, Japan
- Department of Orthopedic Surgery, Kyushu University Graduate School of Medical Sciences, Fukuoka, Japan
| | - Kenichi Kawaguchi
- Department of Orthopedic Surgery, Kyushu University Graduate School of Medical Sciences, Fukuoka, Japan
| | - Kazuhito Itamoto
- Department of Small Animal Clinical Science, Joint Faculty of Veterinary Medicine, Yamaguchi University, Yamaguchi, Japan
| | - Takeshi Watanabe
- Department of Orthopedic Surgery, Watanabe Orthopedic Hospital, Itoshima, Fukuoka, Japan
| | - Mitsumasa Hayashida
- Department of Spine Surgery, Saga Medical Center, Koseikan, Saga, Japan
- Trauma Center, Saga Medical Center, Koseikan, Saga, Japan
- Department of Orthopedic Surgery, Kyushu University Graduate School of Medical Sciences, Fukuoka, Japan
| | - Takao Mae
- Trauma Center, Saga Medical Center, Koseikan, Saga, Japan
- Department of Orthopedic Surgery, Saga Medical Center, Koseikan, Saga, Japan
| | - Yasuharu Nakashima
- Department of Orthopedic Surgery, Kyushu University Graduate School of Medical Sciences, Fukuoka, Japan
| | - Go Kato
- Department of Spine Surgery, Saga Medical Center, Koseikan, Saga, Japan
- Trauma Center, Saga Medical Center, Koseikan, Saga, Japan
- Department of Orthopedic Surgery, Fukuoka Red Cross Hospital, Fukuoka, Japan
- * E-mail:
| |
Collapse
|
72
|
Value of the surgeon's sightline on hologram registration and targeting in mixed reality. Int J Comput Assist Radiol Surg 2020; 15:2027-2039. [PMID: 32984934 PMCID: PMC7671978 DOI: 10.1007/s11548-020-02263-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2020] [Accepted: 09/14/2020] [Indexed: 12/12/2022]
Abstract
Purpose Mixed reality (MR) is being evaluated as a visual tool for surgical navigation. Current literature presents unclear results on intraoperative accuracy using the Microsoft HoloLens 1®. This study aims to assess the impact of the surgeon’s sightline in an inside-out marker-based MR navigation system for open surgery. Methods Surgeons at Akershus University Hospital tested this system. A custom-made phantom was used, containing 18 wire target crosses within its inner walls. A CT scan was obtained in order to segment all wire targets into a single 3D-model (hologram). An in-house software application (CTrue), developed for the Microsoft HoloLens 1, uploaded 3D-models and automatically registered the 3D-model with the phantom. Based on the surgeon’s sightline while registering and targeting (free sightline /F/or a strictly perpendicular sightline /P/), 4 scenarios were developed (FF-PF-FP-PP). Target error distance (TED) was obtained in three different working axes-(XYZ).
Results Six surgeons (5 males, age 29–62) were enrolled. A total of 864 measurements were collected in 4 scenarios, twice. Scenario PP showed the smallest TED in XYZ-axes mean = 2.98 mm ± SD 1.33; 2.28 mm ± SD 1.45; 2.78 mm ± SD 1.91, respectively. Scenario FF showed the largest TED in XYZ-axes with mean = 10.03 mm ± SD 3.19; 6.36 mm ± SD 3.36; 16.11 mm ± SD 8.91, respectively. Multiple comparison tests, grouped in scenarios and axes, showed that the majority of scenario comparisons had significantly different TED values (p < 0.05). Y-axis always presented the smallest TED regardless of scenario tested. Conclusion A strictly perpendicular working sightline in relation to the 3D-model achieves the best accuracy results. Shortcomings in this technology, as an intraoperative visual cue, can be overcome by sightline correction. Incidentally, this is the preferred working angle for open surgery.
Collapse
|
73
|
Chang M, Canseco JA, Nicholson KJ, Patel N, Vaccaro AR. The Role of Machine Learning in Spine Surgery: The Future Is Now. Front Surg 2020; 7:54. [PMID: 32974382 PMCID: PMC7472375 DOI: 10.3389/fsurg.2020.00054] [Citation(s) in RCA: 49] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2020] [Accepted: 07/13/2020] [Indexed: 12/12/2022] Open
Abstract
The recent influx of machine learning centered investigations in the spine surgery literature has led to increased enthusiasm as to the prospect of using artificial intelligence to create clinical decision support tools, optimize postoperative outcomes, and improve technologies used in the operating room. However, the methodology underlying machine learning in spine research is often overlooked as the subject matter is quite novel and may be foreign to practicing spine surgeons. Improper application of machine learning is a significant bioethics challenge, given the potential consequences of over- or underestimating the results of such studies for clinical decision-making processes. Proper peer review of these publications requires a baseline familiarity of the language associated with machine learning, and how it differs from classical statistical analyses. This narrative review first introduces the overall field of machine learning and its role in artificial intelligence, and defines basic terminology. In addition, common modalities for applying machine learning, including classification and regression decision trees, support vector machines, and artificial neural networks are examined in the context of examples gathered from the spine literature. Lastly, the ethical challenges associated with adapting machine learning for research related to patient care, as well as future perspectives on the potential use of machine learning in spine surgery, are discussed specifically.
Collapse
Affiliation(s)
- Michael Chang
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| | - Jose A. Canseco
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| | | | - Neil Patel
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| | - Alexander R. Vaccaro
- Department of Orthopaedic Surgery, Thomas Jefferson University, Philadelphia, PA, United States
- Rothman Orthopaedic Institute, Philadelphia, PA, United States
| |
Collapse
|
74
|
Sun Q, Mai Y, Yang R, Ji T, Jiang X, Chen X. Fast and accurate online calibration of optical see-through head-mounted display for AR-based surgical navigation using Microsoft HoloLens. Int J Comput Assist Radiol Surg 2020; 15:1907-1919. [DOI: 10.1007/s11548-020-02246-4] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2020] [Accepted: 08/07/2020] [Indexed: 11/24/2022]
|
75
|
Salmas M, Fiska A, Vassiou A, Demesticha T, Paraskevas G, Protogerou V, Chytas D. Letter to the Editor Regarding "Enhancing Reality: A Systematic Review of Augmented Reality in Neuronavigation and Education". World Neurosurg 2020; 140:430-431. [PMID: 32797957 DOI: 10.1016/j.wneu.2020.04.213] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2020] [Accepted: 04/27/2020] [Indexed: 11/29/2022]
Affiliation(s)
- Marios Salmas
- Department of Anatomy, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | - Aliki Fiska
- Department of Anatomy, School of Medicine, Democritus University of Thrace, Alexandroupolis, Greece
| | - Aikaterini Vassiou
- Department of Anatomy, Faculty of Medicine, University of Thessaly, Larissa, Greece
| | - Theano Demesticha
- Department of Anatomy, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | - Georgios Paraskevas
- Department of Anatomy and Surgical Anatomy, Faculty of Medicine, Aristotle University of Thessaloniki, Thessaloniki, Greece
| | - Vassilios Protogerou
- Department of Anatomy, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | - Dimitrios Chytas
- Department of Anatomy, School of Medicine, European University of Cyprus, Engomi, Nicosia, Cyprus; 2nd Orthopedic Department, School of Medicine, National and Kapodistrian University of Athens, Nea Ionia, Greece.
| |
Collapse
|
76
|
Manni F, Elmi-Terander A, Burström G, Persson O, Edström E, Holthuizen R, Shan C, Zinger S, van der Sommen F, de With PHN. Towards Optical Imaging for Spine Tracking without Markers in Navigated Spine Surgery. SENSORS (BASEL, SWITZERLAND) 2020; 20:E3641. [PMID: 32610555 PMCID: PMC7374436 DOI: 10.3390/s20133641] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/19/2020] [Revised: 06/13/2020] [Accepted: 06/22/2020] [Indexed: 12/18/2022]
Abstract
Surgical navigation systems are increasingly used for complex spine procedures to avoid neurovascular injuries and minimize the risk for reoperations. Accurate patient tracking is one of the prerequisites for optimal motion compensation and navigation. Most current optical tracking systems use dynamic reference frames (DRFs) attached to the spine, for patient movement tracking. However, the spine itself is subject to intrinsic movements which can impact the accuracy of the navigation system. In this study, we aimed to detect the actual patient spine features in different image views captured by optical cameras, in an augmented reality surgical navigation (ARSN) system. Using optical images from open spinal surgery cases, acquired by two gray-scale cameras, spinal landmarks were identified and matched in different camera views. A computer vision framework was created for preprocessing of the spine images, detecting and matching local invariant image regions. We compared four feature detection algorithms, Speeded Up Robust Feature (SURF), Maximal Stable Extremal Region (MSER), Features from Accelerated Segment Test (FAST), and Oriented FAST and Rotated BRIEF (ORB) to elucidate the best approach. The framework was validated in 23 patients and the 3D triangulation error of the matched features was < 0 . 5 mm. Thus, the findings indicate that spine feature detection can be used for accurate tracking in navigated surgery.
Collapse
Affiliation(s)
- Francesca Manni
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| | - Adrian Elmi-Terander
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | - Gustav Burström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | - Oscar Persson
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | - Erik Edström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm SE-171 46, Sweden & Department of Neurosurgery, Karolinska University Hospital, SE-171 46 Stockholm, Sweden; (A.E.-T.); (G.B.); (O.P.); (E.E.)
| | | | - Caifeng Shan
- Philips Research, High Tech Campus 36, 5656 AE Eindhoven, The Netherlands;
| | - Svitlana Zinger
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| | - Fons van der Sommen
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| | - Peter H. N. de With
- Department of Electrical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, The Netherlands; (S.Z.); (F.v.d.S.); (P.H.N.d.W.)
| |
Collapse
|
77
|
Gibby J, Cvetko S, Javan R, Parr R, Gibby W. Use of augmented reality for image-guided spine procedures. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2020; 29:1823-1832. [DOI: 10.1007/s00586-020-06495-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/22/2019] [Revised: 04/07/2020] [Accepted: 05/31/2020] [Indexed: 12/14/2022]
|
78
|
Overview of Minimally Invasive Spine Surgery. World Neurosurg 2020; 142:43-56. [PMID: 32544619 DOI: 10.1016/j.wneu.2020.06.043] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2020] [Revised: 06/02/2020] [Accepted: 06/04/2020] [Indexed: 12/21/2022]
Abstract
Minimally invasive spine surgery (MISS) has continued to evolve over the past few decades, with significant advancements in technology and technical skills. From endonasal cervical approaches to extreme lateral lumbar interbody fusions, MISS has showcased its usefulness across all practice areas of the spine, with unique points of access to avoid pertinent neurovascular structures. Adult spine deformity has also recognized the importance of minimally invasive techniques in its ability to limit complications and to provide adequate sagittal alignment correction and improvements in patients' functional status. Although MISS has continued to make significant progress clinically, consideration must also be given to its economic impact and the learning curve surgeons experience in adding these procedures to their armamentarium. This review examines current innovations in MISS, as well as the economic impact and future directions of the field.
Collapse
|
79
|
Dennler C, Jaberg L, Spirig J, Agten C, Götschi T, Fürnstahl P, Farshad M. Augmented reality-based navigation increases precision of pedicle screw insertion. J Orthop Surg Res 2020; 15:174. [PMID: 32410636 PMCID: PMC7227090 DOI: 10.1186/s13018-020-01690-x] [Citation(s) in RCA: 46] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/01/2019] [Accepted: 04/29/2020] [Indexed: 02/07/2023] Open
Abstract
Background Precise insertion of pedicle screws is important to avoid injury to closely adjacent neurovascular structures. The standard method for the insertion of pedicle screws is based on anatomical landmarks (free-hand technique). Head-mounted augmented reality (AR) devices can be used to guide instrumentation and implant placement in spinal surgery. This study evaluates the feasibility and precision of AR technology to improve precision of pedicle screw insertion compared to the current standard technique. Methods Two board-certified orthopedic surgeons specialized in spine surgery and two novice surgeons were each instructed to drill pilot holes for 40 pedicle screws in eighty lumbar vertebra sawbones models in an agar-based gel. One hundred and sixty pedicles were randomized into two groups: the standard free-hand technique (FH) and augmented reality technique (AR). A 3D model of the vertebral body was superimposed over the AR headset. Half of the pedicles were drilled using the FH method, and the other half using the AR method. Results The average minimal distance of the drill axis to the pedicle wall (MAPW) was similar in both groups for expert surgeons (FH 4.8 ± 1.0 mm vs. AR 5.0 ± 1.4 mm, p = 0.389) but for novice surgeons (FH 3.4 mm ± 1.8 mm, AR 4.2 ± 1.8 mm, p = 0.044). Expert surgeons showed 0 primary drill pedicle perforations (PDPP) in both the FH and AR groups. Novices showed 3 (7.5%) PDPP in the FH group and one perforation (2.5%) in the AR group, respectively (p > 0.005). Experts showed no statistically significant difference in average secondary screw pedicle perforations (SSPP) between the AR and the FH set 6-, 7-, and 8-mm screws (p > 0.05). Novices showed significant differences of SSPP between most groups: 6-mm screws, 18 (45%) vs. 7 (17.5%), p = 0.006; 7-mm screws, 20 (50%) vs. 10 (25%), p = 0.013; and 8-mm screws, 22 (55%) vs. 15 (37.5%), p = 0.053, in the FH and AR group, respectively. In novices, the average optimal medio-lateral convergent angle (oMLCA) was 3.23° (STD 4.90) and 0.62° (STD 4.56) for the FH and AR set screws (p = 0.017), respectively. Novices drilled with a higher precision with respect to the cranio-caudal inclination angle (CCIA) category (p = 0.04) with AR. Conclusion In this study, the additional anatomical information provided by the AR headset superimposed to real-world anatomy improved the precision of drilling pilot holes for pedicle screws in a laboratory setting and decreases the effect of surgeon’s experience. Further technical development and validations studies are currently being performed to investigate potential clinical benefits of the herein described AR-based navigation approach.
Collapse
Affiliation(s)
- Cyrill Dennler
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland.
| | - Laurenz Jaberg
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - José Spirig
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland
| | - Christoph Agten
- Department of Radiology, University Hospital Balgrist, University of Zürich, Zurich, Switzerland
| | - Tobias Götschi
- Laboratory for Biomechanics, University Hospital Balgrist, University of Zürich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Computer Assisted Research and Development Group, University Hospital Balgrist, University of Zürich, Zurich, Switzerland
| | - Mazda Farshad
- Spine Division, University Hospital Balgrist, University of Zürich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
80
|
Burström G, Balicki M, Patriciu A, Kyne S, Popovic A, Holthuizen R, Homan R, Skulason H, Persson O, Edström E, Elmi-Terander A. Feasibility and accuracy of a robotic guidance system for navigated spine surgery in a hybrid operating room: a cadaver study. Sci Rep 2020; 10:7522. [PMID: 32371880 PMCID: PMC7200720 DOI: 10.1038/s41598-020-64462-x] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2019] [Accepted: 04/15/2020] [Indexed: 12/11/2022] Open
Abstract
The combination of navigation and robotics in spine surgery has the potential to accurately identify and maintain bone entry position and planned trajectory. The goal of this study was to examine the feasibility, accuracy and efficacy of a new robot-guided system for semi-automated, minimally invasive, pedicle screw placement. A custom robotic arm was integrated into a hybrid operating room (OR) equipped with an augmented reality surgical navigation system (ARSN). The robot was mounted on the OR-table and used to assist in placing Jamshidi needles in 113 pedicles in four cadavers. The ARSN system was used for planning screw paths and directing the robot. The robot arm autonomously aligned with the planned screw trajectory, and the surgeon inserted the Jamshidi needle into the pedicle. Accuracy measurements were performed on verification cone beam computed tomographies with the planned paths superimposed. To provide a clinical grading according to the Gertzbein scale, pedicle screw diameters were simulated on the placed Jamshidi needles. A technical accuracy at bone entry point of 0.48 ± 0.44 mm and 0.68 ± 0.58 mm was achieved in the axial and sagittal views, respectively. The corresponding angular errors were 0.94 ± 0.83° and 0.87 ± 0.82°. The accuracy was statistically superior (p < 0.001) to ARSN without robotic assistance. Simulated pedicle screw grading resulted in a clinical accuracy of 100%. This study demonstrates that the use of a semi-automated surgical robot for pedicle screw placement provides an accuracy well above what is clinically acceptable.
Collapse
Affiliation(s)
- Gustav Burström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden.
- Department of Neurosurgery, Karolinska University Hospital, Stockholm, Sweden.
| | | | | | - Sean Kyne
- Philips Research North America, Cambridge, USA
| | | | - Ronald Holthuizen
- Department of Image Guided Therapy Systems, Philips Healthcare, Best, the Netherlands
| | - Robert Homan
- Department of Image Guided Therapy Systems, Philips Healthcare, Best, the Netherlands
| | - Halldor Skulason
- Department of Neurosurgery, Landspitali University Hospital, Reykjavik, Iceland
| | - Oscar Persson
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
- Department of Neurosurgery, Karolinska University Hospital, Stockholm, Sweden
| | - Erik Edström
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
- Department of Neurosurgery, Karolinska University Hospital, Stockholm, Sweden
| | - Adrian Elmi-Terander
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
- Department of Neurosurgery, Karolinska University Hospital, Stockholm, Sweden
| |
Collapse
|
81
|
Abstract
STUDY DESIGN A prospective, case-based, observational study. OBJECTIVES To investigate how microscope-based augmented reality (AR) support can be utilized in various types of spine surgery. METHODS In 42 spinal procedures (12 intra- and 8 extradural tumors, 7 other intradural lesions, 11 degenerative cases, 2 infections, and 2 deformities) AR was implemented using operating microscope head-up displays (HUDs). Intraoperative low-dose computed tomography was used for automatic registration. Nonlinear image registration was applied to integrate multimodality preoperative images. Target and risk structures displayed by AR were defined in preoperative images by automatic anatomical mapping and additional manual segmentation. RESULTS AR could be successfully applied in all 42 cases. Low-dose protocols ensured a low radiation exposure for registration scanning (effective dose cervical 0.29 ± 0.17 mSv, thoracic 3.40 ± 2.38 mSv, lumbar 3.05 ± 0.89 mSv). A low registration error (0.87 ± 0.28 mm) resulted in a reliable AR representation with a close matching of visualized objects and reality, distinctly supporting anatomical orientation in the surgical field. Flexible AR visualization applying either the microscope HUD or video superimposition, including the ability to selectively activate objects of interest, as well as different display modes allowed a smooth integration in the surgical workflow, without disturbing the actual procedure. On average, 7.1 ± 4.6 objects were displayed visualizing target and risk structures reliably. CONCLUSIONS Microscope-based AR can be applied successfully to various kinds of spinal procedures. AR improves anatomical orientation in the surgical field supporting the surgeon, as well as it offers a potential tool for education.
Collapse
Affiliation(s)
- Barbara Carl
- Department of Neurosurgery, University Marburg, Marburg, Germany
| | - Miriam Bopp
- Department of Neurosurgery, University Marburg, Marburg, Germany
- Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| | - Benjamin Saß
- Department of Neurosurgery, University Marburg, Marburg, Germany
| | - Mirza Pojskic
- Department of Neurosurgery, University Marburg, Marburg, Germany
| | | | - Christopher Nimsky
- Department of Neurosurgery, University Marburg, Marburg, Germany
- Marburg Center for Mind, Brain and Behavior (MCMBB), Marburg, Germany
| |
Collapse
|
82
|
Müller F, Roner S, Liebmann F, Spirig JM, Fürnstahl P, Farshad M. Augmented reality navigation for spinal pedicle screw instrumentation using intraoperative 3D imaging. Spine J 2020; 20:621-628. [PMID: 31669611 DOI: 10.1016/j.spinee.2019.10.012] [Citation(s) in RCA: 51] [Impact Index Per Article: 12.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/06/2019] [Revised: 10/19/2019] [Accepted: 10/21/2019] [Indexed: 02/03/2023]
Abstract
BACKGROUND CONTEXT Due to recent developments in augmented reality with head-mounted devices, holograms of a surgical plan can be displayed directly in the surgeon's field of view. To the best of our knowledge, three dimensional (3D) intraoperative fluoroscopy has not been explored for the use with holographic navigation by head-mounted devices in spine surgery. PURPOSE To evaluate the surgical accuracy of holographic pedicle screw navigation by head-mounted device using 3D intraoperative fluoroscopy. STUDY DESIGN In this experimental cadaver study, the accuracy of surgical navigation using a head-mounted device was compared with navigation with a state-of-the-art pose-tracking system. METHODS Three lumbar cadaver spines were embedded in nontransparent agar gel, leaving only commonly visible anatomy in sight. Intraoperative registration of preoperative planning was achieved by 3D fluoroscopy and fiducial markers attached to lumbar vertebrae. Trackable custom-made drill sleeve guides enabled real-time navigation. In total, 20 K-wires were navigated into lumbar pedicles using AR-navigation, 10 K-wires by the state-of-the-art pose-tracking system. 3D models obtained from postexperimental CT scans were used to measure surgical accuracy. MF is the founder and shareholder of Incremed AG, a Balgrist University Hospital start-up focusing on the development of innovative techniques for surgical executions. The other authors declare no conflict of interest concerning the contents of this study. No external funding was received for this study. RESULTS No significant difference in accuracy was measured between AR-navigated drillings and the gold standard with pose-tracking system with mean translational errors between entry points (3D vector distance; p=.85) of 3.4±1.6 mm compared with 3.2±2.0 mm, and mean angular errors between trajectories (3D angle; p=.30) of 4.3°±2.3° compared with 3.5°±1.4°. CONCLUSIONS In conclusion, holographic navigation by use of a head-mounted device achieve accuracy comparable to the gold standard of high-end pose-tracking systems. CLINICAL SIGNIFICANCE These promising results could result in a new way of surgical navigation with minimal infrastructural requirements but now have to be confirmed in clinical studies.
Collapse
Affiliation(s)
- Fabio Müller
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland.
| | - Simon Roner
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Florentin Liebmann
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Lengghalde 5, 8008 Zurich, Switzerland; Laboratory for Orthopedic Biomechanics, ETH Zurich, Forchstrasse 328, 8008 Zurich, Switzerland
| | - José M Spirig
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| | - Philipp Fürnstahl
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Lengghalde 5, 8008 Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zurich, Switzerland
| |
Collapse
|
83
|
Vadalà G, De Salvatore S, Ambrosio L, Russo F, Papalia R, Denaro V. Robotic Spine Surgery and Augmented Reality Systems: A State of the Art. Neurospine 2020; 17:88-100. [PMID: 32252158 PMCID: PMC7136092 DOI: 10.14245/ns.2040060.030] [Citation(s) in RCA: 44] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2020] [Accepted: 02/24/2020] [Indexed: 12/26/2022] Open
Abstract
Instrumented spine procedures have been performed for decades to treat a wide variety of spinal disorders. New technologies have been employed to obtain a high degree of precision, to minimize risks of damage to neurovascular structures and to diminish harmful exposure of patients and the operative team to ionizing radiations. Robotic spine surgery comprehends 3 major categories: telesurgical robotic systems, robotic-assisted navigation (RAN) and virtual augmented reality (AR) systems, including AR and virtual reality. Telesurgical systems encompass devices that can be operated from a remote command station, allowing to perform surgery via instruments being manipulated by the robot. On the other hand, RAN technologies are characterized by the robotic guidance of surgeon-operated instruments based on real-time imaging. Virtual AR systems are able to show images directly on special visors and screens allowing the surgeon to visualize information about the patient and the procedure (i.e., anatomical landmarks, screw direction and inclination, distance from neurological and vascular structures etc.). The aim of this review is to focus on the current state of the art of robotics and AR in spine surgery and perspectives of these emerging technologies that hold promises for future applications.
Collapse
Affiliation(s)
- Gianluca Vadalà
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Sergio De Salvatore
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Luca Ambrosio
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Fabrizio Russo
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Rocco Papalia
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| | - Vincenzo Denaro
- Department of Orthopaedic and Trauma Surgery, Campus Bio-Medico University of Rome, Rome, Italy
| |
Collapse
|
84
|
Jud L, Fotouhi J, Andronic O, Aichmair A, Osgood G, Navab N, Farshad M. Applicability of augmented reality in orthopedic surgery - A systematic review. BMC Musculoskelet Disord 2020; 21:103. [PMID: 32061248 PMCID: PMC7023780 DOI: 10.1186/s12891-020-3110-2] [Citation(s) in RCA: 77] [Impact Index Per Article: 19.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Accepted: 02/03/2020] [Indexed: 02/07/2023] Open
Abstract
BACKGROUND Computer-assisted solutions are changing surgical practice continuously. One of the most disruptive technologies among the computer-integrated surgical techniques is Augmented Reality (AR). While Augmented Reality is increasingly used in several medical specialties, its potential benefit in orthopedic surgery is not yet clear. The purpose of this article is to provide a systematic review of the current state of knowledge and the applicability of AR in orthopedic surgery. METHODS A systematic review of the current literature was performed to find the state of knowledge and applicability of AR in Orthopedic surgery. A systematic search of the following three databases was performed: "PubMed", "Cochrane Library" and "Web of Science". The systematic review followed the Preferred Reporting Items on Systematic Reviews and Meta-analysis (PRISMA) guidelines and it has been published and registered in the international prospective register of systematic reviews (PROSPERO). RESULTS 31 studies and reports are included and classified into the following categories: Instrument / Implant Placement, Osteotomies, Tumor Surgery, Trauma, and Surgical Training and Education. Quality assessment could be performed in 18 studies. Among the clinical studies, there were six case series with an average score of 90% and one case report, which scored 81% according to the Joanna Briggs Institute Critical Appraisal Checklist (JBI CAC). The 11 cadaveric studies scored 81% according to the QUACS scale (Quality Appraisal for Cadaveric Studies). CONCLUSION This manuscript provides 1) a summary of the current state of knowledge and research of Augmented Reality in orthopedic surgery presented in the literature, and 2) a discussion by the authors presenting the key remarks required for seamless integration of Augmented Reality in the future surgical practice. TRIAL REGISTRATION PROSPERO registration number: CRD42019128569.
Collapse
Affiliation(s)
- Lukas Jud
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Javad Fotouhi
- Computer Aided Medical Procedure, Johns Hopkins University, 3400 N Charles Street, Baltimore, 21210 USA
| | - Octavian Andronic
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Alexander Aichmair
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| | - Greg Osgood
- Johns Hopkins Hospital, Department of Orthopedics Surgery, 1800 Orleans Street, Baltimore, 21287 USA
| | - Nassir Navab
- Computer Aided Medical Procedure, Johns Hopkins University, 3400 N Charles Street, Baltimore, 21210 USA
- Computer Aided Medical Procedure, Technical University of Munich, Boltzmannstrasse 3, 85748 Munich, Germany
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008 Zürich, Switzerland
| |
Collapse
|
85
|
Park BJ, Hunt SJ, Martin C, Nadolski GJ, Wood BJ, Gade TP. Augmented and Mixed Reality: Technologies for Enhancing the Future of IR. J Vasc Interv Radiol 2020; 31:1074-1082. [PMID: 32061520 DOI: 10.1016/j.jvir.2019.09.020] [Citation(s) in RCA: 52] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2019] [Revised: 08/01/2019] [Accepted: 09/20/2019] [Indexed: 10/25/2022] Open
Abstract
Augmented and mixed reality are emerging interactive and display technologies. These technologies are able to merge virtual objects, in either 2 or 3 dimensions, with the real world. Image guidance is the cornerstone of interventional radiology. With augmented or mixed reality, medical imaging can be more readily accessible or displayed in actual 3-dimensional space during procedures to enhance guidance, at times when this information is most needed. In this review, the current state of these technologies is addressed followed by a fundamental overview of their inner workings and challenges with 3-dimensional visualization. Finally, current and potential future applications in interventional radiology are highlighted.
Collapse
Affiliation(s)
- Brian J Park
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104.
| | - Stephen J Hunt
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104
| | - Charles Martin
- Department of Interventional Radiology, Cleveland Clinic, Cleveland, Ohio
| | - Gregory J Nadolski
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104
| | - Bradford J Wood
- Interventional Radiology, National Institutes of Health, Bethesda, Maryland
| | - Terence P Gade
- Department of Interventional Radiology, Hospital of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA 19104
| |
Collapse
|
86
|
Chen L, Zhang F, Zhan W, Gan M, Sun L. Optimization of virtual and real registration technology based on augmented reality in a surgical navigation system. Biomed Eng Online 2020; 19:1. [PMID: 31915014 PMCID: PMC6950982 DOI: 10.1186/s12938-019-0745-z] [Citation(s) in RCA: 34] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Accepted: 12/30/2019] [Indexed: 12/19/2022] Open
Abstract
Background The traditional navigation interface was intended only for two-dimensional observation by doctors; thus, this interface does not display the total spatial information for the lesion area. Surgical navigation systems have become essential tools that enable for doctors to accurately and safely perform complex operations. The image navigation interface is separated from the operating area, and the doctor needs to switch the field of vision between the screen and the patient’s lesion area. In this paper, augmented reality (AR) technology was applied to spinal surgery to provide more intuitive information to surgeons. The accuracy of virtual and real registration was improved via research on AR technology. During the operation, the doctor could observe the AR image and the true shape of the internal spine through the skin. Methods To improve the accuracy of virtual and real registration, a virtual and real registration technique based on an improved identification method and robot-assisted method was proposed. The experimental method was optimized by using the improved identification method. X-ray images were used to verify the effectiveness of the puncture performed by the robot. Results The final experimental results show that the average accuracy of the virtual and real registration based on the general identification method was 9.73 ± 0.46 mm (range 8.90–10.23 mm). The average accuracy of the virtual and real registration based on the improved identification method was 3.54 ± 0.13 mm (range 3.36–3.73 mm). Compared with the virtual and real registration based on the general identification method, the accuracy was improved by approximately 65%. The highest accuracy of the virtual and real registration based on the robot-assisted method was 2.39 mm. The accuracy was improved by approximately 28.5% based on the improved identification method. Conclusion The experimental results show that the two optimized methods are highly very effective. The proposed AR navigation system has high accuracy and stability. This system may have value in future spinal surgeries.
Collapse
Affiliation(s)
- Long Chen
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China
| | - Fengfeng Zhang
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China. .,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China.
| | - Wei Zhan
- Department of Radiation Oncology, The First Affiliated Hospital of Soochow University, Suzhou, China
| | - Minfeng Gan
- Department of Radiation Oncology, The First Affiliated Hospital of Soochow University, Suzhou, China
| | - Lining Sun
- School of Mechanical and Electrical Engineering, Soochow University, Suzhou, 215006, China.,Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou, 215123, China
| |
Collapse
|
87
|
Lohre R, Wang JC, Lewandrowski KU, Goel DP. Virtual reality in spinal endoscopy: a paradigm shift in education to support spine surgeons. JOURNAL OF SPINE SURGERY 2020; 6:S208-S223. [PMID: 32195429 DOI: 10.21037/jss.2019.11.16] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
Abstract
Background Minimally invasive spine surgery (MISS) and endoscopic spine surgery have continually evolving indications in the cervical, thoracic, and lumbar spine. Endoscopic spine surgery entails treatment of disc disease, stenosis, spondylolisthesis, radiculopathy, and deformity. MISS involves complex motor skills in regions of variable anatomy. Simulator use has been proposed to aid in training and skill retention, preoperative planning, and intraoperative use. Methods A systematic review of five databases was performed for publications pertaining to the use of virtual (VR), augmented (AR), and mixed (MR) reality in MISS and spinal endoscopic surgery. Qualitative data analysis was undertaken with focus of study design, quality, and reported outcomes. Study quality was assessed using the Medical Education Research Quality Instrument (MERSQI) score and level of evidence (LoE) by a modified Oxford Centre for Evidence-Based Medicine (OCEBM) level for simulation in medicine. Results Thirty-eight studies were retained for data collection. Studies were of intervention-control, clinical application, and pilot or cross-sectional design. Identified articles illustrated use of VR, AR, and MR in all study designs. Procedures included pedicle cannulation and screw insertion, vertebroplasty, kyphoplasty, percutaneous transforaminal endoscopic discectomy (PTED), lumbar puncture and facet injection, transvertebral anterior cervical foraminotomy (TVACF) and posterior cervical laminoforaminotomy. Overall MERSQI score was low-to-medium [M =9.71 (SD =2.60); range, 4.5-13.5], and LoE was predominantly low given the number of purely descriptive articles, or low-quality randomized studies. Conclusions The current scope of VR, AR, and MR surgical simulators in MISS and spinal endoscopic surgery was described. Studies demonstrate improvement in technical skill and patient outcomes in short term follow-up. Despite this, overall study quality and levels of evidence remain low. Cohesive study design and reporting with focus on transfer validity in training scenarios, and patient derived outcome measures in clinical studies are required to further advance the field.
Collapse
Affiliation(s)
- Ryan Lohre
- Department of Orthopaedics, University of British Columbia, Vancouver, BC, USA
| | - Jeffrey C Wang
- USC Spine Center, Keck School of Medicine at University of Southern California, Los Angeles, USA
| | - Kai-Uwe Lewandrowski
- Center for Advanced Spine Care of Southern Arizona and Surgical Institute of Tucson, Tucson, AZ, USA.,Department of Neurosurgery, UNIRIO, Rio de Janeiro, Brazil
| | - Danny P Goel
- Department of Orthopaedics, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
88
|
Pérez-Pachón L, Poyade M, Lowe T, Gröning F. Image Overlay Surgery Based on Augmented Reality: A Systematic Review. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2020; 1260:175-195. [PMID: 33211313 DOI: 10.1007/978-3-030-47483-6_10] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2022]
Abstract
Augmented Reality (AR) applied to surgical guidance is gaining relevance in clinical practice. AR-based image overlay surgery (i.e. the accurate overlay of patient-specific virtual images onto the body surface) helps surgeons to transfer image data produced during the planning of the surgery (e.g. the correct resection margins of tissue flaps) to the operating room, thus increasing accuracy and reducing surgery times. We systematically reviewed 76 studies published between 2004 and August 2018 to explore which existing tracking and registration methods and technologies allow healthcare professionals and researchers to develop and implement these systems in-house. Most studies used non-invasive markers to automatically track a patient's position, as well as customised algorithms, tracking libraries or software development kits (SDKs) to compute the registration between patient-specific 3D models and the patient's body surface. Few studies combined the use of holographic headsets, SDKs and user-friendly game engines, and described portable and wearable systems that combine tracking, registration, hands-free navigation and direct visibility of the surgical site. Most accuracy tests included a low number of subjects and/or measurements and did not normally explore how these systems affect surgery times and success rates. We highlight the need for more procedure-specific experiments with a sufficient number of subjects and measurements and including data about surgical outcomes and patients' recovery. Validation of systems combining the use of holographic headsets, SDKs and game engines is especially interesting as this approach facilitates an easy development of mobile AR applications and thus the implementation of AR-based image overlay surgery in clinical practice.
Collapse
Affiliation(s)
- Laura Pérez-Pachón
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK.
| | - Matthieu Poyade
- School of Simulation and Visualisation, Glasgow School of Art, Glasgow, UK
| | - Terry Lowe
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
- Head and Neck Oncology Unit, Aberdeen Royal Infirmary (NHS Grampian), Aberdeen, UK
| | - Flora Gröning
- School of Medicine, Medical Sciences and Nutrition, University of Aberdeen, Aberdeen, UK
| |
Collapse
|
89
|
Laverdière C, Corban J, Khoury J, Ge SM, Schupbach J, Harvey EJ, Reindl R, Martineau PA. Augmented reality in orthopaedics. Bone Joint J 2019; 101-B:1479-1488. [DOI: 10.1302/0301-620x.101b12.bjj-2019-0315.r1] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Aims Computer-based applications are increasingly being used by orthopaedic surgeons in their clinical practice. With the integration of technology in surgery, augmented reality (AR) may become an important tool for surgeons in the future. By superimposing a digital image on a user’s view of the physical world, this technology shows great promise in orthopaedics. The aim of this review is to investigate the current and potential uses of AR in orthopaedics. Materials and Methods A systematic review of the PubMed, MEDLINE, and Embase databases up to January 2019 using the keywords ‘orthopaedic’ OR ‘orthopedic AND augmented reality’ was performed by two independent reviewers. Results A total of 41 publications were included after screening. Applications were divided by subspecialty: spine (n = 15), trauma (n = 16), arthroplasty (n = 3), oncology (n = 3), and sports (n = 4). Out of these, 12 were clinical in nature. AR-based technologies have a wide variety of applications, including direct visualization of radiological images by overlaying them on the patient and intraoperative guidance using preoperative plans projected onto real anatomy, enabling hands-free real-time access to operating room resources, and promoting telemedicine and education. Conclusion There is an increasing interest in AR among orthopaedic surgeons. Although studies show similar or better outcomes with AR compared with traditional techniques, many challenges need to be addressed before this technology is ready for widespread use. Cite this article: Bone Joint J 2019;101-B:1479–1488
Collapse
Affiliation(s)
- Carl Laverdière
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Jason Corban
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Jason Khoury
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Susan Mengxiao Ge
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Justin Schupbach
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Edward J. Harvey
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Rudy Reindl
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| | - Paul A. Martineau
- Department of Orthopedic Surgery, McGill University Health Centre, Montreal, Canada
| |
Collapse
|
90
|
Heinrich F, Schwenderling L, Becker M, Skalej M, Hansen C. HoloInjection: augmented reality support for CT-guided spinal needle injections. Healthc Technol Lett 2019; 6:165-171. [PMID: 32038851 PMCID: PMC6942927 DOI: 10.1049/htl.2019.0062] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2019] [Accepted: 10/02/2019] [Indexed: 12/02/2022] Open
Abstract
The correct placement of needles is decisive for the success of many minimally-invasive interventions and therapies. These needle insertions are usually only guided by radiological imaging and can benefit from additional navigation support. Augmented reality (AR) is a promising tool to conveniently provide needed information and may thus overcome the limitations of existing approaches. To this end, a prototypical AR application was developed to guide the insertion of needles to spinal targets using the mixed reality glasses Microsoft HoloLens. The system's registration accuracy was attempted to measure and three guidance visualisation concepts were evaluated concerning achievable in-plane and out-of-plane needle orientation errors in a comparison study. Results suggested high registration accuracy and showed that the AR prototype is suitable for reducing out-of-plane orientation errors. Limitations, like comparatively high in-plane orientation errors, effects of the viewing position and missing image slices indicate potential for improvement that needs to be addressed before transferring the application to clinical trials.
Collapse
Affiliation(s)
- Florian Heinrich
- Faculty of Computer Science, University of Magdeburg, Universitätsplatz 2, 39106 Magdeburg, Germany.,Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany
| | - Luisa Schwenderling
- Faculty of Computer Science, University of Magdeburg, Universitätsplatz 2, 39106 Magdeburg, Germany.,Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany
| | - Mathias Becker
- Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany.,Department of Neuroradiology, University Hospital Magdeburg, Leipziger Strasse 44, 39120 Magdeburg, Germany
| | - Martin Skalej
- Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany.,Department of Neuroradiology, University Hospital Magdeburg, Leipziger Strasse 44, 39120 Magdeburg, Germany
| | - Christian Hansen
- Faculty of Computer Science, University of Magdeburg, Universitätsplatz 2, 39106 Magdeburg, Germany.,Research Campus STIMULATE, Sandtorstrasse 23, 39106 Magdeburg, Germany
| |
Collapse
|
91
|
Carl B, Bopp M, Saß B, Pojskic M, Nimsky C. Augmented reality in intradural spinal tumor surgery. Acta Neurochir (Wien) 2019; 161:2181-2193. [PMID: 31300886 DOI: 10.1007/s00701-019-04005-0] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2019] [Accepted: 07/05/2019] [Indexed: 02/07/2023]
Abstract
BACKGROUND Microscope-based augmented reality (AR) is commonly used in cranial surgery; however, until recently, this technique was not implemented for spinal surgery. We prospectively investigated, how AR can be applied for intradural spinal tumor surgery. METHODS For ten patients with intradural spinal tumors (ependymoma, glioma, hemangioblastoma, meningioma, and metastasis), AR was provided by head-up displays (HUDs) of operating microscopes. User-independent automatic AR registration was established by low-dose intraoperative computed tomography. The objects visualized by AR were segmented in preoperative imaging data; non-linear image registration was applied to consider spine flexibility. RESULTS In all cases, AR supported surgery by visualizing the tumor outline and other relevant surrounding structures. The overall AR registration error was 0.72 ± 0.24 mm (mean ± standard deviation), a close matching of visible tumor outline and AR visualization was observed for all cases. Registration scanning resulted in a low effective dose of 0.22 ± 0.16 mSv for cervical and 1.68 ± 0.61 mSv for thoracic lesions. The mean HUD AR usage in relation to microscope time was 51.6 ± 36.7%. The HUD was switched off and turned on again in a range of 2 to 17 times (5.7 ± 4.4 times). Independent of the status of the HUD, the AR visualization was displayed on monitors throughout surgery. CONCLUSIONS Microscope-based AR can be reliably applied to intradural spinal tumor surgery. Automatic AR registration ensures high precision and provides an intuitive visualization of the extent of the tumor and surrounding structures. Given this setting, all advanced multi-modality options of cranial AR can also be applied to spinal surgery.
Collapse
|
92
|
Yoo JS, Patel DS, Hrynewycz NM, Brundage TS, Singh K. The utility of virtual reality and augmented reality in spine surgery. ANNALS OF TRANSLATIONAL MEDICINE 2019; 7:S171. [PMID: 31624737 DOI: 10.21037/atm.2019.06.38] [Citation(s) in RCA: 36] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
As the number of advances in surgical techniques increases, it becomes increasingly important to assess and research the technology regarding spine surgery techniques in order to increase surgical accuracy, decrease overall length of surgery, and minimize overall radiation exposure. Currently, augmented reality and virtual reality have shown promising results in regard to their applicability beyond their current functions. At present, VR has been generally applied to a teaching and preparatory role, while AR has been utilized in surgical settings. As such, the following review attempts to provide an overview of both virtual reality and augmented reality, followed by a discussion of their current applications and future direction.
Collapse
Affiliation(s)
- Joon S Yoo
- Department of Orthopaedic Surgery, Rush University Medical Center, Chicago, IL, USA
| | - Dillon S Patel
- Department of Orthopaedic Surgery, Rush University Medical Center, Chicago, IL, USA
| | - Nadia M Hrynewycz
- Department of Orthopaedic Surgery, Rush University Medical Center, Chicago, IL, USA
| | - Thomas S Brundage
- Department of Orthopaedic Surgery, Rush University Medical Center, Chicago, IL, USA
| | - Kern Singh
- Department of Orthopaedic Surgery, Rush University Medical Center, Chicago, IL, USA
| |
Collapse
|
93
|
Augmented and Virtual Reality Instrument Tracking for Minimally Invasive Spine Surgery: A Feasibility and Accuracy Study. Spine (Phila Pa 1976) 2019; 44:1097-1104. [PMID: 30830046 DOI: 10.1097/brs.0000000000003006] [Citation(s) in RCA: 61] [Impact Index Per Article: 12.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
STUDY DESIGN Cadaveric animal laboratory study. OBJECTIVE To evaluate the feasibility and accuracy of pedicle cannulation using an augmented reality surgical navigation (ARSN) system with automatic instrument tracking, yielding feedback of instrument position in relation to deep anatomy. SUMMARY OF BACKGROUND DATA Minimally invasive spine surgery (MISS) has the possibility of reducing surgical exposure resulting in shorter hospital stays, lower blood loss and infection rates compared with open surgery but the drawback of limiting visual feedback to the surgeon regarding deep anatomy. MISS is mainly performed using image-guided 2D fluoroscopy, thus exposing the staff to ionizing radiation. METHODS A hybrid operating room (OR) equipped with a robotic C-arm with integrated optical cameras for augmented reality instrument navigation was used. In two pig cadavers, cone beam computed tomography (CBCT) scans were performed, a 3D model generated, and pedicle screw insertions were planned. Seventy-eight insertions were performed. Technical accuracy was assessed on post-insertion CBCTs by measuring the distance between the navigated device and the corresponding pre-planned path as well as the angular deviations. Drilling and hammering into the pedicle were also compared. Navigation time was measured. An independent reviewer assessed a simulated clinical accuracy according to Gertzbein. RESULTS The technical accuracy was 1.7 ± 1.0 mm at the bone entry point and 2.0 ± 1.3 mm at the device tip. The angular deviation was 1.7 ± 1.7° in the axial and 1.6 ± 1.2° in the sagittal plane. Navigation time per insertion was 195 ± 93 seconds. There was no difference in accuracy between hammering and drilling into the pedicle. The clinical accuracy was 97.4% to 100% depending on the screw size considered for placement. No ionizing radiation was used during navigation. CONCLUSION ARSN with instrument tracking for MISS is feasible, accurate, and radiation-free during navigation. LEVEL OF EVIDENCE 3.
Collapse
|
94
|
Auloge P, Cazzato RL, Ramamurthy N, de Marini P, Rousseau C, Garnon J, Charles YP, Steib JP, Gangi A. Augmented reality and artificial intelligence-based navigation during percutaneous vertebroplasty: a pilot randomised clinical trial. EUROPEAN SPINE JOURNAL : OFFICIAL PUBLICATION OF THE EUROPEAN SPINE SOCIETY, THE EUROPEAN SPINAL DEFORMITY SOCIETY, AND THE EUROPEAN SECTION OF THE CERVICAL SPINE RESEARCH SOCIETY 2019; 29:1580-1589. [DOI: 10.1007/s00586-019-06054-6] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/29/2018] [Revised: 05/29/2019] [Accepted: 06/26/2019] [Indexed: 12/24/2022]
|
95
|
Chytas D, Malahias MA, Nikolaou VS. Augmented Reality in Orthopedics: Current State and Future Directions. Front Surg 2019; 6:38. [PMID: 31316995 PMCID: PMC6610425 DOI: 10.3389/fsurg.2019.00038] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2019] [Accepted: 06/12/2019] [Indexed: 12/29/2022] Open
Abstract
Augmented reality (AR) comprises special hardware and software, which is used in order to offer computer-processed imaging data to the surgeon in real time, so that real-life objects are combined with computer-generated images. AR technology has recently gained increasing interest in the surgical practice. Preclinical research has provided substantial evidence that AR might be a useful tool for intra-operative guidance and decision-making. AR has been applied to a wide spectrum of orthopedic procedures, such as tumor resection, fracture fixation, arthroscopy, and component's alignment in total joint arthroplasty. The present study aimed to summarize the current state of the application of AR in orthopedics, in preclinical and clinical level, providing future directions and perspectives concerning potential further benefits from this technology.
Collapse
Affiliation(s)
- Dimitrios Chytas
- 2nd Orthopaedic Department, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| | | | - Vasileios S Nikolaou
- 2nd Orthopaedic Department, School of Medicine, National and Kapodistrian University of Athens, Athens, Greece
| |
Collapse
|
96
|
Accuracy assessment for the co-registration between optical and VIVE head-mounted display tracking. Int J Comput Assist Radiol Surg 2019; 14:1207-1215. [PMID: 31069642 DOI: 10.1007/s11548-019-01992-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2019] [Accepted: 04/25/2019] [Indexed: 10/26/2022]
Abstract
PURPOSE We report on the development and accuracy assessment of a hybrid tracking system that integrates optical spatial tracking into a video pass-through head-mounted display. METHODS The hybrid system uses a dual-tracked co-calibration apparatus to provide a co-registration between the origins of an optical dynamic reference frame and the VIVE Pro controller through a point-based registration. This registration provides the location of optically tracked tools with respect to the VIVE controller's origin and thus the VIVE's tracking system. RESULTS The positional accuracy was assessed using a CNC machine to collect a grid of points with 25 samples per location. The positional trueness and precision for the hybrid tracking system were [Formula: see text] and [Formula: see text], respectively. The rotational accuracy was assessed through inserting a stylus tracked by all three systems into a hemispherical phantom with cylindrical openings at known angles and collecting 25 samples per cylinder for each system. The rotational trueness and precision for the hybrid tracking system were [Formula: see text] and [Formula: see text], respectively. The difference in position and rotational trueness between the OTS and the hybrid tracking system was [Formula: see text] and [Formula: see text], respectively. CONCLUSIONS We developed a hybrid tracking system that allows the pose of optically tracked surgical instruments to be known within a first-person HMD visualization system, achieving submillimeter accuracy. This research validated the positional and rotational accuracy of the hybrid tracking system and subsequently the optical tracking and VIVE tracking systems. This work provides a method to determine the position of an optically tracked surgical tool with a surgically acceptable accuracy within a low-cost commercial-grade video pass-through HMD. The hybrid tracking system provides the foundation for the continued development of virtual reality or augmented virtuality surgical navigation systems for training or practicing surgical techniques.
Collapse
|
97
|
Liebmann F, Roner S, von Atzigen M, Scaramuzza D, Sutter R, Snedeker J, Farshad M, Fürnstahl P. Pedicle screw navigation using surface digitization on the Microsoft HoloLens. Int J Comput Assist Radiol Surg 2019; 14:1157-1165. [PMID: 30993519 DOI: 10.1007/s11548-019-01973-7] [Citation(s) in RCA: 90] [Impact Index Per Article: 18.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2019] [Accepted: 04/04/2019] [Indexed: 12/24/2022]
Abstract
PURPOSE In spinal fusion surgery, imprecise placement of pedicle screws can result in poor surgical outcome or may seriously harm a patient. Patient-specific instruments and optical systems have been proposed for improving precision through surgical navigation compared to freehand insertion. However, existing solutions are expensive and cannot provide in situ visualizations. Recent technological advancement enabled the production of more powerful and precise optical see-through head-mounted displays for the mass market. The purpose of this laboratory study was to evaluate whether such a device is sufficiently precise for the navigation of lumbar pedicle screw placement. METHODS A novel navigation method, tailored to run on the Microsoft HoloLens, was developed. It comprises capturing of the intraoperatively reachable surface of vertebrae to achieve registration and tool tracking with real-time visualizations without the need of intraoperative imaging. For both surface sampling and navigation, 3D printable parts, equipped with fiducial markers, were employed. Accuracy was evaluated within a self-built setup based on two phantoms of the lumbar spine. Computed tomography (CT) scans of the phantoms were acquired to carry out preoperative planning of screw trajectories in 3D. A surgeon placed the guiding wire for the pedicle screw bilaterally on ten vertebrae guided by the navigation method. Postoperative CT scans were acquired to compare trajectory orientation (3D angle) and screw insertion points (3D distance) with respect to the planning. RESULTS The mean errors between planned and executed screw insertion were [Formula: see text] for the screw trajectory orientation and 2.77±1.46 mm for the insertion points. The mean time required for surface digitization was 125±27 s. CONCLUSIONS First promising results under laboratory conditions indicate that precise lumbar pedicle screw insertion can be achieved by combining HoloLens with our proposed navigation method. As a next step, cadaver experiments need to be performed to confirm the precision on real patient anatomy.
Collapse
Affiliation(s)
- Florentin Liebmann
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland. .,Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
| | - Simon Roner
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.,Orthopaedic Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Marco von Atzigen
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland.,Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Davide Scaramuzza
- Department of Informatics, University of Zurich, Zurich, Switzerland.,Department of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
| | - Reto Sutter
- Radiology Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Jess Snedeker
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.,Orthopaedic Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Mazda Farshad
- Orthopaedic Department, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Computer Assisted Research and Development Group, Balgrist University Hospital, University of Zurich, Forchstrasse 340, 8008, Zurich, Switzerland
| |
Collapse
|
98
|
Aaskov J, Kawchuk GN, Hamaluik KD, Boulanger P, Hartvigsen J. X-ray vision: the accuracy and repeatability of a technology that allows clinicians to see spinal X-rays superimposed on a person's back. PeerJ 2019; 7:e6333. [PMID: 30783566 PMCID: PMC6377589 DOI: 10.7717/peerj.6333] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2018] [Accepted: 12/21/2018] [Indexed: 12/03/2022] Open
Abstract
Objective Since the discovery of ionizing radiation, clinicians have evaluated X-ray images separately from the patient. The objective of this study was to investigate the accuracy and repeatability of a new technology which seeks to resolve this historic limitation by projecting anatomically correct X-ray images on to a person’s skin. Methods A total of 13 participants enrolled in the study, each having a pre-existing anteroposterior lumbar X-ray. Each participant’s image was uploaded into the Hololens Mixed reality system which when worn, allowed a single examiner to view a participant’s own X-ray superimposed on the participant’s back. The projected image was topographically corrected using depth information obtained by the Hololens system then aligned via existing anatomic landmarks. Using this superimposed image, vertebral levels were identified and validated against spinous process locations obtained by ultrasound. This process was repeated 1–5 days later. The projection of each vertebra was deemed to be “on-target” if it fell within the known morphological dimensions of the spinous process for that specific vertebral level. Results The projection system created on-target projections with respect to individual vertebral levels 73% of the time with no significant difference seen between testing sessions. The average repeatability for all vertebral levels between testing sessions was 77%. Conclusion These accuracy and repeatability data suggest that the accuracy and repeatability of projecting X-rays directly on to the skin is feasible for identifying underlying anatomy and as such, has potential to place radiological evaluation within the patient context. Future opportunities to improve this procedure will focus on mitigating potential sources of error.
Collapse
Affiliation(s)
- Jacob Aaskov
- Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark
| | - Gregory N Kawchuk
- Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark.,Physical Therapy, University of Alberta, Edmonton, AB, Canada.,Nordic Institute of Chiropractic and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark
| | | | | | - Jan Hartvigsen
- Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark.,Nordic Institute of Chiropractic and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark
| |
Collapse
|