1
|
Bonanni M, Russo G, De Siati M, Tomao F, Massaro G, Benedetto D, Longoni M, Matteucci A, Maffi V, Mariano EG, Di Luozzo M, Chiricolo G, Maisano F, Sangiorgi GM. Holographic mixed reality for planning transcatheter aortic valve replacement. Int J Cardiol 2024; 412:132330. [PMID: 38964558 DOI: 10.1016/j.ijcard.2024.132330] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/10/2024] [Revised: 06/06/2024] [Accepted: 07/01/2024] [Indexed: 07/06/2024]
Abstract
BACKGROUND Using three-dimensional (3D) modalities for optimal pre-procedure planning in transcatheter aortic valve replacement (TAVR) is critical for procedural success. However, current methods rely on visualizing images on a two-dimensional screen, using shading and colors to create the illusion of 3D, potentially impeding the accurate comprehension of the actual anatomy structures. In contrast, a new Mixed Reality (MxR) based software enables accurate 3D visualization, imaging manipulation, and quantification of measurements. AIMS The study aims to evaluate the feasibility, reproducibility, and accuracy of dimensions of the aortic valve complex as measured with a new holographic MxR software (ARTICOR®, Artiness srl, Milano, Italy) compared to a widely used software for pre-operative sizing and planning (3mensio Medical Imaging BV, Bilthoven, The Netherlands). METHODS This retrospective, observational, double-center study enrolled 100 patients with severe aortic stenosis who underwent cardiac computed tomography (CCT) before TAVR. The CCT datasets of volumetric aortic valve images were analyzed using 3Mensio and newly introduced MxR-based software. RESULTS Ninety-eight percent of the CCT datasets were successfully converted into holographic models. A higher level of agreement between the two software systems was observed for linear metrics (short, long, and average diameter). In comparison, agreement was lower for area, perimeter, and annulus-to-coronary ostia distance measurements. Notably, the annulus area, annular perimeter, left ventricular outflow tract (LVOT) area, and LVOT perimeter were significantly and consistently smaller with the MxR-based software compared to the 3Mensio. Excellent interobserver reliability was demonstrated for most measurements, especially for direct linear measurements. CONCLUSIONS Linear measurements of the aortic valve complex using MxR-based software are reproducible compared to the standard CCT dataset analyzed with 3Mensio. MxR-based software could represent an accurate tool for the pre-procedural planning of TAVR.
Collapse
Affiliation(s)
- Michela Bonanni
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Giulio Russo
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Matteo De Siati
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Flavia Tomao
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Gianluca Massaro
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Daniela Benedetto
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Matteo Longoni
- Heart Valve Center, Cardio-Thoracic-Vascular Department, IRCCS San Raffaele Scientific Institute, Milan 20132, Italy
| | - Andrea Matteucci
- Department of System and Experimental Medicine, University of Rome 'Tor Vergata, 00133 Rome, Italy
| | - Valerio Maffi
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Enrica Giuliana Mariano
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Marco Di Luozzo
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Gaetano Chiricolo
- Department of Biomedicine and Prevention, University of Rome "Tor Vergata", 00133 Rome, Italy
| | - Francesco Maisano
- Heart Valve Center, Cardio-Thoracic-Vascular Department, IRCCS San Raffaele Scientific Institute, Milan 20132, Italy
| | | |
Collapse
|
2
|
Tohi Y, Okazoe H, Mitamura K, Osaki Y, Tanaka K, Matsuoka Y, Nishiyama Y, Kanenishi K, Sugimoto M. Successful laparoscopic retroperitoneal tumor resection using mixed reality and guiding marker techniques. IJU Case Rep 2024; 7:320-323. [PMID: 38966773 PMCID: PMC11221929 DOI: 10.1002/iju5.12735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2024] [Accepted: 05/03/2024] [Indexed: 07/06/2024] Open
Abstract
Introduction Small tumors may be difficult to identify visually and require preoperative effort to locate. Recent advancements in mixed reality technology have improved surgical accuracy in various departments. Here, we present the application of mixed reality-assisted surgery and a guiding marker in the case of small retroperitoneal metastasis of uterine cancer. Case presentation A 67-year-old female with a history of uterine cancer had a retroperitoneal metastasis in the lateroconal fascia near the right diaphragm, measuring 2 cm and infiltrating the peritoneum. We performed precise surgical planning using the preoperative mixed reality software "Holoeyes" on a head-mounted display called HoloLens2. Novel techniques, including ultrasonography-guided placement of a guiding marker and strategic port-site placement facilitated by HoloLens2, ensured accurate tumor identification and laparoscopic resection with minimal blood loss and no intraoperative complications. Conclusion The use of mixed reality-assisted surgery and a guiding marker effectively enhanced the precision of retroperitoneal tumor resection.
Collapse
Affiliation(s)
- Yoichiro Tohi
- Department of Urology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Homare Okazoe
- Department of Urology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Katsuya Mitamura
- Department of Radiology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Yu Osaki
- Department of Urology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Kenichi Tanaka
- Department of Radiology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Yuki Matsuoka
- Department of Urology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Yoshihiro Nishiyama
- Department of Radiology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Kenji Kanenishi
- Department of Perinatology and Gynecology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| | - Mikio Sugimoto
- Department of Urology, Faculty of MedicineKagawa UniversityKita‐gunKagawaJapan
| |
Collapse
|
3
|
Hamza H, Al-Ansari A, Navkar NV. Technologies Used for Telementoring in Open Surgery: A Scoping Review. Telemed J E Health 2024; 30:1810-1824. [PMID: 38546446 DOI: 10.1089/tmj.2023.0669] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/20/2024] Open
Abstract
Background: Telementoring technologies enable a remote mentor to guide a mentee in real-time during surgical procedures. This addresses challenges, such as lack of expertise and limited surgical training/education opportunities in remote locations. This review aims to provide a comprehensive account of these technologies tailored for open surgery. Methods: A comprehensive scoping review of the scientific literature was conducted using PubMed, ScienceDirect, ACM Digital Library, and IEEE Xplore databases. Broad and inclusive searches were done to identify articles reporting telementoring or teleguidance technologies in open surgery. Results: Screening of the search results yielded 43 articles describing surgical telementoring for open approach. The studies were categorized based on the type of open surgery (surgical specialty, surgical procedure, and stage of clinical trial), the telementoring technology used (information transferred between mentor and mentee, devices used for rendering the information), and assessment of the technology (experience level of mentor and mentee, study design, and assessment criteria). Majority of the telementoring technologies focused on trauma-related surgeries and mixed reality headsets were commonly used for rendering information (telestrations, surgical tools, or hand gestures) to the mentee. These technologies were primarily assessed on high-fidelity synthetic phantoms. Conclusions: Despite longer operative time, these telementoring technologies demonstrated clinical viability during open surgeries through improved performance and confidence of the mentee. In general, usage of immersive devices and annotations appears to be promising, although further clinical trials will be required to thoroughly assess its benefits.
Collapse
Affiliation(s)
- Hawa Hamza
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | - Nikhil V Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| |
Collapse
|
4
|
Wang X, Yang C, Liu Z, Zhang J, Xue C, Xing L, Zheng Y, Geng C, Yin X. R-MFE-TCN: A correlation prediction model between body surface and tumor during respiratory movement. Med Phys 2024. [PMID: 38801342 DOI: 10.1002/mp.17183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2024] [Revised: 04/30/2024] [Accepted: 05/11/2024] [Indexed: 05/29/2024] Open
Abstract
BACKGROUND 2D CT image-guided radiofrequency ablation (RFA) is an exciting minimally invasive treatment that can destroy liver tumors without removing them. However, CT images can only provide limited static information, and the tumor will move with the patient's respiratory movement. Therefore, how to accurately locate tumors under free conditions is an urgent problem to be solved at present. PURPOSE The purpose of this study is to propose a respiratory correlation prediction model for mixed reality surgical assistance system, Riemannian and Multivariate Feature Enhanced Temporal Convolutional Network (R-MFE-TCN), and to achieve accurate respiratory correlation prediction. METHODS The model adopts a respiration-oriented Riemannian information enhancement strategy to expand the diversity of the dataset. A new Multivariate Feature Enhancement module (MFE) is proposed to retain respiratory data information, so that the network can fully explore the correlation of internal and external data information, the dual-channel is used to retain multivariate respiratory feature, and the Multi-headed Self-attention obtains respiratory peak-to-valley value periodic information. This information significantly improves the prediction performance of the network. At the same time, the PSO algorithm is used for hyperparameter optimization. In the experiment, a total of seven patients' internal and external respiratory motion trajectories were obtained from the dataset, and the first six patients were selected as the training set. The respiratory signal collection frequency was 21 Hz. RESULTS A large number of experiments on the dataset prove the good performance of this method, which improves the prediction accuracy while also having strong robustness. This method can reduce the delay deviation under long window prediction and achieve good performance. In the case of 400 ms, the average RMSE and MAE are 0.0453 and 0.0361 mm, respectively, which is better than other research methods. CONCLUSION The R-MFE-TCN can be extended to respiratory correlation prediction in different clinical situations, meeting the accuracy requirements for respiratory delay prediction in surgical assistance.
Collapse
Affiliation(s)
- Xuehu Wang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Chang Yang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Ziqi Liu
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Jushuo Zhang
- College of Electronic and Information Engineering, Hebei University, Baoding, China
- Key Laboratory of Digital Medical Engineering of Hebei Province, Baoding, China
| | - Chao Xue
- Senior Department of Orthopedics, the Fourth Medical Center of PLA General Hospital, Beijing, China
| | - Lihong Xing
- Affiliated Hospital of Hebei University, Baoding, China
| | - Yongchang Zheng
- Department of Liver Surgery, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College (CAMS & PUMC), Beijing, China
| | - Chen Geng
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
| | - Xiaoping Yin
- Affiliated Hospital of Hebei University, Baoding, China
| |
Collapse
|
5
|
Dino MJS, Dion KW, Abadir PM, Budhathoki C, Huang CM, Padula WV, Himmelfarb CRD, Davidson PM. The impact of a mixed reality technology-driven health enhancing physical activity program among community-dwelling older adults: a study protocol. Front Public Health 2024; 12:1383407. [PMID: 38807990 PMCID: PMC11130374 DOI: 10.3389/fpubh.2024.1383407] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2024] [Accepted: 04/25/2024] [Indexed: 05/30/2024] Open
Abstract
Background Physical inactivity and a sedentary lifestyle among community-dwelling older adults poses a greater risk for progressive physical and cognitive decline. Mixed reality technology-driven health enhancing physical activities such as the use of virtual coaches provide an emerging and promising solution to support healthy lifestyle, but the impact has not been clearly understood. Methods and analysis An observational explanatory sequential mixed-method research design was conceptualized to examine the potential impact of a user-preferred mixed reality technology-driven health enhancing physical activity program directed toward purposively selected community-dwelling older adults in two senior centers in the Philippines. Quantitative components of the study will be done through a discreet choice experiment and a quasi-experimental study. A total of 128, or 64 older adults in each center, will be recruited via posters at community senior centers who will undergo additional screening or health records review by a certified gerontologist to ensure safety and proper fit. Treatments (live coaching with video-based exercise and mixed reality technology-driven exercise) will be assigned to each of the two senior center sites for the quasi-experiment. The participants from the experimental group shall be involved in the discreet choice experiment, modeling, and usability evaluations. Finally, a qualitative sample of participants (n = 6) as key informants shall be obtained from the experimental group using purposive selection. Discussion This study protocol will examine the health impact of a promising mixed reality program in health promotion among older adults. The study utilizes a human-centered mixed method research design in technology development and evaluation in the context of developing nations.Clinical trial registration: ClinicalTrials.gov, identifier NCT06136468.
Collapse
Affiliation(s)
- Michael Joseph S. Dino
- School of Nursing, Johns Hopkins University, Baltimore, MD, United States
- Research, Development, and Innovation Center, Our Lady of Fatima University, Valenzuela, Philippines
- Sigma Theta Tau, International Honor Society in Nursing, Indianapolis, IN, United States
| | - Kenneth W. Dion
- School of Nursing, Johns Hopkins University, Baltimore, MD, United States
- Sigma Theta Tau, International Honor Society in Nursing, Indianapolis, IN, United States
| | - Peter M. Abadir
- School of Medicine, Johns Hopkins University, Baltimore, MD, United States
| | - Chakra Budhathoki
- School of Nursing, Johns Hopkins University, Baltimore, MD, United States
- Sigma Theta Tau, International Honor Society in Nursing, Indianapolis, IN, United States
| | - Chien-Ming Huang
- Department of Computer Science, Johns Hopkins University, Baltimore, MD, United States
| | - William V. Padula
- Department of Pharmaceutical and Health Economics, University of Southern California School of Pharmacy, Los Angeles, CA, United States
| | - Cheryl R. Dennison Himmelfarb
- School of Nursing, Johns Hopkins University, Baltimore, MD, United States
- Sigma Theta Tau, International Honor Society in Nursing, Indianapolis, IN, United States
| | - Patricia M. Davidson
- School of Nursing, Johns Hopkins University, Baltimore, MD, United States
- Sigma Theta Tau, International Honor Society in Nursing, Indianapolis, IN, United States
- Office of the Vice Chancellor and President, University of Wollongong, Wollongong, NSW, Australia
| |
Collapse
|
6
|
Heining SM, Raykov V, Wolff O, Alkadhi H, Pape HC, Wanner GA. Augmented reality-based surgical navigation of pelvic screw placement: an ex-vivo experimental feasibility study. Patient Saf Surg 2024; 18:3. [PMID: 38229102 DOI: 10.1186/s13037-023-00385-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Accepted: 12/23/2023] [Indexed: 01/18/2024] Open
Abstract
BACKGROUND Minimally invasive surgical treatment of pelvic trauma requires a significant level of surgical training and technical expertise. Novel imaging and navigation technologies have always driven surgical technique, and with head-mounted displays being commercially available nowadays, the assessment of such Augmented Reality (AR) devices in a specific surgical setting is appropriate. METHODS In this ex-vivo feasibility study, an AR-based surgical navigation system was assessed in a specific clinical scenario with standard pelvic and acetabular screw pathways. The system has the following components: an optical-see-through Head Mounted Display, a specifically designed modular AR software, and surgical tool tracking using pose estimation with synthetic square markers. RESULTS The success rate for entry point navigation was 93.8%, the overall translational deviation of drill pathways was 3.99 ± 1.77 mm, and the overall rotational deviation of drill pathways was 4.3 ± 1.8°. There was no relevant theoretic screw perforation, as shown by 88.7% Grade 0-1 and 100% Grade 0-2 rating in our pelvic screw perforation score. Regarding screw length, 103 ± 8% of the planned pathway length could be realized successfully. CONCLUSION The novel innovative system assessed in this experimental study provided proof-of-concept for the feasibility of percutaneous screw placement in the pelvis and, thus, could easily be adapted to a specific clinical scenario. The system showed comparable performance with other computer-aided solutions while providing specific advantages such as true 3D vision without intraoperative radiation; however, it needs further improvement and must still undergo regulatory body approval. Future endeavors include intraoperative registration and optimized tool tracking.
Collapse
Affiliation(s)
| | - Vladislav Raykov
- Department of Orthopedics & Traumatology, Landeskrankenhaus Bludenz, Bludenz, Austria
| | - Oliver Wolff
- Hochschule Luzern Technik & Architektur, Luzern, Switzerland
| | - Hatem Alkadhi
- Department of Radiology, University Hospital Zurich, Zurich, Switzerland
| | | | - Guido A Wanner
- Spine Clinic & Traumatology, Private Hospital Bethanien, Swiss Medical Network, Zurich, Switzerland.
| |
Collapse
|
7
|
Liebmann F, von Atzigen M, Stütz D, Wolf J, Zingg L, Suter D, Cavalcanti NA, Leoty L, Esfandiari H, Snedeker JG, Oswald MR, Pollefeys M, Farshad M, Fürnstahl P. Automatic registration with continuous pose updates for marker-less surgical navigation in spine surgery. Med Image Anal 2024; 91:103027. [PMID: 37992494 DOI: 10.1016/j.media.2023.103027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Revised: 10/29/2023] [Accepted: 11/09/2023] [Indexed: 11/24/2023]
Abstract
Established surgical navigation systems for pedicle screw placement have been proven to be accurate, but still reveal limitations in registration or surgical guidance. Registration of preoperative data to the intraoperative anatomy remains a time-consuming, error-prone task that includes exposure to harmful radiation. Surgical guidance through conventional displays has well-known drawbacks, as information cannot be presented in-situ and from the surgeon's perspective. Consequently, radiation-free and more automatic registration methods with subsequent surgeon-centric navigation feedback are desirable. In this work, we present a marker-less approach that automatically solves the registration problem for lumbar spinal fusion surgery in a radiation-free manner. A deep neural network was trained to segment the lumbar spine and simultaneously predict its orientation, yielding an initial pose for preoperative models, which then is refined for each vertebra individually and updated in real-time with GPU acceleration while handling surgeon occlusions. An intuitive surgical guidance is provided thanks to the integration into an augmented reality based navigation system. The registration method was verified on a public dataset with a median of 100% successful registrations, a median target registration error of 2.7 mm, a median screw trajectory error of 1.6°and a median screw entry point error of 2.3 mm. Additionally, the whole pipeline was validated in an ex-vivo surgery, yielding a 100% screw accuracy and a median target registration error of 1.0 mm. Our results meet clinical demands and emphasize the potential of RGB-D data for fully automatic registration approaches in combination with augmented reality guidance.
Collapse
Affiliation(s)
- Florentin Liebmann
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
| | - Marco von Atzigen
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Dominik Stütz
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland
| | - Julian Wolf
- Product Development Group, ETH Zurich, Zurich, Switzerland
| | - Lukas Zingg
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Daniel Suter
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Nicola A Cavalcanti
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Laura Leoty
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Hooman Esfandiari
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Jess G Snedeker
- Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland
| | - Martin R Oswald
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Computer Vision Lab, University of Amsterdam, Amsterdam, Netherlands
| | - Marc Pollefeys
- Computer Vision and Geometry Group, ETH Zurich, Zurich, Switzerland; Microsoft Mixed Reality and AI Zurich Lab, Zurich, Switzerland
| | - Mazda Farshad
- Department of Orthopedics, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| | - Philipp Fürnstahl
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland
| |
Collapse
|
8
|
Ramalhinho J, Yoo S, Dowrick T, Koo B, Somasundaram M, Gurusamy K, Hawkes DJ, Davidson B, Blandford A, Clarkson MJ. The value of Augmented Reality in surgery - A usability study on laparoscopic liver surgery. Med Image Anal 2023; 90:102943. [PMID: 37703675 PMCID: PMC10958137 DOI: 10.1016/j.media.2023.102943] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Revised: 06/29/2023] [Accepted: 08/24/2023] [Indexed: 09/15/2023]
Abstract
Augmented Reality (AR) is considered to be a promising technology for the guidance of laparoscopic liver surgery. By overlaying pre-operative 3D information of the liver and internal blood vessels on the laparoscopic view, surgeons can better understand the location of critical structures. In an effort to enable AR, several authors have focused on the development of methods to obtain an accurate alignment between the laparoscopic video image and the pre-operative 3D data of the liver, without assessing the benefit that the resulting overlay can provide during surgery. In this paper, we present a study that aims to assess quantitatively and qualitatively the value of an AR overlay in laparoscopic surgery during a simulated surgical task on a phantom setup. We design a study where participants are asked to physically localise pre-operative tumours in a liver phantom using three image guidance conditions - a baseline condition without any image guidance, a condition where the 3D surfaces of the liver are aligned to the video and displayed on a black background, and a condition where video see-through AR is displayed on the laparoscopic video. Using data collected from a cohort of 24 participants which include 12 surgeons, we observe that compared to the baseline, AR decreases the median localisation error of surgeons on non-peripheral targets from 25.8 mm to 9.2 mm. Using subjective feedback, we also identify that AR introduces usability improvements in the surgical task and increases the perceived confidence of the users. Between the two tested displays, the majority of participants preferred to use the AR overlay instead of navigated view of the 3D surfaces on a separate screen. We conclude that AR has the potential to improve performance and decision making in laparoscopic surgery, and that improvements in overlay alignment accuracy and depth perception should be pursued in the future.
Collapse
Affiliation(s)
- João Ramalhinho
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom.
| | - Soojeong Yoo
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom; UCL Interaction Centre, University College London, London, United Kingdom
| | - Thomas Dowrick
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| | - Bongjin Koo
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| | - Murali Somasundaram
- Division of Surgery and Interventional Sciences, University College London, London, United Kingdom
| | - Kurinchi Gurusamy
- Division of Surgery and Interventional Sciences, University College London, London, United Kingdom
| | - David J Hawkes
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| | - Brian Davidson
- Division of Surgery and Interventional Sciences, University College London, London, United Kingdom
| | - Ann Blandford
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom; UCL Interaction Centre, University College London, London, United Kingdom
| | - Matthew J Clarkson
- Wellcome ESPRC Centre for Interventional and Surgical Sciences, University College London, London, United Kingdom
| |
Collapse
|
9
|
Fan X, Tao B, Tu P, Shen Y, Wu Y, Chen X. A novel mixed reality-guided dental implant placement navigation system based on virtual-actual registration. Comput Biol Med 2023; 166:107560. [PMID: 37847946 DOI: 10.1016/j.compbiomed.2023.107560] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2023] [Revised: 09/14/2023] [Accepted: 10/10/2023] [Indexed: 10/19/2023]
Abstract
BACKGROUNDS The key to successful dental implant surgery is to place the implants accurately along the pre-operative planned paths. The application of surgical navigation systems can significantly improve the safety and accuracy of implantation. However, the frequent shift of the views of the surgeon between the surgical site and the computer screen causes troubles, which is expected to be solved by the introduction of mixed-reality technology through the wearing of HoloLens devices by enabling the alignment of the virtual three-dimensional (3D) image with the actual surgical site in the same field of view. METHODS This study utilized mixed reality technology to enhance dental implant surgery navigation. Our first step was reconstructing a virtual 3D model from pre-operative cone-beam CT (CBCT) images. We then obtained the relative position between objects using the navigation device and HoloLens camera. Via the algorithms of virtual-actual registration, the transformation matrixes between the HoloLens devices and the navigation tracker were acquired through the HoloLens-tracker registration, and the transformation matrixes between the virtual model and the patient phantom through the image-phantom registration. In addition, the algorithm of surgical drill calibration assisted in acquiring transformation matrixes between the surgical drill and the patient phantom. These algorithms allow real-time tracking of the surgical drill's location and orientation relative to the patient phantom under the navigation device. With the aid of the HoloLens 2, virtual 3D images and actual patient phantoms can be aligned accurately, providing surgeons with a clear visualization of the implant path. RESULTS Phantom experiments were conducted using 30 patient phantoms, with a total of 102 dental implants inserted. Comparisons between the actual implant paths and the pre-operatively planned implant paths showed that our system achieved a coronal deviation of 1.507 ± 0.155 mm, an apical deviation of 1.542 ± 0.143 mm, and an angular deviation of 3.468 ± 0.339°. The deviation was not significantly different from that of the navigation-guided dental implant placement but better than the freehand dental implant placement. CONCLUSION Our proposed system realizes the integration of the pre-operative planned dental implant paths and the patient phantom, which helps surgeons achieve adequate accuracy in traditional dental implant surgery. Furthermore, this system is expected to be applicable to animal and cadaveric experiments in further studies.
Collapse
Affiliation(s)
- Xingqi Fan
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Baoxin Tao
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China
| | - Puxun Tu
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Yihan Shen
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Yiqun Wu
- Department of Second Dental Center, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China.
| | - Xiaojun Chen
- Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China; Institute of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
| |
Collapse
|
10
|
Verhellen A, Elprama SA, Scheerlinck T, Van Aerschot F, Duerinck J, Van Gestel F, Frantz T, Jansen B, Vandemeulebroucke J, Jacobs A. Exploring technology acceptance of head-mounted device-based augmented reality surgical navigation in orthopaedic surgery. Int J Med Robot 2023:e2585. [PMID: 37830305 DOI: 10.1002/rcs.2585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2023] [Revised: 09/18/2023] [Accepted: 09/28/2023] [Indexed: 10/14/2023]
Abstract
BACKGROUND This study used the Unified Theory of Acceptance and Use of Technology (UTAUT) to investigate the acceptance of HMD-based AR surgical navigation. METHODS An experiment was conducted in which participants drilled 12 predefined holes using freehand drilling, proprioceptive control, and AR assistance. Technology acceptance was assessed through a survey and non-participant observations. RESULTS Participants' intention to use AR correlated (p < 0.05) with social influence (Spearman's rho (rs) = 0.599), perceived performance improvement (rs = 0.592) and attitude towards AR (rs = 0.542). CONCLUSIONS While most participants acknowledged the potential of AR, they also highlighted persistent barriers to adoption, such as issues related to user-friendliness, time efficiency and device discomfort. To overcome these challenges, future AR surgical navigation systems should focus on enhancing surgical performance while minimising disruptions to workflows and operating times. Engaging orthopaedic surgeons in the development process can facilitate the creation of tailored solutions and accelerate adoption.
Collapse
Affiliation(s)
| | | | - Thierry Scheerlinck
- Department of Orthopedic Surgery and Traumatology - Research Group BEFY-ORTHO, Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Fiene Van Aerschot
- Department of Orthopedic Surgery and Traumatology - Research Group BEFY-ORTHO, Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Johnny Duerinck
- Department of Neurosurgery-Research Group Center for Neurosciences (C4N-NEUR), Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Frederick Van Gestel
- Department of Neurosurgery-Research Group Center for Neurosciences (C4N-NEUR), Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel, Brussel, Belgium
| | - Taylor Frantz
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussel, Belgium
| | - Bart Jansen
- Department of Electronics and Informatics (ETRO), Vrije Universiteit Brussel, Brussel, Belgium
| | - Jef Vandemeulebroucke
- Department of Radiology - Department of Electronics and Informatics (ETRO), Universitair Ziekenhuis Brussel - Vrije Universiteit Brussel - Imec, Brussel, Belgium
| | - An Jacobs
- IMEC-SMIT, Vrije Universiteit, Brussel, Belgium
| |
Collapse
|
11
|
Csernátony Z, Manó S, Szabó D, Soósné Horváth H, Kovács ÁÉ, Csámer L. Acetabular Revision with McMinn Cup: Development and Application of a Patient-Specific Targeting Device. Bioengineering (Basel) 2023; 10:1095. [PMID: 37760197 PMCID: PMC10526046 DOI: 10.3390/bioengineering10091095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2023] [Revised: 09/11/2023] [Accepted: 09/13/2023] [Indexed: 09/29/2023] Open
Abstract
BACKGROUND Surgeries of severe periacetabular bone defects (Paprosky ≥ 2B) are a major challenge in current practice. Although solutions are available for this serious clinical problem, they all have their disadvantages as well as their advantages. An alternative method of reconstructing such extensive defects was the use of a cup with a stem to solve these revision situations. As the instrumentation offered is typically designed for scenarios where a significant bone defect is not present, our unique technique has been developed for implantation in cases where reference points are missing. Our hypothesis was that a targeting device designed based on the CT scan of a patient's pelvis could facilitate the safe insertion of the guiding wire. METHODS Briefly, our surgical solution consists of a two-step operation. If periacetabular bone loss was found to be more significant during revision surgery, all implants were removed, and two titanium marker screws in the anterior iliac crest were percutaneously inserted. Next, by applying the metal artifact removal (MAR) algorithm, a CT scan of the pelvis was performed. Based on that, the dimensions and positioning of the cup to be inserted were determined, and a patient-specific 3D printed targeting device made of biocompatible material was created to safely insert the guidewire, which is essential to the implantation process. RESULTS In this study, medical, engineering, and technical tasks related to the design, the surgical technique, and experiences from 17 surgical cases between February 2018 and July 2021 are reported. There were no surgical complications in any cases. The implant had to be removed due to septic reasons (independently from the technique) in a single case, consistent with the septic statistics for this type of surgery. There was not any perforation of the linea terminalis of the pelvis due to the guiding method. The wound healing of patients was uneventful, and the implant was fixed securely. Following rehabilitation, the joints were able to bear weight again. After one to four years of follow-up, the patient satisfaction level was high, and the gait function of the patients improved a lot in all cases. CONCLUSIONS Our results show that CT-based virtual surgical planning and, based on it, the use of a patient-specific 3D printed aiming device is a reliable method for major hip surgeries with significant bone loss. This technique has also made it possible to perform these operations with minimal X-ray exposure.
Collapse
Affiliation(s)
- Zoltán Csernátony
- Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (Z.C.)
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Sándor Manó
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Dániel Szabó
- Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (Z.C.)
| | - Hajnalka Soósné Horváth
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Ágnes Éva Kovács
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| | - Loránd Csámer
- Laboratory of Biomechanics, Department of Orthopaedics and Traumatology, Faculty of Medicine, University of Debrecen, H-4032 Debrecen, Hungary; (S.M.)
| |
Collapse
|
12
|
Timóteo R, Pinto D, Martinho M, Gouveia P, Lopes DS, Mavioso C, Cardoso MJ. Skin deformation analysis for pre-operative planning of DIEAP flap reconstruction surgery. Med Eng Phys 2023; 119:104025. [PMID: 37634903 DOI: 10.1016/j.medengphy.2023.104025] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Revised: 07/11/2023] [Accepted: 07/19/2023] [Indexed: 08/29/2023]
Abstract
Deep inferior epigastric artery perforator (DIEAP) flap reconstruction surgeries can potentially benefit from augmented reality (AR) in the context of surgery planning and outcomes improvement. Although three-dimensional (3D) models help visualize and map the perforators, the anchorage of the models to the patient's body during surgery does not consider eventual skin deformation from the moment of computed tomography angiography (CTA) data acquisition until the position of the patient while in surgery. In this work, we compared the 3D deformation registration from supine arms down (CTA position) to supine with arms at 90° degrees (surgical position), estimating the patient's skin deformation. We processed the data sets of 20 volunteers with a 3D rigid registration tool and performed a descriptive statistical analysis and statistical inference. With 2.45 mm of root mean square and 2.89 mm of standard deviation, results include 30% cases of deformation above 3 mm and 15% above 4 mm. Pose transformation deformation indicates that 3D surface data from the CTA scan position differs from data acquired in loco at the surgical table. Such results indicate that research should be conducted to construct accurate 3D models using CTA data to display on the patient, while considering projection errors when using AR technology.
Collapse
Affiliation(s)
- Rafaela Timóteo
- Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa, Portugal; Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - David Pinto
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - Marta Martinho
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - Pedro Gouveia
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal; Faculdade de Medicina de Lisboa, Av. Prof. Egas Moniz MB, 1649-028 Lisboa, Portugal.
| | - Daniel Simões Lopes
- Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa, Portugal; INESC ID, Rua Alves Redol 9, 1000-029 Lisboa, Portugal; ITI/LARSyS, Hub Criativo do Beato, Factory Lisbon, Rua da Manutenção 71, Building F S05, 1900-500 Lisboa, Portugal.
| | - Carlos Mavioso
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| | - Maria João Cardoso
- Breast Unit/Digital Surgery Lab, Champalimaud Clinical Centre/Champalimaud Foundation, Avenida Brasília, 1400-038 Lisboa, Portugal.
| |
Collapse
|
13
|
RaviChandran N, Teo ZL, Ting DSW. Artificial intelligence enabled smart digital eye wearables. Curr Opin Ophthalmol 2023; 34:414-421. [PMID: 37527195 DOI: 10.1097/icu.0000000000000985] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/03/2023]
Abstract
PURPOSE OF REVIEW Smart eyewear is a head-worn wearable device that is evolving as the next phase of ubiquitous wearables. Although their applications in healthcare are being explored, they have the potential to revolutionize teleophthalmology care. This review highlights their applications in ophthalmology care and discusses future scope. RECENT FINDINGS Smart eyewear equips advanced sensors, optical displays, and processing capabilities in a wearable form factor. Rapid technological developments and the integration of artificial intelligence are expanding their reach from consumer space to healthcare applications. This review systematically presents their applications in treating and managing eye-related conditions. This includes remote assessments, real-time monitoring, telehealth consultations, and the facilitation of personalized interventions. They also serve as low-vision assistive devices to help visually impaired, and can aid physicians with operational and surgical tasks. SUMMARY Wearables such as smart eyewear collects rich, continuous, objective, individual-specific data, which is difficult to obtain in a clinical setting. By leveraging sophisticated data processing and artificial intelligence based algorithms, these data can identify at-risk patients, recognize behavioral patterns, and make timely interventions. They promise cost-effective and personalized treatment for vision impairments in an effort to mitigate the global burden of eye-related conditions and aging.
Collapse
Affiliation(s)
| | - Zhen Ling Teo
- Singapore National Eye Center, Singapore Eye Research Institute
| | - Daniel S W Ting
- AI and Digital Innovations
- Singapore National Eye Center, Singapore Eye Research Institute
- Duke-NUS Medical School, National University Singapore, Singapore
| |
Collapse
|
14
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 17] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
15
|
Davis C, Yoo S, Reissis A, Clarkson MJ, Thompson S. Enhanced Surgeons: Understanding the Design of Augmented Reality Instructions for Keyhole Surgery. PROCEEDINGS. IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES 2023; 2023:123-127. [PMID: 37525696 PMCID: PMC7614851 DOI: 10.1109/vrw58643.2023.00031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 08/02/2023]
Abstract
It is important to understand how to design AR content for surgical contexts to mitigate the risk of distracting the surgeons. In this work, we test information overlays for AR guidance during keyhole surgery. We performed a preliminary evaluation of a prototype, focusing on the effects of colour, opacity, and information representation. Our work contributes insights into the design of AR guidance in surgery settings and a foundation for future research on visualisation design for surgical AR.
Collapse
Affiliation(s)
- Christoph Davis
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London (UCL), United Kingdom
| | - Soojeong Yoo
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London (UCL), United Kingdom
| | - Athena Reissis
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London (UCL), United Kingdom
| | - Matthew J. Clarkson
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London (UCL), United Kingdom
| | - Stephen Thompson
- Wellcome / EPSRC Centre for Interventional and Surgical Sciences, University College London (UCL), United Kingdom
| |
Collapse
|
16
|
Yuan J, Hassan SS, Wu J, Koger CR, Packard RRS, Shi F, Fei B, Ding Y. Extended reality for biomedicine. NATURE REVIEWS. METHODS PRIMERS 2023; 3:15. [PMID: 37051227 PMCID: PMC10088349 DOI: 10.1038/s43586-023-00208-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
Abstract
Extended reality (XR) refers to an umbrella of methods that allows users to be immersed in a three-dimensional (3D) or a 4D (spatial + temporal) virtual environment to different extents, including virtual reality (VR), augmented reality (AR), and mixed reality (MR). While VR allows a user to be fully immersed in a virtual environment, AR and MR overlay virtual objects over the real physical world. The immersion and interaction of XR provide unparalleled opportunities to extend our world beyond conventional lifestyles. While XR has extensive applications in fields such as entertainment and education, its numerous applications in biomedicine create transformative opportunities in both fundamental research and healthcare. This Primer outlines XR technology from instrumentation to software computation methods, delineating the biomedical applications that have been advanced by state-of-the-art techniques. We further describe the technical advances overcoming current limitations in XR and its applications, providing an entry point for professionals and trainees to thrive in this emerging field.
Collapse
Affiliation(s)
- Jie Yuan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Sohail S. Hassan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Jiaojiao Wu
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Casey R. Koger
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - René R. Sevag Packard
- Division of Cardiology, Department of Medicine, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, United States
- Ronald Reagan UCLA Medical Center, Los Angeles, CA United States
- Veterans Affairs West Los Angeles Medical Center, Los Angeles, CA, United States
| | - Feng Shi
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Baowei Fei
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Department of Radiology, UT Southwestern Medical Center, Dallas, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
| | - Yichen Ding
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
- Hamon Center for Regenerative Science and Medicine, UT Southwestern Medical Center, Dallas, TX, United States
| |
Collapse
|
17
|
Remote Interactive Surgery Platform (RISP): Proof of Concept for an Augmented-Reality-Based Platform for Surgical Telementoring. J Imaging 2023; 9:jimaging9030056. [PMID: 36976107 PMCID: PMC10054087 DOI: 10.3390/jimaging9030056] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2022] [Revised: 02/15/2023] [Accepted: 02/17/2023] [Indexed: 02/26/2023] Open
Abstract
The “Remote Interactive Surgery Platform” (RISP) is an augmented reality (AR)-based platform for surgical telementoring. It builds upon recent advances of mixed reality head-mounted displays (MR-HMD) and associated immersive visualization technologies to assist the surgeon during an operation. It enables an interactive, real-time collaboration with a remote consultant by sharing the operating surgeon’s field of view through the Microsoft (MS) HoloLens2 (HL2). Development of the RISP started during the Medical Augmented Reality Summer School 2021 and is currently still ongoing. It currently includes features such as three-dimensional annotations, bidirectional voice communication and interactive windows to display radiographs within the sterile field. This manuscript provides an overview of the RISP and preliminary results regarding its annotation accuracy and user experience measured with ten participants.
Collapse
|
18
|
Birlo M, Edwards PJE, Yoo S, Dromey B, Vasconcelos F, Clarkson MJ, Stoyanov D. CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording. J Imaging 2022; 9:6. [PMID: 36662104 PMCID: PMC9860994 DOI: 10.3390/jimaging9010006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 11/30/2022] [Accepted: 12/20/2022] [Indexed: 12/30/2022] Open
Abstract
Obstetric ultrasound (US) training teaches the relationship between foetal anatomy and the viewed US slice to enable navigation to standardised anatomical planes (head, abdomen and femur) where diagnostic measurements are taken. This process is difficult to learn, and results in considerable inter-operator variability. We propose the CAL-Tutor system for US training based on a US scanner and phantom, where a model of both the baby and the US slice are displayed to the trainee in its physical location using the HoloLens 2. The intention is that AR guidance will shorten the learning curve for US trainees and improve spatial awareness. In addition to the AR guidance, we also record many data streams to assess user motion and the learning process. The HoloLens 2 provides eye gaze, head and hand position, ARToolkit and NDI Aurora tracking gives the US probe positions and an external camera records the overall scene. These data can provide a rich source for further analysis, such as distinguishing expert from novice motion. We have demonstrated the system in a sample of engineers. Feedback suggests that the system helps novice users navigate the US probe to the standard plane. The data capture is successful and initial data visualisations show that meaningful information about user behaviour can be captured. Initial feedback is encouraging and shows improved user assessment where AR guidance is provided.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), Charles Bell House, 43–45 Foley Street, London W1W 7TY, UK
| | - Philip J. Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), Charles Bell House, 43–45 Foley Street, London W1W 7TY, UK
| | - Soojeong Yoo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), Charles Bell House, 43–45 Foley Street, London W1W 7TY, UK
- UCL Interaction Centre (UCLIC), University College London, 66-72 Gower Street, London WC1E 6EA, UK
| | - Brian Dromey
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), Charles Bell House, 43–45 Foley Street, London W1W 7TY, UK
- UCL EGA Institute for Women’s Health, Medical School Building, 74 Huntley Street, London WC1E 6AU, UK
| | - Francisco Vasconcelos
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), Charles Bell House, 43–45 Foley Street, London W1W 7TY, UK
| | - Matthew J. Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), Charles Bell House, 43–45 Foley Street, London W1W 7TY, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), Charles Bell House, 43–45 Foley Street, London W1W 7TY, UK
| |
Collapse
|
19
|
Navab N, Martin-Gomez A, Seibold M, Sommersperger M, Song T, Winkler A, Yu K, Eck U. Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process. J Imaging 2022; 9:jimaging9010004. [PMID: 36662102 PMCID: PMC9866223 DOI: 10.3390/jimaging9010004] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2022] [Revised: 12/15/2022] [Accepted: 12/19/2022] [Indexed: 12/28/2022] Open
Abstract
Three decades after the first set of work on Medical Augmented Reality (MAR) was presented to the international community, and ten years after the deployment of the first MAR solutions into operating rooms, its exact definition, basic components, systematic design, and validation still lack a detailed discussion. This paper defines the basic components of any Augmented Reality (AR) solution and extends them to exemplary Medical Augmented Reality Systems (MARS). We use some of the original MARS applications developed at the Chair for Computer Aided Medical Procedures and deployed into medical schools for teaching anatomy and into operating rooms for telemedicine and surgical guidance throughout the last decades to identify the corresponding basic components. In this regard, the paper is not discussing all past or existing solutions but only aims at defining the principle components and discussing the particular domain modeling for MAR and its design-development-validation process, and providing exemplary cases through the past in-house developments of such solutions.
Collapse
Affiliation(s)
- Nassir Navab
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
| | - Alejandro Martin-Gomez
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Matthias Seibold
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
- Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, CH-8008 Zurich, Switzerland
| | - Michael Sommersperger
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
| | - Tianyu Song
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
| | - Alexander Winkler
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
- Department of General, Visceral, and Transplant Surgery, Ludwig-Maximilians-University Hospital, DE-80336 Munich, Germany
| | - Kevin Yu
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
- medPhoton GmbH, AT-5020 Salzburg, Austria
| | - Ulrich Eck
- Computer Aided Medical Procedures & Augmented Reality, Technical University Munich, DE-85748 Garching, Germany
- Correspondence:
| |
Collapse
|
20
|
Palumbo A. Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22207709. [PMID: 36298059 PMCID: PMC9611914 DOI: 10.3390/s22207709] [Citation(s) in RCA: 28] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/07/2022] [Revised: 09/29/2022] [Accepted: 10/07/2022] [Indexed: 05/08/2023]
Abstract
In the world reference context, although virtual reality, augmented reality and mixed reality have been emerging methodologies for several years, only today technological and scientific advances have made them suitable to revolutionize clinical care and medical contexts through the provision of enhanced functionalities and improved health services. This systematic review provides the state-of-the-art applications of the Microsoft® HoloLens 2 in a medical and healthcare context. Focusing on the potential that this technology has in providing digitally supported clinical care, also but not only in relation to the COVID-19 pandemic, studies that proved the applicability and feasibility of HoloLens 2 in a medical and healthcare scenario were considered. The review presents a thorough examination of the different studies conducted since 2019, focusing on HoloLens 2 medical sub-field applications, device functionalities provided to users, software/platform/framework used, as well as the study validation. The results provided in this paper could highlight the potential and limitations of the HoloLens 2-based innovative solutions and bring focus to emerging research topics, such as telemedicine, remote control and motor rehabilitation.
Collapse
Affiliation(s)
- Arrigo Palumbo
- Department of Medical and Surgical Sciences, Magna Græcia University, 88100 Catanzaro, Italy
| |
Collapse
|
21
|
Zhao R, Zhu Z, Shao L, Meng F, Lei Z, Li X, Zhang T. Augmented reality guided in reconstruction of mandibular defect with fibular flap: A cadaver study. JOURNAL OF STOMATOLOGY, ORAL AND MAXILLOFACIAL SURGERY 2022; 124:101318. [PMID: 36280109 DOI: 10.1016/j.jormas.2022.10.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/21/2022] [Revised: 10/05/2022] [Accepted: 10/19/2022] [Indexed: 11/21/2022]
Abstract
BACKGROUND Augmented reality (AR) navigation has been developed in recent years and can overcome some limitations of existing technologies. This study aimed to investigate a novel method of fibula free flap (FFF) osteotomy based on AR technology through a cadaver study. METHODS One mandible, seven fibulas, and seven lower limb specimens underwent computed tomography (CT) examination. We used the professional software Proplan CMF 3.0 to design a defective mandible model and created fourteen virtual reconstruction plans using the fibulas and lower limb specimens. The AR-based intraoperative guidance software prototype was developed using the Unity Real-Time Development Platform, and virtual plans were transferred into this software prototype. We used AR-based surgical navigation to guide the FFF osteotomy and used these fibular segments to reconstruct the defective mandible model. After reconstruction, all segments were scanned by CT. Osteotomy accuracy was evaluated by measuring the length and angular deviation between the virtual plan and the final result. The reconstruction precision was reflected by the volume overlap rate and average surface distance between the planned and obtained reconstruction. RESULTS The length difference, angular deviation, volume overlap rate and average surface distance of the in vitro group were 1.03±0.68 mm, 5.04±2.61°, 95.35±1.81%, and 1.02±0.27 mm, respectively. Those of the in vivo group were 1.18±0.84 mm, 5.45±1.47°, 95.31±2.09%, and 1.22±0.12 mm. CONCLUSIONS Due to the ideal result of cadaver experiments, an AR-based FFF osteotomy guided system may become a novel approach to assist FFF osteotomy for the reconstruction of defective mandibles.
Collapse
|
22
|
Schreiter J, Schott D, Schwenderling L, Hansen C, Heinrich F, Joeres F. AR-Supported Supervision of Conditional Autonomous Robots: Considerations for Pedicle Screw Placement in the Future. J Imaging 2022; 8:jimaging8100255. [PMID: 36286350 PMCID: PMC9605344 DOI: 10.3390/jimaging8100255] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Revised: 09/14/2022] [Accepted: 09/15/2022] [Indexed: 12/03/2022] Open
Abstract
Robotic assistance is applied in orthopedic interventions for pedicle screw placement (PSP). While current robots do not act autonomously, they are expected to have higher autonomy under surgeon supervision in the mid-term. Augmented reality (AR) is promising to support this supervision and to enable human–robot interaction (HRI). To outline a futuristic scenario for robotic PSP, the current workflow was analyzed through literature review and expert discussion. Based on this, a hypothetical workflow of the intervention was developed, which additionally contains the analysis of the necessary information exchange between human and robot. A video see-through AR prototype was designed and implemented. A robotic arm with an orthopedic drill mock-up simulated the robotic assistance. The AR prototype included a user interface to enable HRI. The interface provides data to facilitate understanding of the robot’s ”intentions”, e.g., patient-specific CT images, the current workflow phase, or the next planned robot motion. Two-dimensional and three-dimensional visualization illustrated patient-specific medical data and the drilling process. The findings of this work contribute a valuable approach in terms of addressing future clinical needs and highlighting the importance of AR support for HRI.
Collapse
Affiliation(s)
- Josefine Schreiter
- Faculty of Computer Science & Research Campus STIMULATE, University of Magdeburg, 39106 Magdeburg, Germany
| | - Danny Schott
- Faculty of Computer Science & Research Campus STIMULATE, University of Magdeburg, 39106 Magdeburg, Germany
| | - Lovis Schwenderling
- Faculty of Computer Science & Research Campus STIMULATE, University of Magdeburg, 39106 Magdeburg, Germany
| | - Christian Hansen
- Faculty of Computer Science & Research Campus STIMULATE, University of Magdeburg, 39106 Magdeburg, Germany
- Correspondence:
| | - Florian Heinrich
- Faculty of Computer Science & Research Campus STIMULATE, University of Magdeburg, 39106 Magdeburg, Germany
| | - Fabian Joeres
- Faculty of Computer Science & Research Campus STIMULATE, University of Magdeburg, 39106 Magdeburg, Germany
- Innovation Center Computer-Assisted Surgery (ICCAS), Faculty of Medicine, Leipzig University, 04103 Leipzig, Germany
| |
Collapse
|
23
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
24
|
Wong KC, Sun YE, Kumta SM. Review and Future/Potential Application of Mixed Reality Technology in Orthopaedic Oncology. Orthop Res Rev 2022; 14:169-186. [PMID: 35601186 PMCID: PMC9121991 DOI: 10.2147/orr.s360933] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Accepted: 04/26/2022] [Indexed: 11/23/2022] Open
Abstract
In orthopaedic oncology, surgical planning and intraoperative execution errors may result in positive tumor resection margins that increase the risk of local recurrence and adversely affect patients’ survival. Computer navigation and 3D-printed resection guides have been reported to address surgical inaccuracy by replicating the surgical plans in complex cases. However, limitations include surgeons’ attention shift from the operative field to view the navigation monitor and expensive navigation facilities in computer navigation surgery. Practical concerns are lacking real-time visual feedback of preoperative images and the lead-time in manufacturing 3D-printed objects. Mixed Reality (MR) is a technology of merging real and virtual worlds to produce new environments with enhanced visualizations, where physical and digital objects coexist and allow users to interact with both in real-time. The unique MR features of enhanced medical images visualization and interaction with holograms allow surgeons real-time and on-demand medical information and remote assistance in their immediate working environment. Early application of MR technology has been reported in surgical procedures. Its role is unclear in orthopaedic oncology. This review aims to provide orthopaedic tumor surgeons with up-to-date knowledge of the emerging MR technology. The paper presents its essential features and clinical workflow, reviews the current literature and potential clinical applications, and discusses the limitations and future development in orthopaedic oncology. The emerging MR technology adds a new dimension to digital assistive tools with a more accessible and less costly alternative in orthopaedic oncology. The MR head-mounted display and hand-free control may achieve clinical point-of-care inside or outside the operating room and improve service efficiency and patient safety. However, lacking an accurate hologram-to-patient matching, an MR platform dedicated to orthopaedic oncology, and clinical results may hinder its wide adoption. Industry-academic partnerships are essential to advance the technology with its clinical role determined through future clinical studies. ![]()
Point your SmartPhone at the code above. If you have a QR code reader the video abstract will appear. Or use: https://youtu.be/t4hl_Anh_kM
Collapse
Affiliation(s)
- Kwok Chuen Wong
- Department of Orthopaedics and Traumatology, Prince of Wales Hospital, the Chinese University of Hong Kong, Hong Kong Special Administrative Region, People’s Republic of China
- Correspondence: Kwok Chuen Wong, Department of Orthopaedics and Traumatology, Prince of Wales Hospital, the Chinese University of Hong Kong, Hong Kong Special Administrative Region, People’s Republic of China, Email
| | - Yan Edgar Sun
- New Territories, Hong Kong Special Administrative Region, People’s Republic of China
| | - Shekhar Madhukar Kumta
- Department of Orthopaedics and Traumatology, Prince of Wales Hospital, the Chinese University of Hong Kong, Hong Kong Special Administrative Region, People’s Republic of China
| |
Collapse
|