1
|
Lesaunier A, Khlaut J, Dancette C, Tselikas L, Bonnet B, Boeken T. Artificial intelligence in interventional radiology: Current concepts and future trends. Diagn Interv Imaging 2025; 106:5-10. [PMID: 39261225 DOI: 10.1016/j.diii.2024.08.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2024] [Revised: 08/17/2024] [Accepted: 08/23/2024] [Indexed: 09/13/2024]
Abstract
While artificial intelligence (AI) is already well established in diagnostic radiology, it is beginning to make its mark in interventional radiology. AI has the potential to dramatically change the daily practice of interventional radiology at several levels. In the preoperative setting, recent advances in deep learning models, particularly foundation models, enable effective management of multimodality and increased autonomy through their ability to function minimally without supervision. Multimodality is at the heart of patient-tailored management and in interventional radiology, this translates into the development of innovative models for patient selection and outcome prediction. In the perioperative setting, AI is manifesting itself in applications that assist radiologists in image analysis and real-time decision making, thereby improving the efficiency, accuracy, and safety of interventions. In synergy with advances in robotic technologies, AI is laying the groundwork for an increased autonomy. From a research perspective, the development of artificial health data, such as AI-based data augmentation, offers an innovative solution to this central issue and promises to stimulate research in this area. This review aims to provide the medical community with the most important current and future applications of AI in interventional radiology.
Collapse
Affiliation(s)
- Armelle Lesaunier
- Department of Vascular and Oncological Interventional Radiology, Hôpital Européen Georges Pompidou, AP-HP, 75015 Paris, France; Université Paris Cité, Faculté de Médecine, 75006 Paris, France.
| | | | | | - Lambros Tselikas
- Gustave Roussy, Département d'Anesthésie, Chirurgie et Interventionnel (DACI), 94805 Villejuif, France; Faculté de Médecine, Paris-Saclay University, 94276 Le Kremlin Bicêtre, France
| | - Baptiste Bonnet
- Gustave Roussy, Département d'Anesthésie, Chirurgie et Interventionnel (DACI), 94805 Villejuif, France; Faculté de Médecine, Paris-Saclay University, 94276 Le Kremlin Bicêtre, France
| | - Tom Boeken
- Department of Vascular and Oncological Interventional Radiology, Hôpital Européen Georges Pompidou, AP-HP, 75015 Paris, France; Université Paris Cité, Faculté de Médecine, 75006 Paris, France; HEKA INRIA, INSERM PARCC U 970, 75015 Paris, France
| |
Collapse
|
2
|
Graziani GC, Bocchi M, Gouvêa-e-Silva LF, Fornaziero CC, Fernandes EV. Technologies for Studying and Teaching Human Anatomy: Implications in Academic Education. MEDICAL SCIENCE EDUCATOR 2024; 34:1203-1214. [PMID: 39450022 PMCID: PMC11496393 DOI: 10.1007/s40670-024-02079-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/15/2024] [Indexed: 10/26/2024]
Abstract
The teaching of human anatomy (HA) constitutes one of the fundamental pillars of the curriculum in biological and healthcare-related programs. Therefore, it is imperative that the methodology and didactics employed in this discipline equip students in the best possible way. The traditional method of teaching HA involves lectures and practical classes with previously dissected cadaveric specimens and dissection activities. Concurrently, the present era is witnessing the emergence and popularization of new digital technologies connected to the internet, among which we can highlight smartphones, quick response codes, and virtual reality devices, along with the dissemination of complementary imaging methods, such as radiography, ultrasonography, magnetic resonance imaging, and computerized tomography. From this perspective, the objective of this review is to analyze how each of these new tools integrates into the academic context, in order to diversify the teaching of HA and contribute to better understanding of the HA content during academic training, as well as the clinical applications.
Collapse
Affiliation(s)
- Gustavo Cunha Graziani
- Universidade Federal de Jataí, BR 364, Km 195, n. 3800, Cidade Universitária, 75801-615 Jataí, Goiás Brazil
| | - Mayara Bocchi
- Universidade Federal de Jataí, BR 364, Km 195, n. 3800, Cidade Universitária, 75801-615 Jataí, Goiás Brazil
| | | | - Célia Cristina Fornaziero
- Universidade Estadual de Londrina, Rodovia Celso Garcia Cid, PR 445, Km 380, Campus Universitário, 86057-970 Londrina, Paraná Brazil
| | - Eduardo Vignoto Fernandes
- Universidade Federal de Jataí, BR 364, Km 195, n. 3800, Cidade Universitária, 75801-615 Jataí, Goiás Brazil
| |
Collapse
|
3
|
Dahhan H, Awan OA. Immersive Learning Experiences: How Augmented Reality and Virtual Reality are Shaping the Future of Radiology Education. Acad Radiol 2024:S1076-6332(24)00596-8. [PMID: 39304379 DOI: 10.1016/j.acra.2024.08.033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2024] [Accepted: 08/16/2024] [Indexed: 09/22/2024]
Affiliation(s)
- Hadi Dahhan
- Northwell Plainview, 888 Old Country Rd, Plainview, New York 11803, USA (H.D.)
| | - Omer A Awan
- University of Maryland School of Medicine, 655 W Baltimore Street, Baltimore, Maryland 21201, USA (O.A.A.).
| |
Collapse
|
4
|
Lee KH, Li M, Varble N, Negussie AH, Kassin MT, Arrichiello A, Carrafiello G, Hazen LA, Wakim PG, Li X, Xu S, Wood BJ. Smartphone Augmented Reality Outperforms Conventional CT Guidance for Composite Ablation Margins in Phantom Models. J Vasc Interv Radiol 2024; 35:452-461.e3. [PMID: 37852601 DOI: 10.1016/j.jvir.2023.10.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Revised: 09/23/2023] [Accepted: 10/08/2023] [Indexed: 10/20/2023] Open
Abstract
PURPOSE To develop and evaluate a smartphone augmented reality (AR) system for a large 50-mm liver tumor ablation with treatment planning for composite overlapping ablation zones. MATERIALS AND METHODS A smartphone AR application was developed to display tumor, probe, projected probe paths, ablated zones, and real-time percentage of the ablated target tumor volume. Fiducial markers were attached to phantoms and an ablation probe hub for tracking. The system was evaluated with tissue-mimicking thermochromic phantoms and gel phantoms. Four interventional radiologists performed 2 trials each of 3 probe insertions per trial using AR guidance versus computed tomography (CT) guidance approaches in 2 gel phantoms. Insertion points and optimal probe paths were predetermined. On Gel Phantom 2, serial ablated zones were saved and continuously displayed after each probe placement/adjustment, enabling feedback and iterative planning. The percentages of tumor ablated for AR guidance versus CT guidance, and with versus without display of recorded ablated zones, were compared among interventional radiologists with pairwise t-tests. RESULTS The means of percentages of tumor ablated for CT freehand and AR guidance were 36% ± 7 and 47% ± 4 (P = .004), respectively. The mean composite percentages of tumor ablated for AR guidance were 43% ± 1 (without) and 50% ± 2 (with display of ablation zone) (P = .033). There was no strong correlation between AR-guided percentage of ablation and years of experience (r < 0.5), whereas there was a strong correlation between CT-guided percentage of ablation and years of experience (r > 0.9). CONCLUSIONS A smartphone AR guidance system for dynamic iterative large liver tumor ablation was accurate, performed better than conventional CT guidance, especially for less experienced interventional radiologists, and enhanced more standardized performance across experience levels for ablation of a 50-mm tumor.
Collapse
Affiliation(s)
- Katerina H Lee
- McGovern Medical School at UTHealth, Houston, Texas; Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Ming Li
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Nicole Varble
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland; Philips Research North America, Cambridge, Massachusetts
| | - Ayele H Negussie
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Michael T Kassin
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Antonio Arrichiello
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Gianpaolo Carrafiello
- Department of Radiology, Foundation IRCCS Ca' Granda Ospedale Maggiore Policlinico, University of Milan, Milan, Italy
| | - Lindsey A Hazen
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Paul G Wakim
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, National Institutes of Health, Bethesda, Maryland
| | - Sheng Xu
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland
| | - Bradford J Wood
- Center for Interventional Oncology, National Institutes of Health, Bethesda, Maryland.
| |
Collapse
|
5
|
Averill SL, Gomez EN, Belfi LM, Hui J, Mallak N, Chetlen A. Night and Day, Why Radiologists Need Play. Acad Radiol 2024; 31:360-370. [PMID: 38401981 DOI: 10.1016/j.acra.2023.11.039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Revised: 11/17/2023] [Accepted: 11/27/2023] [Indexed: 02/26/2024]
Abstract
What is play and why does it matter for radiologists? Play can take many forms in the workplace, including organic, managed, task-related, diversionary, and resistive forms of play, and play may also take the form of authentic self-expression and creation. In this review article, we will discuss the benefits of play including improved problem solving, gaining perspective, and stress reduction, and also provide low-tech and high-tech examples of beneficial play for the radiology team in work and personal contexts.
Collapse
Affiliation(s)
- Sarah L Averill
- Associate Professor of Oncology and Radiology, Roswell Park Comprehensive Cancer Center, 665 Elm Ste, Buffalo, New York, USA (S.L.A.).
| | - Erin N Gomez
- Assistant Professor, Diagnostic Imaging Division, Program Director, Diagnostic Radiology and Molecular Imaging Residencies, Department of Radiology, Johns Hopkins Hospital, Baltimore, Maryland, USA (E.N.G.)
| | - Lily M Belfi
- Associate Professor of Clinical Radiology, Director of Medical Student Education, Division of Emergency/ Musculoskeletal Radiology, Weill Cornell Medicine, 525 East 68th Street, Room F-054, New York, New York, 10065, USA (L.M.B.)
| | - Jessica Hui
- R3 Radiology Resident University of Iowa Hospitals and Clinics, lowa city, lowa, USA (J.H.)
| | - Nadine Mallak
- Associate Professor, Department of Diagnostic Radiology, Molecular Imaging & Therapy, Body Imaging, PET/MRI Clinical Director, Oregon Health and Science University, 3181 SW Sam Jackson Park Rd, Portland, Oregon, 97239, USA (N.M.)
| | - Alison Chetlen
- Professor, Department of Radiology, Division of Breast Imaging, Vice Chair of Education, Department of Radiology, Breast Imaging Division Chief, Academic Radiology Group, Director, 3+5 DR-APPS Accelerated Pathway Program Penn State College of Medicine, Penn State Health, Hershey Medical Center, 30 Hope Drive, Suite 1800, EC 008, Hershey, Pennsylvania, 17033, USA (A.C.)
| |
Collapse
|
6
|
Lian ME, Yee WG, Yu KL, Wu GY, Yang SM, Tsai HY. Radiation exposure in augmented fluoroscopic bronchoscopy procedures: a comprehensive analysis for patients and physicians. JOURNAL OF RADIOLOGICAL PROTECTION : OFFICIAL JOURNAL OF THE SOCIETY FOR RADIOLOGICAL PROTECTION 2024; 44:011502. [PMID: 38194908 DOI: 10.1088/1361-6498/ad1cd3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Accepted: 01/09/2024] [Indexed: 01/11/2024]
Abstract
Cancer is a major health challenge and causes millions of deaths worldwide each year, and the incidence of lung cancer has increased. Augmented fluoroscopic bronchoscopy (AFB) procedures, which combine bronchoscopy and fluoroscopy, are crucial for diagnosing and treating lung cancer. However, fluoroscopy exposes patients and physicians to radiation, and therefore, the procedure requires careful monitoring. The National Council on Radiation Protection and Measurement and the International Commission on Radiological Protection have emphasised the importance of monitoring patient doses and ensuring occupational radiation safety. The present study evaluated radiation doses during AFB procedures, focusing on patient skin doses, the effective dose, and the personal dose equivalent to the eye lens for physicians. Skin doses were measured using thermoluminescent dosimeters. Peak skin doses were observed on the sides of the patients' arms, particularly on the side closest to the x-ray tube. Differences in the procedures and experience of physicians between the two hospitals involved in this study were investigated. AFB procedures were conducted more efficiently at Hospital A than at Hospital B, resulting in lower effective doses. Cone-beam computed tomography (CT) contributes significantly to patient effective doses because it has higher radiographic parameters. Despite their higher radiographic parameters, AFB procedures resulted in smaller skin doses than did image-guided interventional and CT fluoroscopy procedures. The effective doses differed between the two hospitals of this study due to workflow differences, with cone-beam CT playing a dominant role. No significant differences in left and right eyeHp(3) values were observed between the hospitals. For both hospitals, theHp(3) values were below the recommended limits, indicating that radiation monitoring may not be required for AFB procedures. This study provides insights into radiation exposure during AFB procedures, concerning radiation dosimetry, and safety for patients and physicians.
Collapse
Affiliation(s)
- Meng-En Lian
- Institute of Nuclear Engineering and Science, National Tsing Hua University, Hsinchu, Taiwan
| | - Wong Guang Yee
- Department of Biomedical Engineering and Environmental Sciences, National Tsing Hua University, Hsinchu, Taiwan
| | - Kai-Lun Yu
- Department of Internal Medicine, National Taiwan University Hospital, Hsin-Chu Branch, Hsinchu, Taiwan
- Graduate Institute of Clinical Medicine, College of Medicine, National Taiwan University, Taipei, Taiwan
| | - Guan-Yi Wu
- Scientific Research Division, National Synchrotron Radiation Research Center, Hsinchu, Taiwan
| | - Shun-Mao Yang
- Department of Surgery, National Taiwan University Hospital, Hsin-Chu Branch, Hsinchu, Taiwan
| | - Hui-Yu Tsai
- Institute of Nuclear Engineering and Science, National Tsing Hua University, Hsinchu, Taiwan
| |
Collapse
|
7
|
Kim S, Jung T, Sohn DK, Chae Y, Kim YA, Kang SH, Park Y, Chang YJ. The Multidomain Metaverse Cancer Care Digital Platform: Development and Usability Study. JMIR Serious Games 2023; 11:e46242. [PMID: 38032697 PMCID: PMC10722376 DOI: 10.2196/46242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2023] [Revised: 04/17/2023] [Accepted: 10/24/2023] [Indexed: 12/01/2023] Open
Abstract
BACKGROUND As cancer treatment methods have diversified and the importance of self-management, which lowers the dependence rate on direct hospital visits, has increased, effective cancer care education and management for health professionals and patients have become necessary. The metaverse is in the spotlight as a means of digital health that allows users to engage in cancer care education and management beyond physical constraints. However, it is difficult to find a multipurpose medical metaverse that can not only be used in the field but also complements current cancer care. OBJECTIVE This study aimed to develop an integrated metaverse cancer care platform, Dr. Meta, and examine its usability. METHODS We conducted a multicenter, cross-sectional survey between November and December 2021. A descriptive analysis was performed to examine users' experiences with Dr. Meta. In addition, a supplementary open-ended question was used to ask users for their suggestions and improvements regarding the platform. RESULTS Responses from 70 Korean participants (male: n=19, 27% and female: n=51, 73%) were analyzed. More than half (n=37, 54%) of the participants were satisfied with Dr. Meta; they responded that it was an interesting and immersive platform (n=50, 72%). Less than half perceived no discomfort when using Dr. Meta (n=34, 49%) and no difficulty in wearing and operating the device (n=30, 43%). Furthermore, more than half (n=50, 72%) of the participants reported that Dr. Meta would help provide non-face-to-face and noncontact services. More than half also wanted to continue using this platform in the future (n=41, 59%) and recommended it to others (n=42, 60%). CONCLUSIONS We developed a multidomain metaverse cancer care platform that can support both health professionals and patients in non-face-to-face cancer care. The platform was uniquely disseminated and implemented in multiple regional hospitals and showed the potential to perform successful cancer care.
Collapse
Affiliation(s)
- Sunghak Kim
- Division of Cancer Control and Policy, National Cancer Center, Goyang, Republic of Korea
| | - Timothy Jung
- Faculty of Business and Law, Manchester Metropolitan University, Manchester, United Kingdom
| | - Dae Kyung Sohn
- Center for Colorectal Cancer, National Cancer Center, Goyang, Republic of Korea
| | - Yoon Chae
- Division of Cancer Control and Policy, National Cancer Center, Goyang, Republic of Korea
| | - Young Ae Kim
- National Cancer Survivorship Center, National Cancer Center, Goyang, Republic of Korea
| | - Seung Hyun Kang
- Planning Division, Korea Smart Healthcare Association, Seoul, Republic of Korea
| | - Yujin Park
- Division of Cancer Control and Policy, National Cancer Center, Goyang, Republic of Korea
| | - Yoon Jung Chang
- Division of Cancer Control and Policy, National Cancer Center, Goyang, Republic of Korea
| |
Collapse
|
8
|
Tung EL, Matalon SA. Study Smarter: Applying the Science of Learning to Radiology. J Am Coll Radiol 2023; 20:1084-1091. [PMID: 37634793 DOI: 10.1016/j.jacr.2023.04.026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2023] [Revised: 04/15/2023] [Accepted: 04/26/2023] [Indexed: 08/29/2023]
Abstract
Lifelong learning is critical to a successful career in radiology, but many learners use inefficient or ineffective studying strategies developed from personal experience. Decades of cognitive psychology research have identified several techniques that consistently improve knowledge consolidation and retrieval. To improve their knowledge and ultimately patient care, radiologists should strive to understand and adopt these learning techniques. The first part of this article reviews several evidence-based learning principles, including active retrieval and the testing effect, spaced repetition, interleaving, deliberate practice, and growth mind-set. The second part provides practical suggestions on how to incorporate these principles into radiology learning, both during training and beyond.
Collapse
Affiliation(s)
- Eric L Tung
- Chief Resident, Department of Radiology, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts.
| | - Shanna A Matalon
- Radiology Residency Associate Program Director, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts. https://twitter.com/ShannaMatalonMD
| |
Collapse
|
9
|
Albano D, Messina C, Gitto S, Chianca V, Sconfienza LM. Bone biopsies guided by augmented reality: a pilot study. Eur Radiol Exp 2023; 7:40. [PMID: 37468652 PMCID: PMC10356701 DOI: 10.1186/s41747-023-00353-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Accepted: 05/09/2023] [Indexed: 07/21/2023] Open
Abstract
PURPOSE To test the technical feasibility of an augmented reality (AR) navigation system to guide bone biopsies. METHODS We enrolled patients subjected to percutaneous computed tomography (CT)-guided bone biopsy using a novel AR navigation system. Data from prospectively enrolled patients (AR group) were compared with data obtained retrospectively from previous standard CT-guided bone biopsies (control group). We evaluated the following: procedure duration, number of CT passes, patient's radiation dose (dose-length product), complications, and specimen adequacy. Technical success was defined as the ability to complete the procedure as planned, reaching the target center. Technical efficacy was assessed evaluating specimen adequacy. RESULTS Eight patients (4 males) aged 58 ± 24 years (mean ± standard deviation) were enrolled in the AR group and compared with 8 controls (4 males) aged 60 ± 15 years. No complications were observed. Procedure duration, number of CT passes, and radiation dose were 22 ± 5 min, 4 (median) [4, 6 interquartile range] and 1,034 ± 672 mGy*cm for the AR group and 23 ± 5 min, 9 [7.75, 11.25], and 1,954 ± 993 mGy*cm for controls, respectively. No significant differences were observed for procedure duration (p = 0.878). Conversely, number of CT passes and radiation doses were significantly lower for the AR group (p < 0.001 and p = 0.021, respectively). Technical success and technical efficacy were 100% for both groups. CONCLUSIONS This AR navigation system is safe, feasible, and effective; it can decrease radiation exposure and number of CT passes during bone biopsies without increasing duration time. RELEVANCE STATEMENT This augmented reality (AR) navigation system is a safe and feasible guidance for bone biopsies; it may ensure a decrease in the number of CT passes and patient's radiation dose. KEY POINTS • This AR navigation system is a safe guidance for bone biopsies. • It ensures decrease of number of CT passes and patient's radiation exposure. • Procedure duration was similar to that of standard CT-guided biopsy. • Technical success was 100% as in all patients the target was reached. • Technical efficacy was 100% as the specimen was adequate in all patients.
Collapse
Affiliation(s)
| | - Carmelo Messina
- IRCCS Istituto Ortopedico Galeazzi, Milan, 20161, Italy
- Dipartimento di Scienze Biomediche per la Salute, Università degli Studi di Milano, Milan, 20122, Italy
| | - Salvatore Gitto
- IRCCS Istituto Ortopedico Galeazzi, Milan, 20161, Italy
- Dipartimento di Scienze Biomediche per la Salute, Università degli Studi di Milano, Milan, 20122, Italy
| | - Vito Chianca
- Clinica Di Radiologia EOC IIMSI, Lugano, Switzerland
- Ospedale Evangelico Betania, Via Argine 604, Naples, 80147, Italy
| | - Luca Maria Sconfienza
- IRCCS Istituto Ortopedico Galeazzi, Milan, 20161, Italy
- Dipartimento di Scienze Biomediche per la Salute, Università degli Studi di Milano, Milan, 20122, Italy
| |
Collapse
|
10
|
Kukla P, Maciejewska K, Strojna I, Zapał M, Zwierzchowski G, Bąk B. Extended Reality in Diagnostic Imaging-A Literature Review. Tomography 2023; 9:1071-1082. [PMID: 37368540 DOI: 10.3390/tomography9030088] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/07/2023] [Revised: 05/21/2023] [Accepted: 05/22/2023] [Indexed: 06/29/2023] Open
Abstract
The utilization of extended reality (ER) has been increasingly explored in the medical field over the past ten years. A comprehensive analysis of scientific publications was conducted to assess the applications of ER in the field of diagnostic imaging, including ultrasound, interventional radiology, and computed tomography. The study also evaluated the use of ER in patient positioning and medical education. Additionally, we explored the potential of ER as a replacement for anesthesia and sedation during examinations. The use of ER technologies in medical education has received increased attention in recent years. This technology allows for a more interactive and engaging educational experience, particularly in anatomy and patient positioning, although the question may be asked: is the technology and maintenance cost worth the investment? The results of the analyzed studies suggest that implementing augmented reality in clinical practice is a positive phenomenon that expands the diagnostic capabilities of imaging studies, education, and positioning. The results suggest that ER has significant potential to improve diagnostic imaging procedures' accuracy and efficiency and enhance the patient experience through increased visualization and understanding of medical conditions. Despite these promising advancements, further research is needed to fully realize the potential of ER in the medical field and to address the challenges and limitations associated with its integration into clinical practice.
Collapse
Affiliation(s)
- Paulina Kukla
- Department of Electroradiology, Poznan University of Medical Sciences, 61-866 Poznan, Poland
| | - Karolina Maciejewska
- Department of Electroradiology, Poznan University of Medical Sciences, 61-866 Poznan, Poland
| | - Iga Strojna
- Department of Electroradiology, Poznan University of Medical Sciences, 61-866 Poznan, Poland
| | - Małgorzata Zapał
- Department of Electroradiology, Poznan University of Medical Sciences, 61-866 Poznan, Poland
- Department of Adult Neurology, Medical University of Gdansk, 80-210 Gdansk, Poland
| | - Grzegorz Zwierzchowski
- Department of Electroradiology, Poznan University of Medical Sciences, 61-866 Poznan, Poland
- Department of Medical Physics, Greater Poland Cancer Centre, 61-866 Poznan, Poland
| | - Bartosz Bąk
- Department of Electroradiology, Poznan University of Medical Sciences, 61-866 Poznan, Poland
- Department of Radiotherapy II, Greater Poland Cancer Centre, 61-866 Poznan, Poland
| |
Collapse
|
11
|
England A, Thompson J, Dorey S, Al-Islam S, Long M, Maiorino C, McEntee MF. A comparison of perceived image quality between computer display monitors and augmented reality smart glasses. Radiography (Lond) 2023; 29:641-646. [PMID: 37130492 DOI: 10.1016/j.radi.2023.04.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2022] [Revised: 03/28/2023] [Accepted: 04/14/2023] [Indexed: 05/04/2023]
Abstract
INTRODUCTION Augmented-reality (AR) smart glasses provide an alternative to standard computer display monitors (CDM). AR smart glasses may provide an opportunity to improve visualisation during fluoroscopy and interventional radiology (IR) procedures when there can be difficulty in viewing intra-procedural images on a CDM. The aim of this study was to evaluate radiographer perception of image quality (IQ) when comparing CDM and AR smart glasses. METHODS 38 radiographers attending an international congress evaluated ten fluoroscopic-guided surgery and IR images on both a CDM (1920 × 1200 pixels) and a set of Epson Moverio BT-40 AR smart glasses (1920 × 1080 pixels). Participants provided oral responses to pre-defined IQ questions generated by study researchers. Summative IQ scores for each participant/image were compared between CDM and AR smart glasses. RESULTS Of the 38 participants, the mean age was 39 ± 1 years. 23 (60.5%) participants required corrective glasses. In terms of generalisability, participants were from 12 different countries, the majority (n = 9, 23.7%) from the United Kingdom. For eight out of ten images, the AR smart glasses demonstrated a statistically significant increase in perceived IQ (median [IQR] 2.0 [-1.0 to 7.0] points) when compared to the CDM. CONCLUSION AR smart glasses appear to show improvements in perceived IQ when compared to a CDM. AR smart glasses could provide an option for improving the experiences of radiographers involved in image-guided procedures and should be subject to further clinical evaluations. IMPLICATIONS FOR PRACTICE Opportunities exist to improve perceived IQ for radiographers when reviewing fluoroscopy and IR images. AR smart glasses should be further evaluated as a potential opportunity to improve practice when visual attention is split between positioning equipment and image review.
Collapse
Affiliation(s)
- A England
- Discipline of Medical Imaging & Radiation Therapy, University College Cork, Cork, Ireland.
| | - J Thompson
- University Hospitals of Morecambe Bay NHS Foundation Trust, Barrow-in-Furness, UK
| | - S Dorey
- Tameside and Glossop Integrated Care NHS Foundation Trust, Tameside, UK
| | - S Al-Islam
- East Lancashire Hospitals NHS Trust, Blackburn, UK
| | - M Long
- Discipline of Medical Imaging & Radiation Therapy, University College Cork, Cork, Ireland
| | - C Maiorino
- Discipline of Medical Imaging & Radiation Therapy, University College Cork, Cork, Ireland
| | - M F McEntee
- Discipline of Medical Imaging & Radiation Therapy, University College Cork, Cork, Ireland
| |
Collapse
|
12
|
Dinh A, Yin AL, Estrin D, Greenwald P, Fortenko A. Augmented Reality in Real-time Telemedicine and Telementoring: Scoping Review. JMIR Mhealth Uhealth 2023; 11:e45464. [PMID: 37071458 PMCID: PMC10155085 DOI: 10.2196/45464] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2023] [Revised: 03/01/2023] [Accepted: 03/15/2023] [Indexed: 04/19/2023] Open
Abstract
BACKGROUND Over the last decade, augmented reality (AR) has emerged in health care as a tool for visualizing data and enhancing simulation learning. AR, which has largely been explored for communication and collaboration in nonhealth contexts, could play a role in shaping future remote medical services and training. This review summarized existing studies implementing AR in real-time telemedicine and telementoring to create a foundation for health care providers and technology developers to understand future opportunities in remote care and education. OBJECTIVE This review described devices and platforms that use AR for real-time telemedicine and telementoring, the tasks for which AR was implemented, and the ways in which these implementations were evaluated to identify gaps in research that provide opportunities for further study. METHODS We searched PubMed, Scopus, Embase, and MEDLINE to identify English-language studies published between January 1, 2012, and October 18, 2022, implementing AR technology in a real-time interaction related to telemedicine or telementoring. The search terms were "augmented reality" OR "AR" AND "remote" OR "telemedicine" OR "telehealth" OR "telementoring." Systematic reviews, meta-analyses, and discussion-based articles were excluded from analysis. RESULTS A total of 39 articles met the inclusion criteria and were categorized into themes of patient evaluation, medical intervention, and education. In total, 20 devices and platforms using AR were identified, with common features being the ability for remote users to annotate, display graphics, and display their hands or tools in the local user's view. Common themes across the studies included consultation and procedural education, with surgery, emergency, and hospital medicine being the most represented specialties. Outcomes were most often measured using feedback surveys and interviews. The most common objective measures were time to task completion and performance. Long-term outcome and resource cost measurements were rare. Across the studies, user feedback was consistently positive for perceived efficacy, feasibility, and acceptability. Comparative trials demonstrated that AR-assisted conditions had noninferior reliability and performance and did not consistently extend procedure times compared with in-person controls. CONCLUSIONS Studies implementing AR in telemedicine and telementoring demonstrated the technology's ability to enhance access to information and facilitate guidance in multiple health care settings. However, AR's role as an alternative to current telecommunication platforms or even in-person interactions remains to be validated, with many disciplines and provider-to-nonprovider uses still lacking robust investigation. Additional studies comparing existing methods may offer more insight into this intersection, but the early stage of technical development and the lack of standardized tools and adoption have hindered the conduct of larger longitudinal and randomized controlled trials. Overall, AR has the potential to complement and advance the capabilities of remote medical care and learning, creating unique opportunities for innovator, provider, and patient involvement.
Collapse
Affiliation(s)
- Alana Dinh
- Medical College, Weill Cornell Medicine, New York, NY, United States
| | - Andrew Lukas Yin
- Department of Internal Medicine, Weill Cornell Medicine, New York, NY, United States
| | - Deborah Estrin
- Department of Computer Science, Cornell Tech, New York, NY, United States
| | - Peter Greenwald
- Emergency Medicine, NewYork-Presyterian Hospital, New York, NY, United States
| | - Alexander Fortenko
- Emergency Medicine, NewYork-Presyterian Hospital, New York, NY, United States
| |
Collapse
|
13
|
Ahuja AS, Polascik BW, Doddapaneni D, Byrnes ES, Sridhar J. The Digital Metaverse: Applications in Artificial Intelligence, Medical Education, and Integrative Health. Integr Med Res 2023; 12:100917. [PMID: 36691642 PMCID: PMC9860100 DOI: 10.1016/j.imr.2022.100917] [Citation(s) in RCA: 26] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2022] [Revised: 11/06/2022] [Accepted: 11/16/2022] [Indexed: 12/24/2022] Open
Affiliation(s)
- Abhimanyu S. Ahuja
- Charles E. Schmidt College of Medicine, Florida Atlantic University, Boca Raton, FL, United States of America
| | - Bryce W. Polascik
- Wake Forest University School of Medicine, Winston-Salem, North Carolina, United States of America
| | - Divyesh Doddapaneni
- Department of Internal Medicine, University of Colorado Anschutz Medical Campus, Aurora, Colorado, United States of America
| | - Eamonn S. Byrnes
- Department of Internal Medicine, Orlando Regional Medical Center, Orlando, Florida, United States of America
| | - Jayanth Sridhar
- Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miller School of Medicine, Miami, Florida, United States of America,Corresponding author at: Jayanth Sridhar, Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami, Miller School of Medicine, 900 NW 17th Street, Miami, FL, 33136.
| |
Collapse
|
14
|
Guaraná JB, Aytaç G, Müller AF, Thompson J, Freitas SH, Lee UY, Lozanoff S, Ferrante B. Extended reality veterinary medicine case studies for diagnostic veterinary imaging instruction: Assessing student perceptions and examination performance. Anat Histol Embryol 2023; 52:101-114. [PMID: 36317584 DOI: 10.1111/ahe.12879] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2022] [Revised: 08/07/2022] [Accepted: 08/31/2022] [Indexed: 01/17/2023]
Abstract
Educational technologies in veterinary medicine aim to train veterinarians faster and improve clinical outcomes. COVID-19 pandemic, shifted face-to-face teaching to online, thus, the need to provide effective education remotely was exacerbated. Among recent technology advances for veterinary medical education, extended reality (XR) is a promising teaching tool. This study aimed to develop a case resolution approach for radiographic anatomy studies using XR technology and assess students' achievement of differential diagnostic skills. Learning objectives based on Bloom's taxonomy keywords were used to develop four clinical cases (3 dogs/1 cat) of spinal injuries utilizing CT scans and XR models and presented to 22 third-year veterinary medicine students. Quantitative assessment (ASMT) of 7 questions probing 'memorization', 'understanding and application', 'analysis' and 'evaluation' was given before and after contact with XR technology as well as qualitative feedback via a survey. Mean ASMT scores increased during case resolution (pre 51.6% (±37%)/post 60.1% (± 34%); p < 0.01), but without significant difference between cases (Kruskal-Wallis H = 2.18, NS). Learning objectives were examined for six questions (Q1-Q6) across cases (C1-4): Memorization improved sequentially (Q1, 2 8/8), while Understanding and Application (Q3,4) showed the greatest improvement (26.7%-76.9%). Evaluation and Analysis (Q5,6) was somewhat mixed, improving (5/8), no change (3/8) and declining (1/8).Positive student perceptions suggest that case studies' online delivery was well received stimulating learning in diagnostic imaging and anatomy while developing visual-spatial skills that aid understanding cross-sectional images. Therefore, XR technology could be a useful approach to complement radiological instruction in veterinary medicine.
Collapse
Affiliation(s)
- Julia B Guaraná
- Department of Veterinary Medicine, Faculty of Animal Science and Food Engineering, University of São Paulo (USP), São Paulo, Brazil
| | - Güneş Aytaç
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, University of Hawaii (UH), Honolulu, Hawaii, USA
| | - Alois F Müller
- Department of Veterinary Medicine, Faculty of Animal Science and Food Engineering, University of São Paulo (USP), São Paulo, Brazil
| | - Jesse Thompson
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, University of Hawaii (UH), Honolulu, Hawaii, USA
| | - Silvio H Freitas
- Department of Veterinary Medicine, Faculty of Animal Science and Food Engineering, University of São Paulo (USP), São Paulo, Brazil
| | - U-Young Lee
- Department of Anatomy, College of Medicine, The Catholic University of Korea (CUK), Seoul, South Korea
| | - Scott Lozanoff
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, University of Hawaii (UH), Honolulu, Hawaii, USA
| | - Bruno Ferrante
- Department of Veterinary Medicine, Faculty of Animal Science and Food Engineering, University of São Paulo (USP), São Paulo, Brazil.,Veterinary Clinical and Surgery Department of Veterinary School, Federal University of Minas Gerais (UFMG), Belo Horizonte, Minas Gerais, Brazil
| |
Collapse
|
15
|
Advances and Innovations in Ablative Head and Neck Oncologic Surgery Using Mixed Reality Technologies in Personalized Medicine. J Clin Med 2022; 11:jcm11164767. [PMID: 36013006 PMCID: PMC9410374 DOI: 10.3390/jcm11164767] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Revised: 08/10/2022] [Accepted: 08/12/2022] [Indexed: 11/17/2022] Open
Abstract
The benefit of computer-assisted planning in head and neck ablative and reconstructive surgery has been extensively documented over the last decade. This approach has been proven to offer a more secure surgical procedure. In the treatment of cancer of the head and neck, computer-assisted surgery can be used to visualize and estimate the location and extent of the tumor mass. Nowadays, some software tools even allow the visualization of the structures of interest in a mixed reality environment. However, the precise integration of mixed reality systems into a daily clinical routine is still a challenge. To date, this technology is not yet fully integrated into clinical settings such as the tumor board, surgical planning for head and neck tumors, or medical and surgical education. As a consequence, the handling of these systems is still of an experimental nature, and decision-making based on the presented data is not yet widely used. The aim of this paper is to present a novel, user-friendly 3D planning and mixed reality software and its potential application for ablative and reconstructive head and neck surgery.
Collapse
|
16
|
Multicenter assessment of augmented reality registration methods for image-guided interventions. Radiol Med 2022; 127:857-865. [DOI: 10.1007/s11547-022-01515-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 06/13/2022] [Indexed: 10/17/2022]
|
17
|
Sureshkumar H, Xu R, Erukulla N, Wadhwa A, Zhao L. "Snap on" or Not? A Validation on the Measurement Tool in a Virtual Reality Application. J Digit Imaging 2022; 35:692-703. [PMID: 35088186 PMCID: PMC9156653 DOI: 10.1007/s10278-022-00582-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2021] [Revised: 12/04/2021] [Accepted: 01/03/2022] [Indexed: 12/15/2022] Open
Abstract
This multi-rater comparison study aims to validate the measurement tool with a "snap" feature option (SNAP ON vs. SNAP OFF), in a virtual reality (VR) application, ImmersiveView v. 2.1, against a conventional software Mimics Innovation Suite v.22 (MIS). It is hypothesized that these measurement tools are equivalent between SNAP ON, and SNAP OFF, and when compared to MIS, in terms of basic linear and angular measurements. Six (6) raters conducted a set of 40 linear and 15 angular measurements using CT scan data of three objects (L-block, hand model, and dry skull) with fiducial markers. Inter-rater repeatability and intra-rater reproducibility were assessed via inter-class coefficient (ICC). Equivalency between each pair of modules (SNAP ON, SNAP OFF, and MIS) was analyzed via Bland-Altman plots and two one-sided t-tests (TOST) procedure. The ICC for intra-rater repeatability yielded 0.999 to 1.000, and inter-rater reproducibility yielded 0.998 to 1.000, which suggests high degree of intra- and inter-rater reliability. The Bland-Altman plots demonstrated that measurements acquired from SNAP ON, SNAP OFF, and MIS were equivalent. The TOST procedure yielded that the measurements through all three modules are equivalent within ± 0.2 mm interval for distance, and ± 0.3° interval for angular measurements. The measurement tool with the "snap" feature in a newly developed VR application (ImmersiveView v.2.1) has been validated through a multi-rater comparison study. In terms of linear and angular measurements, this VR application, whether the "snap" feature was on or off, was equivalent to each other and to the control software (MIS) under the condition of this study. A strong reliability, both intra-rater repeatability and inter-rater reproducibility, has been found.
Collapse
Affiliation(s)
- Haarisudhan Sureshkumar
- Department of Biomedical Engineering, University of Illinois at Chicago, 851 S Morgan St, Chicago, IL, 60607, USA
| | - Ruidi Xu
- Department of Biomedical Engineering, University of Illinois at Chicago, 851 S Morgan St, Chicago, IL, 60607, USA
| | - Nikith Erukulla
- Department of Biomedical Engineering, University of Illinois at Chicago, 851 S Morgan St, Chicago, IL, 60607, USA
| | - Aditi Wadhwa
- Department of Biomedical Engineering, University of Illinois at Chicago, 851 S Morgan St, Chicago, IL, 60607, USA
| | - Linping Zhao
- Virtual Surgical Simulation Laboratory, Division of Plastic, Reconstructive and Cosmetic Surgery, University of Illinois at Chicago, 811 S. Paulina St, Chicago, IL, 60612, USA.
- Shriners Hospitals for Children at Chicago, 2211 N. Oak Park Ave, Chicago, IL, 60707, USA.
| |
Collapse
|
18
|
Mikami BS, Hynd TE, Lee UY, DeMeo J, Thompson JD, Sokiranski R, Doll S, Lozanoff S. Extended reality visualization of medical museum specimens: Online presentation of conjoined twins curated by Dr. Jacob Henle between 1844-1852. TRANSLATIONAL RESEARCH IN ANATOMY 2022; 27:100171. [PMID: 36133355 PMCID: PMC9489256 DOI: 10.1016/j.tria.2022.100171] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Background The purpose of this study is to characterize a full-term conjoined twins' cadaver curated by Dr. Jacob Henle sometime between 1844 and 1852 and demonstrate digital distribution of an old and rare medical museum specimen using an extended reality (XR) model workflow. Methods The cadaver (Preparation 296) is in the Department of Anatomy and Cell Biology at the University of Heidelberg. An XR display workflow comprises image capture, segmentation, and visualization using CT/MR scans derived from the cadaver. Online radiology presentation to medical students focuses on diagnostic characteristics of anatomical systems depicted with XR models. Results Developmental defects in Preparation 296 include duplicated supradiaphragmatic structures and abnormal osteological features. Subdiaphragmatically, the gut is continuous on the right, but terminates at the distal esophagus on the left. One large liver occupies the abdomen with one spleen located on the left side. Observations suggest duplication of the primitive streak and separate notochords rostrally. Duplication occurs near the yolk sac and involves midgut formation while secondary midline fusion of the upper extremities and ribs likely results from the proximity of the embryos during development. Medical students access the model with device agnostic software during the curricular topic "Human Body Plan" that includes embryology concepts covering mechanisms of twinning. Conclusions The workflow enables ease-of-access XR visualizations of an old and rare museum specimen. This study also demonstrates digital distribution and utilization of XR models applicable to embryology education.
Collapse
Affiliation(s)
- Brandi S. Mikami
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, Honolulu, HI, 96813, USA
| | - Thomas E. Hynd
- Department of Biology, James Madison University, Harrisonburg, VA, 22807, USA
| | - U-Young Lee
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, Honolulu, HI, 96813, USA
- Department of Anatomy, Catholic University of Korea, Seoul, KR, 06591, South Korea
| | - J. DeMeo
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, Honolulu, HI, 96813, USA
| | - Jesse D. Thompson
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, Honolulu, HI, 96813, USA
| | - Roman Sokiranski
- Department of Anatomy & Cell Biology, Medical University of Varna, Varna, BG-9002, Bulgaria
| | - Sara Doll
- Department of Anatomy and Cell Biology, University of Heidelberg, Heidelberg, DE, 69120, USA
| | - Scott Lozanoff
- Department of Anatomy, Biochemistry & Physiology, John A. Burns School of Medicine, Honolulu, HI, 96813, USA
| |
Collapse
|
19
|
Balthazar P, Harri P, Prater A, Heilbrun ME, Mullins ME, Safdar N. Development and Implementation of an Integrated Imaging Informatics Track for Radiology Residents: Our 3-Year Experience. Acad Radiol 2022; 29 Suppl 5:S58-S64. [PMID: 33303347 DOI: 10.1016/j.acra.2020.11.015] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2020] [Revised: 11/01/2020] [Accepted: 11/09/2020] [Indexed: 12/30/2022]
Abstract
RATIONALE AND OBJECTIVES Imaging Informatics is an emerging and fast-evolving field that encompasses the management of information during all steps of the imaging value chain. With many information technology tools being essential to the radiologists' day-to-day work, there is an increasing need for qualified professionals with clinical background, technology expertise, and leadership skills. To answer this, we describe our experience in the development and implementation of an Integrated Imaging Informatics Track (I3T) for radiology residents at our institution. MATERIALS AND METHODS The I3T was created by a resident-driven initiative funded by an intradepartmental resident grant. Its curriculum is delivered through a combination of monthly small group discussions, operational meetings, recommended readings, lectures, and early exposure to the National Imaging Informatics Course. The track is steered and managed by the I3T Committee, including trainees and faculty advisors. Up to two first-year residents are selected annually based on their curriculum vitae and an interest application. Successful completion of the program requires submission of a capstone project and at least one academic deliverable (national meeting presentation, poster, exhibit, manuscript and/or grant). RESULTS In our three-year experience, the seven I3T radiology residents have reported a total of 58 scholarly activities related to Imaging Informatics. I3T residents have assumed leadership roles within our organization and nationally. All residents have successfully carried out their clinical responsibilities. CONCLUSION We have developed and implemented an I3T for radiology residents at our institution. These residents have been successful in their clinical, scholarship and leadership pursuits.
Collapse
Affiliation(s)
- Patricia Balthazar
- Department of Radiology, Emory University School of Medicine, Atlanta, Georgia.
| | - Peter Harri
- Department of Radiology, Emory University School of Medicine, Atlanta, Georgia
| | - Adam Prater
- Department of Radiology, Emory University School of Medicine, Atlanta, Georgia
| | - Marta E Heilbrun
- Department of Radiology, Emory University School of Medicine, Atlanta, Georgia
| | - Mark E Mullins
- Department of Radiology, Emory University School of Medicine, Atlanta, Georgia
| | - Nabile Safdar
- Department of Radiology, Emory University School of Medicine, Atlanta, Georgia
| |
Collapse
|
20
|
Lee JJ, Klepcha M, Wong M, Dang PN, Sadrameli SS, Britz GW. The First Pilot Study of an Interactive, 360° Augmented Reality Visualization Platform for Neurosurgical Patient Education: A Case Series. Oper Neurosurg (Hagerstown) 2022; 23:53-59. [DOI: 10.1227/ons.0000000000000186] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2021] [Accepted: 01/09/2022] [Indexed: 11/19/2022] Open
|
21
|
Thermal Ablation of Liver Tumors Guided by Augmented Reality: An Initial Clinical Experience. Cancers (Basel) 2022; 14:cancers14051312. [PMID: 35267620 PMCID: PMC8909771 DOI: 10.3390/cancers14051312] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2022] [Revised: 02/25/2022] [Accepted: 02/27/2022] [Indexed: 02/06/2023] Open
Abstract
Background: Over the last two decades, augmented reality (AR) has been used as a visualization tool in many medical fields in order to increase precision, limit the radiation dose, and decrease the variability among operators. Here, we report the first in vivo study of a novel AR system for the guidance of percutaneous interventional oncology procedures. Methods: Eight patients with 15 liver tumors (0.7−3.0 cm, mean 1.56 + 0.55) underwent percutaneous thermal ablations using AR guidance (i.e., the Endosight system). Prior to the intervention, the patients were evaluated with US and CT. The targeted nodules were segmented and three-dimensionally (3D) reconstructed from CT images, and the probe trajectory to the target was defined. The procedures were guided solely by AR, with the position of the probe tip was subsequently confirmed by conventional imaging. The primary endpoints were the targeting accuracy, the system setup time, and targeting time (i.e., from the target visualization to the correct needle insertion). The technical success was also evaluated and validated by co-registration software. Upon completion, the operators were assessed for cybersickness or other symptoms related to the use of AR. Results: Rapid system setup and procedural targeting times were noted (mean 14.3 min; 12.0−17.2 min; 4.3 min, 3.2−5.7 min, mean, respectively). The high targeting accuracy (3.4 mm; 2.6−4.2 mm, mean) was accompanied by technical success in all 15 lesions (i.e., the complete ablation of the tumor and 13/15 lesions with a >90% 5-mm periablational margin). No intra/periprocedural complications or operator cybersickness were observed. Conclusions: AR guidance is highly accurate, and allows for the confident performance of percutaneous thermal ablations.
Collapse
|
22
|
Guérinot C, Marcon V, Godard C, Blanc T, Verdier H, Planchon G, Raimondi F, Boddaert N, Alonso M, Sailor K, Lledo PM, Hajj B, El Beheiry M, Masson JB. New Approach to Accelerated Image Annotation by Leveraging Virtual Reality and Cloud Computing. FRONTIERS IN BIOINFORMATICS 2022; 1:777101. [PMID: 36303792 PMCID: PMC9580868 DOI: 10.3389/fbinf.2021.777101] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Accepted: 12/15/2021] [Indexed: 01/02/2023] Open
Abstract
Three-dimensional imaging is at the core of medical imaging and is becoming a standard in biological research. As a result, there is an increasing need to visualize, analyze and interact with data in a natural three-dimensional context. By combining stereoscopy and motion tracking, commercial virtual reality (VR) headsets provide a solution to this critical visualization challenge by allowing users to view volumetric image stacks in a highly intuitive fashion. While optimizing the visualization and interaction process in VR remains an active topic, one of the most pressing issue is how to utilize VR for annotation and analysis of data. Annotating data is often a required step for training machine learning algorithms. For example, enhancing the ability to annotate complex three-dimensional data in biological research as newly acquired data may come in limited quantities. Similarly, medical data annotation is often time-consuming and requires expert knowledge to identify structures of interest correctly. Moreover, simultaneous data analysis and visualization in VR is computationally demanding. Here, we introduce a new procedure to visualize, interact, annotate and analyze data by combining VR with cloud computing. VR is leveraged to provide natural interactions with volumetric representations of experimental imaging data. In parallel, cloud computing performs costly computations to accelerate the data annotation with minimal input required from the user. We demonstrate multiple proof-of-concept applications of our approach on volumetric fluorescent microscopy images of mouse neurons and tumor or organ annotations in medical images.
Collapse
Affiliation(s)
- Corentin Guérinot
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
- Sorbonne Université, Collège Doctoral, Paris, France
| | - Valentin Marcon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Charlotte Godard
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Thomas Blanc
- Sorbonne Université, Collège Doctoral, Paris, France
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, CNRS UMR168, Paris, France
| | - Hippolyte Verdier
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Histopathology and Bio-Imaging Group, Sanofi R&D, Vitry-Sur-Seine, France
- Université de Paris, UFR de Physique, Paris, France
| | - Guillaume Planchon
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Francesca Raimondi
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
- Unité Médicochirurgicale de Cardiologie Congénitale et Pédiatrique, Centre de Référence des Malformations Cardiaques Congénitales Complexes M3C, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Nathalie Boddaert
- Pediatric Radiology Unit, Hôpital Universitaire Necker-Enfants Malades, Université de Paris, Paris, France
- UMR-1163 Institut Imagine, Hôpital Universitaire Necker-Enfants Malades, AP-HP, Paris, France
| | - Mariana Alonso
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Kurt Sailor
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Pierre-Marie Lledo
- Perception and Memory Unit, CNRS UMR3571, Institut Pasteur, Paris, France
| | - Bassam Hajj
- Sorbonne Université, Collège Doctoral, Paris, France
- École Doctorale Physique en Île-de-France, PSL University, Paris, France
| | - Mohamed El Beheiry
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| | - Jean-Baptiste Masson
- Decision and Bayesian Computation, USR 3756 (C3BI/DBC) & Neuroscience Department CNRS UMR 3751, Université de Paris, Institut Pasteur, Paris, France
| |
Collapse
|
23
|
Muff JL, Heye T, Thieringer FM, Brantner P. Clinical acceptance of advanced visualization methods: a comparison study of 3D-print, virtual reality glasses, and 3D-display. 3D Print Med 2022; 8:5. [PMID: 35094166 PMCID: PMC8801110 DOI: 10.1186/s41205-022-00133-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2021] [Accepted: 01/17/2022] [Indexed: 01/12/2023] Open
Abstract
BACKGROUND To compare different methods of three-dimensional representations, namely 3D-Print, Virtual Reality (VR)-Glasses and 3D-Display regarding the understanding of the pathology, accuracy of details, quality of the anatomical representation and technical operability and assessment of possible change in treatment in different disciplines and levels of professional experience. METHODS Interviews were conducted with twenty physicians from the disciplines of cardiology, oral and maxillofacial surgery, orthopedic surgery, and radiology between 2018 and 2020 at the University Hospital of Basel. They were all presented with three different three-dimensional clinical cases derived from CT data from their area of expertise, one case for each method. During this, the physicians were asked for their feedback written down on a pencil and paper questionnaire. RESULTS Concerning the understanding of the pathology and quality of the anatomical representation, VR-Glasses were rated best in three out of four disciplines and two out of three levels of professional experience. Regarding the accuracy of details, 3D-Display was rated best in three out of four disciplines and all levels of professional experience. As to operability, 3D-Display was consistently rated best in all levels of professional experience and all disciplines. Possible change in treatment was reported using 3D-Print in 33%, VR-Glasses in 44%, and 3D-Display in 33% of participants. Physicians with a professional experience of more than ten years reported no change in treatment using any method. CONCLUSIONS 3D-Print, VR-Glasses, and 3D-Displays are very well accepted, and a relevant percentage of participants with less than ten years of professional work experience could imagine a possible change in treatment using any of these three-dimensional methods. Our findings challenge scientists, technicians, and physicians to further develop these methods to improve the three-dimensional understanding of pathologies and to add value to the education of young and inexperienced physicians.
Collapse
Affiliation(s)
- Julian Louis Muff
- Department of Radiology and Nuclear Medicine, University Hospital Basel, Basel, Switzerland.
| | - Tobias Heye
- Department of Radiology and Nuclear Medicine, University Hospital Basel, Basel, Switzerland
| | - Florian Markus Thieringer
- Department of Oral and Cranio-Maxillofacial Surgery and 3D Print Lab, University Hospital Basel, Basel, Switzerland
- Department of Biomedical Engineering, University Hospital Basel, Basel, Switzerland
| | - Philipp Brantner
- Department of Radiology, Gesundheitszentrum Fricktal, Rheinfelden, Switzerland
| |
Collapse
|
24
|
Reponen J, Niinimäki J. Emergence of teleradiology, PACS, and other radiology IT solutions in Acta Radiologica. Acta Radiol 2021; 62:1525-1533. [PMID: 34637341 DOI: 10.1177/02841851211051003] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
For this historical review, we searched a database containing all the articles published in Acta Radiologica during its 100-year history to find those on the use of information technology (IT) in radiology. After reading the full texts, we selected the presented articles according to major radiology IT domains such as teleradiology, picture archiving and communication systems, image processing, image analysis, and computer-aided diagnostics in order to describe the development as it appeared in the journal. Publications generally follow IT megatrends, but because the contents of Acta Radiologica are mainly clinically oriented, some technology achievements appear later than they do in journals discussing mainly imaging informatics topics.
Collapse
Affiliation(s)
- Jarmo Reponen
- Research Unit of Medical Imaging, Physics and Technology, University of Oulu, Oulu, Finland
- Medical Research Center Oulu, Oulu University Hospital and University of Oulu, Oulu, Finland
| | - Jaakko Niinimäki
- Research Unit of Medical Imaging, Physics and Technology, University of Oulu, Oulu, Finland
- Department of Diagnostic Radiology, Oulu University Hospital, Oulu, Finland
| |
Collapse
|
25
|
Prabhu SP. 3D Modeling and Advanced Visualization of the Pediatric Brain, Neck, and Spine. Magn Reson Imaging Clin N Am 2021; 29:655-666. [PMID: 34717852 DOI: 10.1016/j.mric.2021.06.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023]
Abstract
The ready availability of advanced visualization tools on picture archiving and communication systems workstations or even standard laptops through server-based or cloud-based solutions has enabled greater adoption of these techniques. We describe how radiologists can tailor imaging techniques for optimal 3D reconstructions provide a brief overview of the standard and newer "on-screen" techniques. We describe the process of creating 3D printed models for surgical simulation and education, with examples from the authors' institution and the existing literature. Finally, the review highlights current uses and potential future use cases for virtual reality and augmented reality applications in a pediatric neuroimaging setting.
Collapse
Affiliation(s)
- Sanjay P Prabhu
- Neuroradiology Division, Department of Radiology, Boston Children's Hospital, Harvard Medical School, SIMPeds3D Print, 300 Longwood Avenue, Boston, MA 02115, USA.
| |
Collapse
|
26
|
López-Ojeda W, Hurley RA. Extended-Reality Technologies: An Overview of Emerging Applications in Medical Education and Clinical Care. J Neuropsychiatry Clin Neurosci 2021; 33:A4-177. [PMID: 34289698 DOI: 10.1176/appi.neuropsych.21030067] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Affiliation(s)
- Wilfredo López-Ojeda
- Veterans Affairs Mid-Atlantic Mental Illness Research, Education, and Clinical Center, and Research and Academic Affairs Service Line, W.G. Hefner Veterans Affairs Medical Center, Salisbury, N.C. (López-Ojeda, Hurley); Department of Psychiatry and Behavioral Medicine, Wake Forest School of Medicine, Winston-Salem, N.C. (López-Ojeda); Departments of Psychiatry and Radiology, Wake Forest School of Medicine, Winston-Salem, N.C. (Hurley); and Menninger Department of Psychiatry and Behavioral Sciences, Baylor College of Medicine, Houston (Hurley)
| | - Robin A Hurley
- Veterans Affairs Mid-Atlantic Mental Illness Research, Education, and Clinical Center, and Research and Academic Affairs Service Line, W.G. Hefner Veterans Affairs Medical Center, Salisbury, N.C. (López-Ojeda, Hurley); Department of Psychiatry and Behavioral Medicine, Wake Forest School of Medicine, Winston-Salem, N.C. (López-Ojeda); Departments of Psychiatry and Radiology, Wake Forest School of Medicine, Winston-Salem, N.C. (Hurley); and Menninger Department of Psychiatry and Behavioral Sciences, Baylor College of Medicine, Houston (Hurley)
| |
Collapse
|
27
|
Mialhe C, Chaudhuri A, Raffort J, Lareyre F. Feasibility of the Application of Holographic Augmented Reality in Endovascular Surgery Using Microsoft HoloLens Head-Mounted Display. Ann Vasc Surg 2021; 76:597-598. [PMID: 34182109 DOI: 10.1016/j.avsg.2021.05.010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2021] [Revised: 04/19/2021] [Accepted: 05/03/2021] [Indexed: 10/21/2022]
Abstract
OBJECTIVES Advances in virtual, augmented (AR) and mixed reality have led to the development of wearable technologies including head mounted displays (HMD). The aim of this study was to investigate the feasibility to use HMD during endovascular surgery. METHODS We propose an adaptation of AR-HMD using Microsoft HoloLens. Software was developed to enable visualization of the vascular system during endovascular procedures. A video was performed to present an overview of the device and show its use in real conditions. RESULTS The device allowed a successful visualization of perioperative angiography during peripheral angioplasty, carotid angioplasty and aortic aneurysm endovascular repair. The device was operated on voice command, preserving the environment sterility. CONCLUSION This video illustrated the feasibility of the application of holographic AR during endovascular intervention and brings perspectives to use artificial-intelligence derived tools for image-guided surgery.
Collapse
Affiliation(s)
- Claude Mialhe
- Cardiovascular Surgery Unit, Cardio Thoracic Centre of Monaco, Monaco
| | - Arindam Chaudhuri
- Bedfordshire-Milton Keynes Vascular Centre, Bedfordshire Hospitals NHS Foundation Trust, Bedford, UK
| | - Juliette Raffort
- Université Côte d'Azur, Nice, France; Clinical Chemistry Laboratory, University Hospital of Nice, Nice, France
| | - Fabien Lareyre
- Université Côte d'Azur, Nice, France; Department of Vascular Surgery, Hospital of Antibes-Juan-les-Pins, France.
| |
Collapse
|
28
|
Wachter A, Kost J, Nahm W. Simulation-Based Estimation of the Number of Cameras Required for 3D Reconstruction in a Narrow-Baseline Multi-Camera Setup. J Imaging 2021; 7:jimaging7050087. [PMID: 34460683 PMCID: PMC8321353 DOI: 10.3390/jimaging7050087] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Revised: 04/29/2021] [Accepted: 05/06/2021] [Indexed: 10/27/2022] Open
Abstract
Graphical visualization systems are a common clinical tool for displaying digital images and three-dimensional volumetric data. These systems provide a broad spectrum of information to support physicians in their clinical routine. For example, the field of radiology enjoys unrestricted options for interaction with the data, since information is pre-recorded and available entirely in digital form. However, some fields, such as microsurgery, do not benefit from this yet. Microscopes, endoscopes, and laparoscopes show the surgical site as it is. To allow free data manipulation and information fusion, 3D digitization of surgical sites is required. We aimed to find the number of cameras needed to add this functionality to surgical microscopes. For this, we performed in silico simulations of the 3D reconstruction of representative models of microsurgical sites with different numbers of cameras in narrow-baseline setups. Our results show that eight independent camera views are preferable, while at least four are necessary for a digital surgical site. In most cases, eight cameras allow the reconstruction of over 99% of the visible part. With four cameras, still over 95% can be achieved. This answers one of the key questions for the development of a prototype microscope. In future, such a system can provide functionality which is unattainable today.
Collapse
|
29
|
Long DJ, Li M, De Ruiter QMB, Hecht R, Li X, Varble N, Blain M, Kassin MT, Sharma KV, Sarin S, Krishnasamy VP, Pritchard WF, Karanian JW, Wood BJ, Xu S. Comparison of Smartphone Augmented Reality, Smartglasses Augmented Reality, and 3D CBCT-guided Fluoroscopy Navigation for Percutaneous Needle Insertion: A Phantom Study. Cardiovasc Intervent Radiol 2021; 44:774-781. [PMID: 33409547 DOI: 10.1007/s00270-020-02760-7] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2020] [Accepted: 12/23/2020] [Indexed: 11/30/2022]
Abstract
PURPOSE To compare needle placement performance using an augmented reality (AR) navigation platform implemented on smartphone or smartglasses devices to that of CBCT-guided fluoroscopy in a phantom. MATERIALS AND METHODS An AR application was developed to display a planned percutaneous needle trajectory on the smartphone (iPhone7) and smartglasses (HoloLens1) devices in real time. Two AR-guided needle placement systems and CBCT-guided fluoroscopy with navigation software (XperGuide, Philips) were compared using an anthropomorphic phantom (CIRS, Norfolk, VA). Six interventional radiologists each performed 18 independent needle placements using smartphone (n = 6), smartglasses (n = 6), and XperGuide (n = 6) guidance. Placement error was defined as the distance from the needle tip to the target center. Placement time was recorded. For XperGuide, dose-area product (DAP, mGy*cm2) and fluoroscopy time (sec) were recorded. Statistical comparisons were made using a two-way repeated measures ANOVA. RESULTS The placement error using the smartphone, smartglasses, or XperGuide was similar (3.98 ± 1.68 mm, 5.18 ± 3.84 mm, 4.13 ± 2.38 mm, respectively, p = 0.11). Compared to CBCT-guided fluoroscopy, the smartphone and smartglasses reduced placement time by 38% (p = 0.02) and 55% (p = 0.001), respectively. The DAP for insertion using XperGuide was 3086 ± 2920 mGy*cm2, and no intra-procedural radiation was required for augmented reality. CONCLUSIONS Smartphone- and smartglasses-based augmented reality reduced needle placement time and radiation exposure while maintaining placement accuracy compared to a clinically validated needle navigation platform.
Collapse
Affiliation(s)
- Dilara J Long
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Ming Li
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.
| | - Quirina M B De Ruiter
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Rachel Hecht
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Xiaobai Li
- Biostatistics and Clinical Epidemiology Service, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Nicole Varble
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA.,Philips Research of North America, Cambridge, MA, 02141, USA
| | - Maxime Blain
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Michael T Kassin
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Karun V Sharma
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, Washington, DC, USA
| | - Shawn Sarin
- Department of Interventional Radiology, George Washington University Hospital, Washington, DC, USA
| | - Venkatesh P Krishnasamy
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - William F Pritchard
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - John W Karanian
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Bradford J Wood
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| | - Sheng Xu
- Center for Interventional Oncology, Radiology and Imaging Sciences, Clinical Center, National Institutes of Health, Bethesda, MD, 20892, USA
| |
Collapse
|
30
|
Andrews CM, Henry AB, Soriano IM, Southworth MK, Silva JR. Registration Techniques for Clinical Applications of Three-Dimensional Augmented Reality Devices. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2020; 9:4900214. [PMID: 33489483 PMCID: PMC7819530 DOI: 10.1109/jtehm.2020.3045642] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 11/13/2020] [Accepted: 12/03/2020] [Indexed: 12/15/2022]
Abstract
Many clinical procedures would benefit from direct and intuitive real-time visualization of anatomy, surgical plans, or other information crucial to the procedure. Three-dimensional augmented reality (3D-AR) is an emerging technology that has the potential to assist physicians with spatial reasoning during clinical interventions. The most intriguing applications of 3D-AR involve visualizations of anatomy or surgical plans that appear directly on the patient. However, commercially available 3D-AR devices have spatial localization errors that are too large for many clinical procedures. For this reason, a variety of approaches for improving 3D-AR registration accuracy have been explored. The focus of this review is on the methods, accuracy, and clinical applications of registering 3D-AR devices with the clinical environment. The works cited represent a variety of approaches for registering holograms to patients, including manual registration, computer vision-based registration, and registrations that incorporate external tracking systems. Evaluations of user accuracy when performing clinically relevant tasks suggest that accuracies of approximately 2 mm are feasible. 3D-AR device limitations due to the vergence-accommodation conflict or other factors attributable to the headset hardware add on the order of 1.5 mm of error compared to conventional guidance. Continued improvements to 3D-AR hardware will decrease these sources of error.
Collapse
Affiliation(s)
- Christopher M. Andrews
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
- SentiAR, Inc.St. LouisMO63108USA
| | | | | | | | - Jonathan R. Silva
- Department of Biomedical EngineeringWashington University in St Louis, McKelvey School of EngineeringSt LouisMO63130USA
| |
Collapse
|