1
|
Volmer B, Liu JS, Matthews B, Bornkessel-Schlesewsky I, Feiner S, Thomas BH. Multi-Level Precues for Guiding Tasks Within and Between Workspaces in Spatial Augmented Reality. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:4449-4459. [PMID: 37874709 DOI: 10.1109/tvcg.2023.3320246] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/26/2023]
Abstract
We explore Spatial Augmented Reality (SAR) precues (predictive cues) for procedural tasks within and between workspaces and for visualizing multiple upcoming steps in advance. We designed precues based on several factors: cue type, color transparency, and multi-level (number of precues). Precues were evaluated in a procedural task requiring the user to press buttons in three surrounding workspaces. Participants performed fastest in conditions where tasks were linked with line cues with different levels of color transparency. Precue performance was also affected by whether the next task was in the same workspace or a different one.
Collapse
|
2
|
Foroughi P, Demir A, Hossbach M, Rajan P, Yarmolenko P, Vellody R, Cleary K, Sharma K. In situ guidance for MRI interventions using projected feedback. Int J Comput Assist Radiol Surg 2023:10.1007/s11548-023-02897-z. [PMID: 37072658 DOI: 10.1007/s11548-023-02897-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 03/29/2023] [Indexed: 04/20/2023]
Abstract
PURPOSE To develop and evaluate an augmented reality instrument guidance system for MRI-guided needle placement procedures such as musculoskeletal biopsy and arthrography. Our system guides the physician to insert a needle toward a target while looking at the insertion site without requiring special headgear. METHODS The system is comprised of a pair of stereo cameras, a projector, and a computational unit with a touch screen. All components are designed to be used within the MRI suite (Zone 4). Multi-modality fiducial markers called VisiMARKERs, detectable in both MRI and camera images, facilitate automatic registration after the initial scan. The navigation feedback is projected directly onto the intervention site allowing the interventionalist to keep their focus on the insertion site instead of a secondary monitor which is often not in front of them. RESULTS We evaluated the feasibility and accuracy of this system on custom-built shoulder phantoms. Two radiologists used the system to select targets and entry points on initial MRIs of these phantoms over three sessions. They performed 80 needle insertions following the projected guidance. The system targeting error was 1.09 mm, and the overall error was 2.29 mm. CONCLUSION We demonstrated both feasibility and accuracy of this MRI navigation system. The system operated without any problems inside the MRI suite close to the MRI bore. The two radiologists were able to easily follow the guidance and place the needle close to the target without any intermediate imaging.
Collapse
Affiliation(s)
| | - Alican Demir
- Clear Guide Medical Inc., Baltimore, MD, 21211, USA
| | | | | | | | | | | | | |
Collapse
|
3
|
Franson D, Dupuis A, Gulani V, Griswold M, Seiberlich N. A System for Real-Time, Online Mixed-Reality Visualization of Cardiac Magnetic Resonance Images. J Imaging 2021; 7:jimaging7120274. [PMID: 34940741 PMCID: PMC8709155 DOI: 10.3390/jimaging7120274] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2021] [Revised: 12/07/2021] [Accepted: 12/09/2021] [Indexed: 11/16/2022] Open
Abstract
Image-guided cardiovascular interventions are rapidly evolving procedures that necessitate imaging systems capable of rapid data acquisition and low-latency image reconstruction and visualization. Compared to alternative modalities, Magnetic Resonance Imaging (MRI) is attractive for guidance in complex interventional settings thanks to excellent soft tissue contrast and large fields-of-view without exposure to ionizing radiation. However, most clinically deployed MRI sequences and visualization pipelines exhibit poor latency characteristics, and spatial integration of complex anatomy and device orientation can be challenging on conventional 2D displays. This work demonstrates a proof-of-concept system linking real-time cardiac MR image acquisition, online low-latency reconstruction, and a stereoscopic display to support further development in real-time MR-guided intervention. Data are acquired using an undersampled, radial trajectory and reconstructed via parallelized through-time radial generalized autocalibrating partially parallel acquisition (GRAPPA) implemented on graphics processing units. Images are rendered for display in a stereoscopic mixed-reality head-mounted display. The system is successfully tested by imaging standard cardiac views in healthy volunteers. Datasets comprised of one slice (46 ms), two slices (92 ms), and three slices (138 ms) are collected, with the acquisition time of each listed in parentheses. Images are displayed with latencies of 42 ms/frame or less for all three conditions. Volumetric data are acquired at one volume per heartbeat with acquisition times of 467 ms and 588 ms when 8 and 12 partitions are acquired, respectively. Volumes are displayed with a latency of 286 ms or less. The faster-than-acquisition latencies for both planar and volumetric display enable real-time 3D visualization of the heart.
Collapse
Affiliation(s)
- Dominique Franson
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH 44106, USA;
- Correspondence: (D.F.); (A.D.)
| | - Andrew Dupuis
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH 44106, USA;
- Correspondence: (D.F.); (A.D.)
| | - Vikas Gulani
- Department of Radiology, University of Michigan, Ann Arbor, MI 48109, USA; (V.G.); (N.S.)
| | - Mark Griswold
- Department of Biomedical Engineering, Case Western Reserve University, Cleveland, OH 44106, USA;
- Department of Radiology, Case Western Reserve University, Cleveland, OH 44106, USA
| | - Nicole Seiberlich
- Department of Radiology, University of Michigan, Ann Arbor, MI 48109, USA; (V.G.); (N.S.)
| |
Collapse
|
4
|
Velazco-Garcia JD, Navkar NV, Balakrishnan S, Younes G, Abi-Nahed J, Al-Rumaihi K, Darweesh A, Elakkad MSM, Al-Ansari A, Christoforou EG, Karkoub M, Leiss EL, Tsiamyrtzis P, Tsekos NV. Evaluation of how users interface with holographic augmented reality surgical scenes: Interactive planning MR-Guided prostate biopsies. Int J Med Robot 2021; 17:e2290. [PMID: 34060214 DOI: 10.1002/rcs.2290] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2020] [Revised: 05/04/2021] [Accepted: 05/27/2021] [Indexed: 12/15/2022]
Abstract
BACKGROUND User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot-assisted MR-guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment. METHOD End-user studies were conducted by simulating an MRgPBx system with end- and side-firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system. RESULTS The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system. CONCLUSIONS The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.
Collapse
Affiliation(s)
| | - Nikhil V Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | - Georges Younes
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | | | - Adham Darweesh
- Department of Clinical Imaging, Hamad Medical Corporation, Doha, Qatar
| | | | | | | | - Mansour Karkoub
- Department of Mechanical Engineering, Texas A&M University-Qatar, Doha, Qatar
| | - Ernst L Leiss
- Department of Computer Science, University of Houston, Houston, Texas, USA
| | | | - Nikolaos V Tsekos
- Department of Computer Science, University of Houston, Houston, Texas, USA
| |
Collapse
|
5
|
Reich CM, Sattler B, Jochimsen TH, Unger M, Melzer L, Landgraf L, Barthel H, Sabri O, Melzer A. Practical setting and potential applications of interventions guided by PET/MRI. THE QUARTERLY JOURNAL OF NUCLEAR MEDICINE AND MOLECULAR IMAGING : OFFICIAL PUBLICATION OF THE ITALIAN ASSOCIATION OF NUCLEAR MEDICINE (AIMN) [AND] THE INTERNATIONAL ASSOCIATION OF RADIOPHARMACOLOGY (IAR), [AND] SECTION OF THE SOCIETY OF RADIOPHARMACEUTICAL CHEMISTRY AND BIOLOGY 2020; 65:43-50. [PMID: 33300750 DOI: 10.23736/s1824-4785.20.03293-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Multimodality imaging has emerged from a vision thirty years ago to routine clinical use today. Positron emission tomography (PET)/magnetic resonance imaging (MRI) is still relatively new in this arena and particularly suitable for clinical research and technical development. PET/MRI-guidance for interventions opens up opportunities for novel treatments but at the same time demands certain technical and organizational requirements to be fulfilled. In this work, we aimed to demonstrate a practical setting and potential application of PET/MRI guidance of interventional procedures. The superior quantitative physiologic information of PET, the various unique imaging characteristics of MRI, and the reduced radiation exposure are the most relevant advantages of this technique. As a noninvasive interventional tool, focused ultrasound (FUS) ablation of tumor cells would benefit from PET/MRI for diagnostics, treatment planning and intervention. Yet, technical limitations might impeed preclinical research, given that PET/MRI sites are per se not designed as interventional suites. Nonetheless, several approaches have been offered in the past years to upgrade MRI suites for interventional purposes. Taking advantage of state of the art and easy-to-use technology it is possible to create a supporting infrastructure that is suitable for broad preclinical adaption. Several aspects are to be addressed, including remote control of the imaging system, display of the imaging results, communication technology, and implementation of additional devices such as a FUS platform and an MR-compatible robotic system for positioning of the FUS equipment. Feasibility could be demostrated with an examplary experimental setup for interventional PET/MRI. Most PET/MRI sites could allow for interventions with just a few add-ons and modifications, such as comunication, in room image display and sytems control. By unlocking this feature, and driving preclinical research in interventional PET/MRI, translation of the protocol and methodology into clinical settings seems feasible.
Collapse
Affiliation(s)
- C Martin Reich
- Innovation Center Computer Assisted Surgery, University of Leipzig, Leipzig, Germany
| | - Bernhard Sattler
- Department of Nuclear Medicine, University Hospital Leipzig, Leipzig, Germany -
| | - Thies H Jochimsen
- Department of Nuclear Medicine, University Hospital Leipzig, Leipzig, Germany
| | - Michael Unger
- Innovation Center Computer Assisted Surgery, University of Leipzig, Leipzig, Germany
| | - Leon Melzer
- Innovation Center Computer Assisted Surgery, University of Leipzig, Leipzig, Germany
| | - Lisa Landgraf
- Innovation Center Computer Assisted Surgery, University of Leipzig, Leipzig, Germany
| | - Henryk Barthel
- Department of Nuclear Medicine, University Hospital Leipzig, Leipzig, Germany
| | - Osama Sabri
- Department of Nuclear Medicine, University Hospital Leipzig, Leipzig, Germany
| | - Andreas Melzer
- Innovation Center Computer Assisted Surgery, University of Leipzig, Leipzig, Germany.,Institute for Medical Science and Technology IMSaT, University Dundee, Scotland, UK
| |
Collapse
|
6
|
Heinrich F, Schwenderling L, Joeres F, Lawonn K, Hansen C. Comparison of Augmented Reality Display Techniques to Support Medical Needle Insertion. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3568-3575. [PMID: 33006930 DOI: 10.1109/tvcg.2020.3023637] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Augmented reality (AR) may be a useful technique to overcome issues of conventionally used navigation systems supporting medical needle insertions, like increased mental workload and complicated hand-eye coordination. Previous research primarily focused on the development of AR navigation systems designed for specific displaying devices, but differences between employed methods have not been investigated before. To this end, a user study involving a needle insertion task was conducted comparing different AR display techniques with a monitor-based approach as baseline condition for the visualization of navigation information. A video see-through stationary display, an optical see-through head-mounted display and a spatial AR projector-camera-system were investigated in this comparison. Results suggest advantages of using projected navigation information in terms of lower task completion time, lower angular deviation and affirmative subjective participant feedback. Techniques requiring the intermediate view on screens, i.e. the stationary display and the baseline condition, showed less favorable results. Thus, benefits of providing AR navigation information compared to a conventionally used method could be identified. Significant objective measures results, as well as an identification of advantages and disadvantages of individual display techniques contribute to the development and design of improved needle navigation systems.
Collapse
|
7
|
Mamone V, Fonzo MD, Esposito N, Ferrari M, Ferrari V. Monitoring Wound Healing With Contactless Measurements and Augmented Reality. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2020; 8:2700412. [PMID: 32373400 PMCID: PMC7198047 DOI: 10.1109/jtehm.2020.2983156] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2019] [Revised: 02/26/2020] [Accepted: 03/03/2020] [Indexed: 12/25/2022]
Abstract
Objective: This work presents a device for non-invasive wound parameters assessment, designed to overcome the drawbacks of traditional methods, which are mostly rough, inaccurate, and painful for the patient. The device estimates the morphological parameters of the wound and provides augmented reality (AR) visual feedback on the wound healing status by projecting the wound border acquired during the last examination, thus improving doctor-patient communication. Methods: An accurate 3D model of the wound is created by stereophotogrammetry and refined through self-organizing maps. The 3D model is used to estimate physical parameters for wound healing assessment and integrates AR functionalities based on a miniaturized projector. The physical parameter estimation functionalities are evaluated in terms of precision, accuracy, inter-operator variability, and repeatability, whereas AR wound border projection is evaluated in terms of accuracy on the same phantom. Results: The accuracy and precision of the device are respectively 2% and 1.2% for linear parameters, and 1.7% and 1.3% for area and volume. The AR projection shows an error distance <1 mm. No statistical difference was found between the measurements of different operators. Conclusion: The device has proven to be an objective and non-operator-dependent tool for assessing the morphological parameters of the wound. Comparison with non-contact devices shows improved accuracy, offering reliable and rigorous measurements. Clinical Impact: Chronic wounds represent a significant health problem with high recurrence rates due to the ageing of the population and diseases such as diabetes and obesity. The device presented in this work provides an easy-to-use non-invasive tool to obtain useful information for treatment.
Collapse
Affiliation(s)
- Virginia Mamone
- 1Department of Information EngineeringUniversity of Pisa56122PisaItaly.,2EndoCASCenter for Computer-Assisted Surgery56124PisaItaly
| | - Miriam Di Fonzo
- 3Department of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56122PisaItaly
| | | | - Mauro Ferrari
- 2EndoCASCenter for Computer-Assisted Surgery56124PisaItaly.,3Department of Translational Research and New Technologies in Medicine and SurgeryUniversity of Pisa56122PisaItaly
| | - Vincenzo Ferrari
- 1Department of Information EngineeringUniversity of Pisa56122PisaItaly.,2EndoCASCenter for Computer-Assisted Surgery56124PisaItaly
| |
Collapse
|
8
|
Gerup J, Soerensen CB, Dieckmann P. Augmented reality and mixed reality for healthcare education beyond surgery: an integrative review. INTERNATIONAL JOURNAL OF MEDICAL EDUCATION 2020; 11:1-18. [PMID: 31955150 PMCID: PMC7246121 DOI: 10.5116/ijme.5e01.eb1a] [Citation(s) in RCA: 59] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/2019] [Accepted: 12/24/2019] [Indexed: 05/07/2023]
Abstract
OBJECTIVES This study aimed to review and synthesize the current research and state of augmented reality (AR), mixed reality (MR) and the applications developed for healthcare education beyond surgery. METHODS An integrative review was conducted on all relevant material, drawing on different data sources, including the databases of PubMed, PsycINFO, and ERIC from January 2013 till September 2018. Inductive content analysis and qualitative synthesis were performed. Additionally, the quality of the studies was assessed with different structured tools. RESULTS Twenty-six studies were included. Studies based on both AR and MR involved established applications in 27% of all cases (n=6), the rest being prototypes. The most frequently studied subjects were related to anatomy and anesthesia (n=13). All studies showed several healthcare educational benefits of AR and MR, significantly outperforming traditional learning approaches in 11 studies examining various outcomes. Studies had a low-to-medium quality overall with a MERSQI mean of 12.26 (SD=2.63), while the single qualitative study had high quality. CONCLUSIONS This review suggests the progress of learning approaches based on AR and MR for various medical subjects while moving the research base away from feasibility studies on prototypes. Yet, lacking validity of study conclusions, heterogeneity of research designs and widely varied reporting challenges transferability of the findings in the studies included in the review. Future studies should examine suitable research designs and instructional objectives achievable by AR and MR-based applications to strengthen the evidence base, making it relevant for medical educators and institutions to apply the technologies.
Collapse
Affiliation(s)
- Jaris Gerup
- School of Medical Sciences, University of Copenhagen, Denmark
| | | | - Peter Dieckmann
- Copenhagen Academy of Medical Education and Simulation (CAMES), Center for Human Resources, Herlev and Gentofte Hospital, Denmark
| |
Collapse
|
9
|
Mewes A, Heinrich F, Hensen B, Wacker F, Lawonn K, Hansen C. Concepts for augmented reality visualisation to support needle guidance inside the MRI. Healthc Technol Lett 2018; 5:172-176. [PMID: 30464849 PMCID: PMC6222244 DOI: 10.1049/htl.2018.5076] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2018] [Accepted: 08/20/2018] [Indexed: 11/20/2022] Open
Abstract
During MRI-guided interventions, navigation support is often separated from the operating field on displays, which impedes the interpretation of positions and orientations of instruments inside the patient's body as well as hand–eye coordination. To overcome these issues projector-based augmented reality can be used to support needle guidance inside the MRI bore directly in the operating field. The authors present two visualisation concepts for needle navigation aids which were compared in an accuracy and usability study with eight participants, four of whom were experienced radiologists. The results show that both concepts are equally accurate (\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{upgreek}
\usepackage{mathrsfs}
\setlength{\oddsidemargin}{-69pt}
\begin{document}
}{}$2.0 \pm 0.6$\end{document}2.0±0.6 and \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{upgreek}
\usepackage{mathrsfs}
\setlength{\oddsidemargin}{-69pt}
\begin{document}
}{}$1.7 \pm 0.5\, {\rm mm}$\end{document}1.7±0.5mm), useful and easy to use, with clear visual feedback about the state and success of the needle puncture. For easier clinical applicability, a dynamic projection on moving surfaces and organ movement tracking are needed. For now, tests with patients with respiratory arrest are feasible.
Collapse
Affiliation(s)
- André Mewes
- Faculty of Computer Science, Otto-von-Guericke University Magdeburg, Germany.,Research Campus STIMULATE, Otto-von-Guericke University Magdeburg, Germany
| | - Florian Heinrich
- Faculty of Computer Science, Otto-von-Guericke University Magdeburg, Germany.,Research Campus STIMULATE, Otto-von-Guericke University Magdeburg, Germany
| | - Bennet Hensen
- Research Campus STIMULATE, Otto-von-Guericke University Magdeburg, Germany.,Institute of Diagnostic and Interventional Radiology, Hanover Medical School, Germany
| | - Frank Wacker
- Research Campus STIMULATE, Otto-von-Guericke University Magdeburg, Germany.,Institute of Diagnostic and Interventional Radiology, Hanover Medical School, Germany
| | - Kai Lawonn
- Faculty of Computer Science, University of Koblenz-Landau, Germany
| | - Christian Hansen
- Faculty of Computer Science, Otto-von-Guericke University Magdeburg, Germany.,Research Campus STIMULATE, Otto-von-Guericke University Magdeburg, Germany
| |
Collapse
|