1
|
Kunz JM, Maloca P, Allemann A, Fasler D, Soysal S, Däster S, Kraljević M, Syeda G, Weixler B, Nebiker C, Ochs V, Droeser R, Walker HL, Bolli M, Müller B, Cattin P, Staubli SM. Assessment of resectability of pancreatic cancer using novel immersive high-performance virtual reality rendering of abdominal computed tomography and magnetic resonance imaging. Int J Comput Assist Radiol Surg 2024; 19:1677-1687. [PMID: 38252362 PMCID: PMC11365822 DOI: 10.1007/s11548-023-03048-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Accepted: 12/11/2023] [Indexed: 01/23/2024]
Abstract
PURPOSE Virtual reality (VR) allows for an immersive and interactive analysis of imaging data such as computed tomography (CT) and magnetic resonance imaging (MRI). The aim of this study is to assess the comprehensibility of VR anatomy and its value in assessing resectability of pancreatic ductal adenocarcinoma (PDAC). METHODS This study assesses exposure to VR anatomy and evaluates the potential role of VR in assessing resectability of PDAC. Firstly, volumetric abdominal CT and MRI data were displayed in an immersive VR environment. Volunteering physicians were asked to identify anatomical landmarks in VR. In the second stage, experienced clinicians were asked to identify vascular involvement in a total of 12 CT and MRI scans displaying PDAC (2 resectable, 2 borderline resectable, and 2 locally advanced tumours per modality). Results were compared to 2D standard PACS viewing. RESULTS In VR visualisation of CT and MRI, the abdominal anatomical landmarks were recognised by all participants except the pancreas (30/34) in VR CT and the splenic (31/34) and common hepatic artery (18/34) in VR MRI, respectively. In VR CT, resectable, borderline resectable, and locally advanced PDAC were correctly identified in 22/24, 20/24 and 19/24 scans, respectively. Whereas, in VR MRI, resectable, borderline resectable, and locally advanced PDAC were correctly identified in 19/24, 19/24 and 21/24 scans, respectively. Interobserver agreement as measured by Fleiss κ was 0.7 for CT and 0.4 for MRI, respectively (p < 0.001). Scans were significantly assessed more accurately in VR CT than standard 2D PACS CT, with a median of 5.5 (IQR 4.75-6) and a median of 3 (IQR 2-3) correctly assessed out of 6 scans (p < 0.001). CONCLUSION VR enhanced visualisation of abdominal CT and MRI scan data provides intuitive handling and understanding of anatomy and might allow for more accurate staging of PDAC and could thus become a valuable adjunct in PDAC resectability assessment in the future.
Collapse
Affiliation(s)
- Julia Madlaina Kunz
- Faculty of Medicine, University of Basel, Klingelbergstrasse 61, 4056, Basel, Switzerland
| | - Peter Maloca
- Institute of Molecular and Clinical Ophthalmology Basel (IOB), Mittlere Strasse 91, 4031, Basel, Switzerland
- Department of Ophthalmology, University of Basel, 4031, Basel, Switzerland
- Moorfields Eye Hospital, NHS Foundation Trust, London, EC1V 2PD, UK
| | - Andreas Allemann
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland
| | - David Fasler
- Department of Radiology St. Claraspital Basel, Kleinriehenstrasse 30, 4058, Basel, Switzerland
| | - Savas Soysal
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland
| | - Silvio Däster
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland
| | - Marko Kraljević
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland
| | - Gulbahar Syeda
- Department of HPB Surgery and Liver Transplantation, Royal Free Hospital, London, NHS Foundation Trust, Pond Street, London, NW3 2Q, UK
| | - Benjamin Weixler
- Department of General, Visceral and Vascular Sugery, Charité Campus Benjamin Franklin, Hindenburgdamm 20, 12203, Berlin, Germany
| | - Christian Nebiker
- Surgical Department, Cantonal Hospital Aarau, Tellstrasse 25, 5001, Aarau, Switzerland
| | - Vincent Ochs
- Department of Biomedical Engineering, University of Basel, Hegenheimermattweg 167c, 4123, Allschwil, Switzerland
| | - Raoul Droeser
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland
| | | | - Martin Bolli
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland
| | - Beat Müller
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland
| | - Philippe Cattin
- Department of Biomedical Engineering, University of Basel, Hegenheimermattweg 167c, 4123, Allschwil, Switzerland
| | - Sebastian Manuel Staubli
- Faculty of Medicine, University of Basel, Klingelbergstrasse 61, 4056, Basel, Switzerland.
- Clarunis, University Center for Gastrointestinal and Liver Diseases, 4002, Basel, Switzerland.
- Department of HPB Surgery and Liver Transplantation, Royal Free Hospital, London, NHS Foundation Trust, Pond Street, London, NW3 2Q, UK.
| |
Collapse
|
2
|
Bueckle A, Qing C, Luley S, Kumar Y, Pandey N, Börner K. The HRA Organ Gallery Affords Immersive Superpowers for Building and Exploring the Human Reference Atlas with Virtual Reality. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.02.13.528002. [PMID: 36824790 PMCID: PMC9949060 DOI: 10.1101/2023.02.13.528002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/17/2023]
Abstract
The Human Reference Atlas (HRA, https://humanatlas.io ) funded by the NIH Human Biomolecular Atlas Program (HuBMAP, https://commonfund.nih.gov/hubmap ) and other projects engages 17 international consortia to create a spatial reference of the healthy adult human body at single-cell resolution. The specimen, biological structure, and spatial data that define the HRA are disparate in nature and benefit from a visually explicit method of data integration. Virtual reality (VR) offers unique means to enable users to explore complex data structures in a threedimensional (3D) immersive environment. On a 2D desktop application, the 3D spatiality and real-world size of the 3D reference organs of the atlas is hard to understand. If viewed in VR, the spatiality of the organs and tissue blocks mapped to the HRA can be explored in their true size and in a way that goes beyond traditional 2D user interfaces. Added 2D and 3D visualizations can then provide data-rich context. In this paper, we present the HRA Organ Gallery, a VR application to explore the atlas in an integrated VR environment. Presently, the HRA Organ Gallery features 55 3D reference organs,1,203 mapped tissue blocks from 292 demographically diverse donors and 15 providers that link to 5,000+ datasets; it also features prototype visualizations of cell type distributions and 3D protein structures. We outline our plans to support two biological use cases: on-ramping novice and expert users to HuBMAP data available via the Data Portal ( https://portal.hubmapconsortium.org ), and quality assurance/quality control (QA/QC) for HRA data providers . Code and onboarding materials are available at https://github.com/cns-iu/ccf-organ-vr-gallery#readme .
Collapse
Affiliation(s)
- Andreas Bueckle
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47408, USA
| | - Catherine Qing
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47408, USA
- Department of Humanities & Sciences, Stanford University, Stanford, CA 94305, USA
| | - Shefali Luley
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47408, USA
| | - Yash Kumar
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47408, USA
| | - Naval Pandey
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47408, USA
| | - Katy Börner
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN 47408, USA
| |
Collapse
|
3
|
Bueckle A, Qing C, Luley S, Kumar Y, Pandey N, Börner K. The HRA Organ Gallery affords immersive superpowers for building and exploring the Human Reference Atlas with virtual reality. FRONTIERS IN BIOINFORMATICS 2023; 3:1162723. [PMID: 37181487 PMCID: PMC10174312 DOI: 10.3389/fbinf.2023.1162723] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2023] [Accepted: 04/10/2023] [Indexed: 05/16/2023] Open
Abstract
The Human Reference Atlas (HRA, https://humanatlas.io) funded by the NIH Human Biomolecular Atlas Program (HuBMAP, https://commonfund.nih.gov/hubmap) and other projects engages 17 international consortia to create a spatial reference of the healthy adult human body at single-cell resolution. The specimen, biological structure, and spatial data that define the HRA are disparate in nature and benefit from a visually explicit method of data integration. Virtual reality (VR) offers unique means to enable users to explore complex data structures in a three-dimensional (3D) immersive environment. On a 2D desktop application, the 3D spatiality and real-world size of the 3D reference organs of the atlas is hard to understand. If viewed in VR, the spatiality of the organs and tissue blocks mapped to the HRA can be explored in their true size and in a way that goes beyond traditional 2D user interfaces. Added 2D and 3D visualizations can then provide data-rich context. In this paper, we present the HRA Organ Gallery, a VR application to explore the atlas in an integrated VR environment. Presently, the HRA Organ Gallery features 55 3D reference organs, 1,203 mapped tissue blocks from 292 demographically diverse donors and 15 providers that link to 6,000+ datasets; it also features prototype visualizations of cell type distributions and 3D protein structures. We outline our plans to support two biological use cases: on-ramping novice and expert users to HuBMAP data available via the Data Portal (https://portal.hubmapconsortium.org), and quality assurance/quality control (QA/QC) for HRA data providers. Code and onboarding materials are available at https://github.com/cns-iu/hra-organ-gallery-in-vr.
Collapse
Affiliation(s)
- Andreas Bueckle
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, United States
- *Correspondence: Andreas Bueckle, ; Catherine Qing,
| | - Catherine Qing
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, United States
- Department of Humanities and Sciences, Stanford University, Stanford, CA, United States
- *Correspondence: Andreas Bueckle, ; Catherine Qing,
| | - Shefali Luley
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, United States
| | - Yash Kumar
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, United States
| | - Naval Pandey
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, United States
| | - Katy Börner
- Department of Intelligent Systems Engineering, Luddy School of Informatics, Computing, and Engineering, Indiana University, Bloomington, IN, United States
| |
Collapse
|
4
|
Lau CW, Qu Z, Draper D, Quan R, Braytee A, Bluff A, Zhang D, Johnston A, Kennedy PJ, Simoff S, Nguyen QV, Catchpoole D. Virtual reality for the observation of oncology models (VROOM): immersive analytics for oncology patient cohorts. Sci Rep 2022; 12:11337. [PMID: 35790803 PMCID: PMC9256599 DOI: 10.1038/s41598-022-15548-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Accepted: 06/24/2022] [Indexed: 11/08/2022] Open
Abstract
The significant advancement of inexpensive and portable virtual reality (VR) and augmented reality devices has re-energised the research in the immersive analytics field. The immersive environment is different from a traditional 2D display used to analyse 3D data as it provides a unified environment that supports immersion in a 3D scene, gestural interaction, haptic feedback and spatial audio. Genomic data analysis has been used in oncology to understand better the relationship between genetic profile, cancer type, and treatment option. This paper proposes a novel immersive analytics tool for cancer patient cohorts in a virtual reality environment, virtual reality to observe oncology data models. We utilise immersive technologies to analyse the gene expression and clinical data of a cohort of cancer patients. Various machine learning algorithms and visualisation methods have also been deployed in VR to enhance the data interrogation process. This is supported with established 2D visual analytics and graphical methods in bioinformatics, such as scatter plots, descriptive statistical information, linear regression, box plot and heatmap into our visualisation. Our approach allows the clinician to interrogate the information that is familiar and meaningful to them while providing them immersive analytics capabilities to make new discoveries toward personalised medicine.
Collapse
Affiliation(s)
- Chng Wei Lau
- School of Computer, Data and Mathematical Sciences, Western Sydney University, Parramatta, Australia.
| | - Zhonglin Qu
- School of Computer, Data and Mathematical Sciences, Western Sydney University, Parramatta, Australia
| | | | - Rosa Quan
- School of Psychology, Western Sydney University, Penrith, Australia
| | - Ali Braytee
- School of Computer Science, University of Technology Sydney, Ultimo, Australia
| | | | - Dongmo Zhang
- School of Computer, Data and Mathematical Sciences, Western Sydney University, Parramatta, Australia
| | - Andrew Johnston
- School of Computer Science, University of Technology Sydney, Ultimo, Australia
| | - Paul J Kennedy
- School of Computer Science, University of Technology Sydney, Ultimo, Australia
| | - Simeon Simoff
- MARCS Institute and School of Computer, Data and Mathematical Sciences, Western Sydney University, Parramatta, Australia
| | - Quang Vinh Nguyen
- MARCS Institute and School of Computer, Data and Mathematical Sciences, Western Sydney University, Parramatta, Australia
| | - Daniel Catchpoole
- School of Computer, Data and Mathematical Sciences, Western Sydney University, Parramatta, Australia.
- School of Computer Science, University of Technology Sydney, Ultimo, Australia.
- Biospecimen Research Services, Children's Cancer Research Unit, The Kids Research Institute, The Children's Hospital at Westmead, Westmead, Australia.
| |
Collapse
|
5
|
Jani G, Johnson A. Virtual reality and its transformation in forensic education and research practices. J Vis Commun Med 2021; 45:18-25. [PMID: 34493128 DOI: 10.1080/17453054.2021.1971516] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
Documentation and evidence analysis are major components in forensic investigation; hence two-dimensional (2D) photographs along with three-dimensional (3D) models and data are used to accomplish this task. Data generated through 3D scanning and photogrammetry are generally visualised on a computer screen. However, spatial details are lost on the visualisation of 3D data on 2D computer screens. Virtual reality (VR) is an immersive technology that allows a user to visualise 3D information by immersing oneself into the scene. In forensics, VR was particularly introduced for the visualising and plotting distances of crime scenes; however, this technology has wider applications in the field of forensics and for court presentation. This short communication outlines the concept of VR and its potential in the field of forensics.
Collapse
Affiliation(s)
- Gargi Jani
- Laboratory of Forensic Odontology, School of Forensic Science, National Forensic Sciences University, Gujarat, India
| | - Abraham Johnson
- Laboratory of Forensic Odontology, School of Forensic Science, National Forensic Sciences University, Gujarat, India
| |
Collapse
|
6
|
Wheeler G, Deng S, Toussaint N, Pushparajah K, Schnabel JA, Simpson JM, Gomez A. Virtual interaction and visualisation of 3D medical imaging data with VTK and Unity. Healthc Technol Lett 2018; 5:148-153. [PMID: 30800321 PMCID: PMC6372083 DOI: 10.1049/htl.2018.5064] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2018] [Accepted: 08/20/2018] [Indexed: 11/22/2022] Open
Abstract
The authors present a method to interconnect the Visualisation Toolkit (VTK) and Unity. This integration enables them to exploit the visualisation capabilities of VTK with Unity's widespread support of virtual, augmented, and mixed reality displays, and interaction and manipulation devices, for the development of medical image applications for virtual environments. The proposed method utilises OpenGL context sharing between Unity and VTK to render VTK objects into the Unity scene via a Unity native plugin. The proposed method is demonstrated in a simple Unity application that performs VTK volume rendering to display thoracic computed tomography and cardiac magnetic resonance images. Quantitative measurements of the achieved frame rates show that this approach provides over 90 fps using standard hardware, which is suitable for current augmented reality/virtual reality display devices.
Collapse
Affiliation(s)
- Gavin Wheeler
- School of Imaging Sciences & Biomedical Engineering, King's College London, London, UK
| | - Shujie Deng
- School of Imaging Sciences & Biomedical Engineering, King's College London, London, UK
| | - Nicolas Toussaint
- School of Imaging Sciences & Biomedical Engineering, King's College London, London, UK
| | - Kuberan Pushparajah
- School of Imaging Sciences & Biomedical Engineering, King's College London, London, UK
- Department of Congenital Heart Disease, Evelina London Children's Hospital, London, UK
| | - Julia A. Schnabel
- School of Imaging Sciences & Biomedical Engineering, King's College London, London, UK
| | - John M. Simpson
- School of Imaging Sciences & Biomedical Engineering, King's College London, London, UK
- Department of Congenital Heart Disease, Evelina London Children's Hospital, London, UK
| | - Alberto Gomez
- School of Imaging Sciences & Biomedical Engineering, King's College London, London, UK
| |
Collapse
|
7
|
Taswell SK, Veeramacheneni T, Taswell C. BrainWatch software for interactive exploration of brain scans in 3D virtual reality systems. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2017; 2017:3704-3707. [PMID: 29060703 DOI: 10.1109/embc.2017.8037662] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
The ability to view medical images as 3D objects, which can be explored interactively, has now become possible due to the advent of rapidly emerging virtual reality (VR) technologies. In the past, VR has been used as an educational tool for learning anatomy, a visualization tool for assisting surgery, and a therapeutic tool for rehabilitating patients with motor disorders. However, these older systems were either expensive to build or difficult to acquire and use. Exploiting the arrival of new consumer devices such as the Oculus Rift that are now affordable, we have developed a software application called BrainWatch for VR ready computers to enable 3D visualization and interactive exploration of DICOM data sets focusing on PET and MRI brain scans. BrainWatch software provides a unique set of 3 approaches for interacting with the virtual object which we have named the observatory scenario with an external camera, the planetarium scenario with an internal camera, and the voyager scenario with a mobile camera. A live interactive demo of BrainWatch VR with the Oculus Rift CV1 will be available for conference attendees to experience at EMBC 2017.
Collapse
|
8
|
The forensic holodeck: an immersive display for forensic crime scene reconstructions. Forensic Sci Med Pathol 2014; 10:623-6. [DOI: 10.1007/s12024-014-9605-0] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/25/2014] [Indexed: 12/15/2022]
|