1
|
Hulsen T. Aplicaciones del metaverso en medicina y atención sanitaria. ADVANCES IN LABORATORY MEDICINE 2024; 5:166-172. [PMID: 38939208 PMCID: PMC11206190 DOI: 10.1515/almed-2024-0004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 11/29/2023] [Indexed: 06/29/2024]
Abstract
El metaverso es un mundo virtual, aún en proceso de desarrollo, que permite a las personas interactuar entre ellas, así como con objetos digitales de una forma más inmersiva. Esta innovadora herramienta aúna las tres principales tendencias tecnológicas: la telepresencia, el gemelo digital y la cadena de bloques. La telepresencia permite a las personas “reunirse” de manera virtual, aunque se encuentren en distintos lugares. El gemelo digital es el equivalente virtual y digital de un paciente, dispositivo médico o incluso de un hospital. Por último, la cadena de bloques puede ser utilizada por los pacientes para almacenar sus informes médicos personales de forma segura. En medicina, el metaverso podría tener distintas aplicaciones: (1) consultas médicas virtuales; (2) educación y formación médica; (3) educación del paciente; (4) investigación médica; (5) desarrollo de medicamentos; (6) terapia y apoyo; (7) medicina de laboratorio. El metaverso permitiría una atención sanitaria más personalizada, eficiente y accesible, mejorando así los resultados clínicos y reduciendo los costes de atención médica. No obstante, la implementación del metaverso en medicina y atención sanitaria requerirá una cuidadosa evaluación de los aspectos éticos y de privacidad, así como técnicos, sociales y jurídicos. En términos generales, el futuro del metaverso en el campo de la medicina parece prometedor, aunque es necesario desarrollar nuevas leyes que regulen específicamente el metaverso, con el fin de superar sus posibles inconvenientes.
Collapse
Affiliation(s)
- Tim Hulsen
- Data Science & AI Engineering, Philips, High Tech Campus 34, 5656AEEindhoven, Países Bajos
| |
Collapse
|
2
|
Hulsen T. Applications of the metaverse in medicine and healthcare. ADVANCES IN LABORATORY MEDICINE 2024; 5:159-165. [PMID: 38939198 PMCID: PMC11206184 DOI: 10.1515/almed-2023-0124] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 11/29/2023] [Indexed: 06/29/2024]
Abstract
The metaverse is a virtual world that is being developed to allow people to interact with each other and with digital objects in a more immersive way. It involves the convergence of three major technological trends: telepresence, the digital twin, and blockchain. Telepresence is the ability of people to "be together" in a virtual way while not being close to each other. The digital twin is a virtual, digital equivalent of a patient, a medical device or even a hospital. Blockchain can be used by patients to keep their personal medical records secure. In medicine and healthcare, the metaverse could be used in several ways: (1) virtual medical consultations; (2) medical education and training; (3) patient education; (4) medical research; (5) drug development; (6) therapy and support; (7) laboratory medicine. The metaverse has the potential to enable more personalized, efficient, and accessible healthcare, improving patient outcomes and reducing healthcare costs. However, the implementation of the metaverse in medicine and healthcare will require careful consideration of ethical and privacy concerns, as well as social, technical and regulatory challenges. Overall, the future of the metaverse in healthcare looks bright, but new metaverse-specific laws should be created to help overcome any potential downsides.
Collapse
Affiliation(s)
- Tim Hulsen
- Data Science & AI Engineering, Philips, Eindhoven, The Netherlands
| |
Collapse
|
3
|
Ng J, Arness D, Gronowski A, Qu Z, Lau CW, Catchpoole D, Nguyen QV. Exocentric and Egocentric Views for Biomedical Data Analytics in Virtual Environments-A Usability Study. J Imaging 2023; 10:3. [PMID: 38248988 PMCID: PMC10817309 DOI: 10.3390/jimaging10010003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2023] [Revised: 12/14/2023] [Accepted: 12/20/2023] [Indexed: 01/23/2024] Open
Abstract
Biomedical datasets are usually large and complex, containing biological information about a disease. Computational analytics and the interactive visualisation of such data are essential decision-making tools for disease diagnosis and treatment. Oncology data models were observed in a virtual reality environment to analyse gene expression and clinical data from a cohort of cancer patients. The technology enables a new way to view information from the outside in (exocentric view) and the inside out (egocentric view), which is otherwise not possible on ordinary displays. This paper presents a usability study on the exocentric and egocentric views of biomedical data visualisation in virtual reality and their impact on usability on human behaviour and perception. Our study revealed that the performance time was faster in the exocentric view than in the egocentric view. The exocentric view also received higher ease-of-use scores than the egocentric view. However, the influence of usability on time performance was only evident in the egocentric view. The findings of this study could be used to guide future development and refinement of visualisation tools in virtual reality.
Collapse
Affiliation(s)
- Jing Ng
- School of Psychology, Western Sydney University, Penrith, NSW 2750, Australia; (J.N.); (D.A.); (A.G.)
| | - David Arness
- School of Psychology, Western Sydney University, Penrith, NSW 2750, Australia; (J.N.); (D.A.); (A.G.)
| | - Ashlee Gronowski
- School of Psychology, Western Sydney University, Penrith, NSW 2750, Australia; (J.N.); (D.A.); (A.G.)
| | - Zhonglin Qu
- School of Computer, Data and Mathematical Sciences, Western Sydney University, Penrith, NSW 2751, Australia; (Z.Q.); (C.W.L.)
| | - Chng Wei Lau
- School of Computer, Data and Mathematical Sciences, Western Sydney University, Penrith, NSW 2751, Australia; (Z.Q.); (C.W.L.)
| | - Daniel Catchpoole
- Tumour Bank, Children’s Cancer Research Unit, Kids Research, The Children’s Hospital at Westmead, Westmead, NSW 2145, Australia;
- School of Computer Science, Faculty of Engineering and IT, The University of Technology Sydney, Ultimo, NSW 2007, Australia
| | - Quang Vinh Nguyen
- School of Computer, Data and Mathematical Sciences and MARCS Institute, Western Sydney University, Penrith, NSW 2751, Australia
| |
Collapse
|
4
|
Kaizu K, Takahashi K. Technologies for whole-cell modeling: Genome-wide reconstruction of a cell in silico. Dev Growth Differ 2023; 65:554-564. [PMID: 37856476 DOI: 10.1111/dgd.12897] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2022] [Revised: 09/06/2023] [Accepted: 10/14/2023] [Indexed: 10/21/2023]
Abstract
With advances in high-throughput, large-scale in vivo measurement and genome modification techniques at the single-nucleotide level, there is an increasing demand for the development of new technologies for the flexible design and control of cellular systems. Computer-aided design is a powerful tool to design new cells. Whole-cell modeling aims to integrate various cellular subsystems, determine their interactions and cooperative mechanisms, and predict comprehensive cellular behaviors by computational simulations on a genome-wide scale. It has been applied to prokaryotes, yeasts, and higher eukaryotic cells, and utilized in a wide range of applications, including production of valuable substances, drug discovery, and controlled differentiation. Whole-cell modeling, consisting of several thousand elements with diverse scales and properties, requires innovative model construction, simulation, and analysis techniques. Furthermore, whole-cell modeling has been extended to multiple scales, including high-resolution modeling at the single-nucleotide and single-amino acid levels and multicellular modeling of tissues and organs. This review presents an overview of the current state of whole-cell modeling, discusses the novel computational and experimental technologies driving it, and introduces further developments toward multihierarchical modeling on a whole-genome scale.
Collapse
Affiliation(s)
- Kazunari Kaizu
- RIKEN Center for Biosystems Dynamics Research, Osaka, Japan
| | | |
Collapse
|
5
|
Kim K, Yang H, Lee J, Lee WG. Metaverse Wearables for Immersive Digital Healthcare: A Review. ADVANCED SCIENCE (WEINHEIM, BADEN-WURTTEMBERG, GERMANY) 2023; 10:e2303234. [PMID: 37740417 PMCID: PMC10625124 DOI: 10.1002/advs.202303234] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Revised: 07/15/2023] [Indexed: 09/24/2023]
Abstract
The recent exponential growth of metaverse technology has been instrumental in reshaping a myriad of sectors, not least digital healthcare. This comprehensive review critically examines the landscape and future applications of metaverse wearables toward immersive digital healthcare. The key technologies and advancements that have spearheaded the metamorphosis of metaverse wearables are categorized, encapsulating all-encompassed extended reality, such as virtual reality, augmented reality, mixed reality, and other haptic feedback systems. Moreover, the fundamentals of their deployment in assistive healthcare (especially for rehabilitation), medical and nursing education, and remote patient management and treatment are investigated. The potential benefits of integrating metaverse wearables into healthcare paradigms are multifold, encompassing improved patient prognosis, enhanced accessibility to high-quality care, and high standards of practitioner instruction. Nevertheless, these technologies are not without their inherent challenges and untapped opportunities, which span privacy protection, data safeguarding, and innovation in artificial intelligence. In summary, future research trajectories and potential advancements to circumvent these hurdles are also discussed, further augmenting the incorporation of metaverse wearables within healthcare infrastructures in the post-pandemic era.
Collapse
Affiliation(s)
- Kisoo Kim
- Intelligent Optical Module Research CenterKorea Photonics Technology Institute (KOPTI)Gwangju61007Republic of Korea
| | - Hyosill Yang
- Department of NursingCollege of Nursing ScienceKyung Hee UniversitySeoul02447Republic of Korea
| | - Jihun Lee
- Department of Mechanical EngineeringCollege of EngineeringKyung Hee UniversityYongin17104Republic of Korea
| | - Won Gu Lee
- Department of Mechanical EngineeringCollege of EngineeringKyung Hee UniversityYongin17104Republic of Korea
| |
Collapse
|
6
|
Yuan J, Hassan SS, Wu J, Koger CR, Packard RRS, Shi F, Fei B, Ding Y. Extended reality for biomedicine. NATURE REVIEWS. METHODS PRIMERS 2023; 3:15. [PMID: 37051227 PMCID: PMC10088349 DOI: 10.1038/s43586-023-00208-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/06/2023]
Abstract
Extended reality (XR) refers to an umbrella of methods that allows users to be immersed in a three-dimensional (3D) or a 4D (spatial + temporal) virtual environment to different extents, including virtual reality (VR), augmented reality (AR), and mixed reality (MR). While VR allows a user to be fully immersed in a virtual environment, AR and MR overlay virtual objects over the real physical world. The immersion and interaction of XR provide unparalleled opportunities to extend our world beyond conventional lifestyles. While XR has extensive applications in fields such as entertainment and education, its numerous applications in biomedicine create transformative opportunities in both fundamental research and healthcare. This Primer outlines XR technology from instrumentation to software computation methods, delineating the biomedical applications that have been advanced by state-of-the-art techniques. We further describe the technical advances overcoming current limitations in XR and its applications, providing an entry point for professionals and trainees to thrive in this emerging field.
Collapse
Affiliation(s)
- Jie Yuan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Sohail S. Hassan
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - Jiaojiao Wu
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Casey R. Koger
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
| | - René R. Sevag Packard
- Division of Cardiology, Department of Medicine, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, United States
- Ronald Reagan UCLA Medical Center, Los Angeles, CA United States
- Veterans Affairs West Los Angeles Medical Center, Los Angeles, CA, United States
| | - Feng Shi
- Department of Research and Development, Shanghai United Imaging Intelligence Co., Ltd., Shanghai, China
| | - Baowei Fei
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Department of Radiology, UT Southwestern Medical Center, Dallas, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
| | - Yichen Ding
- Department of Bioengineering, Erik Jonsson School of Engineering and Computer Science, The University of Texas at Dallas, Richardson, TX, United States
- Center for Imaging and Surgical Innovation, The University of Texas at Dallas, Richardson, TX, United States
- Hamon Center for Regenerative Science and Medicine, UT Southwestern Medical Center, Dallas, TX, United States
| |
Collapse
|
7
|
Kutak D, Selzer MN, Byska J, Ganuza ML, Barisic I, Kozlikova B, Miao H. Vivern-A Virtual Environment for Multiscale Visualization and Modeling of DNA Nanostructures. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2022; 28:4825-4838. [PMID: 34437064 DOI: 10.1109/tvcg.2021.3106328] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
DNA nanostructures offer promising applications, particularly in the biomedical domain, as they can be used for targeted drug delivery, construction of nanorobots, or as a basis for molecular motors. One of the most prominent techniques for assembling these structures is DNA origami. Nowadays, desktop applications are used for the in silico design of such structures. However, as such structures are often spatially complex, their assembly and analysis are complicated. Since virtual reality (VR) was proven to be advantageous for such spatial-related tasks and there are no existing VR solutions focused on this domain, we propose Vivern, a VR application that allows domain experts to design and visually examine DNA origami nanostructures. Our approach presents different abstracted visual representations of the nanostructures, various color schemes, and an ability to place several DNA nanostructures and proteins in one environment, thus allowing for the detailed analysis of complex assemblies. We also present two novel examination tools, the Magic Scale Lens and the DNA Untwister, that allow the experts to visually embed different representations into local regions to preserve the context and support detailed investigation. To showcase the capabilities of our solution, prototypes of novel nanodevices conceptualized by our collaborating experts, such as DNA-protein hybrid structures and DNA origami superstructures, are presented. Finally, the results of two rounds of evaluations are summarized. They demonstrate the advantages of our solution, especially for scenarios where current desktop tools are very limited, while also presenting possible future research directions.
Collapse
|
8
|
Ayoob JC, Ramírez-Lugo JS. Ten simple rules for running a summer research program. PLoS Comput Biol 2022; 18:e1010588. [PMID: 36327228 PMCID: PMC9632878 DOI: 10.1371/journal.pcbi.1010588] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
To continue to advance the field of computational biology and fill the constantly growing need for new trainees who are well positioned for success, immersive summer research experiences have proven to be effective in preparing students to navigate the challenges that lay ahead in becoming future computational biologists. Here, we describe 10 simple rules for planning, offering, running, and improving a summer research program in computational biology that supports students in honing technical competencies for success in research and developing skills to become successful scientific professionals.
Collapse
Affiliation(s)
- Joseph C. Ayoob
- Department of Computational and Systems Biology, University of Pittsburgh School of Medicine, Pittsburgh, Pennsylvania, United States of America
- * E-mail:
| | - Juan S. Ramírez-Lugo
- Department of Biology, Universidad de Puerto Rico, Rio Piedras, San Juan, Puerto Rico, United States of America
| |
Collapse
|
9
|
Valades-Cruz CA, Leconte L, Fouche G, Blanc T, Van Hille N, Fournier K, Laurent T, Gallean B, Deslandes F, Hajj B, Faure E, Argelaguet F, Trubuil A, Isenberg T, Masson JB, Salamero J, Kervrann C. Challenges of intracellular visualization using virtual and augmented reality. FRONTIERS IN BIOINFORMATICS 2022; 2:997082. [PMID: 36304296 PMCID: PMC9580941 DOI: 10.3389/fbinf.2022.997082] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 08/26/2022] [Indexed: 11/22/2022] Open
Abstract
Microscopy image observation is commonly performed on 2D screens, which limits human capacities to grasp volumetric, complex, and discrete biological dynamics. With the massive production of multidimensional images (3D + time, multi-channels) and derived images (e.g., restored images, segmentation maps, and object tracks), scientists need appropriate visualization and navigation methods to better apprehend the amount of information in their content. New modes of visualization have emerged, including virtual reality (VR)/augmented reality (AR) approaches which should allow more accurate analysis and exploration of large time series of volumetric images, such as those produced by the latest 3D + time fluorescence microscopy. They include integrated algorithms that allow researchers to interactively explore complex spatiotemporal objects at the scale of single cells or multicellular systems, almost in a real time manner. In practice, however, immersion of the user within 3D + time microscopy data represents both a paradigm shift in human-image interaction and an acculturation challenge, for the concerned community. To promote a broader adoption of these approaches by biologists, further dialogue is needed between the bioimaging community and the VR&AR developers.
Collapse
Affiliation(s)
- Cesar Augusto Valades-Cruz
- SERPICO Project Team, Inria Centre Rennes-Bretagne Atlantique, Rennes, France
- SERPICO/STED Team, UMR144 CNRS Institut Curie, PSL Research University, Sorbonne Universites, Paris, France
| | - Ludovic Leconte
- SERPICO Project Team, Inria Centre Rennes-Bretagne Atlantique, Rennes, France
- SERPICO/STED Team, UMR144 CNRS Institut Curie, PSL Research University, Sorbonne Universites, Paris, France
| | - Gwendal Fouche
- SERPICO Project Team, Inria Centre Rennes-Bretagne Atlantique, Rennes, France
- SERPICO/STED Team, UMR144 CNRS Institut Curie, PSL Research University, Sorbonne Universites, Paris, France
- Inria, CNRS, IRISA, University Rennes, Rennes, France
| | - Thomas Blanc
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, Sorbonne Universites, CNRS UMR168, Paris, France
| | | | - Kevin Fournier
- SERPICO Project Team, Inria Centre Rennes-Bretagne Atlantique, Rennes, France
- SERPICO/STED Team, UMR144 CNRS Institut Curie, PSL Research University, Sorbonne Universites, Paris, France
- Inria, CNRS, IRISA, University Rennes, Rennes, France
| | - Tao Laurent
- LIRMM, Université Montpellier, CNRS, Montpellier, France
| | | | | | - Bassam Hajj
- Laboratoire Physico-Chimie, Institut Curie, PSL Research University, Sorbonne Universites, CNRS UMR168, Paris, France
| | - Emmanuel Faure
- LIRMM, Université Montpellier, CNRS, Montpellier, France
| | | | - Alain Trubuil
- MaIAGE, INRAE, Université Paris-Saclay, Jouy-en-Josas, France
| | | | - Jean-Baptiste Masson
- Decision and Bayesian Computation, Neuroscience and Computational Biology Departments, CNRS UMR 3571, Institut Pasteur, Université Paris Cité, Paris, France
| | - Jean Salamero
- SERPICO Project Team, Inria Centre Rennes-Bretagne Atlantique, Rennes, France
- SERPICO/STED Team, UMR144 CNRS Institut Curie, PSL Research University, Sorbonne Universites, Paris, France
| | - Charles Kervrann
- SERPICO Project Team, Inria Centre Rennes-Bretagne Atlantique, Rennes, France
- SERPICO/STED Team, UMR144 CNRS Institut Curie, PSL Research University, Sorbonne Universites, Paris, France
- *Correspondence: Charles Kervrann,
| |
Collapse
|
10
|
Virtual reality for the observation of oncology models (VROOM): immersive analytics for oncology patient cohorts. Sci Rep 2022; 12:11337. [PMID: 35790803 PMCID: PMC9256599 DOI: 10.1038/s41598-022-15548-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Accepted: 06/24/2022] [Indexed: 11/08/2022] Open
Abstract
The significant advancement of inexpensive and portable virtual reality (VR) and augmented reality devices has re-energised the research in the immersive analytics field. The immersive environment is different from a traditional 2D display used to analyse 3D data as it provides a unified environment that supports immersion in a 3D scene, gestural interaction, haptic feedback and spatial audio. Genomic data analysis has been used in oncology to understand better the relationship between genetic profile, cancer type, and treatment option. This paper proposes a novel immersive analytics tool for cancer patient cohorts in a virtual reality environment, virtual reality to observe oncology data models. We utilise immersive technologies to analyse the gene expression and clinical data of a cohort of cancer patients. Various machine learning algorithms and visualisation methods have also been deployed in VR to enhance the data interrogation process. This is supported with established 2D visual analytics and graphical methods in bioinformatics, such as scatter plots, descriptive statistical information, linear regression, box plot and heatmap into our visualisation. Our approach allows the clinician to interrogate the information that is familiar and meaningful to them while providing them immersive analytics capabilities to make new discoveries toward personalised medicine.
Collapse
|
11
|
Turhan B, Gümüş ZH. A Brave New World: Virtual Reality and Augmented Reality in Systems Biology. FRONTIERS IN BIOINFORMATICS 2022; 2. [PMID: 35647580 PMCID: PMC9140045 DOI: 10.3389/fbinf.2022.873478] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022] Open
Abstract
How we interact with computer graphics has not changed significantly from viewing 2D text and images on a flatscreen since their invention. Yet, recent advances in computing technology, internetworked devices and gaming are driving the design and development of new ideas in other modes of human-computer interfaces (HCIs). Virtual Reality (VR) technology uses computers and HCIs to create the feeling of immersion in a three-dimensional (3D) environment that contains interactive objects with a sense of spatial presence, where objects have a spatial location relative to, and independent of the users. While this virtual environment does not necessarily match the real world, by creating the illusion of reality, it helps users leverage the full range of human sensory capabilities. Similarly, Augmented Reality (AR), superimposes virtual images to the real world. Because humans learn the physical world through a gradual sensory familiarization, these immersive visualizations enable gaining familiarity with biological systems not realizable in the physical world (e.g., allosteric regulatory networks within a protein or biomolecular pathways inside a cell). As VR/AR interfaces are anticipated to be explosive in consumer markets, systems biologists will be more immersed into their world. Here we introduce a brief history of VR/AR, their current roles in systems biology, and advantages and disadvantages in augmenting user abilities. We next argue that in systems biology, VR/AR technologies will be most useful in visually exploring and communicating data; performing virtual experiments; and education/teaching. Finally, we discuss our perspective on future directions for VR/AR in systems biology.
Collapse
Affiliation(s)
- Berk Turhan
- Department of Genetics and Genomic Sciences, Icahn School of Medicine at Mount Sinai, New York, NY, United States
- Faculty of Natural Sciences and Engineering, Sabancı University, Istanbul, Turkey
| | - Zeynep H. Gümüş
- Faculty of Natural Sciences and Engineering, Sabancı University, Istanbul, Turkey
- Precision Immunology Institute, Icahn School of Medicine at Mount Sinai, New York, NY, United States
- *Correspondence: Zeynep H. Gümüş,
| |
Collapse
|
12
|
A Novel Gesture-Based Control System for Fluorescence Volumetric Data in Virtual Reality. SENSORS 2021; 21:s21248329. [PMID: 34960422 PMCID: PMC8703643 DOI: 10.3390/s21248329] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Revised: 12/06/2021] [Accepted: 12/09/2021] [Indexed: 12/04/2022]
Abstract
With the development of light microscopy, it is becoming increasingly easy to obtain detailed multicolor fluorescence volumetric data. The need for their appropriate visualization has become an integral part of fluorescence imaging. Virtual reality (VR) technology provides a new way of visualizing multidimensional image data or models so that the entire 3D structure can be intuitively observed, together with different object features or details on or within the object. With the need for imaging advanced volumetric data, demands for the control of virtual object properties are increasing; this happens especially for multicolor objects obtained by fluorescent microscopy. Existing solutions with universal VR controllers or software-based controllers with the need to define sufficient space for the user to manipulate data in VR are not usable in many practical applications. Therefore, we developed a custom gesture-based VR control system with a custom controller connected to the FluoRender visualization environment. A multitouch sensor disk was used for this purpose. Our control system may be a good choice for easier and more comfortable manipulation of virtual objects and their properties, especially using confocal microscopy, which is the most widely used technique for acquiring volumetric fluorescence data so far.
Collapse
|
13
|
Yallapragada VVB, Xu T, Walker SP, Tabirca S, Tangney M. Pepblock Builder VR - An Open-Source Tool for Gaming-Based Bio-Edutainment in Interactive Protein Design. Front Bioeng Biotechnol 2021; 9:674211. [PMID: 34055764 PMCID: PMC8160467 DOI: 10.3389/fbioe.2021.674211] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2021] [Accepted: 03/24/2021] [Indexed: 11/13/2022] Open
Abstract
Proteins mediate and perform various fundamental functions of life. This versatility of protein function is an attribute of its 3D structure. In recent years, our understanding of protein 3D structure has been complemented with advances in computational and mathematical tools for protein modelling and protein design. 3D molecular visualisation is an essential part in every protein design and protein modelling workflow. Over the years, stand-alone and web-based molecular visualisation tools have been used to emulate three-dimensional view on computers. The advent of virtual reality provided the scope for immersive control of molecular visualisation. While these technologies have significantly improved our insights into protein modelling, designing new proteins with a defined function remains a complicated process. Current tools to design proteins lack user-interactivity and demand high computational skills. In this work, we present the Pepblock Builder VR, a gaming-based molecular visualisation tool for bio-edutainment and understanding protein design. Simulating the concepts of protein design and incorporating gaming principles into molecular visualisation promotes effective game-based learning. Unlike traditional sequence-based protein design and fragment-based stitching, the Pepblock Builder VR provides a building block style environment for complex structure building. This provides users a unique visual structure building experience. Furthermore, the inclusion of virtual reality to the Pepblock Builder VR brings immersive learning and provides users with "being there" experience in protein visualisation. The Pepblock Builder VR works both as a stand-alone and VR-based application, and with a gamified user interface, the Pepblock Builder VR aims to expand the horizons of scientific data generation to the masses.
Collapse
Affiliation(s)
- Venkata V. B. Yallapragada
- Cancer Research @ UCC, University College Cork, Cork, Ireland
- SynBioCentre, University College Cork, Cork, Ireland
| | - Tianshu Xu
- School of Computer Science and Information Technology, University College Cork, Cork, Ireland
| | - Sidney P. Walker
- Cancer Research @ UCC, University College Cork, Cork, Ireland
- SynBioCentre, University College Cork, Cork, Ireland
| | - Sabin Tabirca
- School of Computer Science and Information Technology, University College Cork, Cork, Ireland
- Department of Computer Science, Transylvania University of Braşov, Braşov, Romania
| | - Mark Tangney
- Cancer Research @ UCC, University College Cork, Cork, Ireland
- SynBioCentre, University College Cork, Cork, Ireland
- APC Microbiome Ireland, University College Cork, Cork, Ireland
- iEd Hub, University College Cork, Cork, Ireland
| |
Collapse
|
14
|
Poronnik P, Sellwood MJ. Bioscience education 2030 and beyond: Where will technology take the curriculum? BIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION : A BIMONTHLY PUBLICATION OF THE INTERNATIONAL UNION OF BIOCHEMISTRY AND MOLECULAR BIOLOGY 2020; 48:563-567. [PMID: 32745335 DOI: 10.1002/bmb.21393] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/26/2020] [Accepted: 05/22/2020] [Indexed: 06/11/2023]
Abstract
This brief review explores the ever-increasing role that technological affordances may play in the 21C biochemistry and molecular biology curriculum. We consider the need to develop digital and creative fluencies in our students and the importance of creativity and visualization in learning science. The potential of virtual reality (VR) platforms to complement these goals are discussed with a number of examples. Finally, we look into the future where to see how VR might fit into a future curriculum.
Collapse
Affiliation(s)
- Philip Poronnik
- Discipline of Physiology, School of Medical Sciences, The University of Sydney, Camperdown, New South Wales, Australia
| | - Matthew J Sellwood
- Discipline of Physiology, School of Medical Sciences, The University of Sydney, Camperdown, New South Wales, Australia
| |
Collapse
|
15
|
Calvelo M, Piñeiro Á, Garcia-Fandino R. An immersive journey to the molecular structure of SARS-CoV-2: Virtual reality in COVID-19. Comput Struct Biotechnol J 2020; 18:2621-2628. [PMID: 32983399 PMCID: PMC7500438 DOI: 10.1016/j.csbj.2020.09.018] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Revised: 09/11/2020] [Accepted: 09/12/2020] [Indexed: 02/04/2023] Open
Abstract
The era of the explosion of immersive technologies has bumped head-on with the coronavirus disease 2019 (COVID-19) global pandemic caused by the severe acute respiratory syndrome–coronavirus 2 (SARS-CoV-2). The proper understanding of the three-dimensional structures that compose the virus, as well as of those involved in the infection process and in treatments, is expected to contribute to the advance of fundamental and applied research against this pandemic, including basic molecular biology studies and drug design. Virtual reality (VR) is a powerful technology to visualize the biomolecular structures that are currently being identified for SARS-CoV-2 infection, opening possibilities to significant advances in the understanding of the disease-associate mechanisms and thus to boost new therapies and treatments. The present availability of VR for a large variety of practical applications together with the increasingly easiness, quality and economic access of this technology is transforming the way we interact with digital information. Here, we review the software implementations currently available for VR visualization of SARS-CoV-2 molecular structures, covering a range of virtual environments: CAVEs, desktop software, and cell phone applications, all of them combined with head-mounted devices like cardboards, Oculus Rift or the HTC Vive. We aim to impulse and facilitate the use of these emerging technologies in research against COVID-19 trying to increase the knowledge and thus minimizing risks before placing huge amounts of money for the development of potential treatments.
Collapse
Affiliation(s)
- Martín Calvelo
- Centro Singular de Investigación en Química Biolóxica e Materiais Moleculares (CIQUS), Departamento de Química Orgánica, Universidade de Santiago de Compostela, Spain
| | - Ángel Piñeiro
- Departamento de Física Aplicada, Facultade de Física, Universidade de Santiago de Compostela, Spain
| | - Rebeca Garcia-Fandino
- Centro Singular de Investigación en Química Biolóxica e Materiais Moleculares (CIQUS), Departamento de Química Orgánica, Universidade de Santiago de Compostela, Spain.,Departamento de Química e Bioquímica, Faculdade de Ciências da Universidade do Porto, Porto, Portugal
| |
Collapse
|
16
|
Javaid M, Haleem A. Virtual reality applications toward medical field. CLINICAL EPIDEMIOLOGY AND GLOBAL HEALTH 2020. [DOI: 10.1016/j.cegh.2019.12.010] [Citation(s) in RCA: 47] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022] Open
|
17
|
Cassidy KC, Šefčík J, Raghav Y, Chang A, Durrant JD. ProteinVR: Web-based molecular visualization in virtual reality. PLoS Comput Biol 2020; 16:e1007747. [PMID: 32231351 PMCID: PMC7147804 DOI: 10.1371/journal.pcbi.1007747] [Citation(s) in RCA: 22] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Revised: 04/10/2020] [Accepted: 02/25/2020] [Indexed: 01/21/2023] Open
Abstract
Protein structure determines biological function. Accurately conceptualizing 3D protein/ligand structures is thus vital to scientific research and education. Virtual reality (VR) enables protein visualization in stereoscopic 3D, but many VR molecular-visualization programs are expensive and challenging to use; work only on specific VR headsets; rely on complicated model-preparation software; and/or require the user to install separate programs or plugins. Here we introduce ProteinVR, a web-based application that works on various VR setups and operating systems. ProteinVR displays molecular structures within 3D environments that give useful biological context and allow users to situate themselves in 3D space. Our web-based implementation is ideal for hypothesis generation and education in research and large-classroom settings. We release ProteinVR under the open-source BSD-3-Clause license. A copy of the program is available free of charge from http://durrantlab.com/protein-vr/, and a working version can be accessed at http://durrantlab.com/pvr/. Proteins are microscopic machines that help maintain, defend, and regulate cells. Properly understanding the three-dimensional structures of these machines–as well as the small molecules that interact with them–can advance scientific fields ranging from basic molecular biology to drug discovery. Virtual reality (VR) is a powerful tool for studying protein structures. But many current systems for viewing molecules in VR, though effective, have challenging usability limitations. We have created a new web application called ProteinVR that overcomes these challenges. ProteinVR enables VR molecular visualization in users’ browsers, without requiring them to install a separate program or plugin. It runs on a broad range of desktop, laptop, and mobile devices. For users without VR headsets, ProteinVR leverages mobile-device orientation sensors or video-game-style keyboard navigation to provide an immersive experience. We release ProteinVR as open-source software and have posted a working version at http://durrantlab.com/pvr/.
Collapse
Affiliation(s)
- Kevin C Cassidy
- Department of Biological Sciences, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Jan Šefčík
- Faculty of Information Technology, Czech Technical University in Prague, Prague, Czech Republic
| | - Yogindra Raghav
- Department of Biological Sciences, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Alexander Chang
- Department of Biological Sciences, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| | - Jacob D Durrant
- Department of Biological Sciences, University of Pittsburgh, Pittsburgh, Pennsylvania, United States of America
| |
Collapse
|
18
|
BigTop: a three-dimensional virtual reality tool for GWAS visualization. BMC Bioinformatics 2020; 21:39. [PMID: 32005132 PMCID: PMC6995189 DOI: 10.1186/s12859-020-3373-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2019] [Accepted: 01/17/2020] [Indexed: 01/10/2023] Open
Abstract
BACKGROUND Genome-wide association studies (GWAS) are typically visualized using a two-dimensional Manhattan plot, displaying chromosomal location of SNPs along the x-axis and the negative log-10 of their p-value on the y-axis. This traditional plot provides a broad overview of the results, but offers little opportunity for interaction or expansion of specific regions, and is unable to show additional dimensions of the dataset. RESULTS We created BigTop, a visualization framework in virtual reality (VR), designed to render a Manhattan plot in three dimensions, wrapping the graph around the user in a simulated cylindrical room. BigTop uses the z-axis to display minor allele frequency of each SNP, allowing for the identification of allelic variants of genes. BigTop also offers additional interactivity, allowing users to select any individual SNP and receive expanded information, including SNP name, exact values, and gene location, if applicable. BigTop is built in JavaScript using the React and A-Frame frameworks, and can be rendered using commercially available VR headsets or in a two-dimensional web browser such as Google Chrome. Data is read into BigTop in JSON format, and can be provided as either JSON or a tab-separated text file. CONCLUSIONS Using additional dimensions and interactivity options offered through VR, we provide a new, interactive, three-dimensional representation of the traditional Manhattan plot for displaying and exploring GWAS data.
Collapse
|
19
|
Thomas BH. Virtual Reality for Information Visualization Might Just Work This Time. Front Robot AI 2019; 6:84. [PMID: 33501099 PMCID: PMC7806101 DOI: 10.3389/frobt.2019.00084] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2018] [Accepted: 08/21/2019] [Indexed: 11/13/2022] Open
Affiliation(s)
- Bruce H Thomas
- IVE: Australian Research Centre for Interactive and Virtual Environments, School of Information Technology and Mathematical Sciences, University of South Australia, Adelaide, SA, Australia
| |
Collapse
|
20
|
Puzzarini C, Bloino J, Tasinato N, Barone V. Accuracy and Interpretability: The Devil and the Holy Grail. New Routes across Old Boundaries in Computational Spectroscopy. Chem Rev 2019; 119:8131-8191. [DOI: 10.1021/acs.chemrev.9b00007] [Citation(s) in RCA: 114] [Impact Index Per Article: 22.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
Affiliation(s)
- Cristina Puzzarini
- Dipartimento di Chimica “Giacomo Ciamician”, Università di Bologna, Via F. Selmi 2, I-40126 Bologna, Italy
| | - Julien Bloino
- Scuola Normale Superiore, Piazza dei Cavalieri 7, I-56126 Pisa, Italy
| | - Nicola Tasinato
- Scuola Normale Superiore, Piazza dei Cavalieri 7, I-56126 Pisa, Italy
| | - Vincenzo Barone
- Scuola Normale Superiore, Piazza dei Cavalieri 7, I-56126 Pisa, Italy
| |
Collapse
|