1
|
Lewis KO, Popov V, Fatima SS. From static web to metaverse: reinventing medical education in the post-pandemic era. Ann Med 2024; 56:2305694. [PMID: 38261592 PMCID: PMC10810636 DOI: 10.1080/07853890.2024.2305694] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/01/2023] [Accepted: 01/06/2024] [Indexed: 01/25/2024] Open
Abstract
The World Wide Web and the advancement of computer technology in the 1960s and 1990s respectively set the ground for a substantial and simultaneous change in many facets of our life, including medicine, health care, and medical education. The traditional didactic approach has shifted towards more dynamic and interactive methods, leveraging technologies such as simulation tools, virtual reality, and online platforms. At the forefront is the remarkable evolution that has revolutionized how medical knowledge is accessed, disseminated, and integrated into pedagogical practices. The COVID-19 pandemic also led to rapid and large-scale adoption of e-learning and digital resources in medical education because of widespread lockdowns, social distancing measures, and the closure of medical schools and healthcare training programs. This review paper examines the evolution of medical education from the Flexnerian era to the modern digital age, closely examining the influence of the evolving WWW and its shift from Education 1.0 to Education 4.0. This evolution has been further accentuated by the transition from the static landscapes of Web 2D to the immersive realms of Web 3D, especially considering the growing notion of the metaverse. The application of the metaverse is an interconnected, virtual shared space that includes virtual reality (VR), augmented reality (AR), and mixed reality (MR) to create a fertile ground for simulation-based training, collaborative learning, and experiential skill acquisition for competency development. This review includes the multifaceted applications of the metaverse in medical education, outlining both its benefits and challenges. Through insightful case studies and examples, it highlights the innovative potential of the metaverse as a platform for immersive learning experiences. Moreover, the review addresses the role of emerging technologies in shaping the post-pandemic future of medical education, ultimately culminating in a series of recommendations tailored for medical institutions aiming to successfully capitalize on revolutionary changes.
Collapse
Affiliation(s)
- Kadriye O. Lewis
- Children’s Mercy Kansas City, Department of Pediatrics, UMKC School of Medicine, Kansas City, MO, USA
| | - Vitaliy Popov
- Department of Learning Health Sciences, University of MI Medical School, Ann Arbor, MI, USA
| | - Syeda Sadia Fatima
- Department of Biological and Biomedical Sciences, The Aga Khan University, Karachi, Pakistan
| |
Collapse
|
2
|
Levschuk A, Whittal J, Trejos AL, Sirek A. Leveraging Space-Flown Technologies to Deliver Healthcare with Holographic Physical Examinations. Aerosp Med Hum Perform 2024; 95:214-218. [PMID: 38486313 DOI: 10.3357/amhp.6397.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/17/2024]
Abstract
INTRODUCTION: Musculoskeletal injuries are one of the more common injuries in spaceflight. Physical assessment of an injury is essential for diagnosis and treatment. Unfortunately, when musculoskeletal injuries occur in space, the flight surgeon is limited to two-dimensional videoconferencing and, potentially, observations made by the crew medical officer. To address these limitations, we investigated the feasibility of performing physical examinations on a three-dimensional augmented reality projection using a mixed-reality headset, specifically evaluating a standard shoulder examination.METHODS: A simulated patient interaction was set up between Western University in London, Ontario, Canada, and Huntsville, AL, United States. The exam was performed by a medical student, and a healthy adult man volunteered to enable the physical exam.RESULTS: All parts of the standard shoulder physical examination according to the Bates Guide to the Physical Exam were performed with holoportation. Adaptation was required for the palpation and some special tests.DISCUSSION: All parts of the physical exam were able to be completed. The true to anatomical size of the holograms permitted improved inspection of the anatomy compared to traditional videoconferencing. Palpation was completed by instructing the patient to palpate themselves and comment on relevant findings asked by the examiner. Range of motion and special tests for specific pathologies were also able to be completed with some modifications due to the examiner not being present to provide resistance. Future work should aim to improve the graphics, physician communication, and haptic feedback during holoportation.Levschuk A, Whittal J, Trejos AL, Sirek A. Leveraging space-flown technologies to deliver healthcare with holographic physical examinations. Aerosp Med Hum Perform. 2024; 95(4):214-218.
Collapse
|
3
|
Shabir D, Anjum A, Hamza H, Padhan J, Al-Ansari A, Yaacoub E, Mohammed A, Navkar NV. Development and Evaluation of a Mixed-Reality Tele-ultrasound System. ULTRASOUND IN MEDICINE & BIOLOGY 2023; 49:1867-1874. [PMID: 37263893 DOI: 10.1016/j.ultrasmedbio.2023.04.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/15/2022] [Revised: 02/25/2023] [Accepted: 04/28/2023] [Indexed: 06/03/2023]
Abstract
OBJECTIVE The objective of this feasibility study was to develop and assess a tele-ultrasound system that would enable an expert sonographer (situated at the remote site) to provide real-time guidance to an operator (situated at the imaging site) using a mixed-reality environment. METHODS An architecture along with the operational workflow of the system is designed and a prototype is developed that enables guidance in form of audiovisual cues. The visual cues comprise holograms (of the ultrasound images and ultrasound probe) and is rendered to the operator using a head-mounted display device. The position and orientation of the ultrasound probe's hologram are remotely controlled by the expert sonographer and guide the placement of a physical ultrasound probe at the imaging site. The developed prototype was evaluated for its performance on a network. In addition, a user study (with 12 participants) was conducted to assess the operator's ability to align the probe under different guidance modes. RESULTS The network performance revealed the view of the imaging site and ultrasound images were transferred to the remote site in 233 ± 42 and 158 ± 38 ms, respectively. The expert sonographer was able to transfer, to the imaging site, data related to position and orientation of the ultrasound probe's hologram in 78 ± 13 ms. The user study indicated that the audiovisual cues are sufficient for an operator to position and orient a physical probe for accurate depiction of the targeted tissue (p < 0.001). The probe's placement translational and rotational errors were 1.4 ± 0.6 mm and 5.4 ± 2.2º. CONCLUSION The work illustrates the feasibility of using a mixed-reality environment for effective communication between an expert sonographer (ultrasound physician) and an operator. Further studies are required to determine its applicability in a clinical setting during tele-ultrasound.
Collapse
Affiliation(s)
- Dehlela Shabir
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | - Arshak Anjum
- Department of Computer Science and Engineering, Qatar University, Doha, Qatar
| | - Hawa Hamza
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| | | | | | - Elias Yaacoub
- Department of Computer Science and Engineering, Qatar University, Doha, Qatar
| | - Amr Mohammed
- Department of Computer Science and Engineering, Qatar University, Doha, Qatar
| | - Nikhil V Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar.
| |
Collapse
|
4
|
Khorasani M, Abdurahiman N, Padhan J, Zhao H, Al-Ansari A, Becker AT, Navkar N. Preliminary design and evaluation of a generic surgical scope adapter. Int J Med Robot 2023; 19:e2475. [PMID: 36288569 DOI: 10.1002/rcs.2475] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2022] [Revised: 08/21/2022] [Accepted: 10/25/2022] [Indexed: 11/06/2022]
Abstract
BACKGROUND Robotic scope assistant systems are used to visualise and navigate the operative field during a laparoscopic surgery. The objective of this work is to design a surgical scope adapter that enables control of different scope types (zero-degree, angulated, and articulated), and can be connected to any six degree-of-freedom robotic manipulator for usage as a robotic scope assistant system. METHODS A surgical scope adapter compatible with different camera heads and scope types was designed and prototyped. The technical performance of the scope adapter was evaluated and a user study was conducted to assess the human-in-the-loop control. RESULTS All the subjects were able to navigate the simulated operative field. The scope adapter permits continuous motion to explore the operative field as well as intermittent motion to accurately focus on the targeted anatomical landmarks. CONCLUSION The modular and generic nature of the surgical scope adapter may enable its usage across different minimally invasive surgeries.
Collapse
Affiliation(s)
| | | | | | - Haoran Zhao
- Department of Electrical and Computer Engineering, University of Houston, Houston, Texas, USA
| | | | - Aaron T Becker
- Department of Electrical and Computer Engineering, University of Houston, Houston, Texas, USA
| | - Nikhil Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar
| |
Collapse
|
5
|
Gsaxner C, Li J, Pepe A, Jin Y, Kleesiek J, Schmalstieg D, Egger J. The HoloLens in medicine: A systematic review and taxonomy. Med Image Anal 2023; 85:102757. [PMID: 36706637 DOI: 10.1016/j.media.2023.102757] [Citation(s) in RCA: 20] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2022] [Revised: 01/05/2023] [Accepted: 01/18/2023] [Indexed: 01/22/2023]
Abstract
The HoloLens (Microsoft Corp., Redmond, WA), a head-worn, optically see-through augmented reality (AR) display, is the main player in the recent boost in medical AR research. In this systematic review, we provide a comprehensive overview of the usage of the first-generation HoloLens within the medical domain, from its release in March 2016, until the year of 2021. We identified 217 relevant publications through a systematic search of the PubMed, Scopus, IEEE Xplore and SpringerLink databases. We propose a new taxonomy including use case, technical methodology for registration and tracking, data sources, visualization as well as validation and evaluation, and analyze the retrieved publications accordingly. We find that the bulk of research focuses on supporting physicians during interventions, where the HoloLens is promising for procedures usually performed without image guidance. However, the consensus is that accuracy and reliability are still too low to replace conventional guidance systems. Medical students are the second most common target group, where AR-enhanced medical simulators emerge as a promising technology. While concerns about human-computer interactions, usability and perception are frequently mentioned, hardly any concepts to overcome these issues have been proposed. Instead, registration and tracking lie at the core of most reviewed publications, nevertheless only few of them propose innovative concepts in this direction. Finally, we find that the validation of HoloLens applications suffers from a lack of standardized and rigorous evaluation protocols. We hope that this review can advance medical AR research by identifying gaps in the current literature, to pave the way for novel, innovative directions and translation into the medical routine.
Collapse
Affiliation(s)
- Christina Gsaxner
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria.
| | - Jianning Li
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Antonio Pepe
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Yuan Jin
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Research Center for Connected Healthcare Big Data, Zhejiang Lab, Hangzhou, 311121 Zhejiang, China
| | - Jens Kleesiek
- Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| | - Dieter Schmalstieg
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; BioTechMed, 8010 Graz, Austria
| | - Jan Egger
- Institute of Computer Graphics and Vision, Graz University of Technology, 8010 Graz, Austria; Institute of AI in Medicine, University Medicine Essen, 45131 Essen, Germany; BioTechMed, 8010 Graz, Austria; Cancer Research Center Cologne Essen, University Medicine Essen, 45147 Essen, Germany
| |
Collapse
|
6
|
Hamza H, Baez VM, Al-Ansari A, Becker AT, Navkar NV. User interfaces for actuated scope maneuvering in surgical systems: a scoping review. Surg Endosc 2023:10.1007/s00464-023-09981-0. [DOI: 10.1007/s00464-023-09981-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2022] [Accepted: 02/25/2023] [Indexed: 03/29/2023]
Abstract
Abstract
Background
A variety of human computer interfaces are used by robotic surgical systems to control and actuate camera scopes during minimally invasive surgery. The purpose of this review is to examine the different user interfaces used in both commercial systems and research prototypes.
Methods
A comprehensive scoping review of scientific literature was conducted using PubMed and IEEE Xplore databases to identify user interfaces used in commercial products and research prototypes of robotic surgical systems and robotic scope holders. Papers related to actuated scopes with human–computer interfaces were included. Several aspects of user interfaces for scope manipulation in commercial and research systems were reviewed.
Results
Scope assistance was classified into robotic surgical systems (for multiple port, single port, and natural orifice) and robotic scope holders (for rigid, articulated, and flexible endoscopes). Benefits and drawbacks of control by different user interfaces such as foot, hand, voice, head, eye, and tool tracking were outlined. In the review, it was observed that hand control, with its familiarity and intuitiveness, is the most used interface in commercially available systems. Control by foot, head tracking, and tool tracking are increasingly used to address limitations, such as interruptions to surgical workflow, caused by using a hand interface.
Conclusion
Integrating a combination of different user interfaces for scope manipulation may provide maximum benefit for the surgeons. However, smooth transition between interfaces might pose a challenge while combining controls.
Collapse
|
7
|
Abdurahiman N, Khorasani M, Padhan J, Baez VM, Al-Ansari A, Tsiamyrtzis P, Becker AT, Navkar NV. Scope actuation system for articulated laparoscopes. Surg Endosc 2023; 37:2404-2413. [PMID: 36750488 PMCID: PMC10017632 DOI: 10.1007/s00464-023-09904-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2022] [Accepted: 01/21/2023] [Indexed: 02/09/2023]
Abstract
BACKGROUND An articulated laparoscope comprises a rigid shaft with an articulated distal end to change the viewing direction. The articulation provides improved navigation of the operating field in confined spaces. Furthermore, incorporation of an actuation system tends to enhance the control of an articulated laparoscope. METHODS A preliminary prototype of a scope actuation system to maneuver an off-the-shelf articulated laparoscope (EndoCAMaleon by Karl Storz, Germany) was developed. A user study was conducted to evaluate this prototype for the surgical paradigm of video-assisted thoracic surgery. In the study, the subjects maneuvered an articulated scope under two modes of operation: (a) actuated mode where an operating surgeon maneuvers the scope using the developed prototype and (b) manual mode where a surgical assistant directly maneuvers the scope. The actuated mode was further assessed for multiple configurations based on the orientation of the articulated scope at the incision. RESULTS The data show the actuated mode scored better than the manual mode on all the measured performance parameters including (a) total duration to visualize a marked region, (a) duration for which scope focus shifts outside a predefined visualization region, and (c) number of times for which scope focus shifts outside a predefined visualization region. Among the different configurations tested using the actuated mode, no significant difference was observed. CONCLUSIONS The proposed articulated scope actuation system facilitates better navigation of an operative field as compared to a human assistant. Secondly, irrespective of the orientation in which an articulated scope's shaft is inserted through an incision, the proposed actuation system can navigate and visualize the operative field.
Collapse
Affiliation(s)
| | | | | | - Victor M Baez
- Department of Electrical Engineering, University of Houston, Houston, TX, USA
| | | | | | - Aaron T Becker
- Department of Electrical Engineering, University of Houston, Houston, TX, USA
| | - Nikhil V Navkar
- Department of Surgery, Hamad Medical Corporation, Doha, Qatar.
- Department of Surgery, Surgical Research Section, Hamad General Hospital, Hamad Medical Corporation, PO Box 3050, Doha, Qatar.
| |
Collapse
|
8
|
Palumbo A. Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22207709. [PMID: 36298059 PMCID: PMC9611914 DOI: 10.3390/s22207709] [Citation(s) in RCA: 33] [Impact Index Per Article: 16.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/07/2022] [Revised: 09/29/2022] [Accepted: 10/07/2022] [Indexed: 05/08/2023]
Abstract
In the world reference context, although virtual reality, augmented reality and mixed reality have been emerging methodologies for several years, only today technological and scientific advances have made them suitable to revolutionize clinical care and medical contexts through the provision of enhanced functionalities and improved health services. This systematic review provides the state-of-the-art applications of the Microsoft® HoloLens 2 in a medical and healthcare context. Focusing on the potential that this technology has in providing digitally supported clinical care, also but not only in relation to the COVID-19 pandemic, studies that proved the applicability and feasibility of HoloLens 2 in a medical and healthcare scenario were considered. The review presents a thorough examination of the different studies conducted since 2019, focusing on HoloLens 2 medical sub-field applications, device functionalities provided to users, software/platform/framework used, as well as the study validation. The results provided in this paper could highlight the potential and limitations of the HoloLens 2-based innovative solutions and bring focus to emerging research topics, such as telemedicine, remote control and motor rehabilitation.
Collapse
Affiliation(s)
- Arrigo Palumbo
- Department of Medical and Surgical Sciences, Magna Græcia University, 88100 Catanzaro, Italy
| |
Collapse
|
9
|
Shabir D, Anbatawi M, Padhan J, Balakrishnan S, Al‐Ansari A, Abinahed J, Tsiamyrtzis P, Yaacoub E, Mohammed A, Deng Z, Navkar NV. Evaluation of user‐interfaces for controlling movements of virtual minimally invasive surgical instruments. Int J Med Robot 2022; 18:e2414. [DOI: 10.1002/rcs.2414] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Revised: 04/10/2022] [Accepted: 04/27/2022] [Indexed: 11/10/2022]
Affiliation(s)
- Dehlela Shabir
- Department of Surgery Hamad Medical Corporation Doha Qatar
| | - Malek Anbatawi
- Department of Surgery Hamad Medical Corporation Doha Qatar
| | | | | | | | | | | | - Elias Yaacoub
- Department of Computer Science and Engineering Qatar University Doha Qatar
| | - Amr Mohammed
- Department of Computer Science and Engineering Qatar University Doha Qatar
| | - Zhigang Deng
- Department of Computer Science University of Houston Houston Texas USA
| | | |
Collapse
|
10
|
Quesada-Olarte J, Carrion RE, Fernandez-Crespo R, Henry GD, Simhan J, Shridharani A, Carrion RE, Hakky TS. Extended Reality-Assisted Surgery as a Surgical Training Tool: Pilot Study Presenting First HoloLens-Assisted Complex Penile Revision Surgery. J Sex Med 2022; 19:1580-1586. [PMID: 36088277 DOI: 10.1016/j.jsxm.2022.07.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Revised: 07/10/2022] [Accepted: 07/18/2022] [Indexed: 11/27/2022]
Abstract
BACKGROUND Extended reality-assisted urologic surgery (XRAS) is a novel technology that superimposes a computer-generated image on the physician's field to integrate common elements of the surgical process in more advanced detail. An extended reality (XR) interface is generated using optical head-mounted display (OHMD) devices. AIM To present the first case of HoloLens-assisted complex penile revision surgery. METHODS We describe our pilot study of HoloLens-assisted penile revision surgery and present a thorough review of the literature regarding XRAS technology and innovative OHMD devices. OUTCOMES The ability of XRAS technology to superimpose a computer-generated image of the patient and integrate common elements of the surgical planning process with long-distance experts. RESULTS XRAS is a feasible technology for application in complex penile surgical planning processes. CLINICAL TRANSLATION XRAS and OHMD devices are novel technologies applicable to urological surgical training and planning. STRENGTHS AND LIMITATIONS Evidence suggests that the potential use of OHMD devices is safe and beneficial for surgeons. We intend to pioneer HoloLens technology in the surgical planning process of a malfunctioning penile implant due to herniation of the cylinder. This novel technology has not been used in prosthetic surgery, and current data about XRAS are limited. CONCLUSION OHMD devices are effective in the operative setting. Herein, we successfully demonstrated the integration of Microsoft HoloLens 2 into a penile surgical planning process for the first time. Further development and studies for this technology are necessary to better characterize the XRAS as a training and surgical planning tool. Quesada-Olarte J, Carrion RE, Fernandez-Crespo R, et al. Extended Reality-Assisted Surgery as a Surgical Training Tool: Pilot Study Presenting First HoloLens-Assisted Complex Penile Revision Surgery. J Sex Med 2022;19:1580-1586.
Collapse
|
11
|
Doughty M, Ghugre NR, Wright GA. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J Imaging 2022; 8:jimaging8070203. [PMID: 35877647 PMCID: PMC9318659 DOI: 10.3390/jimaging8070203] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 07/15/2022] [Accepted: 07/18/2022] [Indexed: 02/01/2023] Open
Abstract
We conducted a systematic review of recent literature to understand the current challenges in the use of optical see-through head-mounted displays (OST-HMDs) for augmented reality (AR) assisted surgery. Using Google Scholar, 57 relevant articles from 1 January 2021 through 18 March 2022 were identified. Selected articles were then categorized based on a taxonomy that described the required components of an effective AR-based navigation system: data, processing, overlay, view, and validation. Our findings indicated a focus on orthopedic (n=20) and maxillofacial surgeries (n=8). For preoperative input data, computed tomography (CT) (n=34), and surface rendered models (n=39) were most commonly used to represent image information. Virtual content was commonly directly superimposed with the target site (n=47); this was achieved by surface tracking of fiducials (n=30), external tracking (n=16), or manual placement (n=11). Microsoft HoloLens devices (n=24 in 2021, n=7 in 2022) were the most frequently used OST-HMDs; gestures and/or voice (n=32) served as the preferred interaction paradigm. Though promising system accuracy in the order of 2–5 mm has been demonstrated in phantom models, several human factors and technical challenges—perception, ease of use, context, interaction, and occlusion—remain to be addressed prior to widespread adoption of OST-HMD led surgical navigation.
Collapse
Affiliation(s)
- Mitchell Doughty
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Correspondence:
| | - Nilesh R. Ghugre
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| | - Graham A. Wright
- Department of Medical Biophysics, University of Toronto, Toronto, ON M5S 1A1, Canada; (N.R.G.); (G.A.W.)
- Schulich Heart Program, Sunnybrook Health Sciences Centre, Toronto, ON M4N 3M5, Canada
- Physical Sciences Platform, Sunnybrook Research Institute, Toronto, ON M4N 3M5, Canada
| |
Collapse
|
12
|
3D visualization of perianal fistulas using parametric models. Tech Coloproctol 2022; 26:291-300. [DOI: 10.1007/s10151-022-02573-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/12/2021] [Accepted: 01/13/2022] [Indexed: 10/19/2022]
|