76
|
Steiert C, Behringer SP, Kraus LM, Bissolo M, Demerath T, Beck J, Grauvogel J, Reinacher PC. Augmented reality-assisted craniofacial reconstruction in skull base lesions - an innovative technique for single-step resection and cranioplasty in neurosurgery. Neurosurg Rev 2022; 45:2745-2755. [PMID: 35441994 PMCID: PMC9349131 DOI: 10.1007/s10143-022-01784-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2022] [Revised: 03/19/2022] [Accepted: 03/30/2022] [Indexed: 10/31/2022]
Abstract
Defects of the cranial vault often require cosmetic reconstruction with patient-specific implants, particularly in cases of craniofacial involvement. However, fabrication takes time and is expensive; therefore, efforts must be made to develop more rapidly available and more cost-effective alternatives. The current study investigated the feasibility of an augmented reality (AR)-assisted single-step procedure for repairing bony defects involving the facial skeleton and the skull base. In an experimental setting, nine neurosurgeons fabricated AR-assisted and conventionally shaped ("freehand") implants from polymethylmethacrylate (PMMA) on a skull model with a craniofacial bony defect. Deviations of the surface profile in comparison with the original model were quantified by means of volumetry, and the cosmetic results were evaluated using a multicomponent scoring system, each by two blinded neurosurgeons. Handling the AR equipment proved to be quite comfortable. The median volume deviating from the surface profile of the original model was low in the AR-assisted implants (6.40 cm3) and significantly reduced in comparison with the conventionally shaped implants (13.48 cm3). The cosmetic appearance of the AR-assisted implants was rated as very good (median 25.00 out of 30 points) and significantly improved in comparison with the conventionally shaped implants (median 14.75 out of 30 points). Our experiments showed outstanding results regarding the possibilities of AR-assisted procedures for single-step reconstruction of craniofacial defects. Although patient-specific implants still represent the gold standard in esthetic aspects, AR-assisted procedures hold high potential for an immediately and widely available, cost-effective alternative providing excellent cosmetic outcomes.
Collapse
|
77
|
Dolega-Dolegowski D, Proniewska K, Dolega-Dolegowska M, Pregowska A, Hajto-Bryk J, Trojak M, Chmiel J, Walecki P, Fudalej PS. Application of holography and augmented reality based technology to visualize the internal structure of the dental root - a proof of concept. Head Face Med 2022; 18:12. [PMID: 35382839 PMCID: PMC8981712 DOI: 10.1186/s13005-022-00307-4] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2021] [Accepted: 01/18/2022] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND The Augmented Reality (AR) blends digital information with the real world. Thanks to cameras, sensors, and displays it can supplement the physical world with holographic images. Nowadays, the applications of AR range from navigated surgery to vehicle navigation. DEVELOPMENT The purpose of this feasibility study was to develop an AR holographic system implementing Vertucci's classification of dental root morphology to facilitate the study of tooth anatomy. It was tailored to run on the AR HoloLens 2 (Microsoft) glasses. The 3D tooth models were created in Autodesk Maya and exported to Unity software. The holograms of dental roots can be projected in a natural setting of the dental office. The application allowed to display 3D objects in such a way that they could be rotated, zoomed in/out, and penetrated. The advantage of the proposed approach was that students could learn a 3D internal anatomy of the teeth without environmental visual restrictions. CONCLUSIONS It is feasible to visualize internal dental root anatomy with AR holographic system. AR holograms seem to be attractive adjunct for learning of root anatomy.
Collapse
|
78
|
Kang YJ, Kang Y. Mixed reality-based online interprofessional education: a case study in South Korea. KOREAN JOURNAL OF MEDICAL EDUCATION 2022; 34:63-69. [PMID: 35255617 PMCID: PMC8906924 DOI: 10.3946/kjme.2022.220] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/16/2021] [Revised: 10/13/2021] [Accepted: 11/09/2021] [Indexed: 05/26/2023]
Abstract
PURPOSE The purpose of this study was to explore undergraduate medical and nursing students' satisfaction with their mixed reality (MR)-based online interprofessional learning experience in South Korea. METHODS This study used a case study design. A convenience sample of 30 participants (i.e., 15 third-year medical students and 15 fourth-year nursing students) participated in a 120-minute MR-based online interprofessional education (IPE) that consisted of visualization of holographic standardized patient with ischemic stroke, online interprofessional activity, and debriefing and reflection sessions. Following the MR-based online IPE, data were collected through Modified Satisfaction with Simulation Experience Scale survey and were analyzed using descriptive analyses and independent t-tests. RESULTS Although medical and nursing students were highly satisfied with MR-based online interprofessional learning experience, nursing students were significantly more satisfied with it compared with medical students. CONCLUSION These results suggest that the integration of MR and online approach through the structured clinical reasoning process in undergraduate health professions programs can be used as an educational strategy to improve clinical reasoning and critical thinking and to promote interprofessional understanding.
Collapse
|
79
|
Xi N, Chen J, Gama F, Riar M, Hamari J. The challenges of entering the metaverse: An experiment on the effect of extended reality on workload. INFORMATION SYSTEMS FRONTIERS : A JOURNAL OF RESEARCH AND INNOVATION 2022; 25:659-680. [PMID: 35194390 PMCID: PMC8852991 DOI: 10.1007/s10796-022-10244-x] [Citation(s) in RCA: 23] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 01/13/2022] [Indexed: 06/14/2023]
Abstract
Information technologies exist to enable us to either do things we have not done before or do familiar things more efficiently. Metaverse (i.e. extended reality: XR) enables novel forms of engrossing telepresence, but it also may make mundate tasks more effortless. Such technologies increasingly facilitate our work, education, healthcare, consumption and entertainment; however, at the same time, metaverse bring a host of challenges. Therefore, we pose the question whether XR technologies, specifically Augmented Reality (AR) and Virtual Reality (VR), either increase or decrease the difficulties of carrying out everyday tasks. In the current study we conducted a 2 (AR: with vs. without) × 2 (VR: with vs. without) between-subject experiment where participants faced a shopping-related task (including navigating, movement, hand-interaction, information processing, information searching, storing, decision making, and simple calculation) to examine a proposed series of hypotheses. The NASA Task Load Index (NASA-TLX) was used to measure subjective workload when using an XR-mediated information system including six sub-dimensions of frustration, performance, effort, physical, mental, and temporal demand. The findings indicate that AR was significantly associated with overall workload, especially mental demand and effort, while VR had no significant effect on any workload sub-dimensions. There was a significant interaction effect between AR and VR on physical demand, effort, and overall workload. The results imply that the resources and cost of operating XR-mediated realities are different and higher than physical reality.
Collapse
|
80
|
Zhu LY, Hou JC, Yang L, Liu ZR, Tong W, Bai Y, Zhang YM. Application value of mixed reality in hepatectomy for hepatocellular carcinoma. World J Gastrointest Surg 2022; 14:36-45. [PMID: 35126861 PMCID: PMC8790326 DOI: 10.4240/wjgs.v14.i1.36] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/04/2021] [Revised: 11/29/2021] [Accepted: 12/25/2021] [Indexed: 02/06/2023] Open
Abstract
BACKGROUND As a new digital holographic imaging technology, mixed reality (MR) technology has unique advantages in determining the liver anatomy and location of tumor lesions. With the popularization of 5G communication technology, MR shows great potential in preoperative planning and intraoperative navigation, making hepatectomy more accurate and safer.
AIM To evaluate the application value of MR technology in hepatectomy for hepatocellular carcinoma (HCC).
METHODS The clinical data of 95 patients who underwent open hepatectomy surgery for HCC between June 2018 and October 2020 at our hospital were analyzed retrospectively. We selected 95 patients with HCC according to the inclusion criteria and exclusion criteria. In 38 patients, hepatectomy was assisted by MR (Group A), and an additional 57 patients underwent traditional hepatectomy without MR (Group B). The perioperative outcomes of the two groups were collected and compared to evaluate the application value of MR in hepatectomy for patients with HCC.
RESULTS We summarized the technical process of MR-assisted hepatectomy in the treatment of HCC. Compared to traditional hepatectomy in Group B, MR-assisted hepatectomy in Group A yielded a shorter operation time (202.86 ± 46.02 min vs 229.52 ± 57.13 min, P = 0.003), less volume of bleeding (329.29 ± 97.31 mL vs 398.23 ± 159.61 mL, P = 0.028), and shorter obstructive time of the portal vein (17.71 ± 4.16 min vs 21.58 ± 5.24 min, P = 0.019). Group A had lower alanine aminotransferas and higher albumin values on the third day after the operation (119.74 ± 29.08 U/L vs 135.53 ± 36.68 U/L, P = 0.029 and 33.60 ± 3.21 g/L vs 31.80 ± 3.51 g/L, P = 0.014, respectively). The total postoperative complications and hospitalization days in Group A were significantly less than those in Group B [14 (37.84%) vs 35 (60.34%), P = 0.032 and 12.05 ± 4.04 d vs 13.78 ± 4.13 d, P = 0.049, respectively].
CONCLUSION MR has some application value in three-dimensional visualization of the liver, surgical planning, and intraoperative navigation during hepatectomy, and it significantly improves the perioperative outcomes of hepatectomy for HCC.
Collapse
|
81
|
Elawady M, Sarhan A, Alshewimy MAM. Toward a mixed reality domain model for time-Sensitive applications using IoE infrastructure and edge computing (MRIoEF). THE JOURNAL OF SUPERCOMPUTING 2022; 78:10656-10689. [PMID: 35095192 PMCID: PMC8785157 DOI: 10.1007/s11227-022-04307-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 12/31/2021] [Indexed: 06/14/2023]
Abstract
Mixed reality (MR) is one of the technologies with many challenges in the design and implementation phases, especially the problems associated with time-sensitive applications. The main objective of this paper is to introduce a conceptual model for MR application that gives MR application a new layer of interactivity by using Internet of things/Internet of everything models, which provide an improved quality of experience for end-users. The model supports the cloud and fog compute layers to give more functionalities that need more processing resources and reduce the latency problems for time-sensitive applications. Validation of the proposed model is performed via demonstrating a prototype of the model applied to a real-time case study and discussing how to enable standard technologies of the various components in the model. Moreover, it shows the applicability of the model, the ease of defining the roles, and the coherence of data or processes found in the most common applications.
Collapse
|
82
|
Mixed Reality Needle Guidance Application on Smartglasses Without Pre-procedural CT Image Import with Manually Matching Coordinate Systems. Cardiovasc Intervent Radiol 2022; 45:349-356. [PMID: 35022858 DOI: 10.1007/s00270-021-03029-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/20/2021] [Accepted: 10/28/2021] [Indexed: 11/02/2022]
Abstract
PURPOSE To develop and assess the accuracy of a mixed reality (MR) needle guidance application on smartglasses. MATERIALS AND METHODS An MR needle guidance application on HoloLens2, without pre-procedural CT image reconstruction or import by manually matching the spatial and MR coordinate systems, was developed. First, the accuracy of the target locations in the image overlay at 63 points arranged on a 45 × 35 × 21 cm box and needle angles from 0° to 80°, placed using the MR application, was verified. The needle placement errors from 12 different entry points in a phantom by seven operators (four physicians and three non-physicians) were compared using a linear mixed model between the MR guidance and conventional methods using protractors. RESULTS The average errors of the target locations and needle angles placed using the MR application were 5.9 ± 2.6 mm and 2.3 ± 1.7°, respectively. The average needle insertion error using the MR guidance was slightly smaller compared to that using the conventional method (8.4 ± 4.0 mm vs. 9.6 ± 5.1 mm, p = 0.091), particularly in the out-of-plane approach (9.6 ± 3.5 mm vs. 12.3 ± 4.6 mm, p = 0.003). The procedural time was longer with MR guidance than with the conventional method (412 ± 134 s vs. 219 ± 66 s, p < 0.001). CONCLUSION MR needle guidance without pre-procedural CT image import is feasible when matching coordinate systems, and the accuracy of needle insertion is slightly better than that of the conventional method.
Collapse
|
83
|
Aslan S, Agrawal A, Alyuz N, Chierichetti R, Durham LM, Manuvinakurike R, Okur E, Sahay S, Sharma S, Sherry J, Raffa G, Nachman L. Exploring Kid Space in the wild: a preliminary study of multimodal and immersive collaborative play-based learning experiences. EDUCATIONAL TECHNOLOGY RESEARCH AND DEVELOPMENT : ETR & D 2022; 70:205-230. [PMID: 35035182 PMCID: PMC8741584 DOI: 10.1007/s11423-021-10072-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Accepted: 11/18/2021] [Indexed: 06/14/2023]
Abstract
Parents recognize the potential benefits of technology for their young children but are wary of too much screen time and its potential deficits in terms of social engagement and physical activity. To address these concerns, related literature suggests technology usages with a blend of digital and physical learning experiences. Towards this end, we developed Kid Space, incorporating immersive computing experiences designed to engage children more actively in physical movement and social collaboration during play-based learning. The technology features an animated peer learner, Oscar, who aims to understand and respond to children's actions and utterances using extensive multimodal sensing and sensemaking technologies. To investigate student engagement during Kid Space learning experiences, an exploratory case study was designed using a formative research method with eight first-grade students. Multimodal data (audio and video) along with observational, interview, and questionnaire data were collected and analyzed. The results show that the students demonstrated high levels of engagement, less attention focused on the screen (projected wall), and more physical activity. In addition to these promising results, the study also enabled us to understand actionable insights to improve Kid Space for future deployments (e.g., the need for real-time personalization). We plan to incorporate the lessons learned from this preliminary study and deploy Kid Space with real-time personalization features for longer periods with more students.
Collapse
|
84
|
Goharinejad S, Goharinejad S, Hajesmaeel-Gohari S, Bahaadinbeigy K. The usefulness of virtual, augmented, and mixed reality technologies in the diagnosis and treatment of attention deficit hyperactivity disorder in children: an overview of relevant studies. BMC Psychiatry 2022; 22:4. [PMID: 34983446 PMCID: PMC8728980 DOI: 10.1186/s12888-021-03632-1] [Citation(s) in RCA: 17] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/03/2021] [Accepted: 11/30/2021] [Indexed: 01/03/2023] Open
Abstract
BACKGROUND Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental condition characterized by attention problems, excessive physical activity, and impulsivity. ADHD affects not only the patients but also their families. The development and use of technologies such as virtual reality (VR), augmented reality (AR), and mixed reality (MR) for ADHD has increased over recent years. However, little is known about their potential usefulness. This overview aimed to clarify the current knowledge about the use of these three innovative technologies for the diagnosis and treatment of children with ADHD. METHODS This overview was conducted using the PubMed, Web of Science, and Scopus databases until January 24th, 2021. The following descriptive information was compiled from the identified studies: country, year of publication, sample size, study design, ADHD diagnosis methods, applied technology, hardware equipment, clinical target, and main findings. RESULTS The initial database searches yielded 409 articles, but 103 were removed as duplicates. Eventually, 30 eligible studies remained for analysis, the majority of which were case-control (n = 22, 73%). Regarding the applied technology/hardware equipment, VR (n = 27; 90%), head-mounted displays (n = 19, 63%), VR-based continuous performance tests (VR-CPT) (n = 21, 70%) were most frequently used. Most studies (n = 21, 70%) used the DSM criteria for the diagnosis of childhood ADHD. They primarily evaluated the utility of these technologies in assessing ADHD symptoms (n = 10, 33%) and improving the ADHD diagnostic process (n = 7, 23%). CONCLUSION This comprehensive overview evaluated the studies on the use of VR, AR, and MR technologies for children with ADHD. These technologies seem to be promising tools for improving the diagnosis and management of ADHD in this population.
Collapse
|
85
|
Sparwasser P, Haack M, Frey L, Haferkamp A, Borgmann H. [Virtual and augmented reality in urology]. Urologe A 2021; 61:133-141. [PMID: 34935997 PMCID: PMC8693158 DOI: 10.1007/s00120-021-01734-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/23/2021] [Indexed: 11/29/2022]
Abstract
Zwar haben jeher technologische Weiterentwicklungen die medizinische Versorgung in deren stetigem Wandel optimiert, so waren diese jedoch immer noch für den Anwender weitestgehend fassbar. Getrieben durch immense finanzielle Anstrengungen sind innovative Produkte und technische Lösungen entstanden, die den medizinischen Alltag transformieren und diesen in Zukunft um eine Dimension erweitern werden: die Virtual und Augmented Reality. Dieser Übersichtsartikel fasst die aktuellen wissenschaftlichen Projekte und den zukünftigen Nutzen von Virtual und Augmented Reality im Fachgebiet der Urologie zusammen.
Collapse
|
86
|
Satoh M, Nakajima T, Yamaguchi T, Watanabe E, Kawai K. Evaluation of augmented-reality based navigation for brain tumor surgery. J Clin Neurosci 2021; 94:305-314. [PMID: 34863455 DOI: 10.1016/j.jocn.2021.10.033] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2021] [Revised: 09/30/2021] [Accepted: 10/24/2021] [Indexed: 11/26/2022]
Abstract
To date, several researchers have introduced augmented reality navigation (ARN) into neurological surgery. While its application in brain tumor surgery seems promising, reports on its utility have been limited, thus warranting further evaluation. To clarify the stages and approaches in which ARN is useful and assess the effect of presurgical discussion with surgeons, we assessed usefulness using a hand-held ARN system we had developed, which displays three-dimensional (3D) virtual structures overlaid on a real-time image of the surgical field via a tablet PC monitor. The system was tested in 20 patients undergoing various procedures, with the first 10 consecutive cases being unselected and the following 10 cases being selected, for whom 3D models were prepared per the surgeons' request. Thereafter, the surgeons ranked its usefulness during each stage of surgery. Consequently, case selection and presurgical discussions with surgeons considerably improved the usefulness, with the "useful" gradings improving from 50% to 88% across all surgical stages. Accordingly, usefulness improved from 50% to 90%, 67% to 100%, and 40% to 80% during the skin incision and craniotomy, dura incision, and intradural procedure stages, respectively. ARN was useful for superficial tumor resection, but less so for deep-seated tumor resection, except when using the transcortical and interhemispheric approaches. In conclusion, a tablet-type ARN can be useful during skin incisions, craniotomy and dura incisions, superficial tumor resections, and transcortical and interhemispheric approaches for deep-seated tumors. Case selection and presurgical discussions with surgeons were essential for the efficacy of ARN.
Collapse
|
87
|
Examining the benefits of extended reality in neurosurgery: A systematic review. J Clin Neurosci 2021; 94:41-53. [PMID: 34863461 DOI: 10.1016/j.jocn.2021.09.037] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Revised: 08/18/2021] [Accepted: 09/25/2021] [Indexed: 01/14/2023]
Abstract
While well-established in other surgical subspecialties, the benefits of extended reality, consisting of virtual reality (VR), augmented reality (AR), and mixed reality (MR) technologies, remains underexplored in neurosurgery despite its increasing utilization. To address this gap, we conducted a systematic review of the effects of extended reality (XR) in neurosurgery with an emphasis on the perioperative period, to provide a guide for future clinical optimization. Seven primary electronic databases were screened following guidelines outlined by PRISMA and the Institute of Medicine. Reported data related to outcomes in the perioperative period and resident training were all examined, and a focused analysis of studies reporting controlled, clinical outcomes was completed. After removal of duplicates, 2548 studies were screened with 116 studies reporting measurable effects of XR in neurosurgery. The majority (82%) included cranial based applications related to tumor surgery with 34% showing improved resection rates and functional outcomes. A rise in high-quality studies was identified from 2017 to 2020 compared to all previous years (p = 0.004). Primary users of the technology were: 56% neurosurgeon (n = 65), 28% residents (n = 33) and 5% patients (n = 6). A final synthesis was conducted on 10 controlled studies reporting patient outcomes. XR technologies have demonstrated benefits in preoperative planning and multimodal neuronavigation especially for tumor surgery. However, few studies have reported patient outcomes in a controlled design demonstrating a need for higher quality data. XR platforms offer several advantages to improve patient outcomes and specifically, the patient experience for neurosurgery.
Collapse
|
88
|
Goh GS, Lohre R, Parvizi J, Goel DP. Virtual and augmented reality for surgical training and simulation in knee arthroplasty. Arch Orthop Trauma Surg 2021; 141:2303-2312. [PMID: 34264380 DOI: 10.1007/s00402-021-04037-1] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/09/2021] [Accepted: 07/01/2021] [Indexed: 10/20/2022]
Abstract
BACKGROUND Immersive virtual reality (IVR), augmented reality and mixed reality form a spectrum of extended reality technology integration that has gained popularity in orthopaedics recently. This review article examines the role of extended reality technologies in knee arthroplasty. METHODS Existing literature on the applications of extended reality technologies in preoperative planning and intraoperative navigation were reviewed. A sample workflow of a novel IVR simulator for improving surgical training was also provided to demonstrate its utility in educating trainees on knee arthroplasty techniques. RESULTS Extended reality technologies enable the surgeon to visualise patient-specific anatomy in real-time, enhancing preoperative planning and providing intraoperative guidance. IVR technology has the potential to revolutionise modern surgical training and optimise surgical performance in a cost-efficient manner, with current evidence demonstrating favourable immediate skill acquisition and transfer. CONCLUSIONS Extended reality technologies have a myriad of potential applications in orthopaedic surgery. Further research is needed to evaluate the cost-effectiveness of its incorporation into training programmes.
Collapse
|
89
|
Ito T, Kawashima Y, Yamazaki A, Tsutsumi T. Application of a virtual and mixed reality-navigation system using commercially available devices to the lateral temporal bone resection. Ann Med Surg (Lond) 2021; 72:103063. [PMID: 34824840 PMCID: PMC8604738 DOI: 10.1016/j.amsu.2021.103063] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Revised: 11/10/2021] [Accepted: 11/10/2021] [Indexed: 12/24/2022] Open
Abstract
Background Lateral temporal bone resection (LTBR) is performed for stage T1-2 external ear malignant tumors and requires spatial anatomical knowledge of the rare surgical field. Objective This paper presents a novel virtual reality (VR) based surgical simulation and navigation system using only commercially available display device and an online software, to assist in the understanding of the anatomy pre and intraoperatively. Result and conclusion VR model created by 3D Slicer modules and visualized on head mounted display enabled users to simulate and learn surgical techniques of a rare surgical case. 3D hologram through HoloLens assisted the surgeon in comprehending the spatial relationship between crucial vital structures and the pathological lesion during the operation. This platform does not require the users to possess specific programming skill or knowledge, and is therefore applicable in daily clinical usage. Lateral temporal bone resection (LTBR) is standard operative procedure for early-staged malignant tumors of external ear canals. However, many surgeons lack the opportunity to learn the surgical techniques because of its rarity. We report a usage of novel virtual reality based surgical simulation and navigation system for studying the anatomy and the operative steps in LTBR. 3D holograms with head-mounted display will provide revolutionary tool in assisting surgical planning, intraoperative referencing and navigation of otologic and skull base surgery.
Collapse
|
90
|
Porpiglia F, Checcucci E, Amparore D, Peretti D, Piramide F, De Cillis S, Piana A, Niculescu G, Verri P, Manfredi M, Poggio M, Stura I, Migliaretti G, Cossu M, Fiori C. Percutaneous Kidney Puncture with Three-dimensional Mixed-reality Hologram Guidance: From Preoperative Planning to Intraoperative Navigation. Eur Urol 2021; 81:588-597. [PMID: 34799199 DOI: 10.1016/j.eururo.2021.10.023] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2021] [Revised: 09/05/2021] [Accepted: 10/15/2021] [Indexed: 12/23/2022]
Abstract
BACKGROUND Despite technical and technological innovations, percutaneous puncture still represents the most challenging step when performing percutaneous nephrolithotomy. This maneuver is characterized by the steepest learning curve and a risk of injuring surrounding organs and kidney damage. OBJECTIVE To evaluate the feasibility of three-dimensional mixed reality (3D MR) holograms in establishing the access point and guiding the needle during percutaneous kidney puncture. DESIGN, SETTING, AND PARTICIPANTS This prospective study included ten patients who underwent 3D MR endoscopic combined intrarenal surgery (ECIRS) for kidney stones from July 2019 to January 2020. A retrospective series of patients who underwent a standard procedure were selected for matched pair analysis. SURGICAL PROCEDURE For patients who underwent 3D MR ECIRS, holograms were overlapped on the real anatomy to guide the surgeon during percutaneous puncture. In the standard group, the procedures were only guided by ultrasound and fluoroscopy. MEASUREMENTS Differences in preoperative and postoperative patient characteristics between the groups were tested using a χ2 test and a Kruskal-Wallis test for categorical and continuous variables, respectively. Results are reported as the median and interquartile range for continuous variables and as the frequency and percentage for categorical variables. RESULTS AND LIMITATIONS Ten patients underwent 3D MR ECIRS. In all cases, the inferior calyx was punctured correctly, as planned using the overlapping hologram. The median puncture and radiation exposure times were 27 min and 120 s, respectively. No intraoperative or major postoperative complications occurred. Matched pair analysis with the standard ECIRS group revealed a significantly shorter radiation exposure time for the 3D MR group (p < 0.001) even though the puncture time was longer in comparison to the standard group (p < 0.001). Finally, use of 3D MR led to a higher success rate for renal puncture at the first attempt (100% vs 50%; p = 0.032). The main limitations of the study are the small sample size and manual overlapping of the rigid hologram models. CONCLUSIONS Our experience demonstrates that 3D MR guidance for renal puncture is feasible and safe. The procedure proved to be effective, with the inferior calyx correctly punctured in all cases, and was associated with a low intraoperative radiation exposure time because of the MR guidance. PATIENT SUMMARY Three-dimensional virtual models visualized as holograms and intraoperatively overlapped on the patient's real anatomy seem to be a valid new tool for guiding puncture of the kidney through the skin for minimally invasive treatment.
Collapse
|
91
|
[Virtual reality in teaching of psychiatry and psychotherapy at medical school]. DER NERVENARZT 2021; 93:728-734. [PMID: 34735588 PMCID: PMC8567730 DOI: 10.1007/s00115-021-01227-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Accepted: 10/05/2021] [Indexed: 01/15/2023]
Abstract
Hintergrund Ausbildung und Lehre müssen sich den Gegebenheiten insbesondere in Corona-Zeiten anpassen, zumal neue digitale Technologien zur Verfügung stehen. Ärztliche Interaktions- und Explorationstechniken sind die wichtigsten Werkzeuge, die Medizinstudierende im Fach Psychiatrie und Psychotherapie zu erwerben haben. Ziel der Arbeit Avatare in virtueller Realität (VR) können grundsätzlich alle Krankheitsbilder in unterschiedlichen Schweregraden zu jeder Zeit repräsentieren. Material und Methoden Im Bochumer Avatar-Explorationsprojekt (AVEX) treten Studierende in den Dialog mit „psychisch kranken“ Avataren und versuchen, unter Anleitung und Supervision Diagnose, Differenzialdiagnose und Behandlungsempfehlungen zu erarbeiten. Ergebnisse und Diskussion Dadurch können die Studierenden auch seltene oder schwere psychiatrische Krankheitsbilder durch VR vermittelt kennenlernen. Dieser Übersichtsartikel stellt erste Erfahrungen insbesondere in Aufbau und Entwicklung sowie bez. der technologischen Herausforderungen dar.
Collapse
|
92
|
Chytas D, Nikolaou VS. Mixed reality for visualization of orthopedic surgical anatomy. World J Orthop 2021; 12:727-731. [PMID: 34754828 PMCID: PMC8554346 DOI: 10.5312/wjo.v12.i10.727] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/18/2021] [Revised: 06/16/2021] [Accepted: 08/30/2021] [Indexed: 02/06/2023] Open
Abstract
In the modern era, preoperative planning is substantially facilitated by artificial reality technologies, which permit a better understanding of patient anatomy, thus increasing the safety and accuracy of surgical interventions. In the field of orthopedic surgery, the increase in safety and accuracy improves treatment quality and orthopedic patient outcomes. Artificial reality technologies, which include virtual reality (VR), augmented reality (AR), and mixed reality (MR), use digital images obtained from computed tomography or magnetic resonance imaging. VR replaces the user’s physical environment with one that is computer generated. AR and MR have been defined as technologies that permit the fusing of the physical with the virtual environment, enabling the user to interact with both physical and virtual objects. MR has been defined as a technology that, in contrast to AR, enables users to visualize the depth and perspective of the virtual models. We aimed to shed light on the role that MR can play in the visualization of orthopedic surgical anatomy. The literature suggests that MR could be a valuable tool in orthopedic surgeon’s hands for visualization of the anatomy. However, we remark that confusion exists in the literature concerning the characteristics of MR. Thus, a more clear description of MR is needed in orthopedic research, so that the potential of this technology can be more deeply understood.
Collapse
|
93
|
Saito Y, Sugimoto M, Morine Y, Imura S, Ikemoto T, Yamada S, Shimada M. Intraoperative support with three-dimensional holographic cholangiography in hepatobiliary surgery. Langenbecks Arch Surg 2021; 407:1285-1289. [PMID: 34557939 DOI: 10.1007/s00423-021-02336-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2021] [Accepted: 09/20/2021] [Indexed: 11/25/2022]
Abstract
PURPOSE This study was performed to investigate the potential of intraoperative three-dimensional (3D) holographic cholangiography, which provides a computer graphics model of the biliary tract, with mixed reality techniques. METHODS Two patients with intraductal papillary neoplasm of the bile duct were enrolled in the study. Intraoperative 3D cholangiography was performed in a hybrid operating room. Three-dimensional polygon data using the acquired cholangiography data were installed into a head mount display (HoloLens; Microsoft Corporation, Redmond, WA, USA). RESULTS Upon completion of intraoperative 3D cholangiography, a hologram was immediately and successfully made in the operating room using the acquired cholangiography data, and several surgeons wearing the HoloLens succeeded in sharing the same hologram. Compared with usual two-dimensional cholangiography, this 3D holographic cholangiography technique contributed to more accurate reappearance of the bile ducts, especially the B1 origination site, and moving the hologram from the respective operators' angles by means of easy gesture-handling without any monitors. CONCLUSION Intraoperative 3D holographic cholangiography might be a new next-generation operation-support tool in terms of immediacy, accurate anatomical reappearance, and ease of handling.
Collapse
|
94
|
Liu P, Lu L, Liu S, Xie M, Zhang J, Huo T, Xie Y, Wang H, Duan Y, Hu Y, Ye Z. Mixed reality assists the fight against COVID-19. ACTA ACUST UNITED AC 2021; 1:16-18. [PMID: 34447601 PMCID: PMC8242102 DOI: 10.1016/j.imed.2021.05.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Revised: 04/12/2021] [Accepted: 05/21/2021] [Indexed: 12/18/2022]
Abstract
Coronavirus disease 2019 (COVID-19) made a huge effect globally. With the assistance of mixed reality (MR) technology, complicated clinical works became easier to carry out and the condition had been greatly improved with high-tech advantages such as improved convenience, better understanding and communication, higher security, and medical resource saving. This study aimed to introduce one kind of MR application in the fight against COVID-19 and anticipate more feasible smart health care applications to enhance our strength for the final victory.
Collapse
|
95
|
Wahba R, Thomas MN, Bunck AC, Bruns CJ, Stippel DL. Clinical use of augmented reality, mixed reality, three-dimensional-navigation and artificial intelligence in liver surgery. Artif Intell Gastroenterol 2021; 2:94-104. [DOI: 10.35712/aig.v2.i4.94] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/09/2021] [Revised: 07/10/2021] [Accepted: 08/27/2021] [Indexed: 02/06/2023] Open
Abstract
A precise knowledge of intra-parenchymal vascular and biliary architecture and the location of lesions in relation to the complex anatomy is indispensable to perform liver surgery. Therefore, virtual three-dimensional (3D)-reconstruction models from computed tomography/magnetic resonance imaging scans of the liver might be helpful for visualization. Augmented reality, mixed reality and 3D-navigation could transfer such 3D-image data directly into the operation theater to support the surgeon. This review examines the literature about the clinical and intraoperative use of these image guidance techniques in liver surgery and provides the reader with the opportunity to learn about these techniques. Augmented reality and mixed reality have been shown to be feasible for the use in open and minimally invasive liver surgery. 3D-navigation facilitated targeting of intraparenchymal lesions. The existing data is limited to small cohorts and description about technical details e.g., accordance between the virtual 3D-model and the real liver anatomy. Randomized controlled trials regarding clinical data or oncological outcome are not available. Up to now there is no intraoperative application of artificial intelligence in liver surgery. The usability of all these sophisticated image guidance tools has still not reached the grade of immersion which would be necessary for a widespread use in the daily surgical routine. Although there are many challenges, augmented reality, mixed reality, 3D-navigation and artificial intelligence are emerging fields in hepato-biliary surgery.
Collapse
|
96
|
Rau A, Roelz R, Urbach H, Coenen VA, Demerath T, Reinacher PC. Application of Augmented Reality in Percutaneous Procedures-Rhizotomy of the Gasserian Ganglion. Oper Neurosurg (Hagerstown) 2021; 21:160-164. [PMID: 34098574 PMCID: PMC8555421 DOI: 10.1093/ons/opab155] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2020] [Accepted: 03/14/2021] [Indexed: 01/18/2023] Open
Abstract
BACKGROUND Percutaneous rhizotomy of the Gasserian ganglion for trigeminal neuralgia is an effective therapeutic procedure. Yet, landmark-guided cannulation of the foramen ovale is manually challenging and difficult to learn. OBJECTIVE To overcome these limitations, we assessed the feasibility and accuracy of an augmented reality (AR)-guided puncture of the foramen ovale. METHODS A head phantom with soft tissue structures of the facial area was built. A three-dimensional (3D)-dataset of the phantom was generated using a stereotactic planning workstation. An optimal trajectory to the foramen ovale was created and then transferred to an AR headset. A total of 2 neurosurgeons and 2 neuroradiologists independently performed 8 AR-guided and 8 landmark-guided cannulations of the foramen ovale, respectively. For each AR-guided cannulation, the hologram was manually aligned with the phantom. Accuracy of the cannulation was evaluated using the Euclidean distance to the target point as well as the lateral deviation of the achieved trajectory from the planned trajectory at target point level. RESULTS With the help of AR guidance, a successful cannulation of the foramen ovale was achieved in 90.6% compared to the purely landmark-based method with 18.8%. Euclidean distance and lateral deviation were significantly lower with AR guidance than landmark guidance (P < .01). CONCLUSION AR greatly improved accuracy of simulated percutaneous rhizotomy of the Gasserian ganglion.
Collapse
|
97
|
Pan Z, Luo T, Zhang M, Cai N, Li Y, Miao J, Li Z, Pan Z, Shen Y, Lu J. MagicChem: a MR system based on needs theory for chemical experiments. VIRTUAL REALITY 2021; 26:279-294. [PMID: 34312581 PMCID: PMC8295458 DOI: 10.1007/s10055-021-00560-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Accepted: 07/08/2021] [Indexed: 06/13/2023]
Abstract
UNLABELLED Real chemical experiments may be dangerous or pollute the environment; meanwhile, the preparation of drugs and reagents is time-consuming. Due to the above-mentioned reasons, few experiments can be actually operated by students, which is not conducive to the chemistry learning and the phenomena principle understanding. Recently, due to the impact of Covid-19, many schools adopt online teaching, which is even more detrimental to students' learning of chemistry. Fortunately, MR(mixed reality) technology provides us with the possibility of solving the safety issues and breaking the space-time constraints, while the theory of human needs (Maslow's hierarchical needs) provides us with a way to design a comfortable and stimulant MR system with realistic visual presentation and interaction. The paper combines with the theory of human needs to propose a new needs model for virtual experiment. Based on this needs model, we design and develop a comprehensive MR system called MagicChem, which offers a robust 6-DoF interactive and illumination consistent experimental space with virtual-real occlusion, supporting realistic visual interaction, tangible interaction, gesture interaction with touching, voice interaction, temperature interaction, olfactory interaction and virtual human interaction. User study shows that MagicChem satisfies the needs model better than other MR experimental environments that partially meet the needs model. In addition, we explore the application of the needs model in VR environment. SUPPLEMENTARY INFORMATION The online version contains supplementary material available at 10.1007/s10055-021-00560-z.
Collapse
|
98
|
Silva JNA, Southworth MK, Andrews CM, Privitera MB, Henry AB, Silva JR. Design Considerations for Interacting and Navigating with 2 Dimensional and 3 Dimensional Medical Images in Virtual, Augmented and Mixed Reality Medical Applications. VIRTUAL, AUGMENTED AND MIXED REALITY : 13TH INTERNATIONAL CONFERENCE, VAMR 2021, HELD AS PART OF THE 23RD HCI INTERNATIONAL CONFERENCE, HCII 2021, VIRTUAL EVENT, JULY 24-29, 2021, PROCEEDINGS. VAMR (CONFERENCE) (13TH : 2021 : ONLINE) 2021; 12770:117-133. [PMID: 35079751 PMCID: PMC8786214 DOI: 10.1007/978-3-030-77599-5_10] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
The extended realities, including virtual, augmented, and mixed realities (VAMR) have recently experienced significant hardware improvement resulting in an expansion in medical applications. These applications can be classified by the target end user (for instance, classifying applications as patient-centric, physician-centric, or both) or by use case (for instance educational, diagnostic tools, therapeutic tools, or some combination). When developing medical applications in VAMR, careful consideration of both the target end user and use case must heavily influence design considerations, particularly methods and tools for interaction and navigation. Medical imaging consists of both 2-dimensional and 3-dimensional medical imaging which impacts design, interaction, and navigation. Additionally, medical applications need to comply with regulatory considerations which will also influence interaction and design considerations. In this manuscript, the authors explore these considerations using three VAMR tools being developed for cardiac electrophysiology procedures.
Collapse
|
99
|
Chidambaram S, Stifano V, Demetres M, Teyssandier M, Palumbo MC, Redaelli A, Olivi A, Apuzzo MLJ, Pannullo SC. Applications of augmented reality in the neurosurgical operating room: A systematic review of the literature. J Clin Neurosci 2021; 91:43-61. [PMID: 34373059 DOI: 10.1016/j.jocn.2021.06.032] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Revised: 06/17/2021] [Accepted: 06/18/2021] [Indexed: 12/15/2022]
Abstract
Advancements in imaging techniques are key forces of progress in neurosurgery. The importance of accurate visualization of intraoperative anatomy cannot be overemphasized and is commonly delivered through traditional neuronavigation. Augmented Reality (AR) technology has been tested and applied widely in various neurosurgical subspecialties in intraoperative, clinical use and shows promise for the future. This systematic review of the literature explores the ways in which AR technology has been successfully brought into the operating room (OR) and incorporated into clinical practice. A comprehensive literature search was performed in the following databases from inception-April 2020: Ovid MEDLINE, Ovid EMBASE, and The Cochrane Library. Studies retrieved were then screened for eligibility against predefined inclusion/exclusion criteria. A total of 54 articles were included in this systematic review. The studies were sub- grouped into brain and spine subspecialties and analyzed for their incorporation of AR in the neurosurgical clinical setting. AR technology has the potential to greatly enhance intraoperative visualization and guidance in neurosurgery beyond the traditional neuronavigation systems. However, there are several key challenges to scaling the use of this technology and bringing it into standard operative practice including accurate and efficient brain segmentation of magnetic resonance imaging (MRI) scans, accounting for brain shift, reducing coregistration errors, and improving the AR device hardware. There is also an exciting potential for future work combining AR with multimodal imaging techniques and artificial intelligence to further enhance its impact in neurosurgery.
Collapse
|
100
|
Molina CA, Sciubba DM, Greenberg JK, Khan M, Witham T. Clinical Accuracy, Technical Precision, and Workflow of the First in Human Use of an Augmented-Reality Head-Mounted Display Stereotactic Navigation System for Spine Surgery. Oper Neurosurg (Hagerstown) 2021; 20:300-309. [PMID: 33377137 DOI: 10.1093/ons/opaa398] [Citation(s) in RCA: 44] [Impact Index Per Article: 14.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2020] [Accepted: 09/13/2020] [Indexed: 12/17/2022] Open
Abstract
BACKGROUND Augmented reality mediated spine surgery is a novel technology for spine navigation. Benchmark cadaveric data have demonstrated high accuracy and precision leading to recent regulatory approval. Absence of respiratory motion in cadaveric studies may positively bias precision and accuracy results and analogous investigations are prudent in live clinical scenarios. OBJECTIVE To report a technical note, accuracy, precision analysis of the first in-human deployment of this technology. METHODS A 78-yr-old female underwent an L4-S1 decompression, pedicle screw, and rod fixation for degenerative spine disease. Six pedicle screws were inserted via AR-HMD (xvision; Augmedics, Chicago, Illinois) navigation. Intraoperative computed tomography was used for navigation registration as well as implant accuracy and precision assessment. Clinical accuracy was graded per the Gertzbein-Robbins (GS) scale by an independent neuroradiologist. Technical precision was analyzed by comparing 3-dimensional (3D) (x, y, z) virtual implant vs real implant position coordinates and reported as linear (mm) and angular (°) deviation. Present data were compared to benchmark cadaveric data. RESULTS Clinical accuracy (per the GS grading scale) was 100%. Technical precision analysis yielded a mean linear deviation of 2.07 mm (95% CI: 1.62-2.52 mm) and angular deviation of 2.41° (95% CI: 1.57-3.25°). In comparison to prior cadaveric data (99.1%, 2.03 ± 0.99 mm, 1.41 ± 0.61°; GS accuracy 3D linear and angular deviation, respectively), the present results were not significantly different (P > .05). CONCLUSION The first in human deployment of the single Food and Drug Administration approved AR-HMD stereotactic spine navigation platform demonstrated clinical accuracy and technical precision of inserted hardware comparable to previously acquired cadaveric studies.
Collapse
|