1
|
Necker FN, Cholok DJ, Fischer MJ, Shaheen MS, Gifford K, Januszyk M, Leuze CW, Scholz M, Daniel BL, Momeni A. HoloDIEP-Faster and More Accurate Intraoperative DIEA Perforator Mapping Using a Novel Mixed Reality Tool. J Reconstr Microsurg 2024. [PMID: 39038461 DOI: 10.1055/s-0044-1788548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/24/2024]
Abstract
BACKGROUND Microsurgical breast reconstruction using abdominal tissue is a complex procedure, in part, due to variable vascular/perforator anatomy. Preoperative computed tomography angiography (CTA) has mitigated this challenge to some degree; yet it continues to pose certain challenges. The ability to map perforators with Mixed Reality has been demonstrated in case studies, but its accuracy has not been studied intraoperatively. Here, we compare the accuracy of "HoloDIEP" in identifying perforator location (vs. Doppler ultrasound) by using holographic 3D models derived from preoperative CTA. METHODS Using a custom application on HoloLens, the deep inferior epigastric artery vascular tree was traced in 15 patients who underwent microsurgical breast reconstruction. Perforator markings were compared against the 3D model in a coordinate system centered on the umbilicus. Holographic- and Doppler-identified markings were compared using a perspective-corrected photo technique against the 3D model along with measurement of duration of perforator mapping for each technique. RESULTS Vascular points in HoloDIEP skin markings were -0.97 ± 6.2 mm (perforators: -0.62 ± 6.13 mm) away from 3D-model ground-truth in radial length from the umbilicus at a true distance of 10.81 ± 6.14 mm (perforators: 11.40 ± 6.15 mm). Absolute difference in radial distance was twice as high for Doppler markings compared with Holo-markings (9.71 ± 6.16 and 4.02 ± 3.20 mm, respectively). Only in half of all cases (7/14), more than 50% of the Doppler-identified points were reasonably close (<30 mm) to 3D-model ground-truth. HoloDIEP was twice as fast as Doppler ultrasound (76.9s vs. 150.4 s per abdomen). CONCLUSION HoloDIEP allows for faster and more accurate intraoperative perforator mapping than Doppler ultrasound.
Collapse
Affiliation(s)
- Fabian N Necker
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
- Digital Anatomy Lab, Faculty of Medicine, Institute of Functional and Clinical Anatomy, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - David J Cholok
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Marc J Fischer
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Mohammed S Shaheen
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Kyle Gifford
- Department of Radiology, 3D and Quantitative Imaging, Stanford University School of Medicine, Stanford, California
| | - Michael Januszyk
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Christoph W Leuze
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Michael Scholz
- Digital Anatomy Lab, Faculty of Medicine, Institute of Functional and Clinical Anatomy, Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), Erlangen, Germany
| | - Bruce L Daniel
- Department of Radiology, Stanford IMMERS (Incubator for Medical Mixed and Extended Reality at Stanford), Stanford University School of Medicine, Palo Alto, California
| | - Arash Momeni
- Division of Plastic and Reconstructive Surgery, Stanford University School of Medicine, Palo Alto, California
| |
Collapse
|
2
|
Chou DW, Annadata V, Willson G, Gray M, Rosenberg J. Augmented and Virtual Reality Applications in Facial Plastic Surgery: A Scoping Review. Laryngoscope 2024; 134:2568-2577. [PMID: 37947302 DOI: 10.1002/lary.31178] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2023] [Revised: 10/05/2023] [Accepted: 10/27/2023] [Indexed: 11/12/2023]
Abstract
OBJECTIVES Augmented reality (AR) and virtual reality (VR) are emerging technologies with wide potential applications in health care. We performed a scoping review of the current literature on the application of augmented and VR in the field of facial plastic and reconstructive surgery (FPRS). DATA SOURCES PubMed and Web of Science. REVIEW METHODS According to PRISMA guidelines, PubMed and Web of Science were used to perform a scoping review of literature regarding the utilization of AR and/or VR relevant to FPRS. RESULTS Fifty-eight articles spanning 1997-2023 met the criteria for review. Five overarching categories of AR and/or VR applications were identified across the articles: preoperative, intraoperative, training/education, feasibility, and technical. The following clinical areas were identified: burn, craniomaxillofacial surgery (CMF), face transplant, face lift, facial analysis, facial palsy, free flaps, head and neck surgery, injectables, locoregional flaps, mandible reconstruction, mandibuloplasty, microtia, skin cancer, oculoplastic surgery, rhinology, rhinoplasty, and trauma. CONCLUSION AR and VR have broad applications in FPRS. AR for surgical navigation may have the most emerging potential in CMF surgery and free flap harvest. VR is useful as distraction analgesia for patients and as an immersive training tool for surgeons. More data on these technologies' direct impact on objective clinical outcomes are still needed. LEVEL OF EVIDENCE N/A Laryngoscope, 134:2568-2577, 2024.
Collapse
Affiliation(s)
- David W Chou
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Emory University School of Medicine, Atlanta, Georgia, USA
| | - Vivek Annadata
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Gloria Willson
- Education and Research Services, Levy Library, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Mingyang Gray
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Joshua Rosenberg
- Division of Facial Plastic and Reconstructive Surgery, Department of Otolaryngology-Head and Neck Surgery, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| |
Collapse
|
3
|
Katayama M, Mitsuno D, Ueda K. Clinical Application to Improve the "Depth Perception Problem" by Combining Augmented Reality and a 3D Printing Model. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2023; 11:e5071. [PMID: 37361506 PMCID: PMC10289554 DOI: 10.1097/gox.0000000000005071] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2023] [Accepted: 04/28/2023] [Indexed: 06/28/2023]
Abstract
In our experience with intraoperative evaluation and educational application of augmented reality technology, an illusion of depth has been a major problem. To improve this depth perception problem, we conducted two experiments combining various three-dimensional models and holograms and the observation angles using an augmented reality device. Methods In experiment 1, when observing holograms projected on the surface layer of the model (bone model) or holograms projected on a layer deeper than the model (body surface model), the observer's first impression regarding which model made it easier to understand positional relationships was investigated. In experiment 2, to achieve a more quantitative evaluation, the observer was asked to measure the distance between two specific points on the surface and deep layers from two angles in each of the above combinations. Statistical analysis was performed on the measurement error for this distance. Results In experiment 1, the three-dimensional positional relationships were easier to understand in the bone than in the body surface model. In experiment 2, there was not much difference in the measurement error under either condition, which was not large enough to cause a misunderstanding of the depth relationship between the surface and deep layers. Conclusions Any combination can be used for preoperative examinations and anatomical study purposes. In particular, projecting holograms on a deep model or observing positional relationships from not only the operator's viewpoint, but also multiple other angles is more desirable because it reduces confusion caused by the depth perception problem and improves understanding of anatomy.
Collapse
Affiliation(s)
- Misato Katayama
- From the Department of Plastic and Reconstructive Surgery, Osaka Medical and Pharmaceutical University, Takatsuki City, Osaka, Japan
| | - Daisuke Mitsuno
- From the Department of Plastic and Reconstructive Surgery, Osaka Medical and Pharmaceutical University, Takatsuki City, Osaka, Japan
| | - Koichi Ueda
- From the Department of Plastic and Reconstructive Surgery, Osaka Medical and Pharmaceutical University, Takatsuki City, Osaka, Japan
| |
Collapse
|
4
|
Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol 2023; 68. [PMID: 36580681 DOI: 10.1088/1361-6560/acaf23] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2022] [Accepted: 12/29/2022] [Indexed: 12/31/2022]
Abstract
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of these AR systems in different surgical fields. The types of AR visualization are divided into two categories ofin situvisualization and nonin situvisualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contributions and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if the current development trend continues, it will soon reveal its clinical utility.
Collapse
Affiliation(s)
- Longfei Ma
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Tianqi Huang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Jie Wang
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| | - Hongen Liao
- Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, People's Republic of China
| |
Collapse
|
5
|
Chegini S, Edwards E, McGurk M, Clarkson M, Schilling C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br J Oral Maxillofac Surg 2023; 61:19-27. [PMID: 36513525 DOI: 10.1016/j.bjoms.2022.08.007] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Revised: 07/31/2022] [Accepted: 08/17/2022] [Indexed: 12/14/2022]
Abstract
Augmented-reality (AR) head-mounted devices (HMD) allow the wearer to have digital images superposed on to their field of vision. They are being used to superpose annotations on to the surgical field akin to a navigation system. This review examines published validation studies on HMD-AR systems, their reported protocols, and outcomes. The aim was to establish commonalities and an acceptable registration outcome. Multiple databases were systematically searched for relevant articles between January 2015 and January 2021. Studies that examined the registration of AR content using a HMD to guide surgery were eligible for inclusion. The country of origin, year of publication, medical specialty, HMD device, software, and method of registration, were recorded. A meta-analysis of the mean registration error was conducted. A total of 4784 papers were identified, of which 23 met the inclusion criteria. They included studies using HoloLens (Microsoft) (n = 22) and nVisor ST60 (NVIS Inc) (n = 1). Sixty-six per cent of studies were in hard tissue specialties. Eleven studies reported registration errors using pattern markers (mean (SD) 2.6 (1.8) mm), and four reported registration errors using surface markers (mean (SD) 3.8 (3.7) mm). Three studies reported registration errors using manual alignment (mean (SD) 2.2 (1.3) mm). The majority of studies in this review used in-house software with a variety of registration methods and reported errors. The mean registration error calculated in this study can be considered as a minimum acceptable standard. It should be taken into consideration when procedural applications are selected.
Collapse
Affiliation(s)
- Soudeh Chegini
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom.
| | - Eddie Edwards
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Mark McGurk
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Matthew Clarkson
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| | - Clare Schilling
- University College London, Charles Bell House, 43-45 Foley St, London W1W 7TY, United Kingdom
| |
Collapse
|
6
|
Nonsubjective Assessment of Shape, Volume and Symmetry during Breast Augmentation with Handheld 3D Device. J Clin Med 2022; 11:jcm11144002. [PMID: 35887767 PMCID: PMC9320179 DOI: 10.3390/jcm11144002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2022] [Revised: 07/03/2022] [Accepted: 07/04/2022] [Indexed: 11/17/2022] Open
Abstract
Three-dimensional Surface Imaging (3DSI) has become a valuable tool for planning and documenting surgical procedures. Although surface scanners have allowed for a better understanding of breast shape, size, and asymmetry during patient consultation, its use has not been included in intraoperative assessment so far. Validation of the reliability of the intraoperative use of a portable handheld 3DSI equipment as a tool to evaluate morphological changes during breast augmentation surgery. The patients who underwent bilateral subpectoral breast augmentation through an inframammary incision were included in this study. Intraoperative 3DSI was performed with the Artec Eva device, allowing for visualization of the surgical area before incision, after use of breast sizers and implant, and after wound closure. Intraoperatively manual measurements of breast distances and volume changes due to known sizer and implant volumes were in comparison with digital measurements calculated from 3DSI of the surgical area. Bilateral breasts of 40 patients were 3D photographed before incision and after suture successfully. A further 108 implant sizer uses were digitally documented. There was no significant difference between manual tape measurement and digital breast distance measurement. Pre- to postoperative 3D volume change showed no significant difference to the known sizer and implant volume.
Collapse
|
7
|
Lin L, Gao Y, Aung ZM, Xu H, Wang B, Yang X, Chai G, Xie L. Preliminary reports of augmented-reality assisted craniofacial bone fracture reduction. J Plast Reconstr Aesthet Surg 2022; 75:e1-e8. [DOI: 10.1016/j.bjps.2022.06.105] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2021] [Revised: 05/01/2022] [Accepted: 06/05/2022] [Indexed: 10/31/2022]
|
8
|
Objective evaluation of volumetric changes during breast augmentation using intraoperative three-dimensional surface imaging. J Plast Reconstr Aesthet Surg 2022; 75:3094-3100. [DOI: 10.1016/j.bjps.2022.06.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2021] [Revised: 03/13/2022] [Accepted: 06/05/2022] [Indexed: 11/18/2022]
|
9
|
Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med Image Anal 2022; 77:102361. [PMID: 35168103 PMCID: PMC10466024 DOI: 10.1016/j.media.2022.102361] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 11/17/2021] [Accepted: 01/10/2022] [Indexed: 12/11/2022]
Abstract
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n=40). Experiments mainly involve phantoms (n=43) or system setup (n=21), with patient case studies ranking third (n=19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Collapse
Affiliation(s)
- Manuel Birlo
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK.
| | - P J Eddie Edwards
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Matthew Clarkson
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| | - Danail Stoyanov
- Wellcome/EPSRC Centre for Interventional and Surgical Sciences (WEISS), University College London (UCL), Charles Bell House, 43-45 Foley Street, London W1W 7TS, UK
| |
Collapse
|
10
|
Intraoperative Navigation in Plastic Surgery with Augmented Reality: A Preclinical Validation Study. Plast Reconstr Surg 2022; 149:573e-580e. [PMID: 35196700 DOI: 10.1097/prs.0000000000008875] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
BACKGROUND Augmented reality allows users to visualize and interact with digital images including three-dimensional holograms in the real world. This technology may have value intraoperatively by improving surgical decision-making and precision but relies on the ability to accurately align a hologram to a patient. This study aims to quantify the accuracy with which a hologram of soft tissue can be aligned to a patient and used to guide intervention. METHODS A mannequin's face was marked in a standardized fashion with 14 incision patterns in red and nine reference points in blue. A three-dimensional photograph was then taken, converted into a hologram, and uploaded to HoloLens (Verto Studio LLC, San Diego, Calif.), a wearable augmented reality device. The red markings were then erased, leaving only the blue points. The hologram was then viewed through the HoloLens in augmented reality and aligned onto the mannequin. The user then traced the overlaid red markings present on the hologram. Three-dimensional photographs of the newly marked mannequin were then taken and compared with the baseline three-dimensional photographs of the mannequin for accuracy of the red markings. This process was repeated for 15 trials (n = 15). RESULTS The accuracy of the augmented reality-guided intervention, when considering all trials, was 1.35 ± 0.24 mm. Markings that were positioned laterally on the face were significantly more difficult to reproduce than those centered around the facial midline. CONCLUSIONS Holographic markings can be accurately translated onto a mannequin with an average error of less than 1.4 mm. These data support the notion that augmented reality navigation may be practical and reliable for clinical integration in plastic surgery.
Collapse
|
11
|
Clinical Applications of Meshed Multilayered Anatomical Models by Low-Cost Three-Dimensional Printer. Plast Reconstr Surg 2021; 148:1047e-1051e. [PMID: 34847134 DOI: 10.1097/prs.0000000000008568] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
SUMMARY In recent years, even low-cost fused deposition modeling-type three-dimensional printers can be used to create a three-dimensional model with few errors. The authors devised a method to create a three-dimensional multilayered anatomical model at a lower cost and more easily than with established methods, by using a meshlike structure as the surface layer. Fused deposition modeling-type three-dimensional printers were used, with opaque polylactide filament for material. Using the three-dimensional data-editing software Blender (Blender Foundation, www.blender.org) and Instant Meshes (Jakob et al., https://igl.ethz.ch/projects/instant-meshes/) together, the body surface data were converted into a meshlike structure while retaining its overall shape. The meshed data were printed together with other data (nonmeshed) or printed separately. In each case, the multilayer model in which the layer of the body surface was meshed could be output without any trouble. It was possible to grasp the positional relationship between the body surface and the deep target, and it was clinically useful. The total work time for preparation and processing of three-dimensional data ranged from 1 hour to several hours, depending on the case, but the work time required for converting into a meshlike shape was about 10 minutes in all cases. The filament cost was $2 to $8. In conclusion, the authors devised a method to create a three-dimensional multilayered anatomical model to easily visualize positional relationships within the structure by converting the surface layer into a meshlike structure. This method is easy to adopt, regardless of the available facilities and economic environment, and has broad applications.
Collapse
|
12
|
Matsui C, Banda CH, Okada Y, Shiraishi M, Shimizu K, Mitsui K, Danno K, Ishiura R, Narushima M. Shaping the future of microsurgery: Combination of exoscope and smart glasses. J Plast Reconstr Aesthet Surg 2021; 75:893-939. [PMID: 34852969 DOI: 10.1016/j.bjps.2021.11.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2021] [Accepted: 11/03/2021] [Indexed: 11/30/2022]
Affiliation(s)
- Chiaki Matsui
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Chihena H Banda
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Yoshimoto Okada
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Makoto Shiraishi
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Kotaro Shimizu
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Kohei Mitsui
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Kanako Danno
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Ryohei Ishiura
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan
| | - Mitsunaga Narushima
- Department of Plastic and Reconstructive Surgery, Mie University, Tsu, Japan.
| |
Collapse
|
13
|
Intraoperative 3-dimensional Projection of Blood Vessels on Body Surface Using an Augmented Reality System. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2020; 8:e3028. [PMID: 32983783 PMCID: PMC7489712 DOI: 10.1097/gox.0000000000003028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2020] [Accepted: 06/11/2020] [Indexed: 11/26/2022]
Abstract
Preoperative understanding of the running pattern of blood vessels is an important factor to approach surgical fields safely. In 2 cases where the vascular abnormalities were estimated, we projected the blood vessels onto the surgical field using an augmented reality device HoloLens. A splint was made to allow the patient to be fixed while undergoing computed tomographic angiography. Three-dimensional (3D) data on the blood vessels, skin surfaces, bones, and the 3 chosen points for alignment were segmented and then projected onto the body surfaces as holograms using the HoloLens. Two types of projection for holograms were used: projection type 1-where the body contours were projected as a line, and projection type 2-where the body surface was projected as meshed skin type. By projecting projection type 2 rather than projection type 1, we gained a better understanding of the 3D anatomic findings and deformation characteristics, including the anatomic blood vessel variation and positional relationships between the organs and body surfaces. To some extent, we could make sure that the depth perception can be obtained by recognizing the bone, vessels, or tumor inside the meshed skin surface. Our new method allows the 3D visualization of blood vessels from the body surface, and helps understand the 3D anatomic variation of the blood vessels to be applied as long as the blood vessels can be visualized.
Collapse
|
14
|
Augmented Reality Technology for the Positioning of the Auricle in the Treatment of Microtia. PLASTIC AND RECONSTRUCTIVE SURGERY-GLOBAL OPEN 2020; 8:e2626. [PMID: 32309078 PMCID: PMC7159966 DOI: 10.1097/gox.0000000000002626] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2019] [Accepted: 12/09/2019] [Indexed: 11/25/2022]
Abstract
Supplemental Digital Content is available in the text. Background: The positioning of the auricle is a key factor in successful ear reconstruction. However, the position of the ear is usually determined by transferring the auricle image of the nonaffected side to the affected side using a transparent film. Augmented reality (AR) is becoming useful in the surgical field allowing computer-generated images to be superimposed on patients. In this report, we would like to introduce an application of AR technology in ear reconstruction. Methods: AR technology was used to determine the position of the reconstructed ear of a 10-year-old male with right microtia. Preoperative 3-dimensional photographs of the nonaffected side were taken using VECTRAH1. Then, the image was horizontally inverted and superimposed on the three-dimensional image of the affected side with reference to the anatomical landmarks of the patient’s face. These images were projected onto the patient in the operation room using Microsoft’s HoloLens. The design and positioning of the auricle was done with reference to the AR image. To confirm the accuracy of the AR technique, we compared it to the original transparent film technique. After the insertion of the cartilage framework into the skin pocket, the position and shape of the reconstructed ear was confirmed using the AR technology. Results: The positioning of the reconstructed ear was successfully performed. The deviation between the 2 designated positions using the AR and the transparent film was within 2 mm. Conclusion: The AR technology is a promising option in the surgical treatment of microtia.
Collapse
|
15
|
Sayadi LR, Naides A, Eng M, Fijany A, Chopan M, Sayadi JJ, Shaterian A, Banyard DA, Evans GRD, Vyas R, Widgerow AD. The New Frontier: A Review of Augmented Reality and Virtual Reality in Plastic Surgery. Aesthet Surg J 2019; 39:1007-1016. [PMID: 30753313 DOI: 10.1093/asj/sjz043] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2018] [Revised: 01/12/2019] [Accepted: 02/08/2019] [Indexed: 11/14/2022] Open
Abstract
Mixed reality, a blending of the physical and digital worlds, can enhance the surgical experience, leading to greater precision, efficiency, and improved outcomes. Various studies across different disciplines have reported encouraging results using mixed reality technologies, such as augmented and virtual reality. To provide a better understanding of the applications and limitations of this technology in plastic surgery, we performed a systematic review of the literature in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. The initial query of the National Center for Biotechnology Information database yielded 2544 results, and only 46 articles met our inclusion criteria. The majority of studies were in the field of craniofacial surgery, and uses of mixed reality included preoperative planning, intraoperative guides, and education of surgical trainees. A deeper understanding of mixed reality technologies may promote its integration and also help inspire new and creative applications in healthcare.
Collapse
Affiliation(s)
| | | | | | | | - Mustafa Chopan
- Resident, Division of Plastic and Reconstructive Surgery, University of Florida, Gainesville, FL
| | | | | | | | | | | | - Alan D Widgerow
- Director of the UC Irvine Center for Tissue Engineering, UC Irvine Department of Plastic Surgery, Center for Tissue Engineering, Orange, CA
| |
Collapse
|
16
|
Effective Application of Mixed Reality Device HoloLens: Simple Manual Alignment of Surgical Field and Holograms. Plast Reconstr Surg 2019; 143:647-651. [PMID: 30688914 DOI: 10.1097/prs.0000000000005215] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
The technology used to add information to a real visual field is defined as augmented reality technology. Augmented reality technology that can interactively manipulate displayed information is called mixed reality technology. HoloLens from Microsoft, which is a head-mounted mixed reality device released in 2016, can display a precise three-dimensional model stably on the real visual field as hologram. If it is possible to accurately superimpose the position/direction of the hologram in the surgical field, surgical navigation-like use can be expected. However, in HoloLens, there was no such function. The authors devised a method that can align the surgical field and holograms precisely within a short time using a simple manual operation. The mechanism is to match the three points on the hologram to the corresponding marking points of the body surface. By making it possible to arbitrarily select any of the three points as a pivot/axis of the rotational movement of the hologram, alignment by manual operation becomes very easy. The alignment between the surgical field and the hologram was good and thus contributed to intraoperative objective judgment. By using the method of this study, the clinical usefulness of the mixed reality device HoloLens will be expanded.
Collapse
|
17
|
|