1
|
Ye J, Chen Q, Zhong T, Liu J, Gao H. Is Overlain Display a Right Choice for AR Navigation? A Qualitative Study of Head-Mounted Augmented Reality Surgical Navigation on Accuracy for Large-Scale Clinical Deployment. CNS Neurosci Ther 2025; 31:e70217. [PMID: 39817491 PMCID: PMC11736426 DOI: 10.1111/cns.70217] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2024] [Revised: 12/24/2024] [Accepted: 01/03/2025] [Indexed: 01/18/2025] Open
Abstract
BACKGROUND During the course of the past two decades, head-mounted augmented reality surgical navigation (HMARSN) systems have been increasingly employed in a variety of surgical specialties as a result of both advancements in augmented reality-related technologies and surgeons' desires to overcome some drawbacks inherent to conventional surgical navigation systems. In the present time, most experimental HMARSN systems adopt overlain display (OD) that overlay virtual models and planned routes of surgical tools on corresponding physical tissues, organs, lesions, and so forth, in a surgical field so as to provide surgeons with an intuitive and direct view to gain better hand-eye coordination as well as avoid attention shift and loss of sight (LOS), among other benefits during procedures. Yet, its system accuracy, which is the most crucial performance indicator of any surgical navigation system, is difficult to ascertain because it is highly subjective and user-dependent. Therefore, the aim of this study was to review presently available experimental OD HMARSN systems qualitatively, explore how their system accuracy is affected by overlain display, and find out if such systems are suited to large-scale clinical deployment. METHOD We searched PubMed and ScienceDirect with the following terms: head mounted augmented reality surgical navigation, and 445 records were returned in total. After screening and eligibility assessment, 60 papers were finally analyzed. Specifically, we focused on how their accuracies were defined and measured, as well as whether such accuracies are stable in clinical practice and competitive with corresponding commercially available systems. RESULTS AND CONCLUSIONS The primary findings are that the system accuracy of OD HMARSN systems is seriously affected by a transformation between the spaces of the user's eyes and the surgical field, because measurement of the transformation is heavily individualized and user-dependent. Additionally, the transformation itself is potentially subject to changes during surgical procedures, and hence unstable. Therefore, OD HMARSN systems are not suitable for large-scale clinical deployment.
Collapse
Affiliation(s)
- Jian Ye
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Qingwen Chen
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Tao Zhong
- Department of NeurosurgeryThe First Affiliated Hospital of Guangdong Pharmaceutical UniversityGuangzhouChina
| | - Jian Liu
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| | - Han Gao
- Department of Neurosurgery, Affiliated Qingyuan HospitalGuangzhou Medical University, Qingyuan People's HospitalQiangyuanChina
| |
Collapse
|
2
|
Gorgy A, Xu HH, Hawary HE, Nepon H, Lee J, Vorstenbosch J. Integrating AI into Breast Reconstruction Surgery: Exploring Opportunities, Applications, and Challenges. Plast Surg (Oakv) 2024:22925503241292349. [PMID: 39545210 PMCID: PMC11559540 DOI: 10.1177/22925503241292349] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2024] [Revised: 08/25/2024] [Accepted: 09/08/2024] [Indexed: 11/17/2024] Open
Abstract
Background: Artificial intelligence (AI) has significantly influenced various sectors, including healthcare, by enhancing machine capabilities in assisting with human tasks. In surgical fields, where precision and timely decision-making are crucial, AI's integration could revolutionize clinical quality and health resource optimization. This study explores the current and future applications of AI technologies in reconstructive breast surgery, aiming for broader implementation. Methods: We conducted systematic reviews through PubMed, Web of Science, and Google Scholar using relevant keywords and MeSH terms. The focus was on the main AI subdisciplines: machine learning, computer vision, natural language processing, and robotics. This review includes studies discussing AI applications across preoperative, intraoperative, postoperative, and academic settings in breast plastic surgery. Results: AI is currently utilized preoperatively to predict surgical risks and outcomes, enhancing patient counseling and informed consent processes. During surgery, AI supports the identification of anatomical landmarks and dissection strategies and provides 3-dimensional visualizations. Robotic applications are promising for procedures like microsurgical anastomoses, flap harvesting, and dermal matrix anchoring. Postoperatively, AI predicts discharge times and customizes follow-up schedules, which improves resource allocation and patient management at home. Academically, AI offers personalized training feedback to surgical trainees and aids research in breast reconstruction. Despite these advancements, concerns regarding privacy, costs, and operational efficacy persist and are critically examined in this review. Conclusions: The application of AI in breast plastic and reconstructive surgery presents substantial benefits and diverse potentials. However, much remains to be explored and developed. This study aims to consolidate knowledge and encourage ongoing research and development within the field, thereby empowering the plastic surgery community to leverage AI technologies effectively and responsibly for advancing breast reconstruction surgery.
Collapse
Affiliation(s)
- Andrew Gorgy
- Department of Plastic and Reconstructive Surgery, McGill University Health Center, Montreal, Quebec, Canada
| | - Hong Hao Xu
- Faculty of Medicine, Laval University, Quebec City, Quebec, Canada
| | - Hassan El Hawary
- Department of Plastic and Reconstructive Surgery, McGill University Health Center, Montreal, Quebec, Canada
| | - Hillary Nepon
- Department of Plastic and Reconstructive Surgery, McGill University Health Center, Montreal, Quebec, Canada
| | - James Lee
- Department of Plastic and Reconstructive Surgery, McGill University Health Center, Montreal, Quebec, Canada
| | - Joshua Vorstenbosch
- Department of Plastic and Reconstructive Surgery, McGill University Health Center, Montreal, Quebec, Canada
| |
Collapse
|
3
|
Gao B, Jiang J, Zhou S, Li J, Zhou Q, Li X. Toward the Next Generation Human-Machine Interaction: Headworn Wearable Devices. Anal Chem 2024; 96:10477-10487. [PMID: 38888091 DOI: 10.1021/acs.analchem.4c01190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/20/2024]
Abstract
Wearable devices are lightweight and portable devices worn directly on the body or integrated into the user's clothing or accessories. They are usually connected to the Internet and combined with various software applications to monitor the user's physical conditions. The latest research shows that wearable head devices, particularly those incorporating microfluidic technology, enable the monitoring of bodily fluids and physiological states. Here, we summarize the main forms, functions, and applications of head wearable devices through innovative researches in recent years. The main functions of wearable head devices are sensor monitoring, diagnosis, and even therapeutic interventions. Through this application, real-time monitoring of human physiological conditions and noninvasive treatment can be realized. Furthermore, microfluidics can realize real-time monitoring of body fluids and skin interstitial fluid, which is highly significant in medical diagnosis and has broad medical application prospects. However, despite the progress made, significant challenges persist in the integration of microfluidics into wearable devices at the current technological level. Herein, we focus on summarizing the cutting-edge applications of microfluidic contact lenses and offer insights into the burgeoning intersection between microfluidics and head-worn wearables, providing a glimpse into their future prospects.
Collapse
Affiliation(s)
- Bingbing Gao
- School of Pharmaceutical Sciences, Nanjing Tech University, Nanjing 211816, P. R. China
| | - Jingwen Jiang
- School of Pharmaceutical Sciences, Nanjing Tech University, Nanjing 211816, P. R. China
| | - Shu Zhou
- School of Pharmaceutical Sciences, Nanjing Tech University, Nanjing 211816, P. R. China
| | - Jun Li
- School of Pharmaceutical Sciences, Nanjing Tech University, Nanjing 211816, P. R. China
| | - Qian Zhou
- School of Pharmaceutical Sciences, Nanjing Tech University, Nanjing 211816, P. R. China
| | - Xin Li
- School of Pharmaceutical Sciences, Nanjing Tech University, Nanjing 211816, P. R. China
| |
Collapse
|
4
|
Tel A, Raccampo L, Vinayahalingam S, Troise S, Abbate V, Orabona GD, Sembronio S, Robiony M. Complex Craniofacial Cases through Augmented Reality Guidance in Surgical Oncology: A Technical Report. Diagnostics (Basel) 2024; 14:1108. [PMID: 38893634 PMCID: PMC11171943 DOI: 10.3390/diagnostics14111108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2024] [Revised: 05/20/2024] [Accepted: 05/24/2024] [Indexed: 06/21/2024] Open
Abstract
Augmented reality (AR) is a promising technology to enhance image guided surgery and represents the perfect bridge to combine precise virtual planning with computer-aided execution of surgical maneuvers in the operating room. In craniofacial surgical oncology, AR brings to the surgeon's sight a digital, three-dimensional representation of the anatomy and helps to identify tumor boundaries and optimal surgical paths. Intraoperatively, real-time AR guidance provides surgeons with accurate spatial information, ensuring accurate tumor resection and preservation of critical structures. In this paper, the authors review current evidence of AR applications in craniofacial surgery, focusing on real surgical applications, and compare existing literature with their experience during an AR and navigation guided craniofacial resection, to subsequently analyze which technological trajectories will represent the future of AR and define new perspectives of application for this revolutionizing technology.
Collapse
Affiliation(s)
- Alessandro Tel
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| | - Luca Raccampo
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| | - Shankeeth Vinayahalingam
- Department of Oral and Maxillofacial Surgery, Radboud University Medical Center, 6525 GA Nijmegen, The Netherlands
| | - Stefania Troise
- Neurosciences Reproductive and Odontostomatological Sciences Department, University of Naples “Federico II”, 80131 Naples, Italy
| | - Vincenzo Abbate
- Neurosciences Reproductive and Odontostomatological Sciences Department, University of Naples “Federico II”, 80131 Naples, Italy
| | - Giovanni Dell’Aversana Orabona
- Neurosciences Reproductive and Odontostomatological Sciences Department, University of Naples “Federico II”, 80131 Naples, Italy
| | - Salvatore Sembronio
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| | - Massimo Robiony
- Clinic of Maxillofacial Surgery, Head-Neck and NeuroScience Department, University Hospital of Udine, p.le S. Maria della Misericordia 15, 33100 Udine, Italy
| |
Collapse
|
5
|
Condino S, Cutolo F, Carbone M, Cercenelli L, Badiali G, Montemurro N, Ferrari V. Registration Sanity Check for AR-guided Surgical Interventions: Experience From Head and Face Surgery. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2023; 12:258-267. [PMID: 38410181 PMCID: PMC10896424 DOI: 10.1109/jtehm.2023.3332088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Revised: 10/19/2023] [Accepted: 11/08/2023] [Indexed: 02/28/2024]
Abstract
Achieving and maintaining proper image registration accuracy is an open challenge of image-guided surgery. This work explores and assesses the efficacy of a registration sanity check method for augmented reality-guided navigation (AR-RSC), based on the visual inspection of virtual 3D models of landmarks. We analyze the AR-RSC sensitivity and specificity by recruiting 36 subjects to assess the registration accuracy of a set of 114 AR images generated from camera images acquired during an AR-guided orthognathic intervention. Translational or rotational errors of known magnitude up to ±1.5 mm/±15.5°, were artificially added to the image set in order to simulate different registration errors. This study analyses the performance of AR-RSC when varying (1) the virtual models selected for misalignment evaluation (e. g., the model of brackets, incisor teeth, and gingival margins in our experiment), (2) the type (translation/rotation) of registration error, and (3) the level of user experience in using AR technologies. Results show that: 1) the sensitivity and specificity of the AR-RSC depends on the virtual models (globally, a median true positive rate of up to 79.2% was reached with brackets, and a median true negative rate of up to 64.3% with incisor teeth), 2) there are error components that are more difficult to identify visually, 3) the level of user experience does not affect the method. In conclusion, the proposed AR-RSC, tested also in the operating room, could represent an efficient method to monitor and optimize the registration accuracy during the intervention, but special attention should be paid to the selection of the AR data chosen for the visual inspection of the registration accuracy.
Collapse
Affiliation(s)
- Sara Condino
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Fabrizio Cutolo
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Marina Carbone
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Laura Cercenelli
- EDIMES Laboratory of BioengineeringDepartment of Experimental, Diagnostic and Specialty MedicineUniversity of Bologna40138BolognaItaly
| | - Giovanni Badiali
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| | - Nicola Montemurro
- Department of NeurosurgeryAzienda Ospedaliera Universitaria Pisana (AOUP)56127PisaItaly
| | - Vincenzo Ferrari
- Department of Information EngineeringUniversity of Pisa56126PisaItaly
| |
Collapse
|
6
|
Baecher H, Hoch CC, Knoedler S, Maheta BJ, Kauke-Navarro M, Safi AF, Alfertshofer M, Knoedler L. From bench to bedside - current clinical and translational challenges in fibula free flap reconstruction. Front Med (Lausanne) 2023; 10:1246690. [PMID: 37886365 PMCID: PMC10598714 DOI: 10.3389/fmed.2023.1246690] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2023] [Accepted: 09/29/2023] [Indexed: 10/28/2023] Open
Abstract
Fibula free flaps (FFF) represent a working horse for different reconstructive scenarios in facial surgery. While FFF were initially established for mandible reconstruction, advancements in planning for microsurgical techniques have paved the way toward a broader spectrum of indications, including maxillary defects. Essential factors to improve patient outcomes following FFF include minimal donor site morbidity, adequate bone length, and dual blood supply. Yet, persisting clinical and translational challenges hamper the effectiveness of FFF. In the preoperative phase, virtual surgical planning and artificial intelligence tools carry untapped potential, while the intraoperative role of individualized surgical templates and bioprinted prostheses remains to be summarized. Further, the integration of novel flap monitoring technologies into postoperative patient management has been subject to translational and clinical research efforts. Overall, there is a paucity of studies condensing the body of knowledge on emerging technologies and techniques in FFF surgery. Herein, we aim to review current challenges and solution possibilities in FFF. This line of research may serve as a pocket guide on cutting-edge developments and facilitate future targeted research in FFF.
Collapse
Affiliation(s)
- Helena Baecher
- Department of Plastic, Hand and Reconstructive Surgery, University Hospital Regensburg, Regensburg, Germany
| | - Cosima C. Hoch
- Medical Faculty, Friedrich Schiller University Jena, Jena, Germany
| | - Samuel Knoedler
- Division of Plastic Surgery, Department of Surgery, Yale New Haven Hospital, Yale School of Medicine, New Haven, CT, United States
- Division of Plastic Surgery, Department of Surgery, Brigham and Women’s Hospital, Harvard Medical School, Boston, MA, United States
- Department of Plastic Surgery and Hand Surgery, Klinikum rechts der Isar, Technical University of Munich, Munich, Germany
| | - Bhagvat J. Maheta
- College of Medicine, California Northstate University, Elk Grove, CA, United States
| | - Martin Kauke-Navarro
- Division of Plastic Surgery, Department of Surgery, Yale New Haven Hospital, Yale School of Medicine, New Haven, CT, United States
| | - Ali-Farid Safi
- Craniologicum, Center for Cranio-Maxillo-Facial Surgery, Bern, Switzerland
- Faculty of Medicine, University of Bern, Bern, Switzerland
| | - Michael Alfertshofer
- Division of Hand, Plastic and Aesthetic Surgery, Ludwig-Maximilians-University Munich, Munich, Germany
| | - Leonard Knoedler
- Department of Plastic, Hand and Reconstructive Surgery, University Hospital Regensburg, Regensburg, Germany
- Division of Plastic Surgery, Department of Surgery, Yale New Haven Hospital, Yale School of Medicine, New Haven, CT, United States
| |
Collapse
|
7
|
Daher M, Ghanimeh J, Otayek J, Ghoul A, Bizdikian AJ, EL Abiad R. Augmented reality and shoulder replacement: a state-of-the-art review article. JSES REVIEWS, REPORTS, AND TECHNIQUES 2023; 3:274-278. [PMID: 37588507 PMCID: PMC10426657 DOI: 10.1016/j.xrrt.2023.01.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Since its implementation, the rates of failure of total shoulder arthroplasty which may be due to malpositioning pushed to improve this surgery by creating new techniques and tools to help perioperatively. Augmented reality, a newly used tool in orthopedic surgery can help bypass this problem and reduce the rates of failure faced in shoulder replacement surgeries. Although this technology has revolutionized orthopedic surgery and helped improve the accuracy in shoulder prosthesis components positioning, it still has some limitations such as inaccurate over-imposition that should be addressed before it becomes of standard usage.
Collapse
Affiliation(s)
- Mohammad Daher
- Hotel Dieu de France, Saint Joseph University, Beirut, Lebanon
| | - Joe Ghanimeh
- Lebanese American University Medical Center Rizk Hospital, Beirut, Lebanon
| | - Joeffroy Otayek
- Lebanese American University Medical Center Rizk Hospital, Beirut, Lebanon
| | - Ali Ghoul
- Hotel Dieu de France, Saint Joseph University, Beirut, Lebanon
| | | | - Rami EL Abiad
- Hotel Dieu de France, Saint Joseph University, Beirut, Lebanon
| |
Collapse
|
8
|
Ruggiero F, Cercenelli L, Emiliani N, Badiali G, Bevini M, Zucchelli M, Marcelli E, Tarsitano A. Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study. J Clin Med 2023; 12:jcm12072693. [PMID: 37048777 PMCID: PMC10095377 DOI: 10.3390/jcm12072693] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2023] [Revised: 03/29/2023] [Accepted: 03/31/2023] [Indexed: 04/08/2023] Open
Abstract
Background: Augmented reality (AR) allows the overlapping and integration of virtual information with the real environment. The camera of the AR device reads the object and integrates the virtual data. It has been widely applied to medical and surgical sciences in recent years and has the potential to enhance intraoperative navigation. Materials and methods: In this study, the authors aim to assess the accuracy of AR guidance when using the commercial HoloLens 2 head-mounted display (HMD) in pediatric craniofacial surgery. The Authors selected fronto-orbital remodeling (FOR) as the procedure to test (specifically, frontal osteotomy and nasal osteotomy were considered). Six people (three surgeons and three engineers) were recruited to perform the osteotomies on a 3D printed stereolithographic model under the guidance of AR. By means of calibrated CAD/CAM cutting guides with different grooves, the authors measured the accuracy of the osteotomies that were performed. We tested accuracy levels of ±1.5 mm, ±1 mm, and ±0.5 mm. Results: With the HoloLens 2, the majority of the individuals involved were able to successfully trace the trajectories of the frontal and nasal osteotomies with an accuracy level of ±1.5 mm. Additionally, 80% were able to achieve an accuracy level of ±1 mm when performing a nasal osteotomy, and 52% were able to achieve an accuracy level of ±1 mm when performing a frontal osteotomy, while 61% were able to achieve an accuracy level of ±0.5 mm when performing a nasal osteotomy, and 33% were able to achieve an accuracy level of ±0.5 mm when performing a frontal osteotomy. Conclusions: despite this being an in vitro study, the authors reported encouraging results for the prospective use of AR on actual patients.
Collapse
Affiliation(s)
- Federica Ruggiero
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Maxillo-Facial Surgery Unit, AUSL Bologna, 40124 Bologna, Italy
| | - Laura Cercenelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Nicolas Emiliani
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mirko Bevini
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| | - Mino Zucchelli
- Pediatric Neurosurgery, IRCCS Istituto delle Scienze Neurologiche di Bologna, Via Altura 3, 40138 Bologna, Italy
| | - Emanuela Marcelli
- Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
| |
Collapse
|
9
|
Tzelnick S, Rampinelli V, Sahovaler A, Franz L, Chan HHL, Daly MJ, Irish JC. Skull-Base Surgery-A Narrative Review on Current Approaches and Future Developments in Surgical Navigation. J Clin Med 2023; 12:2706. [PMID: 37048788 PMCID: PMC10095207 DOI: 10.3390/jcm12072706] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2023] [Revised: 03/10/2023] [Accepted: 03/29/2023] [Indexed: 04/07/2023] Open
Abstract
Surgical navigation technology combines patient imaging studies with intraoperative real-time data to improve surgical precision and patient outcomes. The navigation workflow can also include preoperative planning, which can reliably simulate the intended resection and reconstruction. The advantage of this approach in skull-base surgery is that it guides access into a complex three-dimensional area and orients tumors intraoperatively with regard to critical structures, such as the orbit, carotid artery and brain. This enhances a surgeon's capabilities to preserve normal anatomy while resecting tumors with adequate margins. The aim of this narrative review is to outline the state of the art and the future directions of surgical navigation in the skull base, focusing on the advantages and pitfalls of this technique. We will also present our group experience in this field, within the frame of the current research trends.
Collapse
Affiliation(s)
- Sharon Tzelnick
- Division of Head and Neck Surgery, Princess Margaret Cancer Center, University of Toronto, Toronto, ON M5G 2M9, Canada
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Vittorio Rampinelli
- Unit of Otorhinolaryngology—Head and Neck Surgery, Department of Medical and Surgical Specialties, Radiologic Sciences and Public Health, University of Brescia, 25121 Brescia, Italy
- Technology for Health (PhD Program), Department of Information Engineering, University of Brescia, 25121 Brescia, Italy
| | - Axel Sahovaler
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
- Head & Neck Surgery Unit, University College London Hospitals, London NW1 2PG, UK
| | - Leonardo Franz
- Department of Neuroscience DNS, Otolaryngology Section, University of Padova, 35122 Padua, Italy
| | - Harley H. L. Chan
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Michael J. Daly
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| | - Jonathan C. Irish
- Division of Head and Neck Surgery, Princess Margaret Cancer Center, University of Toronto, Toronto, ON M5G 2M9, Canada
- Guided Therapeutics (GTx) Program, TECHNA Institute, University Health Network, Toronto, ON M5G 2C4, Canada
| |
Collapse
|
10
|
Shimizu T, Oba T, Ito KI. The Advantage of Using an Optical See-Through Head-Mounted Display in Ultrasonography-Guided Needle Biopsy Procedures: A Prospective Randomized Study. J Clin Med 2023; 12:jcm12020512. [PMID: 36675443 PMCID: PMC9865023 DOI: 10.3390/jcm12020512] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2022] [Revised: 12/26/2022] [Accepted: 01/05/2023] [Indexed: 01/11/2023] Open
Abstract
An optical see-through head-mounted display (OST-HMD) can potentially improve the safety and accuracy of ultrasonography (US)-guided fine-needle aspiration. We aimed to evaluate the usefulness of an OST-HMD in US-guided needle-puncture procedures. We conducted a prospective randomized controlled study in which we compared the accuracy and safety of the US-guided needle puncture procedure and the stress on the practitioner when using OST-HMD versus standard US display (SUD). Inexperienced medical students were enrolled and randomly divided into two groups. A breast phantom was used to evaluate the required time and accuracy of the US-guided needle puncture. Practitioner stress was quantified using a visual analog scale (VAS). When the procedure was performed for the first time, the time required to reach the target lesion at a shallow depth was significantly shorter in the OST-HMD group (39.8 ± 39.9 s) than in the SUD group (71.0 ± 81.0 s) (p = 0.01). Using the OST-HMD significantly reduced the unintentional puncture of a non-target lesion (p = 0.01). Furthermore, the stress felt by the practitioners when capturing the image of the target lesion (p < 0.001), inserting and advancing the needle more deeply (p < 0.001), and puncturing the target lesion (p < 0.001) was significantly reduced in the OST-HMD group compared with that in the SUD group. Use of OST-HMD may improve the accuracy and safety of US-guided needle puncture procedures and may reduce practitioner stress during the procedure.
Collapse
|
11
|
Ceccariglia F, Cercenelli L, Badiali G, Marcelli E, Tarsitano A. Application of Augmented Reality to Maxillary Resections: A Three-Dimensional Approach to Maxillofacial Oncologic Surgery. J Pers Med 2022; 12:jpm12122047. [PMID: 36556268 PMCID: PMC9785494 DOI: 10.3390/jpm12122047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 12/03/2022] [Accepted: 12/07/2022] [Indexed: 12/14/2022] Open
Abstract
In the relevant global context, although virtual reality, augmented reality, and mixed reality have been emerging methodologies for several years, only now have technological and scientific advances made them suitable for revolutionizing clinical care and medical settings through the provision of advanced features and improved healthcare services. Over the past fifteen years, tools and applications using augmented reality (AR) have been designed and tested in the context of various surgical and medical disciplines, including maxillofacial surgery. The purpose of this paper is to show how a marker-less AR guidance system using the Microsoft® HoloLens 2 can be applied in mandible and maxillary demolition surgery to guide maxillary osteotomies. We describe three mandibular and maxillary oncologic resections performed during 2021 using AR support. In these three patients, we applied a marker-less tracking method based on recognition of the patient's facial profile. The surgeon, using HoloLens 2 smart glasses, could see the virtual surgical planning superimposed on the patient's anatomy. We showed that performing osteotomies under AR guidance is feasible and viable, as demonstrated by comparison with osteotomies performed using CAD-CAM cutting guides. This technology has advantages and disadvantages. However, further research is needed to improve the stability and robustness of the marker-less tracking method applied to patient face recognition.
Collapse
Affiliation(s)
- Francesco Ceccariglia
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
- Correspondence: ; Tel.: +39-051-2144197
| | - Laura Cercenelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Giovanni Badiali
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| | - Emanuela Marcelli
- eDimes Lab-Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy
| | - Achille Tarsitano
- Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
- Maxillofacial Surgery Unit, Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
| |
Collapse
|
12
|
Mendicino AR, Condino S, Carbone M, Cutolo F, Cattari N, Andreani L, Parchi PD, Capanna R, Ferrari V. Augmented Reality as a Tool to Guide Patient-Specific Templates Placement in Pelvic Resections. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:3481-3484. [PMID: 36086331 DOI: 10.1109/embc48229.2022.9871766] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Patient-specific templates (PST) have become a useful tool for guiding osteotomy in complex surgical scenarios such as pelvic resections. The design of the surgical template results in sharper, less jagged resection margins than freehand cuts. However, their correct placement can become difficult in some anatomical regions and cannot be verified during surgery. Conventionally, pelvic resections are performed using Computer Assisted Surgery (CAS), and in recent years Augmented Reality (AR) has been proposed in the literature as an additional tool to support PST placement. This work presents an AR task to simplify and improve the accuracy of the positioning of the template by displaying virtual content. The focus of the work is the creation of the virtual guides displayed during the AR task. The system was validated on a patient-specific phantom designed to provide a realistic setup. Encouraging results have been achieved. The use of the AR simplifies the surgical task and optimizes the correct positioning of the cutting template: an average error of 2.19 mm has been obtained, lower than obtained with state-of-the-art solutions. In addition, supporting PST placement through AR guidance is less time-consuming than the standard procedure that solely relies on anatomical landmarks as reference.
Collapse
|
13
|
Zoabi A, Redenski I, Oren D, Kasem A, Zigron A, Daoud S, Moskovich L, Kablan F, Srouji S. 3D Printing and Virtual Surgical Planning in Oral and Maxillofacial Surgery. J Clin Med 2022; 11:jcm11092385. [PMID: 35566511 PMCID: PMC9104292 DOI: 10.3390/jcm11092385] [Citation(s) in RCA: 36] [Impact Index Per Article: 12.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2022] [Revised: 04/15/2022] [Accepted: 04/18/2022] [Indexed: 02/01/2023] Open
Abstract
Compared to traditional manufacturing methods, additive manufacturing and 3D printing stand out in their ability to rapidly fabricate complex structures and precise geometries. The growing need for products with different designs, purposes and materials led to the development of 3D printing, serving as a driving force for the 4th industrial revolution and digitization of manufacturing. 3D printing has had a global impact on healthcare, with patient-customized implants now replacing generic implantable medical devices. This revolution has had a particularly significant impact on oral and maxillofacial surgery, where surgeons rely on precision medicine in everyday practice. Trauma, orthognathic surgery and total joint replacement therapy represent several examples of treatments improved by 3D technologies. The widespread and rapid implementation of 3D technologies in clinical settings has led to the development of point-of-care treatment facilities with in-house infrastructure, enabling surgical teams to participate in the 3D design and manufacturing of devices. 3D technologies have had a tremendous impact on clinical outcomes and on the way clinicians approach treatment planning. The current review offers our perspective on the implementation of 3D-based technologies in the field of oral and maxillofacial surgery, while indicating major clinical applications. Moreover, the current report outlines the 3D printing point-of-care concept in the field of oral and maxillofacial surgery.
Collapse
Affiliation(s)
- Adeeb Zoabi
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Idan Redenski
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Daniel Oren
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Adi Kasem
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Asaf Zigron
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Shadi Daoud
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Liad Moskovich
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Fares Kablan
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
| | - Samer Srouji
- Department of Oral and Maxillofacial Surgery, Galilee College of Dental Sciences, Galilee Medical Center, Nahariya 2210001, Israel; (A.Z.); (I.R.); (D.O.); (A.K.); (A.Z.); (S.D.); (L.M.); (F.K.)
- The Azrieli Faculty of Medicine, Bar-Ilan University, Safed 1311502, Israel
- Correspondence:
| |
Collapse
|
14
|
Key Ergonomics Requirements and Possible Mechanical Solutions for Augmented Reality Head-Mounted Displays in Surgery. MULTIMODAL TECHNOLOGIES AND INTERACTION 2022. [DOI: 10.3390/mti6020015] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022] Open
Abstract
In the context of a European project, we identified over 150 requirements for the development of an augmented reality (AR) head-mounted display (HMD) specifically tailored to support highly challenging manual surgical procedures. The requirements were established by surgeons from different specialties and by industrial players working in the surgical field who had strong commitments to the exploitation of this technology. Some of these requirements were specific to the project, while others can be seen as key requirements for the implementation of an efficient and reliable AR headset to be used to support manual activities in the peripersonal space. The aim of this work is to describe these ergonomic requirements that impact the mechanical design of the HMDs, the possible innovative solutions to these requirements, and how these solutions have been used to implement the AR headset in surgical navigation. We also report the results of a preliminary qualitative evaluation of the AR headset by three surgeons.
Collapse
|
15
|
Architecture of a Hybrid Video/Optical See-through Head-Mounted Display-Based Augmented Reality Surgical Navigation Platform. INFORMATION 2022. [DOI: 10.3390/info13020081] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
In the context of image-guided surgery, augmented reality (AR) represents a ground-breaking enticing improvement, mostly when paired with wearability in the case of open surgery. Commercially available AR head-mounted displays (HMDs), designed for general purposes, are increasingly used outside their indications to develop surgical guidance applications with the ambition to demonstrate the potential of AR in surgery. The applications proposed in the literature underline the hunger for AR-guidance in the surgical room together with the limitations that hinder commercial HMDs from being the answer to such a need. The medical domain demands specifically developed devices that address, together with ergonomics, the achievement of surgical accuracy objectives and compliance with medical device regulations. In the framework of an EU Horizon2020 project, a hybrid video and optical see-through augmented reality headset paired with a software architecture, both specifically designed to be seamlessly integrated into the surgical workflow, has been developed. In this paper, the overall architecture of the system is described. The developed AR HMD surgical navigation platform was positively tested on seven patients to aid the surgeon while performing Le Fort 1 osteotomy in cranio-maxillofacial surgery, demonstrating the value of the hybrid approach and the safety and usability of the navigation platform.
Collapse
|
16
|
Cercenelli L, Babini F, Badiali G, Battaglia S, Tarsitano A, Marchetti C, Marcelli E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front Oncol 2022; 11:804748. [PMID: 35071009 PMCID: PMC8770836 DOI: 10.3389/fonc.2021.804748] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 12/10/2021] [Indexed: 11/13/2022] Open
Abstract
Background Augmented Reality (AR) represents an evolution of navigation-assisted surgery, providing surgeons with a virtual aid contextually merged with the real surgical field. We recently reported a case series of AR-assisted fibular flap harvesting for mandibular reconstruction. However, the registration accuracy between the real and the virtual content needs to be systematically evaluated before widely promoting this tool in clinical practice. In this paper, after description of the AR based protocol implemented for both tablet and HoloLens 2 smart glasses, we evaluated in a first test session the achievable registration accuracy with the two display solutions, and in a second test session the success rate in executing the AR-guided skin paddle incision task on a 3D printed leg phantom. Methods From a real computed tomography dataset, 3D virtual models of a human leg, including fibula, arteries and skin with planned paddle profile for harvesting, were obtained. All virtual models were imported into Unity software to develop a marker-less AR application suitable to be used both via tablet and via HoloLens 2 headset. The registration accuracy for both solutions was verified on a 3D printed leg phantom obtained from the virtual models, by repeatedly applying the tracking function and computing pose deviations between the AR-projected virtual skin paddle profile and the real one transferred to the phantom via a CAD/CAM cutting guide. The success rate in completing the AR-guided task of skin paddle harvesting was evaluated using CAD/CAM templates positioned on the phantom model surface. Results On average, the marker-less AR protocol showed comparable registration errors (ranging within 1-5 mm) for tablet-based and HoloLens-based solution. Registration accuracy seems to be quite sensitive to ambient light conditions. We found a good success rate in completing the AR-guided task within an error margin of 4 mm (97% and 100% for tablet and HoloLens, respectively). All subjects reported greater usability and ergonomics for HoloLens 2 solution. Conclusions Results revealed that the proposed marker-less AR based protocol may guarantee a registration error within 1-5 mm for assisting skin paddle harvesting in the clinical setting. Optimal lightening conditions and further improvement of marker-less tracking technologies have the potential to increase the efficiency and precision of this AR-assisted reconstructive surgery.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Federico Babini
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| | - Giovanni Badiali
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Salvatore Battaglia
- Maxillofacial Surgery Unit, Policlinico San Marco University Hospital, University of Catania, Catania, Italy
| | - Achille Tarsitano
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Claudio Marchetti
- Maxillofacial Surgery Unit, Head and Neck Department, IRCCS Azienda Ospedaliera Universitaria di Bologna, Department of Biomedical and Neuromotor Sciences, Alma Mater Studiorum University of Bologna, Bologna, Italy
| | - Emanuela Marcelli
- eDIMES Lab - Laboratory of Bioengineering, Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Bologna, Italy
| |
Collapse
|
17
|
Cercenelli L, De Stefano A, Billi AM, Ruggeri A, Marcelli E, Marchetti C, Manzoli L, Ratti S, Badiali G. AEducaAR, Anatomical Education in Augmented Reality: A Pilot Experience of an Innovative Educational Tool Combining AR Technology and 3D Printing. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph19031024. [PMID: 35162049 PMCID: PMC8834017 DOI: 10.3390/ijerph19031024] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/23/2021] [Revised: 01/14/2022] [Accepted: 01/15/2022] [Indexed: 01/27/2023]
Abstract
Gross anatomy knowledge is an essential element for medical students in their education, and nowadays, cadaver-based instruction represents the main instructional tool able to provide three-dimensional (3D) and topographical comprehensions. The aim of the study was to develop and test a prototype of an innovative tool for medical education in human anatomy based on the combination of augmented reality (AR) technology and a tangible 3D printed model that can be explored and manipulated by trainees, thus favoring a three-dimensional and topographical learning approach. After development of the tool, called AEducaAR (Anatomical Education with Augmented Reality), it was tested and evaluated by 62 second-year degree medical students attending the human anatomy course at the International School of Medicine and Surgery of the University of Bologna. Students were divided into two groups: AEducaAR-based learning ("AEducaAR group") was compared to standard learning using human anatomy atlas ("Control group"). Both groups performed an objective test and an anonymous questionnaire. In the objective test, the results showed no significant difference between the two learning methods; instead, in the questionnaire, students showed enthusiasm and interest for the new tool and highlighted its training potentiality in open-ended comments. Therefore, the presented AEducaAR tool, once implemented, may contribute to enhancing students' motivation for learning, increasing long-term memory retention and 3D comprehension of anatomical structures. Moreover, this new tool might help medical students to approach to innovative medical devices and technologies useful in their future careers.
Collapse
Affiliation(s)
- Laura Cercenelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Experimental Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy; (L.C.); (E.M.)
| | - Alessia De Stefano
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Anna Maria Billi
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Alessandra Ruggeri
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Emanuela Marcelli
- eDIMES Lab-Laboratory of Bioengineering, Department of Experimental Diagnostic and Specialty Medicine, University of Bologna, 40138 Bologna, Italy; (L.C.); (E.M.)
| | - Claudio Marchetti
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (C.M.); (G.B.)
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy
| | - Lucia Manzoli
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
| | - Stefano Ratti
- Cellular Signalling Laboratory, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (A.D.S.); (A.M.B.); (A.R.); (L.M.)
- Correspondence:
| | - Giovanni Badiali
- Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40126 Bologna, Italy; (C.M.); (G.B.)
- Department of Maxillo-Facial Surgery, IRCCS Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy
| |
Collapse
|
18
|
Sun X, Gu S, Jiang L, Wu Y. A Low-cost Mobile System with Multi-AR Guidance for Brain Surgery Assistance . ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021; 2021:2222-2225. [PMID: 34891728 DOI: 10.1109/embc46164.2021.9630928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Surgical operation especially brain surgery requires comprehensive understanding on the surrounding area of the surgical path. Augmented Reality (AR) technology provided an effective way to increase the surgeon's perception on the plan. However, current applications were hindered by the expensive hardware and limited guidance information. In this paper, an AR system especially designed for brain surgery was proposed, which featured in low-cost system components and multi-AR guidance. A light-weight AR glasses was utilized together with normal mobile phone to provide mobile AR to the surgeon. A web-based application was implemented for compatibility of various mobile devices. Multi-AR information was designed for surgical guidance, including planned operation path, dangerous areas, and three quantitative guidance metrics. Patient's specific 3D model was reconstructed based on CT images, and the phantom was utilized to evaluate the effectiveness of the system. The experimental results indicated that the assistance of the multi-AR guidance outperformed the results of with no AR guidance at all and with virtual path guidance only. As a result, our system could help the operator to perform the operation tasks easier.Clinical Relevance- This proposed method provided a potential way for brain surgery with multiple AR guidance information with the assistance of a low-cost AR system, which may improve the surgeon's cognition on the surgical site.
Collapse
|
19
|
Mehrotra D, Markus A. Emerging simulation technologies in global craniofacial surgical training. J Oral Biol Craniofac Res 2021; 11:486-499. [PMID: 34345584 PMCID: PMC8319526 DOI: 10.1016/j.jobcr.2021.06.002] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2021] [Accepted: 06/22/2021] [Indexed: 12/14/2022] Open
Abstract
The last few decades have seen an exponential growth in the development and adoption of novel technologies in medical and surgical training of residents globally. Simulation is an active and innovative teaching method, and can be achieved via physical or digital models. Simulation allows the learners to repeatedly practice without the risk of causing any error in an actual patient and enhance their surgical skills and efficiency. Simulation may also allow the clinical instructor to objectively test the ability of the trainee to carry out the clinical procedure competently and independently prior to trainee's completion of the program. This review aims to explore the role of emerging simulation technologies globally in craniofacial training of students and residents in improving their surgical knowledge and skills. These technologies include 3D printed biomodels, virtual and augmented reality, use of google glass, hololens and haptic feedback, surgical boot camps, serious games and escape games and how they can be implemented in low and middle income countries. Craniofacial surgical training methods will probably go through a sea change in the coming years, with the integration of these new technologies in the surgical curriculum, allowing learning in a safe environment with a virtual patient, through repeated exercise. In future, it may also be used as an assessment tool to perform any specific procedure, without putting the actual patient on risk. Although these new technologies are being enthusiastically welcomed by the young surgeons, they should only be used as an addition to the actual curriculum and not as a replacement to the conventional tools, as the mentor-mentee relationship can never be replaced by any technology.
Collapse
Affiliation(s)
- Divya Mehrotra
- Department of Oral and Maxillofacial Surgery KGMU, Lucknow, India
| | - A.F. Markus
- Emeritus Consultant Maxillofacial Surgeon, Poole Hospital University of Bournemouth, University of Duisburg-Essen, Trinity College, Dublin, Ireland
| |
Collapse
|
20
|
Tel A, Arboit L, Sembronio S, Costa F, Nocini R, Robiony M. The Transantral Endoscopic Approach: A Portal for Masses of the Inferior Orbit-Improving Surgeons' Experience Through Virtual Endoscopy and Augmented Reality. Front Surg 2021; 8:715262. [PMID: 34497829 PMCID: PMC8419325 DOI: 10.3389/fsurg.2021.715262] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2021] [Accepted: 07/27/2021] [Indexed: 01/17/2023] Open
Abstract
In the past years, endoscopic techniques have raised an increasing interest to perform minimally invasive accesses to the orbit, resulting in excellent clinical outcomes with inferior morbidities and complication rates. Among endoscopic approaches, the transantral endoscopic approach allows us to create a portal to the orbital floor, representing the most straightforward access to lesions located in the inferior orbital space. However, if endoscopic surgery provides enhanced magnified vision of the anatomy in a bloodless field, then it has several impairments compared with classic open surgery, owing to restricted operative spaces. Virtual surgical planning and anatomical computer-generated models have proved to be of great importance to plan endoscopic surgical approaches, and their role can be widened with the integration of surgical navigation, virtual endoscopy simulation, and augmented reality (AR). This study focuses on the strict conjugation between the technologies that allow the virtualization of surgery in an entirely digital environment, which can be transferred to the patient using intraoperative navigation or to a printed model using AR for pre-surgical analysis. Therefore, the interaction between different software packages and platforms offers a highly predictive preview of the surgical scenario, contributing to increasing orientation, awareness, and effectiveness of maneuvers performed under endoscopic guidance, which can be checked at any time using surgical navigation. In this paper, the authors explore the transantral approach for the excision of masses of the inferior orbital compartment through modern technology. The authors apply this technique for masses located in the inferior orbit and share their clinical results, describing why technological innovation, and, in particular, computer planning, virtual endoscopy, navigation, and AR can contribute to empowering minimally invasive orbital surgery, at the same time offering a valuable and indispensable tool for pre-surgical analysis and training.
Collapse
Affiliation(s)
- Alessandro Tel
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| | - Lorenzo Arboit
- Faculty of Medicine and Surgery, Sant'Anna School of Advanced Studies, Pisa, Italy
| | - Salvatore Sembronio
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| | - Fabio Costa
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| | - Riccardo Nocini
- Department of Otorhinolaryngology, University Hospital of Verona, Verona, Italy
| | - Massimo Robiony
- Department of Maxillofacial Surgery, University Hospital of Udine, Udine, Italy
| |
Collapse
|
21
|
Schlueter-Brust K, Henckel J, Katinakis F, Buken C, Opt-Eynde J, Pofahl T, Rodriguez y Baena F, Tatti F. Augmented-Reality-Assisted K-Wire Placement for Glenoid Component Positioning in Reversed Shoulder Arthroplasty: A Proof-of-Concept Study. J Pers Med 2021; 11:jpm11080777. [PMID: 34442421 PMCID: PMC8400865 DOI: 10.3390/jpm11080777] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2021] [Revised: 08/01/2021] [Accepted: 08/05/2021] [Indexed: 12/17/2022] Open
Abstract
The accuracy of the implant's post-operative position and orientation in reverse shoulder arthroplasty is known to play a significant role in both clinical and functional outcomes. Whilst technologies such as navigation and robotics have demonstrated superior radiological outcomes in many fields of surgery, the impact of augmented reality (AR) assistance in the operating room is still unknown. Malposition of the glenoid component in shoulder arthroplasty is known to result in implant failure and early revision surgery. The use of AR has many promising advantages, including allowing the detailed study of patient-specific anatomy without the need for invasive procedures such as arthroscopy to interrogate the joint's articular surface. In addition, this technology has the potential to assist surgeons intraoperatively in aiding the guidance of surgical tools. It offers the prospect of increased component placement accuracy, reduced surgical procedure time, and improved radiological and functional outcomes, without recourse to the use of large navigation or robotic instruments, with their associated high overhead costs. This feasibility study describes the surgical workflow from a standardised CT protocol, via 3D reconstruction, 3D planning, and use of a commercial AR headset, to AR-assisted k-wire placement. Post-operative outcome was measured using a high-resolution laser scanner on the patient-specific 3D printed bone. In this proof-of-concept study, the discrepancy between the planned and the achieved glenoid entry point and guide-wire orientation was approximately 3 mm with a mean angulation error of 5°.
Collapse
Affiliation(s)
- Klaus Schlueter-Brust
- Department of Orthopaedic Surgery, St. Franziskus Hospital Köln, 50825 Köln, Germany; (F.K.); (C.B.); (J.O.-E.)
- Correspondence: ; Tel.: +49-221-5591-1131
| | - Johann Henckel
- Institute of Orthopaedics, The Royal National Orthopaedic Hospital, Brockley Hill, Stanmore, London HA7 4LP, UK;
| | - Faidon Katinakis
- Department of Orthopaedic Surgery, St. Franziskus Hospital Köln, 50825 Köln, Germany; (F.K.); (C.B.); (J.O.-E.)
| | - Christoph Buken
- Department of Orthopaedic Surgery, St. Franziskus Hospital Köln, 50825 Köln, Germany; (F.K.); (C.B.); (J.O.-E.)
| | - Jörg Opt-Eynde
- Department of Orthopaedic Surgery, St. Franziskus Hospital Köln, 50825 Köln, Germany; (F.K.); (C.B.); (J.O.-E.)
| | | | | | - Fabio Tatti
- Mechatronics in Medicine Laboratory, Imperial College London, London SW7 2AZ, UK; (F.R.y.B.); (F.T.)
| |
Collapse
|
22
|
Can Liquid Lenses Increase Depth of Field in Head Mounted Video See-Through Devices? J Imaging 2021; 7:jimaging7080138. [PMID: 34460773 PMCID: PMC8404927 DOI: 10.3390/jimaging7080138] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Revised: 08/02/2021] [Accepted: 08/03/2021] [Indexed: 02/05/2023] Open
Abstract
Wearable Video See-Through (VST) devices for Augmented Reality (AR) and for obtaining a Magnified View are taking hold in the medical and surgical fields. However, these devices are not yet usable in daily clinical practice, due to focusing problems and a limited depth of field. This study investigates the use of liquid-lens optics to create an autofocus system for wearable VST visors. The autofocus system is based on a Time of Flight (TOF) distance sensor and an active autofocus control system. The integrated autofocus system in the wearable VST viewers showed good potential in terms of providing rapid focus at various distances and a magnified view.
Collapse
|
23
|
Augmented Reality Based Surgical Navigation of Complex Pelvic Osteotomies—A Feasibility Study on Cadavers. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11031228] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
Augmented reality (AR)-based surgical navigation may offer new possibilities for safe and accurate surgical execution of complex osteotomies. In this study we investigated the feasibility of navigating the periacetabular osteotomy of Ganz (PAO), known as one of the most complex orthopedic interventions, on two cadaveric pelves under realistic operating room conditions. Preoperative planning was conducted on computed tomography (CT)-reconstructed 3D models using an in-house developed software, which allowed creating cutting plane objects for planning of the osteotomies and reorientation of the acetabular fragment. An AR application was developed comprising point-based registration, motion compensation and guidance for osteotomies as well as fragment reorientation. Navigation accuracy was evaluated on CT-reconstructed 3D models, resulting in an error of 10.8 mm for osteotomy starting points and 5.4° for osteotomy directions. The reorientation errors were 6.7°, 7.0° and 0.9° for the x-, y- and z-axis, respectively. Average postoperative error of LCE angle was 4.5°. Our study demonstrated that the AR-based execution of complex osteotomies is feasible. Fragment realignment navigation needs further improvement, although it is more accurate than the state of the art in PAO surgery.
Collapse
|
24
|
Advances and Trends in Pediatric Minimally Invasive Surgery. J Clin Med 2020; 9:jcm9123999. [PMID: 33321836 PMCID: PMC7764454 DOI: 10.3390/jcm9123999] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Revised: 11/28/2020] [Accepted: 12/03/2020] [Indexed: 12/11/2022] Open
Abstract
As many meta-analyses comparing pediatric minimally invasive to open surgery can be found in the literature, the aim of this review is to summarize the current state of minimally invasive pediatric surgery and specifically focus on the trends and developments which we expect in the upcoming years. Print and electronic databases were systematically searched for specific keywords, and cross-link searches with references found in the literature were added. Full-text articles were obtained, and eligibility criteria were applied independently. Pediatric minimally invasive surgery is a wide field, ranging from minimally invasive fetal surgery over microlaparoscopy in newborns to robotic surgery in adolescents. New techniques and devices, like natural orifice transluminal endoscopic surgery (NOTES), single-incision and endoscopic surgery, as well as the artificial uterus as a backup for surgery in preterm fetuses, all contribute to the development of less invasive procedures for children. In spite of all promising technical developments which will definitely change the way pediatric surgeons will perform minimally invasive procedures in the upcoming years, one must bear in mind that only hard data of prospective randomized controlled and double-blind trials can validate whether these techniques and devices really improve the surgical outcome of our patients.
Collapse
|