1
|
Javaheri H, Ghamarnejad O, Bade R, Lukowicz P, Karolus J, Stavrou GA. Beyond the visible: preliminary evaluation of the first wearable augmented reality assistance system for pancreatic surgery. Int J Comput Assist Radiol Surg 2025; 20:117-129. [PMID: 38849631 PMCID: PMC11757645 DOI: 10.1007/s11548-024-03131-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2024] [Accepted: 03/27/2024] [Indexed: 06/09/2024]
Abstract
PURPOSE The retroperitoneal nature of the pancreas, marked by minimal intraoperative organ shifts and deformations, makes augmented reality (AR)-based systems highly promising for pancreatic surgery. This study presents preliminary data from a prospective study aiming to develop the first wearable AR assistance system, ARAS, for pancreatic surgery and evaluating its usability, accuracy, and effectiveness in enhancing the perioperative outcomes of patients. METHODS We developed ARAS as a two-phase system for a wearable AR device to aid surgeons in planning and operation. This system was used to visualize and register patient-specific 3D anatomical models during the surgery. The location and precision of the registered 3D anatomy were evaluated by assessing the arterial pulse and employing Doppler and duplex ultrasonography. The usability, accuracy, and effectiveness of ARAS were assessed using a five-point Likert scale questionnaire. RESULTS Perioperative outcomes of five patients underwent various pancreatic resections with ARAS are presented. Surgeons rated ARAS as excellent for preoperative planning. All structures were accurately identified without any noteworthy errors. Only tumor identification decreased after the preparation phase, especially in patients who underwent pancreaticoduodenectomy because of the extensive mobilization of peripancreatic structures. No perioperative complications related to ARAS were observed. CONCLUSIONS ARAS shows promise in enhancing surgical precision during pancreatic procedures. Its efficacy in preoperative planning and intraoperative vascular identification positions it as a valuable tool for pancreatic surgery and a potential educational resource for future surgical residents.
Collapse
Affiliation(s)
- Hamraz Javaheri
- German Research Center for Artificial Intelligence (DFKI), Kaiserslautern, Germany
| | - Omid Ghamarnejad
- Department of General, Visceral, and Oncological Surgery, Klinikum Saarbrücken, Winterberg 1, 66119, Saarbrücken, Germany
| | | | - Paul Lukowicz
- German Research Center for Artificial Intelligence (DFKI), Kaiserslautern, Germany
- University of Kaiserslautern-Landau, Kaiserslautern, Germany
| | - Jakob Karolus
- German Research Center for Artificial Intelligence (DFKI), Kaiserslautern, Germany.
- University of Kaiserslautern-Landau, Kaiserslautern, Germany.
| | - Gregor Alexander Stavrou
- Department of General, Visceral, and Oncological Surgery, Klinikum Saarbrücken, Winterberg 1, 66119, Saarbrücken, Germany
| |
Collapse
|
2
|
Javaheri H, Ghamarnejad O, Widyaningsih R, Bade R, Lukowicz P, Karolus J, Stavrou GA. Enhancing Perioperative Outcomes of Pancreatic Surgery with Wearable Augmented Reality Assistance System: A Matched-Pair Analysis. ANNALS OF SURGERY OPEN 2024; 5:e516. [PMID: 39711676 PMCID: PMC11661739 DOI: 10.1097/as9.0000000000000516] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2024] [Accepted: 10/05/2024] [Indexed: 12/24/2024] Open
Abstract
Objective The present study aimed to evaluate the safety of the first wearable augmented reality assistance system (ARAS) specifically designed for pancreatic surgery and its impact on perioperative outcomes. Background Pancreatic surgery remains highly complex and is associated with a high rate of perioperative complications. ARAS, as an intraoperative assistance system, has the potential to reduce these complications. Methods This prospective, single-center study included 20 patients who underwent pancreatic surgery using ARAS. These patients were matched in a 1:3 ratio with 60 patients from our retrospective data who underwent standard pancreatic resection. Matching variables were selected based on factors associated with poor intraoperative outcomes. Results A higher proportion of patients in the ARAS group were diagnosed with borderline resectable pancreatic cancer and received neoadjuvant chemotherapy (20.0% vs 6.7%, P = 0.085). Additionally, more patients in the ARAS group required arterial resection compared with the control group (15.0% vs 0.0%, P = 0.002). Nevertheless, the ARAS group had a significantly shorter operative time (246 vs 299 minutes, P = 0.004) and required significantly fewer intraoperative blood transfusions (0.0 ± 0.0 vs 0.5 ± 1.4 units, P = 0.014). None of the patients in the ARAS group had positive resection margins (0.0% vs 20.0%, P = 0.045). Furthermore, patients in the ARAS group experienced a significantly shorter hospital stay (13.8 ± 6.6 vs 17.9 ± 8.2 days, P = 0.046). Conclusions ARAS is a safe and effective assistance system for pancreatic surgery, offering superior perioperative outcomes compared with standard procedures.
Collapse
Affiliation(s)
- Hamraz Javaheri
- From the Department of Embedded Intelligence, German Research Center for Artificial Intelligence (DFKI), Kaiserslautern, Germany
| | - Omid Ghamarnejad
- Department of General, Visceral, and Oncological Surgery, Klinikum Saarbrücken, Saarbrücken, Germany
| | - Rizky Widyaningsih
- Department of General, Visceral, and Oncological Surgery, Klinikum Saarbrücken, Saarbrücken, Germany
| | | | - Paul Lukowicz
- From the Department of Embedded Intelligence, German Research Center for Artificial Intelligence (DFKI), Kaiserslautern, Germany
- Department of Computer Science, RPTU Kaiserslautern-Landau, Kaiserslautern, Germany
| | - Jakob Karolus
- From the Department of Embedded Intelligence, German Research Center for Artificial Intelligence (DFKI), Kaiserslautern, Germany
- Department of Computer Science, RPTU Kaiserslautern-Landau, Kaiserslautern, Germany
| | - Gregor Alexander Stavrou
- Department of General, Visceral, and Oncological Surgery, Klinikum Saarbrücken, Saarbrücken, Germany
| |
Collapse
|
3
|
Abstract
INTRODUCTION During an operation, augmented reality (AR) enables surgeons to enrich their vision of the operating field by means of digital imagery, particularly as regards tumors and anatomical structures. While in some specialties, this type of technology is routinely ustilized, in liver surgery due to the complexity of modeling organ deformities in real time, its applications remain limited. At present, numerous teams are attempting to find a solution applicable to current practice, the objective being to overcome difficulties of intraoperative navigation in an opaque organ. OBJECTIVE To identify, itemize and analyze series reporting AR techniques tested in liver surgery, the objectives being to establish a state of the art and to provide indications of perspectives for the future. METHODS In compliance with the PRISMA guidelines and availing ourselves of the PubMed, Embase and Cochrane databases, we identified English-language articles published between January 2020 and January 2022 corresponding to the following keywords: augmented reality, hepatic surgery, liver and hepatectomy. RESULTS Initially, 102 titles, studies and summaries were preselected. Twenty-eight corresponding to the inclusion criteria were included, reporting on 183patients operated with the help of AR by laparotomy (n=31) or laparoscopy (n=152). Several techniques of acquisition and visualization were reported. Anatomical precision was the main assessment criterion in 19 articles, with values ranging from 3mm to 14mm, followed by time of acquisition and clinical feasibility. CONCLUSION While several AR technologies are presently being developed, due to insufficient anatomical precision their clinical applications have remained limited. That much said, numerous teams are currently working toward their optimization, and it is highly likely that in the short term, the application of AR in liver surgery will have become more frequent and effective. As for its clinical impact, notably in oncology, it remains to be assessed.
Collapse
Affiliation(s)
- B Acidi
- Department of Surgery, AP-HP hôpital Paul-Brousse, Hepato-Biliary Center, 12, avenue Paul-Vaillant Couturier, 94804 Villejuif cedex, France; Augmented Operating Room Innovation Chair (BOPA), France; Inria « Mimesis », Strasbourg, France
| | - M Ghallab
- Department of Surgery, AP-HP hôpital Paul-Brousse, Hepato-Biliary Center, 12, avenue Paul-Vaillant Couturier, 94804 Villejuif cedex, France; Augmented Operating Room Innovation Chair (BOPA), France
| | - S Cotin
- Augmented Operating Room Innovation Chair (BOPA), France; Inria « Mimesis », Strasbourg, France
| | - E Vibert
- Department of Surgery, AP-HP hôpital Paul-Brousse, Hepato-Biliary Center, 12, avenue Paul-Vaillant Couturier, 94804 Villejuif cedex, France; Augmented Operating Room Innovation Chair (BOPA), France; DHU Hepatinov, 94800 Villejuif, France; Inserm, Paris-Saclay University, UMRS 1193, Pathogenesis and treatment of liver diseases; FHU Hepatinov, 94800 Villejuif, France
| | - N Golse
- Department of Surgery, AP-HP hôpital Paul-Brousse, Hepato-Biliary Center, 12, avenue Paul-Vaillant Couturier, 94804 Villejuif cedex, France; Augmented Operating Room Innovation Chair (BOPA), France; Inria « Mimesis », Strasbourg, France; DHU Hepatinov, 94800 Villejuif, France; Inserm, Paris-Saclay University, UMRS 1193, Pathogenesis and treatment of liver diseases; FHU Hepatinov, 94800 Villejuif, France.
| |
Collapse
|
4
|
Benmahdjoub M, Thabit A, van Veelen MLC, Niessen WJ, Wolvius EB, Walsum TV. Evaluation of AR visualization approaches for catheter insertion into the ventricle cavity. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; PP:2434-2445. [PMID: 37027733 DOI: 10.1109/tvcg.2023.3247042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Augmented reality (AR) has shown potential in computer-aided surgery. It allows for the visualization of hidden anatomical structures as well as assists in navigating and locating surgical instruments at the surgical site. Various modalities (devices and/or visualizations) have been used in the literature, but few studies investigated the adequacy/superiority of one modality over the other. For instance, the use of optical see-through (OST) HMDs has not always been scientifically justified. Our goal is to compare various visualization modalities for catheter insertion in external ventricular drain and ventricular shunt procedures. We investigate two AR approaches: (1) 2D approaches consisting of a smartphone and a 2D window visualized through an OST (Microsoft HoloLens 2), and (2) 3D approaches consisting of a fully aligned patient model and a model that is adjacent to the patient and is rotationally aligned using an OST. 32 participants joined this study. For each visualization approach, participants were asked to perform five insertions after which they filled NASA-TLX and SUS forms. Moreover, the position and orientation of the needle with respect to the planning during the insertion task were collected. The results show that participants achieved a better insertion performance significantly under 3D visualizations, and the NASA-TLX and SUS forms reflected the preference of participants for these approaches compared to 2D approaches.
Collapse
|
5
|
Minimally invasive and invasive liver surgery based on augmented reality training: a review of the literature. J Robot Surg 2022; 17:753-763. [DOI: 10.1007/s11701-022-01499-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2022] [Accepted: 11/14/2022] [Indexed: 11/29/2022]
|
6
|
Wahba R, Thomas MN, Bunck AC, Bruns CJ, Stippel DL. Clinical use of augmented reality, mixed reality, three-dimensional-navigation and artificial intelligence in liver surgery. Artif Intell Gastroenterol 2021; 2:94-104. [DOI: 10.35712/aig.v2.i4.94] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/09/2021] [Revised: 07/10/2021] [Accepted: 08/27/2021] [Indexed: 02/06/2023] Open
Abstract
A precise knowledge of intra-parenchymal vascular and biliary architecture and the location of lesions in relation to the complex anatomy is indispensable to perform liver surgery. Therefore, virtual three-dimensional (3D)-reconstruction models from computed tomography/magnetic resonance imaging scans of the liver might be helpful for visualization. Augmented reality, mixed reality and 3D-navigation could transfer such 3D-image data directly into the operation theater to support the surgeon. This review examines the literature about the clinical and intraoperative use of these image guidance techniques in liver surgery and provides the reader with the opportunity to learn about these techniques. Augmented reality and mixed reality have been shown to be feasible for the use in open and minimally invasive liver surgery. 3D-navigation facilitated targeting of intraparenchymal lesions. The existing data is limited to small cohorts and description about technical details e.g., accordance between the virtual 3D-model and the real liver anatomy. Randomized controlled trials regarding clinical data or oncological outcome are not available. Up to now there is no intraoperative application of artificial intelligence in liver surgery. The usability of all these sophisticated image guidance tools has still not reached the grade of immersion which would be necessary for a widespread use in the daily surgical routine. Although there are many challenges, augmented reality, mixed reality, 3D-navigation and artificial intelligence are emerging fields in hepato-biliary surgery.
Collapse
Affiliation(s)
- Roger Wahba
- Department of General, Visceral, Cancer and Transplantation Surgery, University of Cologne, Faculty of Medicine and University Hospital Cologne, Cologne 50937, Germany
| | - Michael N Thomas
- Department of General, Visceral, Cancer and Transplantation Surgery, University of Cologne, Faculty of Medicine and University Hospital Cologne, Cologne 50937, Germany
| | - Alexander C Bunck
- Department of Diagnostic and Interventional Radiology, University of Cologne, Faculty of Medicine and University Hospital Cologne, Cologne 50937, Germany
| | - Christiane J Bruns
- Department of General, Visceral, Cancer and Transplantation Surgery, University of Cologne, Faculty of Medicine and University Hospital Cologne, Cologne 50937, Germany
| | - Dirk L Stippel
- Department of General, Visceral, Cancer and Transplantation Surgery, University of Cologne, Faculty of Medicine and University Hospital Cologne, Cologne 50937, Germany
| |
Collapse
|
7
|
Kanehira M, Okamoto T, Abe K, Yasuda J, Onda S, Futagawa Y, Ikegami T, Suzuki N, Hattori A. Development of recognised position-guided navigation system. Int J Med Robot 2021; 17:e2322. [PMID: 34405536 DOI: 10.1002/rcs.2322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Revised: 08/16/2021] [Accepted: 08/16/2021] [Indexed: 11/11/2022]
Abstract
BACKGROUND Previously, we developed an image-guided navigation system (IG-NS) incorporating augmented reality technology. Nevertheless, the system could still only aid the operator by presenting imagery and was short of achieving the goal of developing a real navigation system. Therefore, we developed a recognised position-guided navigation system (RP-NS) and herein reported the functionality and usefulness of this system in a phantom model for clinical applications. METHODS We developed RP-NS which was reconstructed by adding the positional recognition and instruction functions with the cautions by displaying the images on the monitor with a voice to the IG-NS. We evaluated accuracy of positional recognition and instruction functions using phantom model. By utilising the chronological recording of the tip position of the surgical apparatus, the surgical precision of the operators was assessed. Finally, the feasibility of improvements in surgical precision using this system was evaluated. RESULTS The RP-NS indicated an accuracy of the position recognition functions with an error of 2.7 mm. The surgeons could perform partial hepatectomies within mean value of 7.5% error as compared with calculated volume according to the instruction. Improvements in surgical precision using this system were obtained on the surgeons with different levels. CONCLUSIONS The RP-NS was highly effective as a navigation system owing to precise positional recognition and adequate instruction functions. Therefore, these results indicate that the use of this system may complement differences in proficiency, and numerically evaluate surgical skills and analyse tendencies of surgeons.
Collapse
Affiliation(s)
- Masaru Kanehira
- Department of Surgery, The Jikei University Daisan Hospital, Komae, Japan
| | - Tomoyoshi Okamoto
- Department of Surgery, The Jikei University Daisan Hospital, Komae, Japan
| | - Kyohei Abe
- Department of Surgery, The Jikei University Daisan Hospital, Komae, Japan
| | - Jungo Yasuda
- Department of Surgery, The Jikei University School of Medicine, Minato-ku, Japan
| | - Shinji Onda
- Department of Surgery, The Jikei University School of Medicine, Minato-ku, Japan
| | - Yasuro Futagawa
- Department of Surgery, The Jikei University Daisan Hospital, Komae, Japan
| | - Toru Ikegami
- Department of Surgery, The Jikei University School of Medicine, Minato-ku, Japan
| | - Naoki Suzuki
- Institute for High Dimensional Medical Imaging, The Jikei University School of Medicine, Minato-ku, Japan
| | - Asaki Hattori
- Institute for High Dimensional Medical Imaging, The Jikei University School of Medicine, Minato-ku, Japan
| |
Collapse
|
8
|
Semenkov AV, Subbot VS. [Systematic review of current trends in preoperative planning of surgery for liver tumors]. Khirurgiia (Mosk) 2021:84-97. [PMID: 34363450 DOI: 10.17116/hirurgia202108184] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
The purpose of the study was a systematic review of current trends in preoperative planning of surgery for liver tumors. These data will be valuable to determine the advantages and disadvantages of 3D modeling, augmented reality technology and 3D printing in preoperative planning of surgery for focal liver lesions.
Collapse
Affiliation(s)
- A V Semenkov
- Sklifosovsky Institute for Emergency Care, Moscow, Russia.,Sechenov First Moscow State Medical University, Moscow, Russia
| | - V S Subbot
- Sklifosovsky Institute for Emergency Care, Moscow, Russia.,Sechenov First Moscow State Medical University, Moscow, Russia
| |
Collapse
|
9
|
Negrillo-Cárdenas J, Jiménez-Pérez JR, Feito FR. The role of virtual and augmented reality in orthopedic trauma surgery: From diagnosis to rehabilitation. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2020; 191:105407. [PMID: 32120088 DOI: 10.1016/j.cmpb.2020.105407] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/21/2019] [Revised: 10/22/2019] [Accepted: 02/18/2020] [Indexed: 06/10/2023]
Abstract
Virtual and augmented reality have been used to assist and improve human capabilities in many fields. Most recent advances allow the usage of these technologies for personal and professional purposes. In particular, they have been progressively introduced in many medical procedures since the last century. Thanks to immersive training systems and a better comprehension of the ongoing procedure, their main objectives are to increase patient safety and decrease recovery time. The current and future possibilities of virtual and augmented reality in the context of bone fracture reduction are the main focus of this review. This medical procedure requires meticulous planning and a complex intervention in many cases, hence becoming a promising candidate to be benefited from this kind of technology. In this paper, we exhaustively analyze the impact of virtual and augmented reality to bone fracture healing, detailing each task from diagnosis to rehabilitation. Our primary goal is to introduce novel researchers to current trends applied to orthopedic trauma surgery, proposing new lines of research. To that end, we propose and evaluate a set of qualitative metrics to highlight the most promising challenges of virtual and augmented reality technologies in this context.
Collapse
|
10
|
The Miami International Evidence-based Guidelines on Minimally Invasive Pancreas Resection. Ann Surg 2020; 271:1-14. [PMID: 31567509 DOI: 10.1097/sla.0000000000003590] [Citation(s) in RCA: 308] [Impact Index Per Article: 61.6] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
OBJECTIVE The aim of this study was to develop and externally validate the first evidence-based guidelines on minimally invasive pancreas resection (MIPR) before and during the International Evidence-based Guidelines on Minimally Invasive Pancreas Resection (IG-MIPR) meeting in Miami (March 2019). SUMMARY BACKGROUND DATA MIPR has seen rapid development in the past decade. Promising outcomes have been reported by early adopters from high-volume centers. Subsequently, multicenter series as well as randomized controlled trials were reported; however, guidelines for clinical practice were lacking. METHODS The Scottisch Intercollegiate Guidelines Network (SIGN) methodology was used, incorporating these 4 items: systematic reviews using PubMed, Embase, and Cochrane databases to answer clinical questions, whenever possible in PICO style, the GRADE approach for assessment of the quality of evidence, the Delphi method for establishing consensus on the developed recommendations, and the AGREE-II instrument for the assessment of guideline quality and external validation. The current guidelines are cosponsored by the International Hepato-Pancreato-Biliary Association, the Americas Hepato-Pancreato-Biliary Association, the Asian-Pacific Hepato-Pancreato-Biliary Association, the European-African Hepato-Pancreato-Biliary Association, the European Association for Endoscopic Surgery, Pancreas Club, the Society of American Gastrointestinal and Endoscopic Surgery, the Society for Surgery of the Alimentary Tract, and the Society of Surgical Oncology. RESULTS After screening 16,069 titles, 694 studies were reviewed, and 291 were included. The final 28 recommendations covered 6 topics; laparoscopic and robotic distal pancreatectomy, central pancreatectomy, pancreatoduodenectomy, as well as patient selection, training, learning curve, and minimal annual center volume required to obtain optimal outcomes and patient safety. CONCLUSION The IG-MIPR using SIGN methodology give guidance to surgeons, hospital administrators, patients, and medical societies on the use and outcome of MIPR as well as the approach to be taken regarding this challenging type of surgery.
Collapse
|
11
|
Luo H, Yin D, Zhang S, Xiao D, He B, Meng F, Zhang Y, Cai W, He S, Zhang W, Hu Q, Guo H, Liang S, Zhou S, Liu S, Sun L, Guo X, Fang C, Liu L, Jia F. Augmented reality navigation for liver resection with a stereoscopic laparoscope. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2020; 187:105099. [PMID: 31601442 DOI: 10.1016/j.cmpb.2019.105099] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/10/2019] [Revised: 08/14/2019] [Accepted: 09/27/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE Understanding the three-dimensional (3D) spatial position and orientation of vessels and tumor(s) is vital in laparoscopic liver resection procedures. Augmented reality (AR) techniques can help surgeons see the patient's internal anatomy in conjunction with laparoscopic video images. METHOD In this paper, we present an AR-assisted navigation system for liver resection based on a rigid stereoscopic laparoscope. The stereo image pairs from the laparoscope are used by an unsupervised convolutional network (CNN) framework to estimate depth and generate an intraoperative 3D liver surface. Meanwhile, 3D models of the patient's surgical field are segmented from preoperative CT images using V-Net architecture for volumetric image data in an end-to-end predictive style. A globally optimal iterative closest point (Go-ICP) algorithm is adopted to register the pre- and intraoperative models into a unified coordinate space; then, the preoperative 3D models are superimposed on the live laparoscopic images to provide the surgeon with detailed information about the subsurface of the patient's anatomy, including tumors, their resection margins and vessels. RESULTS The proposed navigation system is tested on four laboratory ex vivo porcine livers and five operating theatre in vivo porcine experiments to validate its accuracy. The ex vivo and in vivo reprojection errors (RPE) are 6.04 ± 1.85 mm and 8.73 ± 2.43 mm, respectively. CONCLUSION AND SIGNIFICANCE Both the qualitative and quantitative results indicate that our AR-assisted navigation system shows promise and has the potential to be highly useful in clinical practice.
Collapse
Affiliation(s)
- Huoling Luo
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Dalong Yin
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China; Department of Hepatobiliary Surgery, Shengli Hospital Affiliated to University of Science and Technology of China, Hefei, China
| | - Shugeng Zhang
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China; Department of Hepatobiliary Surgery, Shengli Hospital Affiliated to University of Science and Technology of China, Hefei, China
| | - Deqiang Xiao
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Baochun He
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Fanzheng Meng
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Yanfang Zhang
- Department of Interventional Radiology, Shenzhen People's Hospital, Shenzhen, China
| | - Wei Cai
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China
| | - Shenghao He
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Wenyu Zhang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China
| | - Qingmao Hu
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China
| | - Hongrui Guo
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Shuhang Liang
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Shuo Zhou
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Shuxun Liu
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Linmao Sun
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Xiao Guo
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China
| | - Chihua Fang
- Department of Hepatobiliary Surgery, Zhujiang Hospital, Southern Medical University, Guangzhou, China
| | - Lianxin Liu
- Department of Hepatobiliary Surgery, First Affiliated Hospital of Harbin Medical University, Harbin, China; Department of Hepatobiliary Surgery, Shengli Hospital Affiliated to University of Science and Technology of China, Hefei, China.
| | - Fucang Jia
- Research Lab for Medical Imaging and Digital Surgery, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen, China.
| |
Collapse
|