1
|
Wang Y, Wei S, Zuo R, Kam M, Opfermann JD, Sunmola I, Hsieh MH, Krieger A, Kang JU. Automatic and real-time tissue sensing for autonomous intestinal anastomosis using hybrid MLP-DC-CNN classifier-based optical coherence tomography. BIOMEDICAL OPTICS EXPRESS 2024; 15:2543-2560. [PMID: 38633079 PMCID: PMC11019703 DOI: 10.1364/boe.521652] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/15/2024] [Revised: 03/18/2024] [Accepted: 03/18/2024] [Indexed: 04/19/2024]
Abstract
Anastomosis is a common and critical part of reconstructive procedures within gastrointestinal, urologic, and gynecologic surgery. The use of autonomous surgical robots such as the smart tissue autonomous robot (STAR) system demonstrates an improved efficiency and consistency of the laparoscopic small bowel anastomosis over the current da Vinci surgical system. However, the STAR workflow requires auxiliary manual monitoring during the suturing procedure to avoid missed or wrong stitches. To eliminate this monitoring task from the operators, we integrated an optical coherence tomography (OCT) fiber sensor with the suture tool and developed an automatic tissue classification algorithm for detecting missed or wrong stitches in real time. The classification results were updated and sent to the control loop of STAR robot in real time. The suture tool was guided to approach the object by a dual-camera system. If the tissue inside the tool jaw was inconsistent with the desired suture pattern, a warning message would be generated. The proposed hybrid multilayer perceptron dual-channel convolutional neural network (MLP-DC-CNN) classification platform can automatically classify eight different abdominal tissue types that require different suture strategies for anastomosis. In MLP, numerous handcrafted features (∼1955) were utilized including optical properties and morphological features of one-dimensional (1D) OCT A-line signals. In DC-CNN, intensity-based features and depth-resolved tissues' attenuation coefficients were fully exploited. A decision fusion technique was applied to leverage the information collected from both classifiers to further increase the accuracy. The algorithm was evaluated on 69,773 testing A-line data. The results showed that our model can classify the 1D OCT signals of small bowels in real time with an accuracy of 90.06%, a precision of 88.34%, and a sensitivity of 87.29%, respectively. The refresh rate of the displayed A-line signals was set as 300 Hz, the maximum sensing depth of the fiber was 3.6 mm, and the running time of the image processing algorithm was ∼1.56 s for 1,024 A-lines. The proposed fully automated tissue sensing model outperformed the single classifier of CNN, MLP, or SVM with optimized architectures, showing the complementarity of different feature sets and network architectures in classifying intestinal OCT A-line signals. It can potentially reduce the manual involvement of robotic laparoscopic surgery, which is a crucial step towards a fully autonomous STAR system.
Collapse
Affiliation(s)
- Yaning Wang
- Department of Electrical and Computer Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| | - Shuwen Wei
- Department of Electrical and Computer Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| | - Ruizhi Zuo
- Department of Electrical and Computer Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| | - Michael Kam
- Department of Mechanical Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| | - Justin D. Opfermann
- Department of Mechanical Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| | - Idris Sunmola
- Department of Mechanical Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| | - Michael H. Hsieh
- Division of Urology, Children’s National Hospital, 111 Michigan Ave NW, Washington, D.C. 20010, USA
| | - Axel Krieger
- Department of Mechanical Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| | - Jin U. Kang
- Department of Electrical and Computer Engineering, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218, USA
| |
Collapse
|
2
|
Ge J, Kam M, Opfermann JD, Saeidi H, Leonard S, Mady LJ, Schnermann MJ, Krieger A. Autonomous System for Tumor Resection (ASTR) - Dual-Arm Robotic Midline Partial Glossectomy. IEEE Robot Autom Lett 2024; 9:1166-1173. [PMID: 38292408 PMCID: PMC10824540 DOI: 10.1109/lra.2023.3341773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2024]
Abstract
Head and neck cancers are the seventh most common cancers worldwide, with squamous cell carcinoma being the most prevalent histologic subtype. Surgical resection is a primary treatment modality for many patients with head and neck squamous cell carcinoma, and accurately identifying tumor boundaries and ensuring sufficient resection margins are critical for optimizing oncologic outcomes. This study presents an innovative autonomous system for tumor resection (ASTR) and conducts a feasibility study by performing supervised autonomous midline partial glossectomy for pseudotumor with millimeter accuracy. The proposed ASTR system consists of a dual-camera vision system, an electrosurgical instrument, a newly developed vacuum grasping instrument, two 6-DOF manipulators, and a novel autonomous control system. The letter introduces an ontology-based research framework for creating and implementing a complex autonomous surgical workflow, using the glossectomy as a case study. Porcine tongue tissues are used in this study, and marked using color inks and near-infrared fluorescent (NIRF) markers to indicate the pseudotumor. ASTR actively monitors the NIRF markers and gathers spatial and color data from the samples, enabling planning and execution of robot trajectories in accordance with the proposed glossectomy workflow. The system successfully performs six consecutive supervised autonomous pseudotumor resections on porcine specimens. The average surface and depth resection errors measure 0.73±0.60 mm and 1.89±0.54 mm, respectively, with no positive tumor margins detected in any of the six resections. The resection accuracy is demonstrated to be on par with manual pseudotumor glossectomy performed by an experienced otolaryngologist.
Collapse
Affiliation(s)
- Jiawei Ge
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211 USA
| | - Michael Kam
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211 USA
| | - Justin D Opfermann
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211 USA
| | - Hamed Saeidi
- Department of Computer Science, University of North Carolina Wilmington, Wilmington, NC 28403, USA
| | - Simon Leonard
- Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Leila J Mady
- Department of Otolaryngology - Head and Neck Surgery, Johns Hopkins School of Medicine, Johns Hopkins University, Baltimore, MD 21287, USA
| | - Martin J Schnermann
- Chemical Biology Laboratory, Center for Cancer Research, National Cancer Institute, National Institutes of Health, Frederick, MD 21702, USA
| | - Axel Krieger
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211 USA
| |
Collapse
|
3
|
Fan Y, Liu S, Gao E, Guo R, Dong G, Li Y, Gao T, Tang X, Liao H. The LMIT: Light-mediated minimally-invasive theranostics in oncology. Theranostics 2024; 14:341-362. [PMID: 38164160 PMCID: PMC10750201 DOI: 10.7150/thno.87783] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2023] [Accepted: 10/18/2023] [Indexed: 01/03/2024] Open
Abstract
Minimally-invasive diagnosis and therapy have gradually become the trend and research hotspot of current medical applications. The integration of intraoperative diagnosis and treatment is a development important direction for real-time detection, minimally-invasive diagnosis and therapy to reduce mortality and improve the quality of life of patients, so called minimally-invasive theranostics (MIT). Light is an important theranostic tool for the treatment of cancerous tissues. Light-mediated minimally-invasive theranostics (LMIT) is a novel evolutionary technology that integrates diagnosis and therapeutics for the less invasive treatment of diseased tissues. Intelligent theranostics would promote precision surgery based on the optical characterization of cancerous tissues. Furthermore, MIT also requires the assistance of smart medical devices or robots. And, optical multimodality lay a solid foundation for intelligent MIT. In this review, we summarize the important state-of-the-arts of optical MIT or LMIT in oncology. Multimodal optical image-guided intelligent treatment is another focus. Intraoperative imaging and real-time analysis-guided optical treatment are also systemically discussed. Finally, the potential challenges and future perspectives of intelligent optical MIT are discussed.
Collapse
Affiliation(s)
- Yingwei Fan
- School of Medical Technology, Beijing Institute of Technology, Beijing, China, 100081
| | - Shuai Liu
- School of Medical Technology, Beijing Institute of Technology, Beijing, China, 100081
| | - Enze Gao
- School of Medical Technology, Beijing Institute of Technology, Beijing, China, 100081
| | - Rui Guo
- School of Medical Technology, Beijing Institute of Technology, Beijing, China, 100081
| | - Guozhao Dong
- School of Medical Technology, Beijing Institute of Technology, Beijing, China, 100081
| | - Yangxi Li
- Dept. of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China, 100084
| | - Tianxin Gao
- School of Medical Technology, Beijing Institute of Technology, Beijing, China, 100081
| | - Xiaoying Tang
- School of Medical Technology, Beijing Institute of Technology, Beijing, China, 100081
| | - Hongen Liao
- Dept. of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, China, 100084
| |
Collapse
|
4
|
Sone K, Tanimoto S, Toyohara Y, Taguchi A, Miyamoto Y, Mori M, Iriyama T, Wada-Hiraike O, Osuga Y. Evolution of a surgical system using deep learning in minimally invasive surgery (Review). Biomed Rep 2023; 19:45. [PMID: 37324165 PMCID: PMC10265572 DOI: 10.3892/br.2023.1628] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2022] [Accepted: 03/31/2023] [Indexed: 06/17/2023] Open
Abstract
Recently, artificial intelligence (AI) has been applied in various fields due to the development of new learning methods, such as deep learning, and the marked progress in computational processing speed. AI is also being applied in the medical field for medical image recognition and omics analysis of genomes and other data. Recently, AI applications for videos of minimally invasive surgeries have also advanced, and studies on such applications are increasing. In the present review, studies that focused on the following topics were selected: i) Organ and anatomy identification, ii) instrument identification, iii) procedure and surgical phase recognition, iv) surgery-time prediction, v) identification of an appropriate incision line, and vi) surgical education. The development of autonomous surgical robots is also progressing, with the Smart Tissue Autonomous Robot (STAR) and RAVEN systems being the most reported developments. STAR, in particular, is currently being used in laparoscopic imaging to recognize the surgical site from laparoscopic images and is in the process of establishing an automated suturing system, albeit in animal experiments. The present review examined the possibility of fully autonomous surgical robots in the future.
Collapse
Affiliation(s)
- Kenbun Sone
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Saki Tanimoto
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Yusuke Toyohara
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Ayumi Taguchi
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Yuichiro Miyamoto
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Mayuyo Mori
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Takayuki Iriyama
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Osamu Wada-Hiraike
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| | - Yutaka Osuga
- Department of Obstetrics and Gynecology, Faculty of Medicine, The University of Tokyo, Tokyo 113-8655, Japan
| |
Collapse
|
5
|
Han J, Davids J, Ashrafian H, Darzi A, Elson DS, Sodergren M. A systematic review of robotic surgery: From supervised paradigms to fully autonomous robotic approaches. Int J Med Robot 2022; 18:e2358. [PMID: 34953033 DOI: 10.1002/rcs.2358] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Revised: 11/23/2021] [Accepted: 12/21/2021] [Indexed: 12/20/2022]
Abstract
BACKGROUND From traditional open surgery to laparoscopic surgery and robot-assisted surgery, advances in robotics, machine learning, and imaging are pushing the surgical approach to-wards better clinical outcomes. Pre-clinical and clinical evidence suggests that automation may standardise techniques, increase efficiency, and reduce clinical complications. METHODS A PRISMA-guided search was conducted across PubMed and OVID. RESULTS Of the 89 screened articles, 51 met the inclusion criteria, with 10 included in the final review. Automatic data segmentation, trajectory planning, intra-operative registration, trajectory drilling, and soft tissue robotic surgery were discussed. CONCLUSION Although automated surgical systems remain conceptual, several research groups have developed supervised autonomous robotic surgical systems with increasing consideration for ethico-legal issues for automation. Automation paves the way for precision surgery and improved safety and opens new possibilities for deploying more robust artificial intelligence models, better imaging modalities and robotics to improve clinical outcomes.
Collapse
Affiliation(s)
- Jinpei Han
- Hamlyn Centre for Robotic Surgery and Artificial Intelligence, Imperial College London, London, UK
| | - Joseph Davids
- Hamlyn Centre for Robotic Surgery and Artificial Intelligence, Imperial College London, London, UK
- National Hospital for Neurology and Neurosurgery, London, UK
| | - Hutan Ashrafian
- Hamlyn Centre for Robotic Surgery and Artificial Intelligence, Imperial College London, London, UK
| | - Ara Darzi
- Hamlyn Centre for Robotic Surgery and Artificial Intelligence, Imperial College London, London, UK
| | - Daniel S Elson
- Hamlyn Centre for Robotic Surgery and Artificial Intelligence, Imperial College London, London, UK
| | - Mikael Sodergren
- Hamlyn Centre for Robotic Surgery and Artificial Intelligence, Imperial College London, London, UK
| |
Collapse
|
6
|
Saeidi H, Opfermann JD, Kam M, Wei S, Leonard S, Hsieh MH, Kang JU, Krieger A. Autonomous robotic laparoscopic surgery for intestinal anastomosis. Sci Robot 2022; 7:eabj2908. [PMID: 35080901 DOI: 10.1126/scirobotics.abj2908] [Citation(s) in RCA: 54] [Impact Index Per Article: 27.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
Autonomous robotic surgery has the potential to provide efficacy, safety, and consistency independent of individual surgeon's skill and experience. Autonomous anastomosis is a challenging soft-tissue surgery task because it requires intricate imaging, tissue tracking, and surgical planning techniques, as well as a precise execution via highly adaptable control strategies often in unstructured and deformable environments. In the laparoscopic setting, such surgeries are even more challenging because of the need for high maneuverability and repeatability under motion and vision constraints. Here we describe an enhanced autonomous strategy for laparoscopic soft tissue surgery and demonstrate robotic laparoscopic small bowel anastomosis in phantom and in vivo intestinal tissues. This enhanced autonomous strategy allows the operator to select among autonomously generated surgical plans and the robot executes a wide range of tasks independently. We then use our enhanced autonomous strategy to perform in vivo autonomous robotic laparoscopic surgery for intestinal anastomosis on porcine models over a 1-week survival period. We compared the anastomosis quality criteria-including needle placement corrections, suture spacing, suture bite size, completion time, lumen patency, and leak pressure-of the developed autonomous system, manual laparoscopic surgery, and robot-assisted surgery (RAS). Data from a phantom model indicate that our system outperforms expert surgeons' manual technique and RAS technique in terms of consistency and accuracy. This was also replicated in the in vivo model. These results demonstrate that surgical robots exhibiting high levels of autonomy have the potential to improve consistency, patient outcomes, and access to a standard surgical technique.
Collapse
Affiliation(s)
- H Saeidi
- Department of Computer Science, University of North Carolina Wilmington, Wilmington, NC 28403, USA.,Department of Mechanical Engineering, Johns Hopkins University; Baltimore, MD 21211, USA
| | - J D Opfermann
- Department of Mechanical Engineering, Johns Hopkins University; Baltimore, MD 21211, USA.,Laboratory for Computational Sensing and Robotics, Johns Hopkins University; Baltimore, MD 21211, USA
| | - M Kam
- Department of Mechanical Engineering, Johns Hopkins University; Baltimore, MD 21211, USA.,Laboratory for Computational Sensing and Robotics, Johns Hopkins University; Baltimore, MD 21211, USA
| | - S Wei
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University; Baltimore, MD 21211, USA.,Department of Electrical and Computer Engineering, Johns Hopkins University; Baltimore, MD 21211, USA
| | - S Leonard
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University; Baltimore, MD 21211, USA
| | - M H Hsieh
- Department of Urology, Children's National Hospital, 111 Michigan Ave. N.W., Washington, DC 20010, USA
| | - J U Kang
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University; Baltimore, MD 21211, USA.,Department of Electrical and Computer Engineering, Johns Hopkins University; Baltimore, MD 21211, USA
| | - A Krieger
- Department of Mechanical Engineering, Johns Hopkins University; Baltimore, MD 21211, USA.,Laboratory for Computational Sensing and Robotics, Johns Hopkins University; Baltimore, MD 21211, USA
| |
Collapse
|
7
|
Ge J, Saeidi H, Kam M, Opfermann J, Krieger A. Supervised Autonomous Electrosurgery for Soft Tissue Resection. PROCEEDINGS. IEEE INTERNATIONAL SYMPOSIUM ON BIOINFORMATICS AND BIOENGINEERING 2021; 2021:10.1109/bibe52308.2021.9635563. [PMID: 38533465 PMCID: PMC10965307 DOI: 10.1109/bibe52308.2021.9635563] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/28/2024]
Abstract
Surgical resection is the current clinical standard of care for treating squamous cell carcinoma. Maintaining an adequate tumor resection margin is the key to a good surgical outcome, but tumor edge delineation errors are inevitable with manual surgery due to difficulty in visualization and hand-eye coordination. Surgical automation is a growing field of robotics to relieve surgeon burdens and to achieve a consistent and potentially better surgical outcome. This paper reports a novel robotic supervised autonomous electrosurgery technique for soft tissue resection achieving millimeter accuracy. The tumor resection procedure is decomposed to the subtask level for a more direct understanding and automation. A 4-DOF suction system is developed, and integrated with a 6-DOF electrocautery robot to perform resection experiments. A novel near-infrared fluorescent marker is manually dispensed on cadaver samples to define a pseudotumor, and intraoperatively tracked using a dual-camera system. The autonomous dual-robot resection cooperation workflow is proposed and evaluated in this study. The integrated system achieves autonomous localization of the pseudotumor by tracking the near-infrared marker, and performs supervised autonomous resection in cadaver porcine tongues (N=3). The three pseudotumors were successfully removed from porcine samples. The evaluated average surface and depth resection errors are 1.19 and 1.83mm, respectively. This work is an essential step towards autonomous tumor resections.
Collapse
Affiliation(s)
- Jiawei Ge
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Hamed Saeidi
- Department of Computer Science, University of North Carolina Wilmington, Wilmington, NC, USA
| | - Michael Kam
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Justin Opfermann
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Axel Krieger
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
8
|
Su B, Yu S, Li X, Gong Y, Li H, Ren Z, Xia Y, Wang H, Zhang Y, Yao W, Wang J, Tang J. Autonomous Robot for Removing Superficial Traumatic Blood. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2021; 9:2600109. [PMID: 33598368 PMCID: PMC7880304 DOI: 10.1109/jtehm.2021.3056618] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/16/2020] [Revised: 01/16/2021] [Accepted: 01/29/2021] [Indexed: 11/09/2022]
Abstract
Objective: To remove blood from an incision and find the incision spot is a key task during surgery, or else over discharge of blood will endanger a patient's life. However, the repetitive manual blood removal involves plenty of workload contributing fatigue of surgeons. Thus, it is valuable to design a robotic system which can automatically remove blood on the incision surface. Methods: In this paper, we design a robotic system to fulfill the surgical task of the blood removal. The system consists of a pair of dual cameras, a 6-DoF robotic arm, an aspirator whose handle is fixed to a robotic arm, and a pump connected to the aspirator. Further, a path-planning algorithm is designed to generate a path, which the aspirator tip should follow to remove blood. Results: In a group of simulating bleeding experiments on ex vivo porcine tissue, the contour of the blood region is detected, and the reconstructed spatial coordinates of the detected blood contour is obtained afterward. The BRR robot cleans thoroughly the blood running out the incision. Conclusions: This study contributes the first result on designing an autonomous blood removal medical robot. The skill of the surgical blood removal operation, which is manually operated by surgeons nowadays, is alternatively grasped by the proposed BRR medical robot.
Collapse
Affiliation(s)
- Baiquan Su
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Shi Yu
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Xintong Li
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Yi Gong
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Han Li
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Zifeng Ren
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Yijing Xia
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - He Wang
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Yucheng Zhang
- Medical Robotics Laboratory, School of AutomationBeijing University of Posts and TelecommunicationsBeijing100876China
| | - Wei Yao
- Department of GastroenterologyPeking University Third HospitalBeijing100191China
| | - Junchen Wang
- School of Mechanical Engineering and AutomationBeihang UniversityBeijing100191China.,Beijing Advanced Innovation Center, Biomedical EngineeringBeihang UniversityBeijing100086China
| | - Jie Tang
- Department of NeurosurgeryXuanwu HospitalCapital Medical UniversityBeijing100053China
| |
Collapse
|