1
|
Wei S, Kam M, Wang Y, Opfermann JD, Saeidi H, Hsieh MH, Krieger A, Kang JU. Deep point cloud landmark localization for fringe projection profilometry. JOURNAL OF THE OPTICAL SOCIETY OF AMERICA. A, OPTICS, IMAGE SCIENCE, AND VISION 2022; 39:655-661. [PMID: 35471389 DOI: 10.1364/josaa.450225] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Accepted: 03/02/2022] [Indexed: 06/14/2023]
Abstract
Point clouds have been widely used due to their information being richer than images. Fringe projection profilometry (FPP) is one of the camera-based point cloud acquisition techniques that is being developed as a vision system for robotic surgery. For semi-autonomous robotic suturing, fluorescent fiducials were previously used on a target tissue as suture landmarks. This not only increases system complexity but also imposes safety concerns. To address these problems, we propose a numerical landmark localization algorithm based on a convolutional neural network (CNN) and a conditional random field (CRF). A CNN is applied to regress landmark heatmaps from the four-channel image data generated by the FPP. A CRF leveraging both local and global shape constraints is developed to better tune the landmark coordinates, reject extra landmarks, and recover missing landmarks. The robustness of the proposed method is demonstrated through ex vivo porcine intestine landmark localization experiments.
Collapse
|
2
|
Lucas Y, Niri R, Treuillet S, Douzi H, Castaneda B. Wound Size Imaging: Ready for Smart Assessment and Monitoring. Adv Wound Care (New Rochelle) 2021; 10:641-661. [PMID: 32320356 PMCID: PMC8392100 DOI: 10.1089/wound.2018.0937] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2020] [Accepted: 03/02/2020] [Indexed: 01/02/2023] Open
Abstract
Significance: We introduce and evaluate emerging devices and modalities for wound size imaging and also promising image processing tools for smart wound assessment and monitoring. Recent Advances: Some commercial devices are available for optical wound assessment but with limited possibilities compared to the power of multimodal imaging. With new low-cost devices and machine learning, wound assessment has become more robust and accurate. Wound size imaging not only provides area and volume but also the proportion of each tissue on the wound bed. Near-infrared and thermal spectral bands also enhance the classical visual assessment. Critical Issues: The ability to embed advanced imaging technology in portable devices such as smartphones and tablets with tissue analysis software tools will significantly improve wound care. As wound care and measurement are performed by nurses, the equipment needs to remain user-friendly, enable quick measurements, provide advanced monitoring, and be connected to the patient data management system. Future Directions: Combining several image modalities and machine learning, optical wound assessment will be smart enough to enable real wound monitoring, to provide clinicians with relevant indications to adapt the treatments and to improve healing rates and speed. Sharing the wound care histories of a number of patients on databases and through telemedicine practice could induce a better knowledge of the healing process and thus a better efficiency when the recorded clinical experience has been converted into knowledge through deep learning.
Collapse
Affiliation(s)
- Yves Lucas
- PRISME Laboratory, Orléans University, Orléans, France
| | - Rania Niri
- PRISME Laboratory, Orléans University, Orléans, France
- IRF-SIC Laboratory, Ibn Zohr University, Agadir, Morocco
| | | | - Hassan Douzi
- IRF-SIC Laboratory, Ibn Zohr University, Agadir, Morocco
| | - Benjamin Castaneda
- Laboratorio de Imagenes Medicas, Pontificia Universidad Catholica del Peru, Lima, Peru
| |
Collapse
|
3
|
Leonard S, Opfermann J, Uebele N, Carroll L, Walter R, Bayne C, Ge J, Krieger A. Vaginal Cuff Closure With Dual-Arm Robot and Near-Infrared Fluorescent Sutures. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2021; 3:762-772. [PMID: 36970042 PMCID: PMC10038549 DOI: 10.1109/tmrb.2021.3097415] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
This paper presents a dual-arm suturing robot. We extend the Smart Tissue Autonomous Robot (STAR) with a second robot manipulator, whose purpose is to manage loose suture thread, a task that was previously executed by a human assistant. We also introduce novel near-infrared fluorescent (NIRF) sutures that are automatically segmented and delimit the boundaries of the suturing task. During ex-vivo experiments of porcine models, our results demonstrate that this new system is capable of outperforming human surgeons in all but one metric for the task of vaginal cuff closure (porcine model) and is more consistent in every aspect of the task. We also present results to demonstrate that the system can perform a vaginal cuff closure during an in-vivo experiment (porcine model).
Collapse
Affiliation(s)
- Simon Leonard
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Justin Opfermann
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | | | - Lydia Carroll
- Rotary Mission Systems, Lockheed Martin, Mount Laurel, NJ, USA
| | | | | | - Jiawei Ge
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Axel Krieger
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|
4
|
Kam M, Saeidi H, Hsieh MH, Kang JU, Krieger A. A Confidence-Based Supervised-Autonomous Control Strategy for Robotic Vaginal Cuff Closure. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : ICRA : [PROCEEDINGS]. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION 2021; 2021:10.1109/icra48506.2021.9561685. [PMID: 34840856 PMCID: PMC8612028 DOI: 10.1109/icra48506.2021.9561685] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Autonomous robotic suturing has the potential to improve surgery outcomes by leveraging accuracy, repeatability, and consistency compared to manual operations. However, achieving full autonomy in complex surgical environments is not practical and human supervision is required to guarantee safety. In this paper, we develop a confidence-based supervised autonomous suturing method to perform robotic suturing tasks via both Smart Tissue Autonomous Robot (STAR) and surgeon collaboratively with the highest possible degree of autonomy. Via the proposed method, STAR performs autonomous suturing when highly confident and otherwise asks the operator for possible assistance in suture positioning adjustments. We evaluate the accuracy of our proposed control method via robotic suturing tests on synthetic vaginal cuff tissues and compare them to the results of vaginal cuff closures performed by an experienced surgeon. Our test results indicate that by using the proposed confidence-based method, STAR can predict the success of pure autonomous suture placement with an accuracy of 94.74%. Moreover, via an additional 25% human intervention, STAR can achieve a 98.1% suture placement accuracy compared to an 85.4% accuracy of completely autonomous robotic suturing. Finally, our experiment results indicate that STAR using the proposed method achieves 1.6 times better consistency in suture spacing and 1.8 times better consistency in suture bite sizes than the manual results.
Collapse
Affiliation(s)
- Michael Kam
- Dep. of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Hamed Saeidi
- Dep. of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Michael H Hsieh
- Dep. of Urology, Children's National Hospital, 111 Michigan Ave. N.W., Washington, DC 20010, USA
| | - J U Kang
- Dep. of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Axel Krieger
- Dep. of Mechanical Engineering, Johns Hopkins University, Baltimore, MD 21211, USA
| |
Collapse
|
5
|
Novel Multimodal, Multiscale Imaging System with Augmented Reality. Diagnostics (Basel) 2021; 11:diagnostics11030441. [PMID: 33806547 PMCID: PMC7999725 DOI: 10.3390/diagnostics11030441] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Revised: 02/19/2021] [Accepted: 02/21/2021] [Indexed: 01/23/2023] Open
Abstract
A novel multimodal, multiscale imaging system with augmented reality capability were developed and characterized. The system offers 3D color reflectance imaging, 3D fluorescence imaging, and augmented reality in real time. Multiscale fluorescence imaging was enabled by developing and integrating an in vivo fiber-optic microscope. Real-time ultrasound-fluorescence multimodal imaging used optically tracked fiducial markers for registration. Tomographical data are also incorporated using optically tracked fiducial markers for registration. Furthermore, we characterized system performance and registration accuracy in a benchtop setting. The multiscale fluorescence imaging facilitated assessing the functional status of tissues, extending the minimal resolution of fluorescence imaging to ~17.5 µm. The system achieved a mean of Target Registration error of less than 2 mm for registering fluorescence images to ultrasound images and MRI-based 3D model, which is within clinically acceptable range. The low latency and high frame rate of the prototype system has shown the promise of applying the reported techniques in clinically relevant settings in the future.
Collapse
|
6
|
Saeidi H, Ge J, Kam M, Opfermann JD, Leonard S, Joshi AS, Krieger A. Supervised Autonomous Electrosurgery via Biocompatible Near-Infrared Tissue Tracking Techniques. IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS 2019; 1:228-236. [PMID: 33458603 PMCID: PMC7810241 DOI: 10.1109/tmrb.2019.2949870] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
Autonomous robotic surgery systems aim to improve patient outcomes by leveraging the repeatability and consistency of automation and also reducing human induced errors. However, intraoperative autonomous soft tissue tracking and robot control still remains a challenge due to the lack of structure, and high deformability of such tissues. In this paper, we take advantage of biocompatible Near-Infrared (NIR) marking methods and develop a supervised autonomous 3D path planning, filtering, and control strategy for our Smart Tissue Autonomous Robot (STAR) to enable precise and consistent incisions on complex 3D soft tissues. Our experimental results on cadaver porcine tongue samples indicate that the proposed strategy reduces surface incision error and depth incision error by 40.03% and 51.5%, respectively, compared to a teleoperation strategy via da Vinci. Furthermore, compared to an autonomous path planning method with linear interpolation between the NIR markers, the proposed strategy reduces the incision depth error by 48.58% by taking advantage of 3D tissue surface information.
Collapse
Affiliation(s)
- H. Saeidi
- Mechanical Engineering Department, University of Maryland, College Park, MD 20742, USA., Fischell Institute for Biomedical Devices and the Marlene and Stewart Greenebaum Cancer Center
| | - J. Ge
- Mechanical Engineering Department, University of Maryland, College Park, MD 20742, USA., Fischell Institute for Biomedical Devices and the Marlene and Stewart Greenebaum Cancer Center
| | - M. Kam
- Mechanical Engineering Department, University of Maryland, College Park, MD 20742, USA., Fischell Institute for Biomedical Devices and the Marlene and Stewart Greenebaum Cancer Center
| | - J. D. Opfermann
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Childrens National Health System, 111 Michigan Ave. N.W., Washington, DC 20010
| | - S. Leonard
- Electrical and Computer Science Eng. Dept., Johns Hopkins University, Baltimore, MD 21211
| | - A. S. Joshi
- Division of Otolaryngology - Head & Neck Surgery at The George Washington University Medical Faculty Associates, 2300 M St. NW 4th Floor, Washington DC 20037
| | - A. Krieger
- Mechanical Engineering Department, University of Maryland, College Park, MD 20742, USA., Fischell Institute for Biomedical Devices and the Marlene and Stewart Greenebaum Cancer Center
| |
Collapse
|
7
|
Kam M, Saeidi H, Wei S, Opfermann JD, Leonard S, Hsieh MH, Kang JU, Krieger A. Semi-autonomous Robotic Anastomoses of Vaginal Cuffs Using Marker Enhanced 3D Imaging and Path Planning. MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION : MICCAI ... INTERNATIONAL CONFERENCE ON MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION 2019; 11768:65-73. [PMID: 33521798 PMCID: PMC7841647 DOI: 10.1007/978-3-030-32254-0_8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/07/2023]
Abstract
Autonomous robotic anastomosis has the potential to improve surgical outcomes by performing more consistent suture spacing and bite size compared to manual anastomosis. However, due to soft tissue's irregular shape and unpredictable deformation, performing autonomous robotic anastomosis without continuous tissue detection and three-dimensional path planning strategies remains a challenging task. In this paper, we present a novel three-dimensional path planning algorithm for Smart Tissue Autonomous Robot (STAR) to enable semi-autonomous robotic anastomosis on deformable tissue. The algorithm incorporates (i) continuous detection of 3D near infrared (NIR) markers manually placed on deformable tissue before the procedure, (ii) generating a uniform and consistent suture placement plan using 3D path planning methods based on the locations of the NIR markers, and (iii) updating the remaining suture plan after each completed stitch using a non-rigid registration technique to account for tissue deformation during anastomosis. We evaluate the path planning algorithm for accuracy and consistency by comparing the anastomosis of synthetic vaginal cuff tissue completed by STAR and a surgeon. Our test results indicate that STAR using the proposed method achieves 2.6 times better consistency in suture spacing and 2.4 times better consistency in suture bite sizes than the manual anastomosis.
Collapse
Affiliation(s)
- M Kam
- Department of Mechanical Engineering, University of Maryland, College Park, MD 20742, USA
| | - H Saeidi
- Department of Mechanical Engineering, University of Maryland, College Park, MD 20742, USA
| | - S Wei
- Electrical and Computer Science Engineering Department, Johns Hopkins University, Baltimore, MD 21211, USA
| | - J D Opfermann
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, 111 Michigan Avenue N.W., Washington, DC 20010, USA
| | - S Leonard
- Electrical and Computer Science Engineering Department, Johns Hopkins University, Baltimore, MD 21211, USA
| | - M H Hsieh
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, 111 Michigan Avenue N.W., Washington, DC 20010, USA
| | - J U Kang
- Electrical and Computer Science Engineering Department, Johns Hopkins University, Baltimore, MD 21211, USA
| | - A Krieger
- Department of Mechanical Engineering, University of Maryland, College Park, MD 20742, USA
| |
Collapse
|
8
|
Ge J, Saeidi H, Opfermann JD, Joshi AS, Krieger A. Landmark-Guided Deformable Image Registration for Supervised Autonomous Robotic Tumor Resection. ACTA ACUST UNITED AC 2019; 11764:320-328. [PMID: 33511379 DOI: 10.1007/978-3-030-32239-7_36] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/12/2023]
Abstract
Oral squamous cell carcinoma (OSCC) is the most common cancer in the head and neck region, and is associated with high morbidity and mortality rates. Surgical resection is usually the primary treatment strategy for OSCC, and maintaining effective tumor resection margins is paramount to surgical outcomes. In practice, wide tumor excisions impair post-surgical organ function, while narrow resection margins are associated with tumor recurrence. Identification and tracking of these resection margins remain a challenge because they migrate and shrink from preoperative chemo or radiation therapies, and deform intra-operatively. This paper reports a novel near-infrared (NIR) fluorescent marking and landmark-based deformable image registration (DIR) method to precisely predict deformed margins. The accuracy of DIR predicted resection margins on porcine cadaver tongues is compared with rigid image registration and surgeon's manual prediction. Furthermore, our tracking and registration technique is integrated into a robotic system, and tested using ex vivo porcine cadaver tongues to demonstrate the feasibility of supervised autonomous tumor bed resections.
Collapse
Affiliation(s)
- Jiawei Ge
- Department of Mechanical Engineering, University of Maryland, College Park, MD, USA
| | - Hamed Saeidi
- Department of Mechanical Engineering, University of Maryland, College Park, MD, USA
| | - Justin D Opfermann
- Sheikh Zayed Institute, Children's National Health System, Washington, DC, USA
| | - Arjun S Joshi
- Division of Otolaryngology - Head and Neck Surgery, George Washington University, Washington, DC, USA
| | - Axel Krieger
- Department of Mechanical Engineering, University of Maryland, College Park, MD, USA
| |
Collapse
|
9
|
Sorriento A, Porfido MB, Mazzoleni S, Calvosa G, Tenucci M, Ciuti G, Dario P. Optical and Electromagnetic Tracking Systems for Biomedical Applications: A Critical Review on Potentialities and Limitations. IEEE Rev Biomed Eng 2019; 13:212-232. [PMID: 31484133 DOI: 10.1109/rbme.2019.2939091] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
Optical and electromagnetic tracking systems represent the two main technologies integrated into commercially-available surgical navigators for computer-assisted image-guided surgery so far. Optical Tracking Systems (OTSs) work within the optical spectrum to track the position and orientation, i.e., pose of target surgical instruments. OTSs are characterized by high accuracy and robustness to environmental conditions. The main limitation of OTSs is the need of a direct line-of-sight between the optical markers and the camera sensor, rigidly fixed into the operating theatre. Electromagnetic Tracking Systems (EMTSs) use electromagnetic field generator to detect the pose of electromagnetic sensors. EMTSs do not require such a direct line-of-sight, however the presence of metal or ferromagnetic sources in the operating workspace can significantly affect the measurement accuracy. The aim of the proposed review is to provide a complete and detailed overview of optical and electromagnetic tracking systems, including working principles, source of error and validation protocols. Moreover, commercial and research-oriented solutions, as well as clinical applications, are described for both technologies. Finally, a critical comparative analysis of the state of the art which highlights the potentialities and the limitations of each tracking system for a medical use is provided.
Collapse
|
10
|
Fracczak L, Szaniewski M, Podsedkowski L. Share control of surgery robot master manipulator guiding tool along the standard path. Int J Med Robot 2019; 15:e1984. [PMID: 30650473 PMCID: PMC6916569 DOI: 10.1002/rcs.1984] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2017] [Revised: 01/02/2019] [Accepted: 01/09/2019] [Indexed: 11/25/2022]
Abstract
Recently, minimally invasive surgery (MIS) robotics enters the phase of autonomous operation. However, because of the high variability of the environment, conducting a fully autonomous surgery is still extremely difficult. This paper presents a share control system, the objective of which is to suggest the optimum path of tool guidance through the use of force on the master manipulator (hereinafter as master), meaning the surgeon's hand. Owing to this type of control, the surgeon has full control over the position of the tool the entire time and is supported by the system to better and faster guide the tool during surgery. The force should be felt by the surgeon but, simultaneously, must not hinder or impact the surgical process. Furthermore, the share control system presented in the paper can be turned on or off at any moment during surgery.
Collapse
Affiliation(s)
- Lukasz Fracczak
- Institute of Machine Tools and Production Engineering, Lodz University of Technology, Łódź, Poland
| | - Mateusz Szaniewski
- Institute of Machine Tools and Production Engineering, Lodz University of Technology, Łódź, Poland
| | - Leszek Podsedkowski
- Institute of Machine Tools and Production Engineering, Lodz University of Technology, Łódź, Poland
| |
Collapse
|
11
|
Scrofani G, Sola-Pikabea J, Llavador A, Sanchez-Ortiga E, Barreiro JC, Saavedra G, Garcia-Sucerquia J, Martínez-Corral M. FIMic: design for ultimate 3D-integral microscopy of in-vivo biological samples. BIOMEDICAL OPTICS EXPRESS 2018; 9:335-346. [PMID: 29359107 PMCID: PMC5772586 DOI: 10.1364/boe.9.000335] [Citation(s) in RCA: 48] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/26/2017] [Revised: 12/09/2017] [Accepted: 12/10/2017] [Indexed: 05/12/2023]
Abstract
In this work, Fourier integral microscope (FIMic), an ultimate design of 3D-integral microscopy, is presented. By placing a multiplexing microlens array at the aperture stop of the microscope objective of the host microscope, FIMic shows extended depth of field and enhanced lateral resolution in comparison with regular integral microscopy. As FIMic directly produces a set of orthographic views of the 3D-micrometer-sized sample, it is suitable for real-time imaging. Following regular integral-imaging reconstruction algorithms, a 2.75-fold enhanced depth of field and [Formula: see text]-time better spatial resolution in comparison with conventional integral microscopy is reported. Our claims are supported by theoretical analysis and experimental images of a resolution test target, cotton fibers, and in-vivo 3D-imaging of biological specimens.
Collapse
Affiliation(s)
- G. Scrofani
- Department of Optics, University of Valencia, E-46100 Burjassot, Spain
| | - J. Sola-Pikabea
- Department of Optics, University of Valencia, E-46100 Burjassot, Spain
| | - A. Llavador
- Department of Optics, University of Valencia, E-46100 Burjassot, Spain
| | - E. Sanchez-Ortiga
- Department of Optics, University of Valencia, E-46100 Burjassot, Spain
| | - J. C. Barreiro
- Department of Optics, University of Valencia, E-46100 Burjassot, Spain
| | - G. Saavedra
- Department of Optics, University of Valencia, E-46100 Burjassot, Spain
| | - J. Garcia-Sucerquia
- Universidad Nacional de Colombia, Sede Medellin, School of Physics, A.A. 3840 Medellín 050034, Colombia
| | | |
Collapse
|
12
|
Opfermann JD, Leonard S, Decker RS, Uebele NA, Bayne CE, Joshi AS, Krieger A. Semi-Autonomous Electrosurgery for Tumor Resection Using a Multi-Degree of Freedom Electrosurgical Tool and Visual Servoing. PROCEEDINGS OF THE ... IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS. IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS 2017; 2017:3653-3659. [PMID: 29503760 DOI: 10.1109/iros.2017.8206210] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Abstract
This paper specifies a surgical robot performing semi-autonomous electrosurgery for tumor resection and evaluates its accuracy using a visual servoing paradigm. We describe the design and integration of a novel, multi-degree of freedom electrosurgical tool for the smart tissue autonomous robot (STAR). Standardized line tests are executed to determine ideal cut parameters in three different types of porcine tissue. STAR is then programmed with the ideal cut setting for porcine tissue and compared against expert surgeons using open and laparoscopic techniques in a line cutting task. We conclude with a proof of concept demonstration using STAR to semi-autonomously resect pseudo-tumors in porcine tissue using visual servoing. When tasked to excise tumors with a consistent 4mm margin, STAR can semi-autonomously dissect tissue with an average margin of 3.67 mm and a standard deviation of 0.89mm.
Collapse
Affiliation(s)
- Justin D Opfermann
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, 111 Michigan Ave. N.W., Washington, DC 20010
| | - Simon Leonard
- Electrical Engineering Department, Johns Hopkins University, Baltimore, MD 21211
| | - Ryan S Decker
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, 111 Michigan Ave. N.W., Washington, DC 20010
| | - Nicholas A Uebele
- Electrical Engineering Department, Johns Hopkins University, Baltimore, MD 21211
| | - Christopher E Bayne
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, 111 Michigan Ave. N.W., Washington, DC 20010
| | - Arjun S Joshi
- Division of Otolaryngology - Head and Neck Surgery at The George Washington University, Washington, DC 20052
| | - Axel Krieger
- Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Health System, 111 Michigan Ave. N.W., Washington, DC 20010
| |
Collapse
|