1
|
Park J, Seo B, Jeong Y, Park I. A Review of Recent Advancements in Sensor-Integrated Medical Tools. ADVANCED SCIENCE (WEINHEIM, BADEN-WURTTEMBERG, GERMANY) 2024; 11:e2307427. [PMID: 38460177 PMCID: PMC11132050 DOI: 10.1002/advs.202307427] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/06/2023] [Revised: 12/26/2023] [Indexed: 03/11/2024]
Abstract
A medical tool is a general instrument intended for use in the prevention, diagnosis, and treatment of diseases in humans or other animals. Nowadays, sensors are widely employed in medical tools to analyze or quantify disease-related parameters for the diagnosis and monitoring of patients' diseases. Recent explosive advancements in sensor technologies have extended the integration and application of sensors in medical tools by providing more versatile in vivo sensing capabilities. These unique sensing capabilities, especially for medical tools for surgery or medical treatment, are getting more attention owing to the rapid growth of minimally invasive surgery. In this review, recent advancements in sensor-integrated medical tools are presented, and their necessity, use, and examples are comprehensively introduced. Specifically, medical tools often utilized for medical surgery or treatment, for example, medical needles, catheters, robotic surgery, sutures, endoscopes, and tubes, are covered, and in-depth discussions about the working mechanism used for each sensor-integrated medical tool are provided.
Collapse
Affiliation(s)
- Jaeho Park
- Department of Mechanical EngineeringKorea Advanced Institute of Science and Technology (KAIST)Daejeon34141South Korea
| | - Bokyung Seo
- Department of Mechanical EngineeringKorea Advanced Institute of Science and Technology (KAIST)Daejeon34141South Korea
| | - Yongrok Jeong
- Department of Mechanical EngineeringKorea Advanced Institute of Science and Technology (KAIST)Daejeon34141South Korea
- Radioisotope Research DivisionKorea Atomic Energy Research Institute (KAERI)Daejeon34057South Korea
| | - Inkyu Park
- Department of Mechanical EngineeringKorea Advanced Institute of Science and Technology (KAIST)Daejeon34141South Korea
| |
Collapse
|
2
|
Zhang P, Kim JW, Gehlbach P, Iordachita I, Kobilarov M. Autonomous Needle Navigation in Retinal Microsurgery: Evaluation in ex vivo Porcine Eyes. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : ICRA : [PROCEEDINGS]. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION 2023; 2023:4661-4667. [PMID: 38107423 PMCID: PMC10723823 DOI: 10.1109/icra48891.2023.10161151] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/19/2023]
Abstract
Important challenges in retinal microsurgery include prolonged operating time, inadequate force feedback, and poor depth perception due to a constrained top-down view of the surgery. The introduction of robot-assisted technology could potentially deal with such challenges and improve the surgeon's performance. Motivated by such challenges, this work develops a strategy for autonomous needle navigation in retinal microsurgery aiming to achieve precise manipulation, reduced end-to-end surgery time, and enhanced safety. This is accomplished through real-time geometry estimation and chance-constrained Model Predictive Control (MPC) resulting in high positional accuracy while keeping scleral forces within a safe level. The robotic system is validated using both open-sky and intact (with lens and partial vitreous removal) ex vivo porcine eyes. The experimental results demonstrate that the generation of safe control trajectories is robust to small motions associated with head drift. The mean navigation time and scleral force for MPC navigation experiments are 7.208 s and 11.97 mN, which can be considered efficient and well within acceptable safe limits. The resulting mean errors along lateral directions of the retina are below 0.06 mm, which is below the typical hand tremor amplitude in retinal microsurgery.
Collapse
Affiliation(s)
- Peiyao Zhang
- Department of Mechanical Engineering and the Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD 21211, USA
| | - Ji Woong Kim
- Department of Mechanical Engineering and the Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD 21211, USA
| | - Peter Gehlbach
- Wilmer Eye Institute, Johns Hopkins University, Baltimore, MD 21211, USA
| | - Iulian Iordachita
- Department of Mechanical Engineering and the Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD 21211, USA
| | - Marin Kobilarov
- Department of Mechanical Engineering and the Laboratory for Computational Sensing and Robotics (LCSR), Johns Hopkins University, Baltimore, MD 21211, USA
| |
Collapse
|
3
|
Starovoyt A, Quirk BC, Putzeys T, Kerckhofs G, Nuyts J, Wouters J, McLaughlin RA, Verhaert N. An optically-guided cochlear implant sheath for real-time monitoring of electrode insertion into the human cochlea. Sci Rep 2022; 12:19234. [PMID: 36357503 PMCID: PMC9649659 DOI: 10.1038/s41598-022-23653-4] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Accepted: 11/03/2022] [Indexed: 11/12/2022] Open
Abstract
In cochlear implant surgery, insertion of perimodiolar electrode arrays into the scala tympani can be complicated by trauma or even accidental translocation of the electrode array within the cochlea. In patients with partial hearing loss, cochlear trauma can not only negatively affect implant performance, but also reduce residual hearing function. These events have been related to suboptimal positioning of the cochlear implant electrode array with respect to critical cochlear walls of the scala tympani (modiolar wall, osseous spiral lamina and basilar membrane). Currently, the position of the electrode array in relation to these walls cannot be assessed during the insertion and the surgeon depends on tactile feedback, which is unreliable and often comes too late. This study presents an image-guided cochlear implant device with an integrated, fiber-optic imaging probe that provides real-time feedback using optical coherence tomography during insertion into the human cochlea. This novel device enables the surgeon to accurately detect and identify the cochlear walls ahead and to adjust the insertion trajectory, avoiding collision and trauma. The functionality of this prototype has been demonstrated in a series of insertion experiments, conducted by experienced cochlear implant surgeons on fresh-frozen human cadaveric cochleae.
Collapse
Affiliation(s)
- Anastasiya Starovoyt
- grid.5596.f0000 0001 0668 7884Department of Neurosciences, ExpORL, KU Leuven, 3000 Leuven, Belgium ,grid.5596.f0000 0001 0668 7884Department of Neurosciences, Leuven Brain Institute, KU Leuven, 3000 Leuven, Belgium
| | - Bryden C. Quirk
- grid.1010.00000 0004 1936 7304Australian Research Council Centre of Excellence for Nanoscale BioPhotonics, Faculty of Health and Medical Sciences, The University of Adelaide, Adelaide, SA 5005 Australia ,grid.1010.00000 0004 1936 7304Institute for Photonics and Advanced Sensing, The University of Adelaide, Adelaide, SA 5005 Australia
| | - Tristan Putzeys
- grid.5596.f0000 0001 0668 7884Department of Neurosciences, ExpORL, KU Leuven, 3000 Leuven, Belgium ,grid.5596.f0000 0001 0668 7884Department of Neurosciences, Leuven Brain Institute, KU Leuven, 3000 Leuven, Belgium ,grid.5596.f0000 0001 0668 7884Laboratory for Soft Matter and Biophysics, Department of Physics and Astronomy, KU Leuven, 3000 Leuven, Belgium
| | - Greet Kerckhofs
- grid.7942.80000 0001 2294 713XBiomechanics Laboratory, Institute of Mechanics, Materials, and Civil Engineering, UCLouvain, 1348 Louvain-La-Neuve, Belgium ,grid.5596.f0000 0001 0668 7884Department of Materials Science and Engineering, KU Leuven, 3000 Leuven, Belgium ,grid.7942.80000 0001 2294 713XInstitute of Experimental and Clinical Research, UCLouvain, 1200 Woluwé-Saint-Lambert, Belgium ,grid.5596.f0000 0001 0668 7884Prometheus, Division of Skeletal Tissue Engineering, KU Leuven, 3000 Leuven, Belgium
| | - Johan Nuyts
- grid.5596.f0000 0001 0668 7884Department of Imaging and Pathology, Division of Nuclear Medicine, KU Leuven, 3000 Leuven, Belgium ,Nuclear Medicine and Molecular Imaging, Medical Imaging Research Center, 3000 Leuven, Belgium
| | - Jan Wouters
- grid.5596.f0000 0001 0668 7884Department of Neurosciences, ExpORL, KU Leuven, 3000 Leuven, Belgium ,grid.5596.f0000 0001 0668 7884Department of Neurosciences, Leuven Brain Institute, KU Leuven, 3000 Leuven, Belgium
| | - Robert A. McLaughlin
- grid.1010.00000 0004 1936 7304Australian Research Council Centre of Excellence for Nanoscale BioPhotonics, Faculty of Health and Medical Sciences, The University of Adelaide, Adelaide, SA 5005 Australia ,grid.1010.00000 0004 1936 7304Institute for Photonics and Advanced Sensing, The University of Adelaide, Adelaide, SA 5005 Australia ,grid.1012.20000 0004 1936 7910School of Engineering, University of Western Australia, Perth, WA 6009 Australia
| | - Nicolas Verhaert
- grid.5596.f0000 0001 0668 7884Department of Neurosciences, ExpORL, KU Leuven, 3000 Leuven, Belgium ,grid.5596.f0000 0001 0668 7884Department of Neurosciences, Leuven Brain Institute, KU Leuven, 3000 Leuven, Belgium ,grid.410569.f0000 0004 0626 3338Department of Otorhinolaryngology, Head and Neck Surgery, University Hospitals of Leuven, 3000 Leuven, Belgium
| |
Collapse
|
4
|
Li Z, Fu P, Wei BT, Wang J, Li AL, Li MJ, Bian GB. An automatic drug injection device with spatial micro-force perception guided by an microscopic image for robot-assisted ophthalmic surgery. Front Robot AI 2022; 9:913930. [PMID: 35991847 PMCID: PMC9382114 DOI: 10.3389/frobt.2022.913930] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Accepted: 06/30/2022] [Indexed: 11/26/2022] Open
Abstract
Retinal vein injection guided by microscopic image is an innovative procedure for treating retinal vein occlusion. However, the retina organization is complex, fine, and weak, and the operation scale and force are small. Surgeons’ limited operation and force-sensing accuracy make it difficult to perform precise and stable drug injection operations on the retina in a magnified field of image vision. In this paper, a 3-DOF automatic drug injection mechanism was designed for microscopic image guiding robot-assisted needle delivery and automatic drug injection. Additionally, the robot-assisted real-time three-dimensional micro-force-sensing method for retinal vein injection was proposed. Based on the layout of three FBG sensors on the hollow outer wall of the nested needle tube in a circular array of nickel-titanium alloys, the real-time sensing of the contact force between the intraoperative instrument and the blood vessel was realized. The experimental data of 15 groups of porcine eyeball retinal veins with diameters of 100–200 μm showed that the piercing force of surgical instruments and blood vessels is 5.95∼12.97 mN, with an average value of 9.98 mN. Furthermore, 20 groups of experimental measurements on chicken embryo blood vessels with diameters of 150–500 μm showed that the piercing force was 4.02∼23.4 mN, with an average value of 12.05 mN.
Collapse
Affiliation(s)
- Zhen Li
- School of Electronic and Information Engineering, Tongji University, Shanghai, China
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
| | - Pan Fu
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Automation, Beijing Information Science and Technology University, Beijing, China
| | - Bing-Ting Wei
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
| | - Jie Wang
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Automation, Beijing Information Science and Technology University, Beijing, China
| | - An-Long Li
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
| | - Ming-Jun Li
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- School of Automation, Beijing Information Science and Technology University, Beijing, China
| | - Gui-Bin Bian
- State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
- *Correspondence: Gui-Bin Bian,
| |
Collapse
|
5
|
Iordachita II, de Smet MD, Naus G, Mitsuishi M, Riviere CN. Robotic Assistance for Intraocular Microsurgery: Challenges and Perspectives. PROCEEDINGS OF THE IEEE. INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS 2022; 110:893-908. [PMID: 36588782 PMCID: PMC9799958 DOI: 10.1109/jproc.2022.3169466] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
Intraocular surgery, one of the most challenging discipline of microsurgery, requires sensory and motor skills at the limits of human physiological capabilities combined with tremendously difficult requirements for accuracy and steadiness. Nowadays, robotics combined with advanced imaging has opened conspicuous and significant directions in advancing the field of intraocular microsurgery. Having patient treatment with greater safety and efficiency as the final goal, similar to other medical applications, robotics has a real potential to fundamentally change microsurgery by combining human strengths with computer and sensor-based technology in an information-driven environment. Still in its early stages, robotic assistance for intraocular microsurgery has been accepted with precaution in the operating room and successfully tested in a limited number of clinical trials. However, owing to its demonstrated capabilities including hand tremor reduction, haptic feedback, steadiness, enhanced dexterity, micrometer-scale accuracy, and others, microsurgery robotics has evolved as a very promising trend in advancing retinal surgery. This paper will analyze the advances in retinal robotic microsurgery, its current drawbacks and limitations, as well as the possible new directions to expand retinal microsurgery to techniques currently beyond human boundaries or infeasible without robotics.
Collapse
Affiliation(s)
- Iulian I Iordachita
- Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA
| | - Marc D de Smet
- Microinvasive Ocular Surgery Center (MIOS), Lausanne, Switzerland
| | | | - Mamoru Mitsuishi
- Department of Mechanical Engineering, The University of Tokyo, Japan
| | | |
Collapse
|
6
|
Alamdar A, Patel N, Urias M, Ebrahimi A, Gehlbach P, Iordachita I. Force and Velocity Based Puncture Detection in Robot Assisted Retinal Vein Cannulation: in-vivo Study. IEEE Trans Biomed Eng 2021; 69:1123-1132. [PMID: 34550878 DOI: 10.1109/tbme.2021.3114638] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Retinal vein cannulation is a technically demanding surgical procedure and its feasibility may rely on using advanced surgical robots equipped with force-sensing microneedles. Reliable detection of the moment of venous puncture is important, to either alert or prevent the clinician from double puncturing the vessel and damaging the retinal surface beneath. This paper reports the first in-vivo retinal vein cannulation trial on rabbit eyes, using sensorized metal needles, and investigates puncture detection. METHODS We utilized total of four indices including two previously demonstrated ones and two new indices, based on the velocity and force of the needle tip and the correlation between the needle-tissue and tool-sclera interaction forces. We also studied the effect of detection timespan on the performance of detecting actual punctures. RESULTS The new indices, when used in conjunction with the previous algorithm, improved the detection rate form 75% to 92%, but slightly increased the number of false detections from 37 to 43. Increasing the detection window improved the detection performance, at the cost of adding to the delay. CONCLUSION The current algorithm can supplement the surgeons visual feedback and surgical judgment. To achieve automatic puncture detection, more measurements and further analysis are required. Subsequent in-vivo studies in other animals, such as pigs with their more human like eye anatomy, are required, before clinical trials. SIGNIFICANCE The study provides promising results and the criteria developed may serve as guidelines for further investigation into puncture detection in in-vivo retinal vein cannulation.
Collapse
|
7
|
He C, Ebrahimi A, Yang E, Urias M, Yang Y, Gehlbach P, Iordachita I. Towards Bimanual Vein Cannulation: Preliminary Study of a Bimanual Robotic System With a Dual Force Constraint Controller. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : ICRA : [PROCEEDINGS]. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION 2020; 2020:4441-4447. [PMID: 33692911 DOI: 10.1109/icra40945.2020.9196889] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Retinal vein cannulation is a promising approach for treating retinal vein occlusion that involves injecting medicine into the occluded vessel to dissolve the clot. The approach remains largely unexploited clinically due to surgeon limitations in detecting interaction forces between surgical tools and retinal tissue. In this paper, a dual force constraint controller for robot-assisted retinal surgery was presented to keep the tool-to-vessel forces and tool-to-sclera forces below prescribed thresholds. A cannulation tool and forceps with dual force-sensing capability were developed and used to measure force information fed into the robot controller, which was implemented on existing Steady Hand Eye Robot platforms. The robotic system facilitates retinal vein cannulation by allowing a user to grasp the target vessel with the forceps and then enter the vessel with the cannula. The system was evaluated on an eye phantom. The results showed that, while the eyeball was subjected to rotational disturbances, the proposed controller actuates the robotic manipulators to maintain the average tool-to-vessel force at 10.9 mN and 13.1 mN and the average tool-to-sclera force at 38.1 mN and 41.2 mN for the cannula and the forcpes, respectively. Such small tool-to-tissue forces are acceptable to avoid retinal tissue injury. Additionally, two clinicians participated in a preliminary user study of the bimanual cannulation demonstrating that the operation time and tool-to-tissue forces are significantly decreased when using the bimanual robotic system as compared to freehand performance.
Collapse
Affiliation(s)
- Changyan He
- School of Mechanical Engineering and Automation at Beihang University, Beijing, 100191 China, and also with LCSR at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Ali Ebrahimi
- LCSR at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Emily Yang
- LCSR at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Muller Urias
- Wilmer Eye Institute at the Johns Hopkins Hospital, Baltimore, MD 21287 USA
| | - Yang Yang
- School of Mechanical Engineering and Automation at Beihang University, Beijing, 100191 China
| | - Peter Gehlbach
- Wilmer Eye Institute at the Johns Hopkins Hospital, Baltimore, MD 21287 USA
| | | |
Collapse
|
8
|
Patel N, Urias M, He C, Gehlbach PL, Iordachita I. A Comparison of Manual and Robot Assisted Retinal Vein Cannulation in Chicken Chorioallantoic Membrane. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2020:5101-5105. [PMID: 33019134 PMCID: PMC7538656 DOI: 10.1109/embc44109.2020.9176853] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Retinal vein occlusion (RVO) is a vision threatening condition occurring in the central or the branch retinal veins. Risk factors include but are not limited to hypercoagulability, thrombus or other cause of low blood flow. Current clinically proven treatment options limit complications of vein occlusion without treating the causative occlusion. In recent years, a more direct approach called Retinal Vein Cannulation (RVC) has been explored both in animal and human eye models. Though RVC has demonstrated potential efficacy, it remains a challenging and risky procedure that demands precise needle manipulation to achieve safely. During RVC, a thin cannula (diameter 70-110 µm) is delicately inserted into a retinal vein. Its intraluminal position is maintained for up to 2 minutes while infusion of a therapeutic drug occurs. Because the tool-tissue interaction forces at the needle tip are well below human tactile perception, a robotic assistant combined with a force sensing microneedle could alleviate the challenges of RVC. In this paper we present a comparative study of manual and robot assisted retinal vein cannulation in chicken chorioallantoic membrane (CAM) using a force sensing microneedle tool. The results indicate that the average puncture force and average force during the infusion period are larger in manual mode than in robot assisted mode. Moreover, retinal vein cannulation was more stable during infusion, in robot assisted mode.
Collapse
|
9
|
Kim JW, He C, Urias M, Gehlbach P, Hager GD, Iordachita I, Kobilarov M. Autonomously Navigating a Surgical Tool Inside the Eye by Learning from Demonstration. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : ICRA : [PROCEEDINGS]. IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION 2020; 2020. [PMID: 34621556 DOI: 10.1109/icra40945.2020.9196537] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
A fundamental challenge in retinal surgery is safely navigating a surgical tool to a desired goal position on the retinal surface while avoiding damage to surrounding tissues, a procedure that typically requires tens-of-microns accuracy. In practice, the surgeon relies on depth-estimation skills to localize the tool-tip with respect to the retina in order to perform the tool-navigation task, which can be prone to human error. To alleviate such uncertainty, prior work has introduced ways to assist the surgeon by estimating the tool-tip distance to the retina and providing haptic or auditory feedback. However, automating the tool-navigation task itself remains unsolved and largely unexplored. Such a capability, if reliably automated, could serve as a building block to streamline complex procedures and reduce the chance for tissue damage. Towards this end, we propose to automate the tool-navigation task by learning to mimic expert demonstrations of the task. Specifically, a deep network is trained to imitate expert trajectories toward various locations on the retina based on recorded visual servoing to a given goal specified by the user. The proposed autonomous navigation system is evaluated in simulation and in physical experiments using a silicone eye phantom. We show that the network can reliably navigate a needle surgical tool to various desired locations within 137 μm accuracy in physical experiments and 94 μm in simulation on average, and generalizes well to unseen situations such as in the presence of auxiliary surgical tools, variable eye backgrounds, and brightness conditions.
Collapse
Affiliation(s)
- Ji Woong Kim
- Laboratory for Computing + Sensing (LCSR) dept. at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Changyan He
- Laboratory for Computing + Sensing (LCSR) dept. at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Muller Urias
- Wilmer Eye Institute at the Johns Hopkins Hospital, Baltimore, MD 21287 USA
| | - Peter Gehlbach
- Wilmer Eye Institute at the Johns Hopkins Hospital, Baltimore, MD 21287 USA
| | - Gregory D Hager
- Computer Science dept. at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Iulian Iordachita
- Laboratory for Computing + Sensing (LCSR) dept. at the Johns Hopkins University, Baltimore, MD 21218 USA
| | - Marin Kobilarov
- Laboratory for Computing + Sensing (LCSR) dept. at the Johns Hopkins University, Baltimore, MD 21218 USA
| |
Collapse
|
10
|
Jäckle S, Eixmann T, Schulz-Hildebrandt H, Hüttmann G, Pätz T. Fiber optical shape sensing of flexible instruments for endovascular navigation. Int J Comput Assist Radiol Surg 2019; 14:2137-2145. [PMID: 31493113 PMCID: PMC6858473 DOI: 10.1007/s11548-019-02059-0] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2019] [Accepted: 08/21/2019] [Indexed: 11/27/2022]
Abstract
Purpose Endovascular aortic repair procedures are currently conducted with 2D fluoroscopy imaging. Tracking systems based on fiber Bragg gratings are an emerging technology for the navigation of minimally invasive instruments which can reduce the X-ray exposure and the used contrast agent. Shape sensing of flexible structures is challenging and includes many calculations steps which are prone to different errors. To reduce this errors, we present an optimized shape sensing model. Methods We analyzed for every step of the shape sensing process, which errors can occur, how the error affects the shape and how it can be compensated or minimized. Experiments were done with one multicore fiber system with 38 cm sensing length, and the effects of different methods and parameters were analyzed. Furthermore, we compared 3D shape reconstructions with the segmented shape of the corresponding CT scans of the fiber to evaluate the accuracy of our optimized shape sensing model. Finally, we tested our model in a realistic endovascular scenario by using a 3D printed vessel system created from patient data. Results Depending on the complexity of the shape, we reached an average error of 0.35–1.15 mm and maximal error of 0.75–7.53 mm over the whole 38 cm sensing length. In the endovascular scenario, we obtained an average and maximal error of 1.13 mm and 2.11 mm, respectively. Conclusion The accuracies of the 3D shape sensing model are promising, and we plan to combine the shape sensing based on fiber Bragg gratings with the position and orientation of an electromagnetic tracking to obtain the located catheter shape. Electronic supplementary material The online version of this article (10.1007/s11548-019-02059-0) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Sonja Jäckle
- Fraunhofer MEVIS, Institute for Digital Medicine, Lübeck, Maria-Goeppert-Straße 3, 23562, Lübeck, Germany.
| | - Tim Eixmann
- Medical Laser Center Lübeck GmbH, Peter-Monnik-Weg 4, 23562, Lübeck, Germany
| | - Hinnerk Schulz-Hildebrandt
- Medical Laser Center Lübeck GmbH, Peter-Monnik-Weg 4, 23562, Lübeck, Germany
- Institute of Biomedical Optics, Universität zu Lübeck, Peter-Monnik-Weg 4, 23562, Lübeck, Germany
- German Center for Lung Research, DZL, Airways Research Center North, 22927, Großhansdorf, Germany
| | - Gereon Hüttmann
- Medical Laser Center Lübeck GmbH, Peter-Monnik-Weg 4, 23562, Lübeck, Germany
- Institute of Biomedical Optics, Universität zu Lübeck, Peter-Monnik-Weg 4, 23562, Lübeck, Germany
- German Center for Lung Research, DZL, Airways Research Center North, 22927, Großhansdorf, Germany
| | - Torben Pätz
- Fraunhofer MEVIS, Institute for Digital Medicine, Bremen, Am Fallturm 1, 28359, Bremen, Germany
| |
Collapse
|