1
|
Mekki L, Sheth NM, Vijayan RC, Rohleder M, Sisniega A, Kleinszig G, Vogt S, Kunze H, Osgood GM, Siewerdsen JH, Uneri A. Surgical navigation for guidewire placement from intraoperative fluoroscopy in orthopaedic surgery. Phys Med Biol 2023; 68:215001. [PMID: 37774711 DOI: 10.1088/1361-6560/acfec4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2023] [Accepted: 09/29/2023] [Indexed: 10/01/2023]
Abstract
Objective. Surgical guidewires are commonly used in placing fixation implants to stabilize fractures. Accurate positioning of these instruments is challenged by difficulties in 3D reckoning from 2D fluoroscopy. This work aims to enhance the accuracy and reduce exposure times by providing 3D navigation for guidewire placement from as little as two fluoroscopic images.Approach. Our approach combines machine learning-based segmentation with the geometric model of the imager to determine the 3D poses of guidewires. Instrument tips are encoded as individual keypoints, and the segmentation masks are processed to estimate the trajectory. Correspondence between detections in multiple views is established using the pre-calibrated system geometry, and the corresponding features are backprojected to obtain the 3D pose. Guidewire 3D directions were computed using both an analytical and an optimization-based method. The complete approach was evaluated in cadaveric specimens with respect to potential confounding effects from the imaging geometry and radiographic scene clutter due to other instruments.Main results. The detection network identified the guidewire tips within 2.2 mm and guidewire directions within 1.1°, in 2D detector coordinates. Feature correspondence rejected false detections, particularly in images with other instruments, to achieve 83% precision and 90% recall. Estimating the 3D direction via numerical optimization showed added robustness to guidewires aligned with the gantry rotation plane. Guidewire tips and directions were localized in 3D world coordinates with a median accuracy of 1.8 mm and 2.7°, respectively.Significance. The paper reports a new method for automatic 2D detection and 3D localization of guidewires from pairs of fluoroscopic images. Localized guidewires can be virtually overlaid on the patient's pre-operative 3D scan during the intervention. Accurate pose determination for multiple guidewires from two images offers to reduce radiation dose by minimizing the need for repeated imaging and provides quantitative feedback prior to implant placement.
Collapse
Affiliation(s)
- L Mekki
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
| | - N M Sheth
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
| | - R C Vijayan
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
| | - M Rohleder
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
| | - A Sisniega
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
| | | | - S Vogt
- Siemens Healthineers, Erlangen, Germany
| | - H Kunze
- Siemens Healthineers, Erlangen, Germany
| | - G M Osgood
- Department of Orthopaedic Surgery, Johns Hopkins Medicine, Baltimore MD, United States of America
| | - J H Siewerdsen
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
- Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston TX, United States of America
| | - A Uneri
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD, United States of America
| |
Collapse
|
2
|
Vijayan RC, Venkataraman K, Wei J, Sheth NM, Shafiq B, Siewerdsen JH, Zbijewski W, Li G, Cleary K, Uneri A. Multi-Body 3D-2D Registration for Robot-Assisted Joint Reduction: Preclinical Evaluation in the Ankle Syndesmosis. Proc SPIE Int Soc Opt Eng 2023; 12466:124661F. [PMID: 37143861 PMCID: PMC10155864 DOI: 10.1117/12.2654481] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
Abstract
Purpose Existing methods to improve the accuracy of tibiofibular joint reduction present workflow challenges, high radiation exposure, and a lack of accuracy and precision, leading to poor surgical outcomes. To address these limitations, we propose a method to perform robot-assisted joint reduction using intraoperative imaging to align the dislocated fibula to a target pose relative to the tibia. Methods The approach (1) localizes the robot via 3D-2D registration of a custom plate adapter attached to its end effector, (2) localizes the tibia and fibula using multi-body 3D-2D registration, and (3) drives the robot to reduce the dislocated fibula according to the target plan. The custom robot adapter was designed to interface directly with the fibular plate while presenting radiographic features to aid registration. Registration accuracy was evaluated on a cadaveric ankle specimen, and the feasibility of robotic guidance was assessed by manipulating a dislocated fibula in a cadaver ankle. Results Using standard AP and mortise radiographic views registration errors were measured to be less than 1 mm and 1° for the robot adapter and the ankle bones. Experiments in a cadaveric specimen revealed up to 4 mm deviations from the intended path, which was reduced to <2 mm using corrective actions guided by intraoperative imaging and 3D-2D registration. Conclusions Preclinical studies suggest that significant robot flex and tibial motion occur during fibula manipulation, motivating the use of the proposed method to dynamically correct the robot trajectory. Accurate robot registration was achieved via the use of fiducials embedded within the custom design. Future work will evaluate the approach on a custom radiolucent robot design currently under construction and verify the solution on additional cadaveric specimens.
Collapse
Affiliation(s)
- R. C. Vijayan
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - K. Venkataraman
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - J. Wei
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - N. M. Sheth
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - B. Shafiq
- Department of Orthopedic Surgery, Johns Hopkins Medicine, Baltimore MD
| | - J. H. Siewerdsen
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
- Department of Imaging Physics, The University of Texas M. D. Anderson Cancer Center, Houston TX
| | - W. Zbijewski
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - G. Li
- Children’s National Hospital, Washington DC
| | - K. Cleary
- Children’s National Hospital, Washington DC
| | - A. Uneri
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
- ; phone: +1-276-614-7743; website: carnegie.jhu.edu
| |
Collapse
|
3
|
Vijayan RC, Han R, Wu P, Sheth NM, Vagdargi P, Vogt S, Kleinszig G, Osgood GM, Siewerdsen JH, Uneri A. Fluoroscopic Guidance of a Surgical Robot: Pre-clinical Evaluation in Pelvic Guidewire Placement. Proc SPIE Int Soc Opt Eng 2021; 11598:115981G. [PMID: 36090307 PMCID: PMC9455933 DOI: 10.1117/12.2582188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE A method and prototype for a fluoroscopically-guided surgical robot is reported for assisting pelvic fracture fixation. The approach extends the compatibility of existing guidance methods with C-arms that are in mainstream use (without prior geometric calibration) using an online calibration of the C-arm geometry automated via registration to patient anatomy. We report the first preclinical studies of this method in cadaver for evaluation of geometric accuracy. METHODS The robot is placed over the patient within the imaging field-of-view and radiographs are acquired as the robot rotates an attached instrument. The radiographs are then used to perform an online geometric calibration via 3D-2D image registration, which solves for the intrinsic and extrinsic parameters of the C-arm imaging system with respect to the patient. The solved projective geometry is then be used to register the robot to the patient and drive the robot to planned trajectories. This method is applied to a robotic system consisting of a drill guide instrument for guidewire placement and evaluated in experiments using a cadaver specimen. RESULTS Robotic drill guide alignment to trajectories defined in the cadaver pelvis were accurate within 2 mm and 1° (on average) using the calibration-free approach. Conformance of trajectories within bone corridors was confirmed in cadaver by extrapolating the aligned drill guide trajectory into the cadaver pelvis. CONCLUSION This study demonstrates the accuracy of image-guided robotic positioning without prior calibration of the C-arm gantry, facilitating the use of surgical robots with simpler imaging devices that cannot establish or maintain an offline calibration. Future work includes testing of the system in a clinical setting with trained orthopaedic surgeons and residents.
Collapse
Affiliation(s)
- R C Vijayan
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD USA
| | - R Han
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD USA
| | - P Wu
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD USA
| | - N M Sheth
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD USA
| | - P Vagdargi
- Department of Computer Science, Johns Hopkins University, Baltimore MD USA
| | - S Vogt
- Siemens Healthineers, Erlangen Germany
| | | | - G M Osgood
- Department of Orthopaedic Surgery, Johns Hopkins University, Baltimore MD USA
| | - J H Siewerdsen
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD USA
- Department of Computer Science, Johns Hopkins University, Baltimore MD USA
| | - A Uneri
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD USA
| |
Collapse
|
4
|
Han R, Uneri A, Vijayan RC, Wu P, Vagdargi P, Sheth N, Vogt S, Kleinszig G, Osgood GM, Siewerdsen JH. Fracture reduction planning and guidance in orthopaedic trauma surgery via multi-body image registration. Med Image Anal 2020; 68:101917. [PMID: 33341493 DOI: 10.1016/j.media.2020.101917] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2020] [Revised: 11/16/2020] [Accepted: 11/23/2020] [Indexed: 02/06/2023]
Abstract
PURPOSES Surgical reduction of pelvic fracture is a challenging procedure, and accurate restoration of natural morphology is essential to obtaining positive functional outcome. The procedure often requires extensive preoperative planning, long fluoroscopic exposure time, and trial-and-error to achieve accurate reduction. We report a multi-body registration framework for reduction planning using preoperative CT and intraoperative guidance using routine 2D fluoroscopy that could help address such challenges. METHOD The framework starts with semi-automatic segmentation of fractured bone fragments in preoperative CT using continuous max-flow. For reduction planning, a multi-to-one registration is performed to register bone fragments to an adaptive template that adjusts to patient-specific bone shapes and poses. The framework further registers bone fragments to intraoperative fluoroscopy to provide 2D fluoroscopy guidance and/or 3D navigation relative to the reduction plan. The framework was investigated in three studies: (1) a simulation study of 40 CT images simulating three fracture categories (unilateral two-body, unilateral three-body, and bilateral two-body); (2) a proof-of-concept cadaver study to mimic clinical scenario; and (3) a retrospective clinical study investigating feasibility in three cases of increasing severity and accuracy requirement. RESULTS Segmentation of simulated pelvic fracture demonstrated Dice coefficient of 0.92±0.06. Reduction planning using the adaptive template achieved 2-3 mm and 2-3° error for the three fracture categories, significantly better than planning based on mirroring of contralateral anatomy. 3D-2D registration yielded ~2 mm and 0.5° accuracy, providing accurate guidance with respect to the preoperative reduction plan. The cadaver study and retrospective clinical study demonstrated comparable accuracy: ~0.90 Dice coefficient in segmentation, ~3 mm accuracy in reduction planning, and ~2 mm accuracy in 3D-2D registration. CONCLUSION The registration framework demonstrated planning and guidance accuracy within clinical requirements in both simulation and clinical feasibility studies for a broad range of fracture-dislocation patterns. Using routinely acquired preoperative CT and intraoperative fluoroscopy, the framework could improve the accuracy of pelvic fracture reduction, reduce radiation dose, and could integrate well with common clinical workflow without the need for additional navigation systems.
Collapse
Affiliation(s)
- R Han
- Department of Biomedical Engineering, The Johns Hopkins University, BaltimoreMD, United States
| | - A Uneri
- Department of Biomedical Engineering, The Johns Hopkins University, BaltimoreMD, United States
| | - R C Vijayan
- Department of Biomedical Engineering, The Johns Hopkins University, BaltimoreMD, United States
| | - P Wu
- Department of Biomedical Engineering, The Johns Hopkins University, BaltimoreMD, United States
| | - P Vagdargi
- Department of Computer Science, The Johns Hopkins University, BaltimoreMD, United States
| | - N Sheth
- Department of Biomedical Engineering, The Johns Hopkins University, BaltimoreMD, United States
| | - S Vogt
- Siemens Healthineers, ErlangenGermany
| | | | - G M Osgood
- Department of Orthopaedic Surgery, The Johns Hopkins Hospital, BaltimoreMD, United States
| | - J H Siewerdsen
- Department of Biomedical Engineering, The Johns Hopkins University, BaltimoreMD, United States.
| |
Collapse
|
5
|
Vijayan RC, Han R, Wu P, Sheth NM, Ketcha MD, Vagdargi P, Vogt S, Kleinszig G, Osgood GM, Siewerdsen JH, Uneri A. Image-Guided Robotic K-Wire Placement for Orthopaedic Trauma Surgery. Proc SPIE Int Soc Opt Eng 2020; 11315:113151A. [PMID: 36082206 PMCID: PMC9450105 DOI: 10.1117/12.2549713] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE We report the initial development of an image-based solution for robotic assistance of pelvic fracture fixation. The approach uses intraoperative radiographs, preoperative CT, and an end effector of known design to align the robot with target trajectories in CT. The method extends previous work to solve the robot-to-patient registration from a single radiographic view (without C-arm rotation) and addresses the workflow challenges associated with integrating robotic assistance in orthopaedic trauma surgery in a form that could be broadly applicable to isocentric or non-isocentric C-arms. METHODS The proposed method uses 3D-2D known-component registration to localize a robot end effector with respect to the patient by: (1) exploiting the extended size and complex features of pelvic anatomy to register the patient; and (2) capturing multiple end effector poses using precise robotic manipulation. These transformations, along with an offline hand-eye calibration of the end effector, are used to calculate target robot poses that align the end effector with planned trajectories in the patient CT. Geometric accuracy of the registrations was independently evaluated for the patient and the robot in phantom studies. RESULTS The resulting translational difference between the ground truth and patient registrations of a pelvis phantom using a single (AP) view was 1.3 mm, compared to 0.4 mm using dual (AP+Lat) views. Registration of the robot in air (i.e., no background anatomy) with five unique end effector poses achieved mean translational difference ~1.4 mm for K-wire placement in the pelvis, comparable to tracker-based margins of error (commonly ~2 mm). CONCLUSIONS The proposed approach is feasible based on the accuracy of the patient and robot registrations and is a preliminary step in developing an image-guided robotic guidance system that more naturally fits the workflow of fluoroscopically guided orthopaedic trauma surgery. Future work will involve end-to-end development of the proposed guidance system and assessment of the system with delivery of K-wires in cadaver studies.
Collapse
Affiliation(s)
- R. C. Vijayan
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - R. Han
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - P. Wu
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - N. M. Sheth
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - M. D. Ketcha
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| | - P. Vagdargi
- Department of Computer Science, Johns Hopkins University, Baltimore MD
| | - S. Vogt
- Siemens Healthineers, Forchheim, Germany
| | | | - G. M. Osgood
- Department of Orthopaedic Surgery, Johns Hopkins Medicine, Baltimore MD
| | - J. H. Siewerdsen
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
- Department of Computer Science, Johns Hopkins University, Baltimore MD
| | - A. Uneri
- Department of Biomedical Engineering, Johns Hopkins University, Baltimore MD
| |
Collapse
|