1
|
Bozuyuk U, Wrede P, Yildiz E, Sitti M. Roadmap for Clinical Translation of Mobile Microrobotics. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2024; 36:e2311462. [PMID: 38380776 DOI: 10.1002/adma.202311462] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/31/2023] [Revised: 01/24/2024] [Indexed: 02/22/2024]
Abstract
Medical microrobotics is an emerging field to revolutionize clinical applications in diagnostics and therapeutics of various diseases. On the other hand, the mobile microrobotics field has important obstacles to pass before clinical translation. This article focuses on these challenges and provides a roadmap of medical microrobots to enable their clinical use. From the concept of a "magic bullet" to the physicochemical interactions of microrobots in complex biological environments in medical applications, there are several translational steps to consider. Clinical translation of mobile microrobots is only possible with a close collaboration between clinical experts and microrobotics researchers to address the technical challenges in microfabrication, safety, and imaging. The clinical application potential can be materialized by designing microrobots that can solve the current main challenges, such as actuation limitations, material stability, and imaging constraints. The strengths and weaknesses of the current progress in the microrobotics field are discussed and a roadmap for their clinical applications in the near future is outlined.
Collapse
Affiliation(s)
- Ugur Bozuyuk
- Physical Intelligence Department, Max Planck Institute for Intelligent Systems, 70569, Stuttgart, Germany
| | - Paul Wrede
- Physical Intelligence Department, Max Planck Institute for Intelligent Systems, 70569, Stuttgart, Germany
- Institute for Biomedical Engineering, ETH Zurich, Zurich, 8093, Switzerland
| | - Erdost Yildiz
- Physical Intelligence Department, Max Planck Institute for Intelligent Systems, 70569, Stuttgart, Germany
| | - Metin Sitti
- Physical Intelligence Department, Max Planck Institute for Intelligent Systems, 70569, Stuttgart, Germany
- School of Medicine and College of Engineering, Koc University, Istanbul, 34450, Turkey
| |
Collapse
|
2
|
Cruz J, Gonçalves SB, Neves MC, Silva HP, Silva MT. Intraoperative Angle Measurement of Anatomical Structures: A Systematic Review. SENSORS (BASEL, SWITZERLAND) 2024; 24:1613. [PMID: 38475148 PMCID: PMC10934548 DOI: 10.3390/s24051613] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2024] [Revised: 02/22/2024] [Accepted: 02/28/2024] [Indexed: 03/14/2024]
Abstract
Ensuring precise angle measurement during surgical correction of orientation-related deformities is crucial for optimal postoperative outcomes, yet there is a lack of an ideal commercial solution. Current measurement sensors and instrumentation have limitations that make their use context-specific, demanding a methodical evaluation of the field. A systematic review was carried out in March 2023. Studies reporting technologies and validation methods for intraoperative angular measurement of anatomical structures were analyzed. A total of 32 studies were included, 17 focused on image-based technologies (6 fluoroscopy, 4 camera-based tracking, and 7 CT-based), while 15 explored non-image-based technologies (6 manual instruments and 9 inertial sensor-based instruments). Image-based technologies offer better accuracy and 3D capabilities but pose challenges like additional equipment, increased radiation exposure, time, and cost. Non-image-based technologies are cost-effective but may be influenced by the surgeon's perception and require careful calibration. Nevertheless, the choice of the proper technology should take into consideration the influence of the expected error in the surgery, surgery type, and radiation dose limit. This comprehensive review serves as a valuable guide for surgeons seeking precise angle measurements intraoperatively. It not only explores the performance and application of existing technologies but also aids in the future development of innovative solutions.
Collapse
Affiliation(s)
- João Cruz
- IDMEC, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal; (J.C.); (S.B.G.)
| | - Sérgio B. Gonçalves
- IDMEC, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal; (J.C.); (S.B.G.)
| | | | - Hugo Plácido Silva
- IT—Instituto de Telecomunicações, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal;
| | - Miguel Tavares Silva
- IDMEC, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal; (J.C.); (S.B.G.)
| |
Collapse
|
3
|
Warner SJ, Haase DR, Chip Routt ML, Eastman JG, Achor TS. Use of 3D Fluoroscopy to Assist in the Reduction and Fixation of Pelvic and Acetabular Fractures: A Safety and Quality Case Series. J Orthop Trauma 2023; 37:S1-S6. [PMID: 37828694 DOI: 10.1097/bot.0000000000002686] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 08/08/2023] [Indexed: 10/14/2023]
Abstract
SUMMARY Multidimensional fluoroscopy has been increasingly used in orthopaedic trauma to improve the intraoperative assessment of articular reductions and implant placement. Owing to the complex osteology of the pelvis, cross-sectional imaging is imperative for accurate evaluation of pelvic ring and acetabular injuries both preoperatively and intraoperatively. The continued development of fluoroscopic technology over the past decade has resulted in improved ease of intraoperative multidimensional fluoroscopy use in pelvic and acetabular surgery. This has provided orthopaedic trauma surgeons with a valuable tool to better evaluate reduction and fixation at different stages during operative treatment of these injuries. Specifically, intraoperative 3D fluoroscopy during treatment of acetabulum and pelvis injuries assists with guiding intraoperative decisions, assessing reductions, ensuring implant safety, and confirming appropriate fixation. We outline the useful aspects of this technology during pelvic and acetabular surgery and report its utility with a consecutive case series at a single institution. The added benefits of this technology have improved the ability to effectively manage patients with pelvis and acetabulum injuries.
Collapse
Affiliation(s)
- Stephen J Warner
- Department of Orthopaedic Surgery, University of Texas Health Science Center at Houston, McGovern Medical School and Memorial Hermann Medical Center, Houston, TX
| | | | | | | | | |
Collapse
|
4
|
Affiliation(s)
- Guido A Wanner
- Spine Clinic & Traumatology, Private Hospital Bethanien, Swiss Medical Network, Zurich, Switzerland
| | - Sandro M Heining
- Department of Traumatology, University Hospital Zurich, Switzerland
| | - Vladislav Raykov
- Department of Orthopedics & Traumatology, Landeskrankenhaus Bludenz/Feldkirch, Austria
| | | |
Collapse
|
5
|
Ikumi A, Yoshii Y, Iwahashi Y, Sashida S, Shrestha P, Xie C, Kitahara I, Ishii T. Comparison of 3D Bone Position Estimation Using QR Code and Metal Bead Markers. Diagnostics (Basel) 2023; 13:diagnostics13061141. [PMID: 36980448 PMCID: PMC10047530 DOI: 10.3390/diagnostics13061141] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Revised: 02/15/2023] [Accepted: 03/15/2023] [Indexed: 03/19/2023] Open
Abstract
To improve the accuracy of a 3D bone position estimation system that displays 3D images in response to changes in the position of fluoroscopic images, modified markers using quick response (QR) codes were developed. The aims of this study were to assess the accuracy of the estimated bone position on 3D images with reference to QR code markers on fluoroscopic images and to compare its accuracy with metal bead markers. Bone positions were estimated from reference points on a fluoroscopic image compared with those on a 3D image. The positional relationships of QR code and metal bead markers on the fluoroscopic image were compared with those on the 3D image in order to establish whether a 3D image may be drawn by tracking positional changes in radius models. Differences were investigated by comparing the distance between markers on the fluoroscopic image and that on the 3D image, which was projected on the monitor. The error ratio, which was defined as the difference in the measurement between the fluoroscopic and 3D images divided by the fluoroscopic measurement, was compared between QR code and metal bead markers. Error ratios for the QR code markers were 5.0 ± 2.0%, 6.4 ± 7.6%, and 1.0 ± 0.8% in the anterior–posterior view, ulnar side lateral view, and posterior–anterior view, respectively. Error ratios for the metal bead markers were 1.3 ± 1.7%, 13.8 ± 14.5%, and 4.7 ± 5.7% in the anterior–posterior view, ulnar side lateral view, and posterior–anterior view, respectively. The error ratio for the metal bead markers was smaller in the initial position (p < 0.01). However, the error ratios for the QR code markers were smaller in the lateral position and the posterior–anterior position (p < 0.05). In QR code marker tracking, tracking was successful even with discontinuous images. The accuracy of a 3D bone position estimation was increased by using the QR code marker system. QR code marker tracking facilitates real-time comparisons of dynamic changes in preoperative 3D and intraoperative fluoroscopic images.
Collapse
Affiliation(s)
- Akira Ikumi
- Department of Orthopaedic Surgery, Tsukuba University Hospital, Tsukuba 305-8576, Japan
| | - Yuichi Yoshii
- Department of Orthopaedic Surgery, Tokyo Medical University Ibaraki Medical Center, Ami 300-0395, Japan
- Correspondence: ; Tel.: +81-29-887-1161
| | | | | | - Pragyan Shrestha
- Center for Computational Sciences, Tsukuba University, Tsukuba 305-8577, Japan
| | - Chun Xie
- Center for Computational Sciences, Tsukuba University, Tsukuba 305-8577, Japan
| | - Itaru Kitahara
- Center for Computational Sciences, Tsukuba University, Tsukuba 305-8577, Japan
| | - Tomoo Ishii
- Department of Orthopaedic Surgery, Tokyo Medical University Ibaraki Medical Center, Ami 300-0395, Japan
| |
Collapse
|
6
|
3D Reconstruction of Wrist Bones from C-Arm Fluoroscopy Using Planar Markers. Diagnostics (Basel) 2023; 13:diagnostics13020330. [PMID: 36673139 PMCID: PMC9858297 DOI: 10.3390/diagnostics13020330] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2022] [Revised: 01/01/2023] [Accepted: 01/13/2023] [Indexed: 01/18/2023] Open
Abstract
In orthopedic surgeries, such as osteotomy and osteosynthesis, an intraoperative 3D reconstruction of the bone would enable surgeons to quickly assess the fracture reduction procedure with preoperative planning. Scanners equipped with such functionality are often more expensive than a conventional C-arm fluoroscopy device. Moreover, a C-arm fluoroscopy device is commonly available in many orthopedic facilities. Based on the widespread use of such equipment, this paper proposes a method to reconstruct the 3D structure of bone with a conventional C-arm fluoroscopy device. We focus on wrist bones as the target of reconstruction in this research as this will facilitate a flexible imaging scheme. Planar markers are attached to the target object and are tracked in the fluoroscopic image for C-arm pose estimation. The initial calibration of the device is conducted using a checkerboard pattern. In general, reconstruction algorithms are sensitive to geometric calibration errors. To assess the practicality of the method for reconstruction, a simulation study demonstrating the effect of checkerboard thickness and spherical marker size on reconstruction quality was conducted.
Collapse
|
7
|
Killeen BD, Winter J, Gu W, Martin-Gomez A, Taylor RH, Osgood G, Unberath M. Mixed Reality Interfaces for Achieving Desired Views with Robotic X-ray Systems. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING. IMAGING & VISUALIZATION 2022; 11:1130-1135. [PMID: 37555199 PMCID: PMC10406465 DOI: 10.1080/21681163.2022.2154272] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 11/19/2022] [Indexed: 12/14/2022]
Abstract
Robotic X-ray C-arm imaging systems can precisely achieve any position and orientation relative to the patient. Informing the system, however, what pose exactly corresponds to a desired view is challenging. Currently these systems are operated by the surgeon using joysticks, but this interaction paradigm is not necessarily effective because users may be unable to efficiently actuate more than a single axis of the system simultaneously. Moreover, novel robotic imaging systems, such as the Brainlab Loop-X, allow for independent source and detector movements, adding even more complexity. To address this challenge, we consider complementary interfaces for the surgeon to command robotic X-ray systems effectively. Specifically, we consider three interaction paradigms: (1) the use of a pointer to specify the principal ray of the desired view relative to the anatomy, (2) the same pointer, but combined with a mixed reality environment to synchronously render digitally reconstructed radiographs from the tool's pose, and (3) the same mixed reality environment but with a virtual X-ray source instead of the pointer. Initial human-in-the-loop evaluation with an attending trauma surgeon indicates that mixed reality interfaces for robotic X-ray system control are promising and may contribute to substantially reducing the number of X-ray images acquired solely during "fluoro hunting" for the desired view or standard plane.
Collapse
Affiliation(s)
- Benjamin D Killeen
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Jonas Winter
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Wenhao Gu
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Alejandro Martin-Gomez
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Russell H Taylor
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| | - Greg Osgood
- Department of Orthopaedic Surgery, Johns Hopkins Hospital, Baltimore, MD, USA
| | - Mathias Unberath
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
| |
Collapse
|