1
|
Berhouet J, Samargandi R. Emerging Innovations in Preoperative Planning and Motion Analysis in Orthopedic Surgery. Diagnostics (Basel) 2024; 14:1321. [PMID: 39001212 PMCID: PMC11240316 DOI: 10.3390/diagnostics14131321] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2024] [Revised: 06/15/2024] [Accepted: 06/20/2024] [Indexed: 07/16/2024] Open
Abstract
In recent years, preoperative planning has undergone significant advancements, with a dual focus: improving the accuracy of implant placement and enhancing the prediction of functional outcomes. These breakthroughs have been made possible through the development of advanced processing methods for 3D preoperative images. These methods not only offer novel visualization techniques but can also be seamlessly integrated into computer-aided design models. Additionally, the refinement of motion capture systems has played a pivotal role in this progress. These "markerless" systems are more straightforward to implement and facilitate easier data analysis. Simultaneously, the emergence of machine learning algorithms, utilizing artificial intelligence, has enabled the amalgamation of anatomical and functional data, leading to highly personalized preoperative plans for patients. The shift in preoperative planning from 2D towards 3D, from static to dynamic, is closely linked to technological advances, which will be described in this instructional review. Finally, the concept of 4D planning, encompassing periarticular soft tissues, will be introduced as a forward-looking development in the field of orthopedic surgery.
Collapse
Affiliation(s)
- Julien Berhouet
- Service de Chirurgie Orthopédique et Traumatologique, Centre Hospitalier Régional Universitaire (CHRU) de Tours, 1C Avenue de la République, 37170 Chambray-les-Tours, France
- Equipe Reconnaissance de Forme et Analyse de l'Image, Laboratoire d'Informatique Fondamentale et Appliquée de Tours EA6300, Ecole d'Ingénieurs Polytechnique Universitaire de Tours, Université de Tours, 64 Avenue Portalis, 37200 Tours, France
| | - Ramy Samargandi
- Service de Chirurgie Orthopédique et Traumatologique, Centre Hospitalier Régional Universitaire (CHRU) de Tours, 1C Avenue de la République, 37170 Chambray-les-Tours, France
- Department of Orthopedic Surgery, Faculty of Medicine, University of Jeddah, Jeddah 23218, Saudi Arabia
| |
Collapse
|
2
|
Benda V, Kubicek J, Madeja R, Oczka D, Cerny M, Dostalova K. Design of Proposed Software System for Prediction of Iliosacral Screw Placement for Iliosacral Joint Injuries Based on X-ray and CT Images. J Clin Med 2023; 12:jcm12062138. [PMID: 36983141 PMCID: PMC10054889 DOI: 10.3390/jcm12062138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2023] [Revised: 03/03/2023] [Accepted: 03/07/2023] [Indexed: 03/30/2023] Open
Abstract
One of the crucial tasks for the planning of surgery of the iliosacral joint is placing an iliosacral screw with the goal of fixing broken parts of the pelvis. Tracking of proper screw trajectory is usually done in the preoperative phase by the acquisition of X-ray images under different angles, which guide the surgeons to perform surgery. This approach is standardly complicated due to the investigation of 2D X-ray images not showing spatial perspective. Therefore, in this pilot study, we propose complex software tools which are aimed at making a simulation model of reconstructed CT (DDR) images with a virtual iliosacral screw to guide the surgery process. This pilot study presents the testing for two clinical cases to reveal the initial performance and usability of this software in clinical conditions. This model is consequently used for a multiregional registration with reference intraoperative X-ray images to select the slide from the 3D dataset which best fits with reference X-ray. The proposed software solution utilizes input CT slices of the pelvis area to create a segmentation model of individual bone components. Consequently, a model of an iliosacral screw is inserted into this model. In the next step, we propose the software CT2DDR which makes DDR projections with the iliosacral screw. In the last step, we propose a multimodal registration procedure, which performs registration of a selected number of slices with reference X-ray, and based on the Structural Similarity Index (SSIM) and index of correlation, the procedure finds the best match of DDR with X-ray images. In this pilot study, we also provide a comparative analysis of the computational costs of the multimodal registration upon various numbers of DDR slices to show the complex software performance. The proposed complex model has versatile usage for modeling and surgery planning of the pelvis area in fractures of iliosacral joints.
Collapse
Affiliation(s)
- Vojtech Benda
- Department of Cybernetics and Biomedical Engineering, VŠB-Technical University of Ostrava, 17. listopadu 2172/15, Poruba, 708 00 Ostrava, Czech Republic
| | - Jan Kubicek
- Department of Cybernetics and Biomedical Engineering, VŠB-Technical University of Ostrava, 17. listopadu 2172/15, Poruba, 708 00 Ostrava, Czech Republic
| | - Roman Madeja
- Trauma Center, University Hospital Ostrava, 17. listopadu 1790, Poruba, 708 52 Ostrava, Czech Republic
| | - David Oczka
- Department of Cybernetics and Biomedical Engineering, VŠB-Technical University of Ostrava, 17. listopadu 2172/15, Poruba, 708 00 Ostrava, Czech Republic
| | - Martin Cerny
- Department of Cybernetics and Biomedical Engineering, VŠB-Technical University of Ostrava, 17. listopadu 2172/15, Poruba, 708 00 Ostrava, Czech Republic
| | - Kamila Dostalova
- Department of Cybernetics and Biomedical Engineering, VŠB-Technical University of Ostrava, 17. listopadu 2172/15, Poruba, 708 00 Ostrava, Czech Republic
| |
Collapse
|
3
|
Kordon F, Maier A, Swartman B, Privalov M, El Barbari JS, Kunze H. Multi-Stage Platform for (Semi-)Automatic Planning in Reconstructive Orthopedic Surgery. J Imaging 2022; 8:jimaging8040108. [PMID: 35448235 PMCID: PMC9027971 DOI: 10.3390/jimaging8040108] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Revised: 04/05/2022] [Accepted: 04/08/2022] [Indexed: 01/11/2023] Open
Abstract
Intricate lesions of the musculoskeletal system require reconstructive orthopedic surgery to restore the correct biomechanics. Careful pre-operative planning of the surgical steps on 2D image data is an essential tool to increase the precision and safety of these operations. However, the plan’s effectiveness in the intra-operative workflow is challenged by unpredictable patient and device positioning and complex registration protocols. Here, we develop and analyze a multi-stage algorithm that combines deep learning-based anatomical feature detection and geometric post-processing to enable accurate pre- and intra-operative surgery planning on 2D X-ray images. The algorithm allows granular control over each element of the planning geometry, enabling real-time adjustments directly in the operating room (OR). In the method evaluation of three ligament reconstruction tasks effect on the knee joint, we found high spatial precision in drilling point localization (ε<2.9mm) and low angulation errors for k-wire instrumentation (ε<0.75∘) on 38 diagnostic radiographs. Comparable precision was demonstrated in 15 complex intra-operative trauma cases suffering from strong implant overlap and multi-anatomy exposure. Furthermore, we found that the diverse feature detection tasks can be efficiently solved with a multi-task network topology, improving precision over the single-task case. Our platform will help overcome the limitations of current clinical practice and foster surgical plan generation and adjustment directly in the OR, ultimately motivating the development of novel 2D planning guidelines.
Collapse
Affiliation(s)
- Florian Kordon
- Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg, 91058 Erlangen, Germany; (A.M.); (H.K.)
- Erlangen Graduate School in Advanced Optical Technologies (SAOT), Friedrich-Alexander University Erlangen-Nuremberg, 91052 Erlangen, Germany
- Advanced Therapies, Siemens Healthcare GmbH, 91031 Forchheim, Germany
- Correspondence:
| | - Andreas Maier
- Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg, 91058 Erlangen, Germany; (A.M.); (H.K.)
- Erlangen Graduate School in Advanced Optical Technologies (SAOT), Friedrich-Alexander University Erlangen-Nuremberg, 91052 Erlangen, Germany
| | - Benedict Swartman
- Department for Trauma and Orthopaedic Surgery, BG Trauma Center, Ludwigshafen, 67071 Ludwigshafen, Germany; (B.S.); (M.P.); (J.S.E.B.)
| | - Maxim Privalov
- Department for Trauma and Orthopaedic Surgery, BG Trauma Center, Ludwigshafen, 67071 Ludwigshafen, Germany; (B.S.); (M.P.); (J.S.E.B.)
| | - Jan Siad El Barbari
- Department for Trauma and Orthopaedic Surgery, BG Trauma Center, Ludwigshafen, 67071 Ludwigshafen, Germany; (B.S.); (M.P.); (J.S.E.B.)
| | - Holger Kunze
- Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg, 91058 Erlangen, Germany; (A.M.); (H.K.)
- Advanced Therapies, Siemens Healthcare GmbH, 91031 Forchheim, Germany
| |
Collapse
|
4
|
Smart sensor implant technology in total knee arthroplasty. J Clin Orthop Trauma 2021; 22:101605. [PMID: 34631412 PMCID: PMC8479248 DOI: 10.1016/j.jcot.2021.101605] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/25/2021] [Revised: 09/19/2021] [Accepted: 09/19/2021] [Indexed: 01/30/2023] Open
Abstract
Innovations in computer technology and implant design have paved the way for the development of smart instruments and intelligent implants in trauma and orthopaedics to improve patient-related functional outcomes. Sensor technology uses embedded devices that detect physical, chemical and biological signals and provide a way for these signals to be measured and recorded. Sensor technology applications have been introduced in various fields of medicine in the diagnosis, treatment and monitoring of diseases. Intelligent 'Smart' implants are devices that can provide diagnostic capabilities along with therapeutic benefits. In trauma and orthopaedics, applications of sensors is increasing because of the advances in microchip technologies for implant devices and research designs. It offers real-time monitoring from the signals transmitted by the embedded sensors and thus provides early management solutions. Smart orthopaedic implants have applications in total knee arthroplasty, hip arthroplasty, spine surgery, fracture healing, early detection of infection and implant loosening. Here we have explored the role of Smart sensor implant technology in total knee arthroplasty. Smart sensor assisted can be used intraoperatively to provide objective assessment of ligament and soft tissue balancing whilst maintaining the sagittal and coronal alignment to achieve desired kinematic targets following total knee arthroplasty. It can also provide post-implantation data to monitor implant performance in natural conditions and patient's clinical recovery during rehabilitation. The use of Smart Sensor implant technology in total knee arthroplasty appears to provide superior patient satisfaction rates and improved functional outcomes.
Collapse
|
5
|
Popescu D, Marinescu R, Laptoiu D, Deac GC, Cotet CE. DICOM 3D viewers, virtual reality or 3D printing - a pilot usability study for assessing the preference of orthopedic surgeons. Proc Inst Mech Eng H 2021; 235:1014-1024. [PMID: 34176364 DOI: 10.1177/09544119211020148] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
As standard practice in orthopedic surgery, the information gathered by analyzing Computer Tomography (CT) 2D images is used for patient diagnosis and planning surgery. Lately, these virtual slices are the input for generating 3D virtual models using DICOM viewers, facilitating spatial orientation, and diagnosis. Virtual Reality (VR) and 3D printing (3DP) technologies are also reported for use in anatomy visualization, medical training, and diagnosis. However, it has not been yet investigated whether the surgeons consider that the advantages offered by 3DP and VR outweigh their development efforts. Moreover, no comparative evaluation for understanding surgeon's preference in using these investigation tools has been performed so far. Therefore, in this paper, a pilot usability test was conducted for collecting surgeons' opinions. 3D models of knee, hip and foot were displayed using DICOM 3D viewer, two VR environments and as 3D-printed replicas. These tools adequacy for diagnosis was comparatively assessed in three cases scenarios, the time for completing the diagnosis tasks was recorded and questionnaires filled in. The time for preparing the models for VR and 3DP, the resources needed and the associated costs were presented in order to provide surgeons with the whole context. Results showed a preference in using desktop DICOM viewer with 3D capabilities along with the information provided by Unity-based VR solution for visualizing the virtual model from various angles challenging to analyze on the computer screen. 3D-printed replicas were considered more useful for physically simulating the surgery than for diagnosis. For the VR and 3DP models, the lack of information on bone quality was considered an important drawback. The following order of using the tools was preferred: DICOM viewer, followed by Unity VR and 3DP.
Collapse
Affiliation(s)
- Diana Popescu
- Department of Robotics and Production Systems, University Politehnica of Bucharest, Bucharest, Romania
| | - Rodica Marinescu
- University of Medicine and Pharmacy Carol Davila Bucharest, Bucharest, Romania
| | - Dan Laptoiu
- Department of Orthopedics, Colentina Clinical Hospital, Bucharest, Romania
| | - Gicu Calin Deac
- Department of Robotics and Production Systems, University Politehnica of Bucharest, Bucharest, Romania
| | - Costel Emil Cotet
- Department of Robotics and Production Systems, University Politehnica of Bucharest, Bucharest, Romania
| |
Collapse
|
6
|
Li Q, Du Z, Yu H. Grinding trajectory generator in robot-assisted laminectomy surgery. Int J Comput Assist Radiol Surg 2021; 16:485-494. [PMID: 33507483 DOI: 10.1007/s11548-021-02316-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Accepted: 01/15/2021] [Indexed: 10/22/2022]
Abstract
PURPOSE Grinding trajectory planning for robot-assisted laminectomy is a complicated and cumbersome task. The purpose of this research is to automatically obtain the surgical target area from the CT image, and based on this, formulate a reasonable robotic grinding trajectory. METHODS We propose a deep neural network for laminae positioning, a trajectory generation strategy, and a grinding speed adjusting strategy. These algorithms can obtain surgical information from CT images and automatically complete grinding trajectory planning. RESULTS The proposed laminae positioning network can reach a recognition accuracy of 95.7%, and the positioning error is only 1.12 mm in the desired direction. The simulated surgical planning on the public dataset has achieved the expected results. In a set of comparative robotic grinding experiments, those using the speed adjustment algorithm obtained a smoother grinding force. CONCLUSION Our work can automatically extract laminar centers from the CT image precisely to formulate a reasonable surgical trajectory plan. It simplifies the surgical planning process and reduces the time needed for surgeons to perform such a cumbersome operation manually.
Collapse
Affiliation(s)
- Qian Li
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China
| | - Zhijiang Du
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China
| | - Hongjian Yu
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, China.
| |
Collapse
|
7
|
Zhao JX, Su XY, Zhao Z, Xiao RX, Zhang LC, Tang PF. Radiographic assessment of the cup orientation after total hip arthroplasty: a literature review. ANNALS OF TRANSLATIONAL MEDICINE 2020; 8:130. [PMID: 32175423 DOI: 10.21037/atm.2019.12.150] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Optimal acetabular cup orientation is of substantial importance to good long-term function and low complication rates after total hip arthroplasty (THA). The radiographic anteversion (RA) and inclination (RI) angles of the cup are typically studied due to the practicability, simplicity, and ease of interpretation of their measurements. A great number of methods have been developed to date, most of which have been performed on pelvic or hip anteroposterior radiographs. However, there are primarily two influencing factors for these methods: X-ray offset and pelvic rotation. In addition, there are three types of pelvic rotations about the transverse, longitudinal, and anteroposterior axes of the body. Their effects on the RA and RI angles of the cup are interactively correlated with the position and true orientation of the cup. To date, various fitted or analytical models have been established to disclose the correlations between the X-ray offset and pelvic rotation and the RA and RI angles of the cup. Most of these models do not incorporate all the potential influencing parameters. Advanced methods for performing X-ray offset and pelvic rotation corrections are mainly performed on a single pelvic AP radiograph, two synchronized radiographs, or a two-dimensional/three-dimensional (2D-3D) registration system. Some measurement systems, originally developed for evaluating implant migration or wear, could also be used for correcting the X-ray offset and pelvic rotation simultaneously, but some drawbacks still exist with these systems. Above all, the 2D-3D registration technique might be an alternative and powerful tool for accurately measuring cup orientation. In addition to the current methods used for postoperative assessment, navigation systems and augmented reality are also used for the preoperative planning and intraoperative guidance of cup placement. With the continuing development of artificial intelligence and machine learning, these techniques could be incorporated into robot-assisted orthopaedic surgery in the future.
Collapse
Affiliation(s)
- Jing-Xin Zhao
- Department of Orthopaedics, the First Medical Centre, Chinese PLA General Hospital, Beijing 100853, China.,National Clinical Research Center for Orthopedics, Sports Medicine & Rehabilitation, Beijing 100853, China
| | - Xiu-Yun Su
- Department of Orthopaedics, the First Medical Centre, Chinese PLA General Hospital, Beijing 100853, China.,Intelligent and Digital Surgery Innovation Center, Southern University of Science and Technology Hospital, Shenzhen, Guangdong 518055, China
| | - Zhe Zhao
- Department of Orthopaedics, Beijing Tsinghua Changgung Hospital, School of Clinical Medicine, Tsinghua University, Beijing 102218, China
| | - Ruo-Xiu Xiao
- School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China
| | - Li-Cheng Zhang
- Department of Orthopaedics, the First Medical Centre, Chinese PLA General Hospital, Beijing 100853, China.,National Clinical Research Center for Orthopedics, Sports Medicine & Rehabilitation, Beijing 100853, China
| | - Pei-Fu Tang
- Department of Orthopaedics, the First Medical Centre, Chinese PLA General Hospital, Beijing 100853, China.,National Clinical Research Center for Orthopedics, Sports Medicine & Rehabilitation, Beijing 100853, China
| |
Collapse
|