1
|
Sultan LR, Haertter A, Al-Hasani M, Demiris G, Cary TW, Tung-Chen Y, Sehgal CM. Can Artificial Intelligence Aid Diagnosis by Teleguided Point-of-Care Ultrasound? A Pilot Study for Evaluating a Novel Computer Algorithm for COVID-19 Diagnosis Using Lung Ultrasound. AI 2023; 4:875-887. [PMID: 37929255 PMCID: PMC10623579 DOI: 10.3390/ai4040044] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2023] Open
Abstract
With the 2019 coronavirus disease (COVID-19) pandemic, there is an increasing demand for remote monitoring technologies to reduce patient and provider exposure. One field that has an increasing potential is teleguided ultrasound, where telemedicine and point-of-care ultrasound (POCUS) merge to create this new scope. Teleguided POCUS can minimize staff exposure while preserving patient safety and oversight during bedside procedures. In this paper, we propose the use of teleguided POCUS supported by AI technologies for the remote monitoring of COVID-19 patients by non-experienced personnel including self-monitoring by the patients themselves. Our hypothesis is that AI technologies can facilitate the remote monitoring of COVID-19 patients through the utilization of POCUS devices, even when operated by individuals without formal medical training. In pursuit of this goal, we performed a pilot analysis to evaluate the performance of users with different clinical backgrounds using a computer-based system for COVID-19 detection using lung ultrasound. The purpose of the analysis was to emphasize the potential of the proposed AI technology for improving diagnostic performance, especially for users with less experience.
Collapse
Affiliation(s)
- Laith R. Sultan
- Department of Radiology, Children’s Hospital of Philadelphia, Philadelphia, PA 19104, USA
| | - Allison Haertter
- Radiation Oncology Department, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Maryam Al-Hasani
- Ultrasound Research Lab, Department of Radiology, University of Pennsylvania, Philadelphia, PA 19103, USA
| | - George Demiris
- Informatics Division of the Department of Biostatistics, Epidemiology and Informatics, University of Pennsylvania, Philadelphia, PA 19104, USA
| | - Theodore W. Cary
- Ultrasound Research Lab, Department of Radiology, University of Pennsylvania, Philadelphia, PA 19103, USA
| | - Yale Tung-Chen
- Emergency Medicine Department, La Madrida Hospital, 28006 Madrid, Spain
| | - Chandra M. Sehgal
- Ultrasound Research Lab, Department of Radiology, University of Pennsylvania, Philadelphia, PA 19103, USA
| |
Collapse
|
2
|
Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. SENSORS (BASEL, SWITZERLAND) 2023; 23:6202. [PMID: 37448050 DOI: 10.3390/s23136202] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2023] [Revised: 06/09/2023] [Accepted: 07/03/2023] [Indexed: 07/15/2023]
Abstract
Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future.
Collapse
Affiliation(s)
- Jenna Seetohul
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| | - Mahmood Shafiee
- Mechanical Engineering Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
- School of Mechanical Engineering Sciences, University of Surrey, Guildford GU2 7XH, UK
| | - Konstantinos Sirlantzis
- School of Engineering, Technology and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Intelligent Interactions Group, School of Engineering, University of Kent, Canterbury CT2 7NT, UK
| |
Collapse
|
3
|
Brockmeyer P, Wiechens B, Schliephake H. The Role of Augmented Reality in the Advancement of Minimally Invasive Surgery Procedures: A Scoping Review. Bioengineering (Basel) 2023; 10:bioengineering10040501. [PMID: 37106688 PMCID: PMC10136262 DOI: 10.3390/bioengineering10040501] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Revised: 04/13/2023] [Accepted: 04/20/2023] [Indexed: 04/29/2023] Open
Abstract
The purpose of this review was to analyze the evidence on the role of augmented reality (AR) in the improvement of minimally invasive surgical (MIS) procedures. A scoping literature search of the PubMed and ScienceDirect databases was performed to identify articles published in the last five years that addressed the direct impact of AR technology on MIS procedures or that addressed an area of education or clinical care that could potentially be used for MIS development. A total of 359 studies were screened and 31 articles were reviewed in depth and categorized into three main groups: Navigation, education and training, and user-environment interfaces. A comparison of studies within the different application groups showed that AR technology can be useful in various disciplines to advance the development of MIS. Although AR-guided navigation systems do not yet offer a precision advantage, benefits include improved ergonomics and visualization, as well as reduced surgical time and blood loss. Benefits can also be seen in improved education and training conditions and improved user-environment interfaces that can indirectly influence MIS procedures. However, there are still technical challenges that need to be addressed to demonstrate added value to patient care and should be evaluated in clinical trials with sufficient patient numbers or even in systematic reviews or meta-analyses.
Collapse
Affiliation(s)
- Phillipp Brockmeyer
- Department of Oral and Maxillofacial Surgery, University Medical Center Goettingen, D-37075 Goettingen, Germany
| | - Bernhard Wiechens
- Department of Orthodontics, University Medical Center Goettingen, D-37075 Goettingen, Germany
| | - Henning Schliephake
- Department of Oral and Maxillofacial Surgery, University Medical Center Goettingen, D-37075 Goettingen, Germany
| |
Collapse
|
4
|
Avrumova F, Lebl DR. Augmented reality for minimally invasive spinal surgery. Front Surg 2023; 9:1086988. [PMID: 36776471 PMCID: PMC9914175 DOI: 10.3389/fsurg.2022.1086988] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2022] [Accepted: 12/28/2022] [Indexed: 01/28/2023] Open
Abstract
Background Augmented reality (AR) is an emerging technology that can overlay computer graphics onto the real world and enhance visual feedback from information systems. Within the past several decades, innovations related to AR have been integrated into our daily lives; however, its application in medicine, specifically in minimally invasive spine surgery (MISS), may be most important to understand. AR navigation provides auditory and haptic feedback, which can further enhance surgeons' capabilities and improve safety. Purpose The purpose of this article is to address previous and current applications of AR, AR in MISS, limitations of today's technology, and future areas of innovation. Methods A literature review related to applications of AR technology in previous and current generations was conducted. Results AR systems have been implemented for treatments related to spinal surgeries in recent years, and AR may be an alternative to current approaches such as traditional navigation, robotically assisted navigation, fluoroscopic guidance, and free hand. As AR is capable of projecting patient anatomy directly on the surgical field, it can eliminate concern for surgeon attention shift from the surgical field to navigated remote screens, line-of-sight interruption, and cumulative radiation exposure as the demand for MISS increases. Conclusion AR is a novel technology that can improve spinal surgery, and limitations will likely have a great impact on future technology.
Collapse
|
5
|
Sun X, Zou Y, Wang S, Su H, Guan B. A parallel network utilizing local features and global representations for segmentation of surgical instruments. Int J Comput Assist Radiol Surg 2022; 17:1903-1913. [PMID: 35680692 DOI: 10.1007/s11548-022-02687-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Accepted: 05/19/2022] [Indexed: 11/30/2022]
Abstract
PURPOSE Automatic image segmentation of surgical instruments is a fundamental task in robot-assisted minimally invasive surgery, which greatly improves the context awareness of surgeons during the operation. A novel method based on Mask R-CNN is proposed in this paper to realize accurate instance segmentation of surgical instruments. METHODS A novel feature extraction backbone is built, which could extract both local features through the convolutional neural network branch and global representations through the Swin-Transformer branch. Moreover, skip fusions are applied in the backbone to fuse both features and improve the generalization ability of the network. RESULTS The proposed method is evaluated on the dataset of MICCAI 2017 EndoVis Challenge with three segmentation tasks and shows state-of-the-art performance with an mIoU of 0.5873 in type segmentation and 0.7408 in part segmentation. Furthermore, the results of ablation studies prove that the proposed novel backbone contributes to at least 17% improvement in mIoU. CONCLUSION The promising results demonstrate that our method can effectively extract global representations as well as local features in the segmentation of surgical instruments and improve the accuracy of segmentation. With the proposed novel backbone, the network can segment the contours of surgical instruments' end tips more precisely. This method can provide more accurate data for localization and pose estimation of surgical instruments, and make a further contribution to the automation of robot-assisted minimally invasive surgery.
Collapse
Affiliation(s)
- Xinan Sun
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, 135 Yaguan Road, Tianjin, 300350, China.,School of Mechanical Engineering, Tianjin University, 135 Yaguan Road, Jinnan District, Tianjin, 300350, China
| | - Yuelin Zou
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, 135 Yaguan Road, Tianjin, 300350, China.,School of Mechanical Engineering, Tianjin University, 135 Yaguan Road, Jinnan District, Tianjin, 300350, China
| | - Shuxin Wang
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, 135 Yaguan Road, Tianjin, 300350, China.,School of Mechanical Engineering, Tianjin University, 135 Yaguan Road, Jinnan District, Tianjin, 300350, China
| | - He Su
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, 135 Yaguan Road, Tianjin, 300350, China. .,School of Mechanical Engineering, Tianjin University, 135 Yaguan Road, Jinnan District, Tianjin, 300350, China.
| | - Bo Guan
- Key Laboratory of Mechanism Theory and Equipment Design of Ministry of Education, Tianjin University, 135 Yaguan Road, Tianjin, 300350, China.,School of Mechanical Engineering, Tianjin University, 135 Yaguan Road, Jinnan District, Tianjin, 300350, China
| |
Collapse
|
6
|
Youn JK, Lee D, Ko D, Yeom I, Joo HJ, Kim HC, Kong HJ, Kim HY. Augmented Reality-Based Visual Cue for Guiding Central Catheter Insertion in Pediatric Oncologic Patients. World J Surg 2022; 46:942-948. [PMID: 35006323 DOI: 10.1007/s00268-021-06425-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/29/2021] [Indexed: 10/19/2022]
Abstract
BACKGROUND Pediatric hemato-oncologic patients require central catheters for chemotherapy, and the junction of the superior vena cava and right atrium is considered the ideal location for catheter tips. Skin landmarks or fluoroscopic supports have been applied to identify the cavoatrial junction; however, none has been recognized as the gold standard. Therefore, we aim to develop a safe and accurate technique using augmented reality technology for the location of the cavoatrial junction in pediatric hemato-oncologic patients. METHODS Fifteen oncology patients who underwent chest computed tomography were enrolled for Hickman catheter or chemoport insertion. With the aid of augmented reality technology, three-dimensional models of the internal jugular veins, external jugular veins, subclavian veins, superior vena cava, and right atrium were constructed. On inserting the central vein catheters, the cavoatrial junction identified using the three-dimensional models were marked on the body surface, the tip was positioned at the corresponding location, and the actual insertion location was confirmed using a portable x-ray machine. The proposed method was evaluated by comparing the distance from the cavoatrial junction to the augmented reality location with that to the conventional location on x-ray. RESULTS The mean distance between the cavoatrial junction and augmented reality location on x-ray was 1.2 cm, which was significantly shorter than that between the cavoatrial junction and conventional location (1.9 cm; P = 0.027). CONCLUSIONS Central catheter insertion using augmented reality technology is more safe and accurate than that using conventional methods and can be performed at no additional cost in oncology patients.
Collapse
Affiliation(s)
- Joong Kee Youn
- Department of Pediatric Surgery, Seoul National University Hospital, Seoul, Republic of Korea.,Department of Pediatric Surgery, Seoul National University College of Medicine, 101 Daehak-ro, Jongro-gu, Seoul, 03080, Republic of Korea
| | - Dongheon Lee
- Department of Biomedical Engineering, Chungnam National University College of Medicine and Hospital, Daejeon, Republic of Korea
| | - Dayoung Ko
- Department of Pediatric Surgery, Seoul National University Hospital, Seoul, Republic of Korea
| | - Inhwa Yeom
- Transdisciplinary Department of Medicine and Advanced Technology, Seoul National University Hospital, 101 Daehak-ro, Jongro-gu, Seoul, 03080, Republic of Korea
| | - Hyun-Jin Joo
- Transdisciplinary Department of Medicine and Advanced Technology, Seoul National University Hospital, 101 Daehak-ro, Jongro-gu, Seoul, 03080, Republic of Korea
| | - Hee Chan Kim
- Department of Biomedical Engineering, Seoul National University College of Medicine, Seoul, Republic of Korea.,Institute of Medical and Biological Engineering, Medical Research Center, Seoul National University College of Medicine, Seoul, Republic of Korea
| | - Hyoun-Joong Kong
- Transdisciplinary Department of Medicine and Advanced Technology, Seoul National University Hospital, 101 Daehak-ro, Jongro-gu, Seoul, 03080, Republic of Korea.
| | - Hyun-Young Kim
- Department of Pediatric Surgery, Seoul National University College of Medicine, 101 Daehak-ro, Jongro-gu, Seoul, 03080, Republic of Korea.
| |
Collapse
|