1
|
Li S, Ye X, Tian H, Ding Z, Cui C, Shi S, Yang Y, Li G, Chen J, Lin Z, Ni Z, Xu J, Dong F. An artificial intelligence model based on transrectal ultrasound images of biopsy needle tract tissues to differentiate prostate cancer. Postgrad Med J 2024; 100:228-236. [PMID: 38142286 DOI: 10.1093/postmj/qgad127] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2023] [Revised: 10/25/2023] [Accepted: 11/10/2023] [Indexed: 12/25/2023]
Abstract
PURPOSE We aimed to develop an artificial intelligence (AI) model based on transrectal ultrasonography (TRUS) images of biopsy needle tract (BNT) tissues for predicting prostate cancer (PCa) and to compare the PCa diagnostic performance of the radiologist model and clinical model. METHODS A total of 1696 2D prostate TRUS images were involved from 142 patients between July 2021 and May 2022. The ResNet50 network model was utilized to train classification models with different input methods: original image (Whole model), BNT (Needle model), and combined image [Feature Pyramid Networks (FPN) model]. The training set, validation set, and test set were randomly assigned, then randomized 5-fold cross-validation between the training set and validation set was performed. The diagnostic effectiveness of AI models and image combination was accessed by an independent testing set. Then, the optimal AI model and image combination were selected to compare the diagnostic efficacy with that of senior radiologists and the clinical model. RESULTS In the test set, the area under the curve, specificity, and sensitivity of the FPN model were 0.934, 0.966, and 0.829, respectively; the diagnostic efficacy was improved compared with the Whole and Needle models, with statistically significant differences (P < 0.05), and was better than that of senior radiologists (area under the curve: 0.667). The FPN model detected more PCa compared with senior physicians (82.9% vs. 55.8%), with a 61.3% decrease in the false-positive rate and a 23.2% increase in overall accuracy (0.887 vs. 0.655). CONCLUSION The proposed FPN model can offer a new method for prostate tissue classification, improve the diagnostic performance, and may be a helpful tool to guide prostate biopsy.
Collapse
Affiliation(s)
- Shiyu Li
- Department of Ultrasound, The Second Clinical Medical College of Jinan University, China
| | - Xiuqin Ye
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Hongtian Tian
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Zhimin Ding
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Chen Cui
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Siyuan Shi
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Yang Yang
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Guoqiu Li
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Jing Chen
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Ziwei Lin
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Zhipeng Ni
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Jinfeng Xu
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| | - Fajin Dong
- Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen People's Hospital, Shenzhen, Guangdong 518020, China
| |
Collapse
|
2
|
Zhou Z, Qian X, Hu J, Chen G, Zhang C, Zhu J, Dai Y. An artificial intelligence-assisted diagnosis modeling software (AIMS) platform based on medical images and machine learning: a development and validation study. Quant Imaging Med Surg 2023; 13:7504-7522. [PMID: 37969634 PMCID: PMC10644131 DOI: 10.21037/qims-23-20] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2023] [Accepted: 06/12/2023] [Indexed: 11/17/2023]
Abstract
Background Supervised machine learning methods [both radiomics and convolutional neural network (CNN)-based deep learning] are usually employed to develop artificial intelligence models with medical images for computer-assisted diagnosis and prognosis of diseases. A classical machine learning-based modeling workflow involves a series of interconnected components and various algorithms, but this makes it challenging, tedious, and labor intensive for radiologists and researchers to build customized models for specific clinical applications if they lack expertise in machine learning methods. Methods We developed a user-friendly artificial intelligence-assisted diagnosis modeling software (AIMS) platform, which supplies standardized machine learning-based modeling workflows for computer-assisted diagnosis and prognosis systems with medical images. In contrast to other existing software platforms, AIMS contains both radiomics and CNN-based deep learning workflows, making it an all-in-one software platform for machine learning-based medical image analysis. The modular design of AIMS allows users to build machine learning models easily, test models comprehensively, and fairly compare the performance of different models in a specific application. The graphical user interface (GUI) enables users to process large numbers of medical images without programming or script addition. Furthermore, AIMS also provides a flexible image processing toolkit (e.g., semiautomatic segmentation, registration, morphological operations) to rapidly create lesion labels for multiphase analysis, multiregion analysis of an individual tumor (e.g., tumor mass and peritumor), and multimodality analysis. Results The functionality and efficiency of AIMS were demonstrated in 3 independent experiments in radiation oncology, where multiphase, multiregion, and multimodality analyses were performed, respectively. For clear cell renal cell carcinoma (ccRCC) Fuhrman grading with multiphase analysis (sample size =187), the area under the curve (AUC) value of the AIMS was 0.776; for ccRCC Fuhrman grading with multiregion analysis (sample size =177), the AUC value of the AIMS was 0.848; for prostate cancer Gleason grading with multimodality analysis (sample size =206), the AUC value of the AIMS was 0.980. Conclusions AIMS provides a user-friendly infrastructure for radiologists and researchers, lowering the barrier to building customized machine learning-based computer-assisted diagnosis models for medical image analysis.
Collapse
Affiliation(s)
- Zhiyong Zhou
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
| | - Xusheng Qian
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, China
| | - Jisu Hu
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei, China
| | - Guangqiang Chen
- Department of Radiology, The Second Affiliated Hospital of Soochow University, Suzhou, China
| | - Caiyuan Zhang
- Department of Radiology, The Second Affiliated Hospital of Soochow University, Suzhou, China
| | - Jianbing Zhu
- Suzhou Science & Technology Town Hospital, Suzhou Hospital, Affiliated Hospital of Medical School, Nanjing University, Suzhou, China
| | - Yakang Dai
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, China
- Suzhou Guoke Kangcheng Medical Technology Co., Ltd., Suzhou, China
| |
Collapse
|