1
|
Kwon MR, Youn I, Lee MY, Lee HA. Diagnostic Performance of Artificial Intelligence-Based Computer-Aided Detection Software for Automated Breast Ultrasound. Acad Radiol 2024; 31:480-491. [PMID: 37813703 DOI: 10.1016/j.acra.2023.09.013] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2023] [Revised: 08/25/2023] [Accepted: 09/12/2023] [Indexed: 10/11/2023]
Abstract
RATIONALE AND OBJECTIVES This study aimed to evaluate the diagnostic performance of radiologists following the utilization of artificial intelligence (AI)-based computer-aided detection software (CAD) in detecting suspicious lesions in automated breast ultrasounds (ABUS). MATERIALS AND METHODS ABUS-detected 262 breast lesions (histopathological verification; January 2020 to December 2022) were included. Two radiologists reviewed the images and assigned a Breast Imaging Reporting and Data System (BI-RADS) category. ABUS images were classified as positive or negative using AI-CAD. The BI-RADS category was readjusted in four ways: the radiologists modified the BI-RADS category using the AI results (AI-aided 1), upgraded or downgraded based on AI results (AI-aided 2), only upgraded for positive results (AI-aided 3), or only downgraded for negative results (AI-aided 4). The AI-aided diagnostic performances were compared to radiologists. The AI-CAD-positive and AI-CAD-negative cancer characteristics were compared. RESULTS For 262 lesions (145 malignant and 117 benign) in 231 women (mean age, 52.2 years), the area under the receiver operator characteristic curve (AUC) of radiologists was 0.870 (95% confidence interval [CI], 0.832-0.908). The AUC significantly improved to 0.919 (95% CI, 0.890-0.947; P = 0.001) using AI-aided 1, whereas it improved without significance to 0.884 (95% CI, 0.844-0.923), 0.890 (95% CI, 0.852-0.929), and 0.890 (95% CI, 0.853-0.928) using AI-aided 2, 3, and 4, respectively. AI-CAD-negative cancers were smaller, less frequently exhibited retraction phenomenon, and had lower BI-RADS category. Among nonmass lesions, AI-CAD-negative cancers showed no posterior shadowing. CONCLUSION AI-CAD implementation significantly improved the radiologists' diagnostic performance and may serve as a valuable diagnostic tool.
Collapse
Affiliation(s)
- Mi-Ri Kwon
- Department of Radiology, Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, 29 Saemunan-ro, Jongno-gu, Seoul, 03181, Republic of Korea (M.K., I.Y., H.-A.L.)
| | - Inyoung Youn
- Department of Radiology, Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, 29 Saemunan-ro, Jongno-gu, Seoul, 03181, Republic of Korea (M.K., I.Y., H.-A.L.).
| | - Mi Yeon Lee
- Division of Biostatistics, Department of R&D Management, Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, Seoul, Republic of Korea (M.Y.L.)
| | - Hyun-Ah Lee
- Department of Radiology, Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, 29 Saemunan-ro, Jongno-gu, Seoul, 03181, Republic of Korea (M.K., I.Y., H.-A.L.)
| |
Collapse
|
2
|
Hejduk P, Marcon M, Unkelbach J, Ciritsis A, Rossi C, Borkowski K, Boss A. Fully automatic classification of automated breast ultrasound (ABUS) imaging according to BI-RADS using a deep convolutional neural network. Eur Radiol 2022; 32:4868-4878. [PMID: 35147776 PMCID: PMC9213284 DOI: 10.1007/s00330-022-08558-0] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2021] [Revised: 12/14/2021] [Accepted: 12/26/2021] [Indexed: 12/15/2022]
Abstract
PURPOSE The aim of this study was to develop and test a post-processing technique for detection and classification of lesions according to the BI-RADS atlas in automated breast ultrasound (ABUS) based on deep convolutional neural networks (dCNNs). METHODS AND MATERIALS In this retrospective study, 645 ABUS datasets from 113 patients were included; 55 patients had lesions classified as high malignancy probability. Lesions were categorized in BI-RADS 2 (no suspicion of malignancy), BI-RADS 3 (probability of malignancy < 3%), and BI-RADS 4/5 (probability of malignancy > 3%). A deep convolutional neural network was trained after data augmentation with images of lesions and normal breast tissue, and a sliding-window approach for lesion detection was implemented. The algorithm was applied to a test dataset containing 128 images and performance was compared with readings of 2 experienced radiologists. RESULTS Results of calculations performed on single images showed accuracy of 79.7% and AUC of 0.91 [95% CI: 0.85-0.96] in categorization according to BI-RADS. Moderate agreement between dCNN and ground truth has been achieved (κ: 0.57 [95% CI: 0.50-0.64]) what is comparable with human readers. Analysis of whole dataset improved categorization accuracy to 90.9% and AUC of 0.91 [95% CI: 0.77-1.00], while achieving almost perfect agreement with ground truth (κ: 0.82 [95% CI: 0.69-0.95]), performing on par with human readers. Furthermore, the object localization technique allowed the detection of lesion position slice-wise. CONCLUSIONS Our results show that a dCNN can be trained to detect and distinguish lesions in ABUS according to the BI-RADS classification with similar accuracy as experienced radiologists. KEY POINTS • A deep convolutional neural network (dCNN) was trained for classification of ABUS lesions according to the BI-RADS atlas. • A sliding-window approach allows accurate automatic detection and classification of lesions in ABUS examinations.
Collapse
Affiliation(s)
- Patryk Hejduk
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland.
| | - Magda Marcon
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Jan Unkelbach
- Department of Radiation Oncology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Alexander Ciritsis
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Cristina Rossi
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Karol Borkowski
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| | - Andreas Boss
- Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, Rämistr. 100, 8091, Zurich, Switzerland
| |
Collapse
|
3
|
Zhu Y, Zhan W, Jia X, Liu J, Zhou J. Clinical Application of Computer-Aided Diagnosis for Breast Ultrasonography: Factors That Lead to Discordant Results in Radial and Antiradial Planes. Cancer Manag Res 2022; 14:751-760. [PMID: 35237075 PMCID: PMC8882474 DOI: 10.2147/cmar.s348463] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2021] [Accepted: 01/27/2022] [Indexed: 01/30/2023] Open
Affiliation(s)
- Ying Zhu
- Department of Ultrasound, Shanghai Ruijin Hospital Affiliated to Medical School of Shanghai Jiaotong University, Shanghai, People’s Republic of China
| | - Weiwei Zhan
- Department of Ultrasound, Shanghai Ruijin Hospital Affiliated to Medical School of Shanghai Jiaotong University, Shanghai, People’s Republic of China
| | - Xiaohong Jia
- Department of Ultrasound, Shanghai Ruijin Hospital Affiliated to Medical School of Shanghai Jiaotong University, Shanghai, People’s Republic of China
| | - Juan Liu
- Department of Ultrasound, Shanghai Ruijin Hospital Affiliated to Medical School of Shanghai Jiaotong University, Shanghai, People’s Republic of China
| | - Jianqiao Zhou
- Department of Ultrasound, Shanghai Ruijin Hospital Affiliated to Medical School of Shanghai Jiaotong University, Shanghai, People’s Republic of China
- Correspondence: Jianqiao Zhou, Department of Ultrasound, Shanghai Ruijin Hospital Affiliated to Medical School of Shanghai Jiaotong University, 197 Ruijin Er Road, Shanghai, 200025, People’s Republic of China, Email
| |
Collapse
|
4
|
Ying Z, Xiaohong J, Yijie D, Juan L, Yilai C, Congcong Y, Weiwei Z, Jianqiao Z. Using S-Detect to Improve Breast Ultrasound: The Different Combined Strategies Based on Radiologist Experienc. ADVANCED ULTRASOUND IN DIAGNOSIS AND THERAPY 2022. [DOI: 10.37015/audt.2022.220007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
|
5
|
Xing J, Li Z, Wang B, Qi Y, Yu B, Zanjani FG, Zheng A, Duits R, Tan T. Lesion Segmentation in Ultrasound Using Semi-Pixel-Wise Cycle Generative Adversarial Nets. IEEE/ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS 2021; 18:2555-2565. [PMID: 32149651 DOI: 10.1109/tcbb.2020.2978470] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Breast cancer is the most common invasive cancer with the highest cancer occurrence in females. Handheld ultrasound is one of the most efficient ways to identify and diagnose the breast cancer. The area and the shape information of a lesion is very helpful for clinicians to make diagnostic decisions. In this study we propose a new deep-learning scheme, semi-pixel-wise cycle generative adversarial net (SPCGAN) for segmenting the lesion in 2D ultrasound. The method takes the advantage of a fully convolutional neural network (FCN) and a generative adversarial net to segment a lesion by using prior knowledge. We compared the proposed method to a fully connected neural network and the level set segmentation method on a test dataset consisting of 32 malignant lesions and 109 benign lesions. Our proposed method achieved a Dice similarity coefficient (DSC) of 0.92 while FCN and the level set achieved 0.90 and 0.79 respectively. Particularly, for malignant lesions, our method increases the DSC (0.90) of the fully connected neural network to 0.93 significantly (p 0.001). The results show that our SPCGAN can obtain robust segmentation results. The framework of SPCGAN is particularly effective when sufficient training samples are not available compared to FCN. Our proposed method may be used to relieve the radiologists' burden for annotation.
Collapse
|
6
|
Zhang P, Ma Z, Zhang Y, Chen X, Wang G. Improved Inception V3 method and its effect on radiologists' performance of tumor classification with automated breast ultrasound system. Gland Surg 2021; 10:2232-2245. [PMID: 34422594 PMCID: PMC8340346 DOI: 10.21037/gs-21-328] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2021] [Accepted: 06/17/2021] [Indexed: 11/06/2022]
Abstract
BACKGROUND The automated breast ultrasound system (ABUS) is recognized as a valuable detection tool in addition to mammography. The purpose of this study was to propose a novel computer-aided diagnosis (CAD) system by extracting the textural features from ABUS images and to investigate the efficiency of using this CAD for breast cancer detection. METHODS This retrospective study involved 149 breast nodules [maximum diameter: mean size 18.89 mm, standard deviation (SD) 10.238, and range 5-59 mm] in 135. We assigned 3 novice readers (<3 years of experience and 3 experienced readers (≥10 years of experience to review the imaging data and stratify the 149 breast nodules as either malignant or benign. The Improved Inception V3 (II3) method was developed and used as an assistant tool to help the 6 readers to re-interpret the images. RESULTS Our method (II3) achieved an accuracy of 88.6% for the final result. The 3 novice readers had an average accuracy of 71.37%±4.067% while the 3 experienced readers was 83.03%±3.371% on the first-reading. With the help of II3 on the second-reading, the average accuracy of the novice readers increased to 84.13%±1.662% and the experienced readers increased to 89.50%±0.346%.The areas under the curve (AUCs) were similar compared with linear algorithms. The mean AUC of the novice readers was improved from 0.7751 (without II3) to 0.8232 (with II3). The mean AUC of the experienced readers was improved from 0.8939 (without II3) to 0.9211 (with II3). The mean AUC for all readers improved in both the second-reading mode (from 0.8345 to 0.8722, P=0.0081<0.05). CONCLUSIONS With the help of the II3, the diagnostic accuracy of the two groups were both improved, and II3 was more helpful for novice readers than for experienced readers. Our results showed that II3 is valuable in the differentiation of benign and malignant breast nodules and it also improves the experience and skill of some novice radiologists. The II3 cannot completely replace the influence of experience in the diagnostic process and will retain an auxiliary role in the clinic at present.
Collapse
Affiliation(s)
- Panpan Zhang
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| | - Zhaosheng Ma
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| | - Yingtao Zhang
- Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China
| | - Xiaodan Chen
- Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China
| | - Gang Wang
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| |
Collapse
|
7
|
|
8
|
Liu Y, Ren L, Cao X, Tong Y. Breast tumors recognition based on edge feature extraction using support vector machine. Biomed Signal Process Control 2020. [DOI: 10.1016/j.bspc.2019.101825] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
|
9
|
van Zelst JCM, Tan T, Mann RM, Karssemeijer N. Validation of radiologists' findings by computer-aided detection (CAD) software in breast cancer detection with automated 3D breast ultrasound: a concept study in implementation of artificial intelligence software. Acta Radiol 2020; 61:312-320. [PMID: 31324132 PMCID: PMC7059207 DOI: 10.1177/0284185119858051] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2018] [Accepted: 05/22/2019] [Indexed: 11/16/2022]
Abstract
Background Computer-aided detection software for automated breast ultrasound has been shown to have potential in improving the accuracy of radiologists. Alternative ways of implementing computer-aided detection, such as independent validation or preselecting suspicious cases, might also improve radiologists’ accuracy. Purpose To investigate the effect of using computer-aided detection software to improve the performance of radiologists by validating findings reported by radiologists during screening with automated breast ultrasound. Material and Methods Unilateral automated breast ultrasound exams were performed in 120 women with dense breasts that included 60 randomly selected normal exams, 30 exams with benign lesions, and 30 malignant cases (20 mammography-negative). Eight radiologists were instructed to detect breast cancer and rate lesions using BI-RADS and level-of-suspiciousness scores. Computer-aided detection software was used to check the validity of radiologists' findings. Findings found negative by computer-aided detection were not included in the readers’ performance analysis; however, the nature of these findings were further analyzed. The area under the curve and the partial area under the curve for an interval in the range of 80%–100% specificity before and after validation of computer-aided detection were compared. Sensitivity was computed for all readers at a simulation of 90% specificity. Results Partial AUC improved significantly from 0.126 (95% confidence interval [CI] = 0.098–0.153) to 0.142 (95% CI = 0.115–0.169) (P = 0.037) after computer-aided detection rejected mostly benign lesions and normal tissue scored BI-RADS 3 or 4. The full areas under the curve (0.823 vs. 0.833, respectively) were not significantly different (P = 0.743). Four cancers detected by readers were completely missed by computer-aided detection and four other cancers were detected by both readers and computer-aided detection but falsely rejected due to technical limitations of our implementation of computer-aided detection validation. In this study, validation of computer-aided detection discarded 42.6% of findings that were scored BI-RADS ≥3 by the radiologists, of which 85.5% were non-malignant findings. Conclusion Validation of radiologists’ findings using computer-aided detection software for automated breast ultrasound has the potential to improve the performance of radiologists. Validation of computer-aided detection might be an efficient tool for double-reading strategies by limiting the amount of discordant cases needed to be double-read.
Collapse
Affiliation(s)
- Jan CM van Zelst
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Tao Tan
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Ritse M Mann
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Nico Karssemeijer
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| |
Collapse
|
10
|
Murtaza G, Shuib L, Abdul Wahab AW, Mujtaba G, Mujtaba G, Nweke HF, Al-garadi MA, Zulfiqar F, Raza G, Azmi NA. Deep learning-based breast cancer classification through medical imaging modalities: state of the art and research challenges. Artif Intell Rev 2019. [DOI: 10.1007/s10462-019-09716-5] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
|
11
|
Drukker K, Giger ML, Joe BN, Kerlikowske K, Greenwood H, Drukteinis JS, Niell B, Fan B, Malkov S, Avila J, Kazemi L, Shepherd J. Combined Benefit of Quantitative Three-Compartment Breast Image Analysis and Mammography Radiomics in the Classification of Breast Masses in a Clinical Data Set. Radiology 2018; 290:621-628. [PMID: 30526359 DOI: 10.1148/radiol.2018180608] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
Purpose To investigate the combination of mammography radiomics and quantitative three-compartment breast (3CB) image analysis of dual-energy mammography to limit unnecessary benign breast biopsies. Materials and Methods For this prospective study, dual-energy craniocaudal and mediolateral oblique mammograms were obtained immediately before biopsy in 109 women (mean age, 51 years; range, 31-85 years) with Breast Imaging Reporting and Data System category 4 or 5 breast masses (35 invasive cancers, 74 benign) from 2013 through 2017. The three quantitative compartments of water, lipid, and protein thickness at each pixel were calculated from the attenuation at high and low energy by using a within-image phantom. Masses were automatically segmented and features were extracted from the low-energy mammograms and the quantitative compartment images. Tenfold cross-validations using a linear discriminant classifier with predefined feature signatures helped differentiate between malignant and benign masses by means of (a) water-lipid-protein composition images alone, (b) mammography radiomics alone, and (c) a combined image analysis of both. Positive predictive value of biopsy performed (PPV3) at maximum sensitivity was the primary performance metric, and results were compared with those for conventional diagnostic digital mammography. Results The PPV3 for conventional diagnostic digital mammography in our data set was 32.1% (35 of 109; 95% confidence interval [CI]: 23.9%, 41.3%), with a sensitivity of 100%. In comparison, combined mammography radiomics plus quantitative 3CB image analysis had PPV3 of 49% (34 of 70; 95% CI: 36.5%, 58.9%; P < .001), with a sensitivity of 97% (34 of 35; 95% CI: 90.3%, 100%; P < .001) and 35.8% (39 of 109) fewer total biopsies (P < .001). Conclusion Quantitative three-compartment breast image analysis of breast masses combined with mammography radiomics has the potential to reduce unnecessary breast biopsies. © RSNA, 2018 Online supplemental material is available for this article.
Collapse
Affiliation(s)
- Karen Drukker
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Maryellen L Giger
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Bonnie N Joe
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Karla Kerlikowske
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Heather Greenwood
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Jennifer S Drukteinis
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Bethany Niell
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Bo Fan
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Serghei Malkov
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Jesus Avila
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - Leila Kazemi
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| | - John Shepherd
- From the Department of Radiology, University of Chicago, 5481 S Maryland Ave, MC2026, Chicago, IL 60637 (K.D., M.L.G.); Department of Radiology and Biomedical Imaging (B.N.J., H.G., B.F., S.M., J.A., L.K., J.S.) and Department of Medicine and Epidemiology (K.K.), University of California, San Francisco, San Francisco, Calif; and Department of Diagnostic Radiology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, Fla (J.S.D., B.N.)
| |
Collapse
|
12
|
Abstract
OBJECTIVE The purpose of this article is to discuss potential applications of artificial intelligence (AI) in breast imaging and limitations that may slow or prevent its adoption. CONCLUSION The algorithms of AI for workflow improvement and outcome analyses are advancing. Using imaging data of high quality and quantity, AI can support breast imagers in diagnosis and patient management, but AI cannot yet be relied on or be responsible for physicians' decisions that may affect survival. Education in AI is urgently needed for physicians.
Collapse
|
13
|
Tan T, Li Z, Liu H, Zanjani FG, Ouyang Q, Tang Y, Hu Z, Li Q. Optimize Transfer Learning for Lung Diseases in Bronchoscopy Using a New Concept: Sequential Fine-Tuning. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2018; 6:1800808. [PMID: 30324036 PMCID: PMC6175035 DOI: 10.1109/jtehm.2018.2865787] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/27/2018] [Revised: 08/01/2018] [Accepted: 08/03/2018] [Indexed: 12/20/2022]
Abstract
Bronchoscopy inspection, as a follow-up procedure next to the radiological imaging, plays a key role in the diagnosis and treatment design for lung disease patients. When performing bronchoscopy, doctors have to make a decision immediately whether to perform a biopsy. Because biopsies may cause uncontrollable and life-threatening bleeding of the lung tissue, thus doctors need to be selective with biopsies. In this paper, to help doctors to be more selective on biopsies and provide a second opinion on diagnosis, we propose a computer-aided diagnosis (CAD) system for lung diseases, including cancers and tuberculosis (TB). Based on transfer learning (TL), we propose a novel TL method on the top of DenseNet: sequential fine-tuning (SFT). Compared with traditional fine-tuning (FT) methods, our method achieves the best performance. In a data set of recruited 81 normal cases, 76 TB cases and 277 lung cancer cases, SFT provided an overall accuracy of 82% while other traditional TL methods achieved an accuracy from 70% to 74%. The detection accuracy of SFT for cancers, TB, and normal cases are 87%, 54%, and 91%, respectively. This indicates that the CAD system has the potential to improve lung disease diagnosis accuracy in bronchoscopy and it may be used to be more selective with biopsies.
Collapse
Affiliation(s)
- Tao Tan
- Department of Biomedical EngineeringEindhoven University of Technology5600 MBEindhovenThe Netherlands.,ScreenPoint Medical6512 ABNijmegenThe Netherlands
| | - Zhang Li
- College of Aerospace Science and EngineeringNational University of Defense TechnologyChangsha410073China
| | - Haixia Liu
- School Of Computer ScienceUniversity of Nottingham Malaysia Campus43500SemenyihMalaysia
| | - Farhad G Zanjani
- Department of Electrical EngineeringEindhoven University of Technology5600 MBEindhovenThe Netherlands
| | - Quchang Ouyang
- Hunan Cancer Hospital, The Affiliated Cancer Hospital of Xiangya School of MedicineCentral South UniversityChangsha410000China
| | - Yuling Tang
- First Hospital of Changsha CityChangsha410000China
| | - Zheyu Hu
- Hunan Cancer Hospital, The Affiliated Cancer Hospital of Xiangya School of MedicineCentral South UniversityChangsha410000China
| | - Qiang Li
- Department of Respiratory MedicineShanghai East HospitalTongji University School of MedicineShanghai200120China
| |
Collapse
|
14
|
Guo R, Lu G, Qin B, Fei B. Ultrasound Imaging Technologies for Breast Cancer Detection and Management: A Review. ULTRASOUND IN MEDICINE & BIOLOGY 2018; 44:37-70. [PMID: 29107353 PMCID: PMC6169997 DOI: 10.1016/j.ultrasmedbio.2017.09.012] [Citation(s) in RCA: 203] [Impact Index Per Article: 33.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/09/2017] [Revised: 09/12/2017] [Accepted: 09/13/2017] [Indexed: 05/25/2023]
Abstract
Ultrasound imaging is a commonly used modality for breast cancer detection and diagnosis. In this review, we summarize ultrasound imaging technologies and their clinical applications for the management of breast cancer patients. The technologies include ultrasound elastography, contrast-enhanced ultrasound, 3-D ultrasound, automatic breast ultrasound and computer-aided detection of breast ultrasound. We summarize the study results seen in the literature and discuss their future directions. We also provide a review of ultrasound-guided, breast biopsy and the fusion of ultrasound with other imaging modalities, especially magnetic resonance imaging (MRI). For comparison, we also discuss the diagnostic performance of mammography, MRI, positron emission tomography and computed tomography for breast cancer diagnosis at the end of this review. New ultrasound imaging techniques, ultrasound-guided biopsy and the fusion of ultrasound with other modalities provide important tools for the management of breast patients.
Collapse
Affiliation(s)
- Rongrong Guo
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia, USA; Department of Ultrasound, Shanxi Provincial Cancer Hospital, Taiyuan, Shanxi, China
| | - Guolan Lu
- The Wallace H. Coulter Department of Biomedical Engineering, Emory University and Georgia Institute of Technology, Atlanta, Georgia, USA
| | - Binjie Qin
- School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China
| | - Baowei Fei
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia, USA; The Wallace H. Coulter Department of Biomedical Engineering, Emory University and Georgia Institute of Technology, Atlanta, Georgia, USA; Department of Mathematics and Computer Science, Emory College of Emory University, Atlanta, Georgia, USA; Winship Cancer Institute of Emory University, Atlanta, Georgia, USA.
| |
Collapse
|
15
|
Skerl K, Cochran S, Evans A. First step to facilitate long-term and multi-centre studies of shear wave elastography in solid breast lesions using a computer-assisted algorithm. Int J Comput Assist Radiol Surg 2017; 12:1533-1542. [PMID: 28478519 PMCID: PMC5569155 DOI: 10.1007/s11548-017-1596-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2017] [Accepted: 04/24/2017] [Indexed: 12/24/2022]
Abstract
Purpose Shear wave elastography (SWE) visualises the elasticity of tissue. As malignant tissue is generally stiffer than benign tissue, SWE is helpful to diagnose solid breast lesions. Until now, quantitative measurements of elasticity parameters have been possible only, while the images were still saved on the ultrasound imaging device. This work aims to overcome this issue and introduces an algorithm allowing fast offline evaluation of SWE images. Methods The algorithm was applied to a commercial phantom comprising three lesions of various elasticities and 207 in vivo solid breast lesions. All images were saved in DICOM, JPG and QDE (quantitative data export; for research only) format and evaluated according to our clinical routine using a computer-aided diagnosis algorithm. The results were compared to the manual evaluation (experienced radiologist and trained engineer) regarding their numerical discrepancies and their diagnostic performance using ROC and ICC analysis. Results ICCs of the elasticity parameters in all formats were nearly perfect (0.861–0.990). AUC for all formats was nearly identical for \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$${E}_{\mathrm{max}}$$\end{document}Emax and \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$${E}_{\mathrm{mean}}$$\end{document}Emean (0.863–0.888). The diagnostic performance of SD using DICOM or JPG estimations was lower than the manual or QDE estimation (AUC 0.673 vs. 0.844). Conclusions The algorithm introduced in this study is suitable for the estimation of the elasticity parameters offline from the ultrasound system to include images taken at different times and sites. This facilitates the performance of long-term and multi-centre studies.
Collapse
Affiliation(s)
- Katrin Skerl
- Medical Research Institute, Ninewells Hospital and Medical School, Mailbox 4, Dundee, DD1 9SY, Scotland, UK. .,Image Science for Interventional Techniques, University of Auvergne, 28, Place Henri Dunant, BP 38, 63001, Clermont-Ferrand Cedex, France.
| | - Sandy Cochran
- School of Engineering, University of Glasgow, Glasgow, G12 8QQ, Scotland, UK
| | - Andrew Evans
- Medical Research Institute, Ninewells Hospital and Medical School, Mailbox 4, Dundee, DD1 9SY, Scotland, UK
| |
Collapse
|
16
|
Xi X, Shi H, Han L, Wang T, Ding HY, Zhang G, Tang Y, Yin Y. Breast tumor segmentation with prior knowledge learning. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2016.09.067] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
|
17
|
Diagnosis of Breast Tumors with Sonographic Texture Analysis Using Run-length Matrix. INTERNATIONAL JOURNAL OF CANCER MANAGEMENT 2017. [DOI: 10.5812/ijcm.6120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
18
|
Zhang Q, Xiao Y, Dai W, Suo J, Wang C, Shi J, Zheng H. Deep learning based classification of breast tumors with shear-wave elastography. ULTRASONICS 2016; 72:150-7. [PMID: 27529139 DOI: 10.1016/j.ultras.2016.08.004] [Citation(s) in RCA: 117] [Impact Index Per Article: 14.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/22/2015] [Revised: 06/30/2016] [Accepted: 08/05/2016] [Indexed: 05/03/2023]
Abstract
This study aims to build a deep learning (DL) architecture for automated extraction of learned-from-data image features from the shear-wave elastography (SWE), and to evaluate the DL architecture in differentiation between benign and malignant breast tumors. We construct a two-layer DL architecture for SWE feature extraction, comprised of the point-wise gated Boltzmann machine (PGBM) and the restricted Boltzmann machine (RBM). The PGBM contains task-relevant and task-irrelevant hidden units, and the task-relevant units are connected to the RBM. Experimental evaluation was performed with five-fold cross validation on a set of 227 SWE images, 135 of benign tumors and 92 of malignant tumors, from 121 patients. The features learned with our DL architecture were compared with the statistical features quantifying image intensity and texture. Results showed that the DL features achieved better classification performance with an accuracy of 93.4%, a sensitivity of 88.6%, a specificity of 97.1%, and an area under the receiver operating characteristic curve of 0.947. The DL-based method integrates feature learning with feature selection on SWE. It may be potentially used in clinical computer-aided diagnosis of breast cancer.
Collapse
Affiliation(s)
- Qi Zhang
- School of Communication and Information Engineering, Shanghai University, Shanghai, China.
| | - Yang Xiao
- Paul C. Lauterbur Research Center for Biomedical Imaging, Institute of Biomedical and Health Engineering, Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Wei Dai
- School of Communication and Information Engineering, Shanghai University, Shanghai, China
| | - Jingfeng Suo
- School of Communication and Information Engineering, Shanghai University, Shanghai, China
| | - Congzhi Wang
- Paul C. Lauterbur Research Center for Biomedical Imaging, Institute of Biomedical and Health Engineering, Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China
| | - Jun Shi
- School of Communication and Information Engineering, Shanghai University, Shanghai, China
| | - Hairong Zheng
- Paul C. Lauterbur Research Center for Biomedical Imaging, Institute of Biomedical and Health Engineering, Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.
| |
Collapse
|
19
|
Summers SM, Chin EJ, Long BJ, Grisell RD, Knight JG, Grathwohl KW, Ritter JL, Morgan JD, Salinas J, Blackbourne LH. Computerized Diagnostic Assistant for the Automatic Detection of Pneumothorax on Ultrasound: A Pilot Study. West J Emerg Med 2016; 17:209-15. [PMID: 26973754 PMCID: PMC4786248 DOI: 10.5811/westjem.2016.1.28087] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2015] [Revised: 01/08/2016] [Accepted: 01/15/2016] [Indexed: 11/11/2022] Open
Abstract
INTRODUCTION Bedside thoracic ultrasound (US) can rapidly diagnose pneumothorax (PTX) with improved accuracy over the physical examination and without the need for chest radiography (CXR); however, US is highly operator dependent. A computerized diagnostic assistant was developed by the United States Army Institute of Surgical Research to detect PTX on standard thoracic US images. This computer algorithm is designed to automatically detect sonographic signs of PTX by systematically analyzing B-mode US video clips for pleural sliding and M-mode still images for the seashore sign. This was a pilot study to estimate the diagnostic accuracy of the PTX detection computer algorithm when compared to an expert panel of US trained physicians. METHODS This was a retrospective study using archived thoracic US obtained on adult patients presenting to the emergency department (ED) between 5/23/2011 and 8/6/2014. Emergency medicine residents, fellows, attending physicians, physician assistants, and medical students performed the US examinations and stored the images in the picture archive and communications system (PACS). The PACS was queried for all ED bedside US examinations with reported positive PTX during the study period along with a random sample of negatives. The computer algorithm then interpreted the images, and we compared the results to an independent, blinded expert panel of three physicians, each with experience reviewing over 10,000 US examinations. RESULTS Query of the PACS system revealed 146 bedside thoracic US examinations for analysis. Thirteen examinations were indeterminate and were excluded. There were 79 true negatives, 33 true positives, 9 false negatives, and 12 false positives. The test characteristics of the algorithm when compared to the expert panel were sensitivity 79% (95 % CI [63-89]) and specificity 87% (95% CI [77-93]). For the 20 images scored as highest quality by the expert panel, the algorithm demonstrated 100% sensitivity (95% CI [56-100]) and 92% specificity (95% CI [62-100]). CONCLUSION This novel computer algorithm has potential to aid clinicians with the identification of the sonographic signs of PTX in the absence of expert physician sonographers. Further refinement and training of the algorithm is still needed, along with prospective validation, before it can be utilized in clinical practice.
Collapse
Affiliation(s)
- Shane M Summers
- Brooke Army Medical Center, Department of Emergency Medicine, San Antonio, Texas
| | - Eric J Chin
- Brooke Army Medical Center, Department of Emergency Medicine, San Antonio, Texas
| | - Brit J Long
- Brooke Army Medical Center, Department of Emergency Medicine, San Antonio, Texas
| | - Ronald D Grisell
- United States Army Institute of Surgical Research, San Antonio, Texas
| | - John G Knight
- Brooke Army Medical Center, Department of Emergency Medicine, San Antonio, Texas
| | - Kurt W Grathwohl
- Brooke Army Medical Center, Department of Pulmonary/Critical Care, San Antonio, Texas
| | - John L Ritter
- Brooke Army Medical Center, Department of Radiology, San Antonio, Texas
| | - Jeffrey D Morgan
- Brooke Army Medical Center, Department of Emergency Medicine, San Antonio, Texas
| | - Jose Salinas
- United States Army Institute of Surgical Research, San Antonio, Texas
| | | |
Collapse
|
20
|
Wang L, Böhler T, Zöhrer F, Georgii J, Rauh C, Fasching PA, Brehm B, Schulz-Wendtland R, Beckmann MW, Uder M, Hahn HK. A hybrid method towards automated nipple detection in 3D breast ultrasound images. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2015; 2014:2869-72. [PMID: 25570590 DOI: 10.1109/embc.2014.6944222] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
In clinical work-up of breast cancer, nipple position is an important marker to locate lesions. Moreover, it serves as an effective landmark to register a 3D automated breast ultrasound (ABUS) images to other imaging modalities, e.g., X-ray mammography, tomosynthesis or magnetic resonance imaging (MRI). However, the presence of speckle noises caused by the interference waves and variant imaging directions poses challenges to automatically identify nipple positions. In this work, a hybrid fully automatic method to detect nipple positions in ABUS images is presented. The method extends the multi-scale Laplacian-based method that we proposed previously, by integrating a specially designed Hessian-based method to locate the shadow area beneath the nipple and areola. Subsequently, the likelihood maps of nipple positions generated by both methods are combined to build a joint-likelihood map, where the final nipple position is extracted. To validate the efficiency and robustness, the extended hybrid method was tested on 926 ABUS images, resulting in a distance error of 7.08±10.96 mm (mean±standard deviation).
Collapse
|
21
|
Shin HJ, Kim HH, Cha JH. Current status of automated breast ultrasonography. Ultrasonography 2015; 34:165-72. [PMID: 25971900 PMCID: PMC4484287 DOI: 10.14366/usg.15002] [Citation(s) in RCA: 64] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2015] [Revised: 03/20/2015] [Accepted: 03/23/2015] [Indexed: 11/03/2022] Open
Abstract
Breast ultrasonography (US) is currently considered the first-line examination in the detection and characterization of breast lesions. However, conventional handheld US (HHUS) has several limitations such as operator dependence and the requirement of a considerable amount of radiologist time for whole-breast US. Automated breast US (ABUS), recently approved by the United States Food and Drug Administration for screening purposes, has several advantages over HHUS, such as higher reproducibility, less operator dependence, and less required physician time for image acquisition. In addition, ABUS provides both a coronal view and a relatively large field of view. Recent studies have reported that ABUS is promising in US screening for women with dense breasts and can potentially replace handheld second-look US in a preoperative setting.
Collapse
Affiliation(s)
- Hee Jung Shin
- Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea
| | - Hak Hee Kim
- Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea
| | - Joo Hee Cha
- Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea
| |
Collapse
|
22
|
Liu H, Tan T, van Zelst J, Mann R, Karssemeijer N, Platel B. Incorporating texture features in a computer-aided breast lesion diagnosis system for automated three-dimensional breast ultrasound. J Med Imaging (Bellingham) 2014; 1:024501. [PMID: 26158036 DOI: 10.1117/1.jmi.1.2.024501] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2014] [Revised: 06/22/2014] [Accepted: 06/26/2014] [Indexed: 11/14/2022] Open
Abstract
We investigated the benefits of incorporating texture features into an existing computer-aided diagnosis (CAD) system for classifying benign and malignant lesions in automated three-dimensional breast ultrasound images. The existing system takes into account 11 different features, describing different lesion properties; however, it does not include texture features. In this work, we expand the system by including texture features based on local binary patterns, gray level co-occurrence matrices, and Gabor filters computed from each lesion to be diagnosed. To deal with the resulting large number of features, we proposed a combination of feature-oriented classifiers combining each group of texture features into a single likelihood, resulting in three additional features used for the final classification. The classification was performed using support vector machine classifiers, and the evaluation was done with 10-fold cross validation on a dataset containing 424 lesions (239 benign and 185 malignant lesions). We compared the classification performance of the CAD system with and without texture features. The area under the receiver operating characteristic curve increased from 0.90 to 0.91 after adding texture features ([Formula: see text]).
Collapse
Affiliation(s)
- Haixia Liu
- Radboud University Medical Centre , Department of Radiology and Nuclear Medicine, 6525 GA Nijmegen, The Netherlands ; University of Nottingham Malaysia Campus , School Of Computer Science, Room BB79, Jalan Broga, 43500 Semenyih, Selangor Darul Ehsan, Malaysia
| | - Tao Tan
- Radboud University Medical Centre , Department of Radiology and Nuclear Medicine, 6525 GA Nijmegen, The Netherlands
| | - Jan van Zelst
- Radboud University Medical Centre , Department of Radiology and Nuclear Medicine, 6525 GA Nijmegen, The Netherlands
| | - Ritse Mann
- Radboud University Medical Centre , Department of Radiology and Nuclear Medicine, 6525 GA Nijmegen, The Netherlands
| | - Nico Karssemeijer
- Radboud University Medical Centre , Department of Radiology and Nuclear Medicine, 6525 GA Nijmegen, The Netherlands
| | - Bram Platel
- Radboud University Medical Centre , Department of Radiology and Nuclear Medicine, 6525 GA Nijmegen, The Netherlands
| |
Collapse
|