1
|
Garg P, Mohanty A, Ramisetty S, Kulkarni P, Horne D, Pisick E, Salgia R, Singhal SS. Artificial intelligence and allied subsets in early detection and preclusion of gynecological cancers. Biochim Biophys Acta Rev Cancer 2023; 1878:189026. [PMID: 37980945 DOI: 10.1016/j.bbcan.2023.189026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2023] [Revised: 11/09/2023] [Accepted: 11/14/2023] [Indexed: 11/21/2023]
Abstract
Gynecological cancers including breast, cervical, ovarian, uterine, and vaginal, pose the greatest threat to world health, with early identification being crucial to patient outcomes and survival rates. The application of machine learning (ML) and artificial intelligence (AI) approaches to the study of gynecological cancer has shown potential to revolutionize cancer detection and diagnosis. The current review outlines the significant advancements, obstacles, and prospects brought about by AI and ML technologies in the timely identification and accurate diagnosis of different types of gynecological cancers. The AI-powered technologies can use genomic data to discover genetic alterations and biomarkers linked to a particular form of gynecologic cancer, assisting in the creation of targeted treatments. Furthermore, it has been shown that the potential benefits of AI and ML technologies in gynecologic tumors can greatly increase the accuracy and efficacy of cancer diagnosis, reduce diagnostic delays, and possibly eliminate the need for needless invasive operations. In conclusion, the review focused on the integrative part of AI and ML based tools and techniques in the early detection and exclusion of various cancer types; together with a collaborative coordination between research clinicians, data scientists, and regulatory authorities, which is suggested to realize the full potential of AI and ML in gynecologic cancer care.
Collapse
Affiliation(s)
- Pankaj Garg
- Department of Chemistry, GLA University, Mathura, Uttar Pradesh 281406, India
| | - Atish Mohanty
- Departments of Medical Oncology & Therapeutics Research, Molecular Medicine, Beckman Research Institute of City of Hope, Comprehensive Cancer Center and National Medical Center, Duarte, CA 91010, USA
| | - Sravani Ramisetty
- Departments of Medical Oncology & Therapeutics Research, Molecular Medicine, Beckman Research Institute of City of Hope, Comprehensive Cancer Center and National Medical Center, Duarte, CA 91010, USA
| | - Prakash Kulkarni
- Departments of Medical Oncology & Therapeutics Research, Molecular Medicine, Beckman Research Institute of City of Hope, Comprehensive Cancer Center and National Medical Center, Duarte, CA 91010, USA
| | - David Horne
- Molecular Medicine, Beckman Research Institute of City of Hope, Comprehensive Cancer Center and National Medical Center, Duarte, CA 91010, USA
| | - Evan Pisick
- Department of Medical Oncology, City of Hope, Chicago, IL 60099, USA
| | - Ravi Salgia
- Departments of Medical Oncology & Therapeutics Research, Molecular Medicine, Beckman Research Institute of City of Hope, Comprehensive Cancer Center and National Medical Center, Duarte, CA 91010, USA
| | - Sharad S Singhal
- Departments of Medical Oncology & Therapeutics Research, Molecular Medicine, Beckman Research Institute of City of Hope, Comprehensive Cancer Center and National Medical Center, Duarte, CA 91010, USA.
| |
Collapse
|
2
|
Baughan N, Douglas L, Giger ML. Past, Present, and Future of Machine Learning and Artificial Intelligence for Breast Cancer Screening. JOURNAL OF BREAST IMAGING 2022; 4:451-459. [PMID: 38416954 DOI: 10.1093/jbi/wbac052] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Indexed: 03/01/2024]
Abstract
Breast cancer screening has evolved substantially over the past few decades because of advancements in new image acquisition systems and novel artificial intelligence (AI) algorithms. This review provides a brief overview of the history, current state, and future of AI in breast cancer screening and diagnosis along with challenges involved in the development of AI systems. Although AI has been developing for interpretation tasks associated with breast cancer screening for decades, its potential to combat the subjective nature and improve the efficiency of human image interpretation is always expanding. The rapid advancement of computational power and deep learning has increased greatly in AI research, with promising performance in detection and classification tasks across imaging modalities. Most AI systems, based on human-engineered or deep learning methods, serve as concurrent or secondary readers, that is, as aids to radiologists for a specific, well-defined task. In the future, AI may be able to perform multiple integrated tasks, making decisions at the level of or surpassing the ability of humans. Artificial intelligence may also serve as a partial primary reader to streamline ancillary tasks, triaging cases or ruling out obvious normal cases. However, before AI is used as an independent, autonomous reader, various challenges need to be addressed, including explainability and interpretability, in addition to repeatability and generalizability, to ensure that AI will provide a significant clinical benefit to breast cancer screening across all populations.
Collapse
Affiliation(s)
- Natalie Baughan
- University of Chicago, Department of Radiology Committee on Medical Physics, Chicago, IL, USA
| | - Lindsay Douglas
- University of Chicago, Department of Radiology Committee on Medical Physics, Chicago, IL, USA
| | - Maryellen L Giger
- University of Chicago, Department of Radiology Committee on Medical Physics, Chicago, IL, USA
| |
Collapse
|
3
|
Wang L. Terahertz Imaging for Breast Cancer Detection. SENSORS (BASEL, SWITZERLAND) 2021; 21:6465. [PMID: 34640784 PMCID: PMC8512288 DOI: 10.3390/s21196465] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Revised: 09/19/2021] [Accepted: 09/26/2021] [Indexed: 12/02/2022]
Abstract
Terahertz (THz) imaging has the potential to detect breast tumors during breast-conserving surgery accurately. Over the past decade, many research groups have extensively studied THz imaging and spectroscopy techniques for identifying breast tumors. This manuscript presents the recent development of THz imaging techniques for breast cancer detection. The dielectric properties of breast tissues in the THz range, THz imaging and spectroscopy systems, THz radiation sources, and THz breast imaging studies are discussed. In addition, numerous chemometrics methods applied to improve THz image resolution and data collection processing are summarized. Finally, challenges and future research directions of THz breast imaging are presented.
Collapse
Affiliation(s)
- Lulu Wang
- Biomedical Device Innovation Center, Shenzhen Technology University, Shenzhen 518118, China;
- Institute of Biomedical Technologies, Auckland University of Technology, Auckland 1010, New Zealand
| |
Collapse
|
4
|
Zhang P, Ma Z, Zhang Y, Chen X, Wang G. Improved Inception V3 method and its effect on radiologists' performance of tumor classification with automated breast ultrasound system. Gland Surg 2021; 10:2232-2245. [PMID: 34422594 PMCID: PMC8340346 DOI: 10.21037/gs-21-328] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2021] [Accepted: 06/17/2021] [Indexed: 11/06/2022]
Abstract
BACKGROUND The automated breast ultrasound system (ABUS) is recognized as a valuable detection tool in addition to mammography. The purpose of this study was to propose a novel computer-aided diagnosis (CAD) system by extracting the textural features from ABUS images and to investigate the efficiency of using this CAD for breast cancer detection. METHODS This retrospective study involved 149 breast nodules [maximum diameter: mean size 18.89 mm, standard deviation (SD) 10.238, and range 5-59 mm] in 135. We assigned 3 novice readers (<3 years of experience and 3 experienced readers (≥10 years of experience to review the imaging data and stratify the 149 breast nodules as either malignant or benign. The Improved Inception V3 (II3) method was developed and used as an assistant tool to help the 6 readers to re-interpret the images. RESULTS Our method (II3) achieved an accuracy of 88.6% for the final result. The 3 novice readers had an average accuracy of 71.37%±4.067% while the 3 experienced readers was 83.03%±3.371% on the first-reading. With the help of II3 on the second-reading, the average accuracy of the novice readers increased to 84.13%±1.662% and the experienced readers increased to 89.50%±0.346%.The areas under the curve (AUCs) were similar compared with linear algorithms. The mean AUC of the novice readers was improved from 0.7751 (without II3) to 0.8232 (with II3). The mean AUC of the experienced readers was improved from 0.8939 (without II3) to 0.9211 (with II3). The mean AUC for all readers improved in both the second-reading mode (from 0.8345 to 0.8722, P=0.0081<0.05). CONCLUSIONS With the help of the II3, the diagnostic accuracy of the two groups were both improved, and II3 was more helpful for novice readers than for experienced readers. Our results showed that II3 is valuable in the differentiation of benign and malignant breast nodules and it also improves the experience and skill of some novice radiologists. The II3 cannot completely replace the influence of experience in the diagnostic process and will retain an auxiliary role in the clinic at present.
Collapse
Affiliation(s)
- Panpan Zhang
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| | - Zhaosheng Ma
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| | - Yingtao Zhang
- Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China
| | - Xiaodan Chen
- Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China
| | - Gang Wang
- Department of Ultrasound, Taizhou Hospital of Zhejiang Province, Zhejiang University, Linhai, China
| |
Collapse
|
5
|
Drukker K, Yan P, Sibley A, Wang G. Biomedical imaging and analysis through deep learning. Artif Intell Med 2021. [DOI: 10.1016/b978-0-12-821259-2.00004-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
6
|
|
7
|
van Zelst JCM, Tan T, Mann RM, Karssemeijer N. Validation of radiologists' findings by computer-aided detection (CAD) software in breast cancer detection with automated 3D breast ultrasound: a concept study in implementation of artificial intelligence software. Acta Radiol 2020; 61:312-320. [PMID: 31324132 PMCID: PMC7059207 DOI: 10.1177/0284185119858051] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2018] [Accepted: 05/22/2019] [Indexed: 11/16/2022]
Abstract
Background Computer-aided detection software for automated breast ultrasound has been shown to have potential in improving the accuracy of radiologists. Alternative ways of implementing computer-aided detection, such as independent validation or preselecting suspicious cases, might also improve radiologists’ accuracy. Purpose To investigate the effect of using computer-aided detection software to improve the performance of radiologists by validating findings reported by radiologists during screening with automated breast ultrasound. Material and Methods Unilateral automated breast ultrasound exams were performed in 120 women with dense breasts that included 60 randomly selected normal exams, 30 exams with benign lesions, and 30 malignant cases (20 mammography-negative). Eight radiologists were instructed to detect breast cancer and rate lesions using BI-RADS and level-of-suspiciousness scores. Computer-aided detection software was used to check the validity of radiologists' findings. Findings found negative by computer-aided detection were not included in the readers’ performance analysis; however, the nature of these findings were further analyzed. The area under the curve and the partial area under the curve for an interval in the range of 80%–100% specificity before and after validation of computer-aided detection were compared. Sensitivity was computed for all readers at a simulation of 90% specificity. Results Partial AUC improved significantly from 0.126 (95% confidence interval [CI] = 0.098–0.153) to 0.142 (95% CI = 0.115–0.169) (P = 0.037) after computer-aided detection rejected mostly benign lesions and normal tissue scored BI-RADS 3 or 4. The full areas under the curve (0.823 vs. 0.833, respectively) were not significantly different (P = 0.743). Four cancers detected by readers were completely missed by computer-aided detection and four other cancers were detected by both readers and computer-aided detection but falsely rejected due to technical limitations of our implementation of computer-aided detection validation. In this study, validation of computer-aided detection discarded 42.6% of findings that were scored BI-RADS ≥3 by the radiologists, of which 85.5% were non-malignant findings. Conclusion Validation of radiologists’ findings using computer-aided detection software for automated breast ultrasound has the potential to improve the performance of radiologists. Validation of computer-aided detection might be an efficient tool for double-reading strategies by limiting the amount of discordant cases needed to be double-read.
Collapse
Affiliation(s)
- Jan CM van Zelst
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Tao Tan
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Ritse M Mann
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| | - Nico Karssemeijer
- Department of Radiology and Nuclear
Medicine, Radboud University Medical Centre, the Netherlands
| |
Collapse
|
8
|
|
9
|
Bi WL, Hosny A, Schabath MB, Giger ML, Birkbak NJ, Mehrtash A, Allison T, Arnaout O, Abbosh C, Dunn IF, Mak RH, Tamimi RM, Tempany CM, Swanton C, Hoffmann U, Schwartz LH, Gillies RJ, Huang RY, Aerts HJWL. Artificial intelligence in cancer imaging: Clinical challenges and applications. CA Cancer J Clin 2019; 69:127-157. [PMID: 30720861 PMCID: PMC6403009 DOI: 10.3322/caac.21552] [Citation(s) in RCA: 611] [Impact Index Per Article: 122.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Judgement, as one of the core tenets of medicine, relies upon the integration of multilayered data with nuanced decision making. Cancer offers a unique context for medical decisions given not only its variegated forms with evolution of disease but also the need to take into account the individual condition of patients, their ability to receive treatment, and their responses to treatment. Challenges remain in the accurate detection, characterization, and monitoring of cancers despite improved technologies. Radiographic assessment of disease most commonly relies upon visual evaluations, the interpretations of which may be augmented by advanced computational analyses. In particular, artificial intelligence (AI) promises to make great strides in the qualitative interpretation of cancer imaging by expert clinicians, including volumetric delineation of tumors over time, extrapolation of the tumor genotype and biological course from its radiographic phenotype, prediction of clinical outcome, and assessment of the impact of disease and treatment on adjacent organs. AI may automate processes in the initial interpretation of images and shift the clinical workflow of radiographic detection, management decisions on whether or not to administer an intervention, and subsequent observation to a yet to be envisioned paradigm. Here, the authors review the current state of AI as applied to medical imaging of cancer and describe advances in 4 tumor types (lung, brain, breast, and prostate) to illustrate how common clinical problems are being addressed. Although most studies evaluating AI applications in oncology to date have not been vigorously validated for reproducibility and generalizability, the results do highlight increasingly concerted efforts in pushing AI technology to clinical use and to impact future directions in cancer care.
Collapse
Affiliation(s)
- Wenya Linda Bi
- Assistant Professor of Neurosurgery, Department of Neurosurgery, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
| | - Ahmed Hosny
- Research Scientist, Department of Radiation Oncology, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
| | - Matthew B. Schabath
- Associate Member, Department of Cancer EpidemiologyH. Lee Moffitt Cancer Center and Research InstituteTampaFL
| | - Maryellen L. Giger
- Professor of Radiology, Department of RadiologyUniversity of ChicagoChicagoIL
| | - Nicolai J. Birkbak
- Research Associate, The Francis Crick InstituteLondonUnited Kingdom
- Research Associate, University College London Cancer InstituteLondonUnited Kingdom
| | - Alireza Mehrtash
- Research Assistant, Department of Radiology, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
- Research Assistant, Department of Electrical and Computer EngineeringUniversity of British ColumbiaVancouverBCCanada
| | - Tavis Allison
- Research Assistant, Department of RadiologyColumbia University College of Physicians and SurgeonsNew YorkNY
- Research Assistant, Department of RadiologyNew York Presbyterian HospitalNew YorkNY
| | - Omar Arnaout
- Assistant Professor of Neurosurgery, Department of Neurosurgery, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
| | - Christopher Abbosh
- Research Fellow, The Francis Crick InstituteLondonUnited Kingdom
- Research Fellow, University College London Cancer InstituteLondonUnited Kingdom
| | - Ian F. Dunn
- Associate Professor of Neurosurgery, Department of Neurosurgery, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
| | - Raymond H. Mak
- Associate Professor, Department of Radiation Oncology, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
| | - Rulla M. Tamimi
- Associate Professor, Department of MedicineBrigham and Women’s Hospital, Dana‐Farber Cancer Institute, Harvard Medical SchoolBostonMA
| | - Clare M. Tempany
- Professor of Radiology, Department of Radiology, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
| | - Charles Swanton
- Professor, The Francis Crick InstituteLondonUnited Kingdom
- Professor, University College London Cancer InstituteLondonUnited Kingdom
| | - Udo Hoffmann
- Professor of Radiology, Department of RadiologyMassachusetts General Hospital and Harvard Medical SchoolBostonMA
| | - Lawrence H. Schwartz
- Professor of Radiology, Department of RadiologyColumbia University College of Physicians and SurgeonsNew YorkNY
- Chair, Department of RadiologyNew York Presbyterian HospitalNew YorkNY
| | - Robert J. Gillies
- Professor of Radiology, Department of Cancer PhysiologyH. Lee Moffitt Cancer Center and Research InstituteTampaFL
| | - Raymond Y. Huang
- Assistant Professor, Department of Radiology, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
| | - Hugo J. W. L. Aerts
- Associate Professor, Departments of Radiation Oncology and Radiology, Brigham and Women’s Hospital, Dana‐Farber Cancer InstituteHarvard Medical SchoolBostonMA
- Professor in AI in Medicine, Radiology and Nuclear Medicine, GROWMaastricht University Medical Centre (MUMC+)MaastrichtThe Netherlands
| |
Collapse
|
10
|
Lei B, Huang S, Li R, Bian C, Li H, Chou YH, Cheng JZ. Segmentation of breast anatomy for automated whole breast ultrasound images with boundary regularized convolutional encoder–decoder network. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2018.09.043] [Citation(s) in RCA: 42] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
11
|
Rella R, Belli P, Giuliani M, Bufi E, Carlino G, Rinaldi P, Manfredi R. Automated Breast Ultrasonography (ABUS) in the Screening and Diagnostic Setting: Indications and Practical Use. Acad Radiol 2018; 25:1457-1470. [PMID: 29555568 DOI: 10.1016/j.acra.2018.02.014] [Citation(s) in RCA: 59] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2018] [Revised: 02/10/2018] [Accepted: 02/11/2018] [Indexed: 10/17/2022]
Abstract
Automated breast ultrasonography (ABUS) is a new imaging technology for automatic breast scanning through ultrasound. It was first developed to overcome the limitation of operator dependency and lack of standardization and reproducibility of handheld ultrasound. ABUS provides a three-dimensional representation of breast tissue and allows images reformatting in three planes, and the generated coronal plane has been suggested to improve diagnostic accuracy. This technique has been first used in the screening setting to improve breast cancer detection, especially in mammographically dense breasts. In recent years, numerous studies also evaluated its use in the diagnostic setting: they showed its suitability for breast cancer staging, evaluation of tumor response to neoadjuvant chemotherapy, and second-look ultrasound after magnetic resonance imaging. The purpose of this article is to provide a comprehensive review of the current body of literature about the clinical performance of ABUS, summarize available evidence, and identify gaps in knowledge for future research.
Collapse
|
12
|
Tan T, Li Z, Liu H, Zanjani FG, Ouyang Q, Tang Y, Hu Z, Li Q. Optimize Transfer Learning for Lung Diseases in Bronchoscopy Using a New Concept: Sequential Fine-Tuning. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE-JTEHM 2018; 6:1800808. [PMID: 30324036 PMCID: PMC6175035 DOI: 10.1109/jtehm.2018.2865787] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/27/2018] [Revised: 08/01/2018] [Accepted: 08/03/2018] [Indexed: 12/20/2022]
Abstract
Bronchoscopy inspection, as a follow-up procedure next to the radiological imaging, plays a key role in the diagnosis and treatment design for lung disease patients. When performing bronchoscopy, doctors have to make a decision immediately whether to perform a biopsy. Because biopsies may cause uncontrollable and life-threatening bleeding of the lung tissue, thus doctors need to be selective with biopsies. In this paper, to help doctors to be more selective on biopsies and provide a second opinion on diagnosis, we propose a computer-aided diagnosis (CAD) system for lung diseases, including cancers and tuberculosis (TB). Based on transfer learning (TL), we propose a novel TL method on the top of DenseNet: sequential fine-tuning (SFT). Compared with traditional fine-tuning (FT) methods, our method achieves the best performance. In a data set of recruited 81 normal cases, 76 TB cases and 277 lung cancer cases, SFT provided an overall accuracy of 82% while other traditional TL methods achieved an accuracy from 70% to 74%. The detection accuracy of SFT for cancers, TB, and normal cases are 87%, 54%, and 91%, respectively. This indicates that the CAD system has the potential to improve lung disease diagnosis accuracy in bronchoscopy and it may be used to be more selective with biopsies.
Collapse
Affiliation(s)
- Tao Tan
- Department of Biomedical EngineeringEindhoven University of Technology5600 MBEindhovenThe Netherlands.,ScreenPoint Medical6512 ABNijmegenThe Netherlands
| | - Zhang Li
- College of Aerospace Science and EngineeringNational University of Defense TechnologyChangsha410073China
| | - Haixia Liu
- School Of Computer ScienceUniversity of Nottingham Malaysia Campus43500SemenyihMalaysia
| | - Farhad G Zanjani
- Department of Electrical EngineeringEindhoven University of Technology5600 MBEindhovenThe Netherlands
| | - Quchang Ouyang
- Hunan Cancer Hospital, The Affiliated Cancer Hospital of Xiangya School of MedicineCentral South UniversityChangsha410000China
| | - Yuling Tang
- First Hospital of Changsha CityChangsha410000China
| | - Zheyu Hu
- Hunan Cancer Hospital, The Affiliated Cancer Hospital of Xiangya School of MedicineCentral South UniversityChangsha410000China
| | - Qiang Li
- Department of Respiratory MedicineShanghai East HospitalTongji University School of MedicineShanghai200120China
| |
Collapse
|
13
|
Zhang X, Lin X, Tan Y, Zhu Y, Wang H, Feng R, Tang G, Zhou X, Li A, Qiao Y. A multicenter hospital-based diagnosis study of automated breast ultrasound system in detecting breast cancer among Chinese women. Chin J Cancer Res 2018; 30:231-239. [PMID: 29861608 DOI: 10.21147/j.issn.1000-9604.2018.02.06] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022] Open
Abstract
Objective The automated breast ultrasound system (ABUS) is a potential method for breast cancer detection; however, its diagnostic performance remains unclear. We conducted a hospital-based multicenter diagnostic study to evaluate the clinical performance of the ABUS for breast cancer detection by comparing it to handheld ultrasound (HHUS) and mammography (MG). Methods Eligible participants underwent HHUS and ABUS testing; women aged 40-69 years additionally underwent MG. Images were interpreted using the Breast Imaging Reporting and Data System (BI-RADS). Women in the BI-RADS categories 1-2 were considered negative. Women classified as BI-RADS 3 underwent magnetic resonance imaging to distinguish true- and false-negative results. Core aspiration or surgical biopsy was performed in women classified as BI-RADS 4-5, followed by a pathological diagnosis. Kappa values and agreement rates were calculated between ABUS, HHUS and MG. Results A total of 1,973 women were included in the final analysis. Of these, 1,353 (68.6%) and 620 (31.4%) were classified as BI-RADS categories 1-3 and 4-5, respectively. In the older age group, the agreement rate and Kappa value between the ABUS and HHUS were 94.0% and 0.860 (P<0.001), respectively; they were 89.2% and 0.735 (P<0.001) between the ABUS and MG, respectively. Regarding consistency between imaging and pathology results, 78.6% of women classified as BI-RADS 4-5 based on the ABUS were diagnosed with precancerous lesions or cancer; which was 7.2% higher than that of women based on HHUS. For BI-RADS 1-2, the false-negative rates of the ABUS and HHUS were almost identical and were much lower than those of MG. Conclusions We observed a good diagnostic reliability for the ABUS. Considering its performance for breast cancer detection in women with high-density breasts and its lower operator dependence, the ABUS is a promising option for breast cancer detection in China.
Collapse
Affiliation(s)
- Xi Zhang
- Department of Epidemiology, National Cancer Center/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing 100021, China
| | - Xi Lin
- Department of Ultrasound, State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University Cancer Center, Guangzhou 510060, China
| | - Yanjuan Tan
- Department of Ultrasound, the First People's Hospital of Hangzhou, Affiliated Hangzhou Hospital of Nanjing Medical University, Hangzhou 310006, China
| | - Ying Zhu
- Department of Breast Imaging, Tianjin Medical University Cancer Institute and Hospital, National Clinical Research Center for Cancer, Tianjin 300060, China
| | - Hui Wang
- Department of Ultrasound, Xin Hua Hospital, Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai 200092, China
| | - Ruimei Feng
- Department of Cancer Prevention Research, Sun Yat-Sen University Cancer Center, Guangzhou 510060, China
| | - Guoxue Tang
- Department of Ultrasound, State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University Cancer Center, Guangzhou 510060, China
| | - Xiang Zhou
- Department of Interventional Radiology, National Cancer Center/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing 100021, China
| | - Anhua Li
- Department of Ultrasound, State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University Cancer Center, Guangzhou 510060, China
| | - Youlin Qiao
- Department of Epidemiology, National Cancer Center/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing 100021, China
| |
Collapse
|
14
|
Kozegar E, Soryani M, Behnam H, Salamati M, Tan T. Breast cancer detection in automated 3D breast ultrasound using iso-contours and cascaded RUSBoosts. ULTRASONICS 2017; 79:68-80. [PMID: 28448836 DOI: 10.1016/j.ultras.2017.04.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/25/2016] [Revised: 03/21/2017] [Accepted: 04/18/2017] [Indexed: 06/07/2023]
Abstract
Automated 3D breast ultrasound (ABUS) is a new popular modality as an adjunct to mammography for detecting cancers in women with dense breasts. In this paper, a multi-stage computer aided detection system is proposed to detect cancers in ABUS images. In the first step, an efficient despeckling method called OBNLM is applied on the images to reduce speckle noise. Afterwards, a new algorithm based on isocontours is applied to detect initial candidates as the boundary of masses is hypo echoic. To reduce false generated isocontours, features such as hypoechoicity, roundness, area and contour strength are used. Consequently, the resulted candidates are further processed by a cascade classifier whose base classifiers are Random Under-Sampling Boosting (RUSBoost) that are introduced to deal with imbalanced datasets. Each base classifier is trained on a group of features like Gabor, LBP, GLCM and other features. Performance of the proposed system was evaluated using 104 volumes from 74 patients, including 112 malignant lesions. According to Free Response Operating Characteristic (FROC) analysis, the proposed system achieved the region-based sensitivity and case-based sensitivity of 68% and 76% at one false positive per image.
Collapse
Affiliation(s)
- Ehsan Kozegar
- School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran
| | - Mohsen Soryani
- School of Computer Engineering, Iran University of Science and Technology, Tehran, Iran
| | - Hamid Behnam
- School of Electrical Engineering, Iran University of Science and Technology, Tehran, Iran
| | - Masoumeh Salamati
- Department of Reproductive Imaging, Reproductive Biomedicine Research Center, Royan Institute for Reproductive Biomedicine, ACECR, Tehran, Iran
| | - Tao Tan
- Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen 6525 GA, The Netherlands.
| |
Collapse
|
15
|
Improved cancer detection in automated breast ultrasound by radiologists using Computer Aided Detection. Eur J Radiol 2017; 89:54-59. [PMID: 28267549 DOI: 10.1016/j.ejrad.2017.01.021] [Citation(s) in RCA: 36] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2016] [Revised: 11/08/2016] [Accepted: 01/18/2017] [Indexed: 11/24/2022]
Abstract
OBJECTIVE To investigate the effect of dedicated Computer Aided Detection (CAD) software for automated breast ultrasound (ABUS) on the performance of radiologists screening for breast cancer. METHODS 90 ABUS views of 90 patients were randomly selected from a multi-institutional archive of cases collected between 2010 and 2013. This dataset included normal cases (n=40) with >1year of follow up, benign (n=30) lesions that were either biopsied or remained stable, and malignant lesions (n=20). Six readers evaluated all cases with and without CAD in two sessions. CAD-software included conventional CAD-marks and an intelligent minimum intensity projection of the breast tissue. Readers reported using a likelihood-of-malignancy scale from 0 to 100. Alternative free-response ROC analysis was used to measure the performance. RESULTS Without CAD, the average area-under-the-curve (AUC) of the readers was 0.77 and significantly improved with CAD to 0.84 (p=0.001). Sensitivity of all readers improved (range 5.2-10.6%) by using CAD but specificity decreased in four out of six readers (range 1.4-5.7%). No significant difference was observed in the AUC between experienced radiologists and residents both with and without CAD. CONCLUSIONS Dedicated CAD-software for ABUS has the potential to improve the cancer detection rates of radiologists screening for breast cancer.
Collapse
|
16
|
Ng KH, Lau S. Vision 20/20: Mammographic breast density and its clinical applications. Med Phys 2015; 42:7059-77. [PMID: 26632060 DOI: 10.1118/1.4935141] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Affiliation(s)
- Kwan-Hoong Ng
- Department of Biomedical Imaging and University of Malaya Research Imaging Centre, Faculty of Medicine, University of Malaya, 50603 Kuala Lumpur, Malaysia
| | - Susie Lau
- Department of Biomedical Imaging and University of Malaya Research Imaging Centre, Faculty of Medicine, University of Malaya, 50603 Kuala Lumpur, Malaysia
| |
Collapse
|
17
|
Tan T, Mordang JJ, van Zelst J, Grivegnée A, Gubern-Mérida A, Melendez J, Mann RM, Zhang W, Platel B, Karssemeijer N. Computer-aided detection of breast cancers using Haar-like features in automated 3D breast ultrasound. Med Phys 2015; 42:1498-504. [DOI: 10.1118/1.4914162] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
|
18
|
Arleo EK, Saleh M, Ionescu D, Drotman M, Min RJ, Hentel K. Recall rate of screening ultrasound with automated breast volumetric scanning (ABVS) in women with dense breasts: a first quarter experience. Clin Imaging 2014; 38:439-444. [DOI: 10.1016/j.clinimag.2014.03.012] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2014] [Revised: 03/03/2014] [Accepted: 03/24/2014] [Indexed: 10/25/2022]
|