1
|
Hossain S, Azam S, Montaha S, Karim A, Chowa SS, Mondol C, Zahid Hasan M, Jonkman M. Automated breast tumor ultrasound image segmentation with hybrid UNet and classification using fine-tuned CNN model. Heliyon 2023; 9:e21369. [PMID: 37885728 PMCID: PMC10598544 DOI: 10.1016/j.heliyon.2023.e21369] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Revised: 10/11/2023] [Accepted: 10/20/2023] [Indexed: 10/28/2023] Open
Abstract
Introduction Breast cancer stands as the second most deadly form of cancer among women worldwide. Early diagnosis and treatment can significantly mitigate mortality rates. Purpose The study aims to classify breast ultrasound images into benign and malignant tumors. This approach involves segmenting the breast's region of interest (ROI) employing an optimized UNet architecture and classifying the ROIs through an optimized shallow CNN model utilizing an ablation study. Method Several image processing techniques are utilized to improve image quality by removing text, artifacts, and speckle noise, and statistical analysis is done to check the enhanced image quality is satisfactory. With the processed dataset, the segmentation of breast tumor ROI is carried out, optimizing the UNet model through an ablation study where the architectural configuration and hyperparameters are altered. After obtaining the tumor ROIs from the fine-tuned UNet model (RKO-UNet), an optimized CNN model is employed to classify the tumor into benign and malignant classes. To enhance the CNN model's performance, an ablation study is conducted, coupled with the integration of an attention unit. The model's performance is further assessed by classifying breast cancer with mammogram images. Result The proposed classification model (RKONet-13) results in an accuracy of 98.41 %. The performance of the proposed model is further compared with five transfer learning models for both pre-segmented and post-segmented datasets. K-fold cross-validation is done to assess the proposed RKONet-13 model's performance stability. Furthermore, the performance of the proposed model is compared with previous literature, where the proposed model outperforms existing methods, demonstrating its effectiveness in breast cancer diagnosis. Lastly, the model demonstrates its robustness for breast cancer classification, delivering an exceptional performance of 96.21 % on a mammogram dataset. Conclusion The efficacy of this study relies on image pre-processing, segmentation with hybrid attention UNet, and classification with fine-tuned robust CNN model. This comprehensive approach aims to determine an effective technique for detecting breast cancer within ultrasound images.
Collapse
Affiliation(s)
- Shahed Hossain
- Health Informatics Research Laboratory (HIRL), Department of Computer Science and Engineering, Daffodil International University, Dhaka, 1341, Bangladesh
| | - Sami Azam
- Faculty of Science and Technology, Charles Darwin University, Casuarina, 0909, NT, Australia
| | - Sidratul Montaha
- Department of Computer Science, University of Calgary, Calgary, AB, T2N 1N4, Canada
| | - Asif Karim
- Faculty of Science and Technology, Charles Darwin University, Casuarina, 0909, NT, Australia
| | - Sadia Sultana Chowa
- Health Informatics Research Laboratory (HIRL), Department of Computer Science and Engineering, Daffodil International University, Dhaka, 1341, Bangladesh
| | - Chaity Mondol
- Health Informatics Research Laboratory (HIRL), Department of Computer Science and Engineering, Daffodil International University, Dhaka, 1341, Bangladesh
| | - Md Zahid Hasan
- Health Informatics Research Laboratory (HIRL), Department of Computer Science and Engineering, Daffodil International University, Dhaka, 1341, Bangladesh
| | - Mirjam Jonkman
- Faculty of Science and Technology, Charles Darwin University, Casuarina, 0909, NT, Australia
| |
Collapse
|
2
|
Qi W, Wu HC, Chan SC. MDF-Net: A Multi-Scale Dynamic Fusion Network for Breast Tumor Segmentation of Ultrasound Images. IEEE TRANSACTIONS ON IMAGE PROCESSING : A PUBLICATION OF THE IEEE SIGNAL PROCESSING SOCIETY 2023; 32:4842-4855. [PMID: 37639409 DOI: 10.1109/tip.2023.3304518] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/31/2023]
Abstract
Breast tumor segmentation of ultrasound images provides valuable information of tumors for early detection and diagnosis. Accurate segmentation is challenging due to low image contrast between areas of interest; speckle noises, and large inter-subject variations in tumor shape and size. This paper proposes a novel Multi-scale Dynamic Fusion Network (MDF-Net) for breast ultrasound tumor segmentation. It employs a two-stage end-to-end architecture with a trunk sub-network for multiscale feature selection and a structurally optimized refinement sub-network for mitigating impairments such as noise and inter-subject variation via better feature exploration and fusion. The trunk network is extended from UNet++ with a simplified skip pathway structure to connect the features between adjacent scales. Moreover, deep supervision at all scales, instead of at the finest scale in UNet++, is proposed to extract more discriminative features and mitigate errors from speckle noise via a hybrid loss function. Unlike previous works, the first stage is linked to a loss function of the second stage so that both the preliminary segmentations and refinement subnetworks can be refined together at training. The refinement sub-network utilizes a structurally optimized MDF mechanism to integrate preliminary segmentation information (capturing general tumor shape and size) at coarse scales and explores inter-subject variation information at finer scales. Experimental results from two public datasets show that the proposed method achieves better Dice and other scores over state-of-the-art methods. Qualitative analysis also indicates that our proposed network is more robust to tumor size/shapes, speckle noise and heavy posterior shadows along tumor boundaries. An optional post-processing step is also proposed to facilitate users in mitigating segmentation artifacts. The efficiency of the proposed network is also illustrated on the "Electron Microscopy neural structures segmentation dataset". It outperforms a state-of-the-art algorithm based on UNet-2022 with simpler settings. This indicates the advantages of our MDF-Nets in other challenging image segmentation tasks with small to medium data sizes.
Collapse
|
3
|
Malekmohammadi A, Barekatrezaei S, Kozegar E, Soryani M. Mass detection in automated 3-D breast ultrasound using a patch Bi-ConvLSTM network. ULTRASONICS 2023; 129:106891. [PMID: 36493507 DOI: 10.1016/j.ultras.2022.106891] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/13/2022] [Revised: 10/27/2022] [Accepted: 11/13/2022] [Indexed: 06/17/2023]
Abstract
Breast cancer mortality can be significantly reduced by early detection of its symptoms. The 3-D Automated Breast Ultrasound (ABUS) has been widely used for breast screening due to its high sensitivity and reproducibility. The large number of ABUS slices, and high variation in size and shape of the masses, make the manual evaluation a challenging and time-consuming process. To assist the radiologists, we propose a convolutional BiLSTM network to classify the slices based on the presence of a mass. Because of its patch-based architecture, this model produces the approximate location of masses as a heat map. The prepared dataset consists of 60 volumes belonging to 43 patients. The precision, recall, accuracy, F1-score, and AUC of the proposed model for slice classification were 84%, 84%, 93%, 84%, and 97%, respectively. Based on the FROC analysis, the proposed detector obtained a sensitivity of 82% with two false positives per volume.
Collapse
Affiliation(s)
- Amin Malekmohammadi
- School of Computer Engineering, Iran University of Science and Technology (IUST), Tehran 16846, Iran.
| | - Sepideh Barekatrezaei
- School of Computer Engineering, Iran University of Science and Technology (IUST), Tehran 16846, Iran.
| | - Ehsan Kozegar
- Faculty of Technology and Engineering-East of Guilan, University of Guilan, Vajargah, Rudsar, Guilan 4199613776, Iran.
| | - Mohsen Soryani
- School of Computer Engineering, Iran University of Science and Technology (IUST), Tehran 16846, Iran.
| |
Collapse
|
4
|
Xu X, Lu L, Zhu L, Tan Y, Yu L, Bao L. Predicting the molecular subtypes of breast cancer using nomograms based on three-dimensional ultrasonography characteristics. Front Oncol 2022; 12:838787. [PMID: 36059623 PMCID: PMC9437331 DOI: 10.3389/fonc.2022.838787] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2021] [Accepted: 08/04/2022] [Indexed: 11/13/2022] Open
Abstract
BackgroundMolecular subtyping of breast cancer is commonly doneforindividualzed cancer management because it may determines prognosis and treatment. Therefore, preoperativelyidentifying different molecular subtypes of breast cancery can be significant in clinical practice.Thisretrospective study aimed to investigate characteristic three-dimensional ultrasonographic imaging parameters of breast cancer that are associated with the molecular subtypes and establish nomograms to predict the molecular subtypes of breast cancers.MethodsA total of 309 patients diagnosed with breast cancer between January 2017and December 2019 were enrolled. Sonographic features were compared between the different molecular subtypes. A multinomial logistic regression model was developed, and nomograms were constructed based on this model.ResultsThe performance of the nomograms was evaluated in terms of discrimination and calibration.Variables such as maximum diameter, irregular shape, non-parallel growth, heterogeneous internal echo, enhanced posterior echo, lymph node metastasis, retraction phenomenon, calcification, and elasticity score were entered into the multinomial model.Three nomograms were constructed to visualize the final model. The probabilities of the different molecular subtypes could be calculated based on these nomograms. Based on the receiver operating characteristic curves of the model, the macro-and micro-areaunder the curve (AUC) were0.744, and 0.787. The AUC was 0.759, 0.683, 0.747 and 0.785 for luminal A(LA), luminal B(LB), human epidermal growth factor receptor 2-positive(HER2), and triple-negative(TN), respectively.The nomograms for the LA, HER2, and TN subtypes provided good calibration.ConclusionsSonographic features such as calcification and posterior acoustic features were significantly associated with the molecular subtype of breast cancer. The presence of the retraction phenomenon was the most important predictor for the LA subtype. Nomograms to predict the molecular subtype were established, and the calibration curves and receiver operating characteristic curves proved that the models had good performance.
Collapse
|
5
|
BUSIS: A Benchmark for Breast Ultrasound Image Segmentation. Healthcare (Basel) 2022; 10:healthcare10040729. [PMID: 35455906 PMCID: PMC9025635 DOI: 10.3390/healthcare10040729] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Revised: 04/07/2022] [Accepted: 04/08/2022] [Indexed: 02/06/2023] Open
Abstract
Breast ultrasound (BUS) image segmentation is challenging and critical for BUS computer-aided diagnosis (CAD) systems. Many BUS segmentation approaches have been studied in the last two decades, but the performances of most approaches have been assessed using relatively small private datasets with different quantitative metrics, which results in a discrepancy in performance comparison. Therefore, there is a pressing need for building a benchmark to compare existing methods using a public dataset objectively, to determine the performance of the best breast tumor segmentation algorithm available today, and to investigate what segmentation strategies are valuable in clinical practice and theoretical study. In this work, a benchmark for B-mode breast ultrasound image segmentation is presented. In the benchmark, (1) we collected 562 breast ultrasound images and proposed standardized procedures to obtain accurate annotations using four radiologists; (2) we extensively compared the performance of 16 state-of-the-art segmentation methods and demonstrated that most deep learning-based approaches achieved high dice similarity coefficient values (DSC ≥ 0.90) and outperformed conventional approaches; (3) we proposed the losses-based approach to evaluate the sensitivity of semi-automatic segmentation to user interactions; and (4) the successful segmentation strategies and possible future improvements were discussed in details.
Collapse
|
6
|
Iqbal A, Sharif M. MDA-Net: Multiscale dual attention-based network for breast lesion segmentation using ultrasound images. JOURNAL OF KING SAUD UNIVERSITY - COMPUTER AND INFORMATION SCIENCES 2021. [DOI: 10.1016/j.jksuci.2021.10.002] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
|
7
|
Wu Y, Zhang R, Zhu L, Wang W, Wang S, Xie H, Cheng G, Wang FL, He X, Zhang H. BGM-Net: Boundary-Guided Multiscale Network for Breast Lesion Segmentation in Ultrasound. Front Mol Biosci 2021; 8:698334. [PMID: 34350211 PMCID: PMC8326799 DOI: 10.3389/fmolb.2021.698334] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/21/2021] [Accepted: 06/14/2021] [Indexed: 11/13/2022] Open
Abstract
Automatic and accurate segmentation of breast lesion regions from ultrasonography is an essential step for ultrasound-guided diagnosis and treatment. However, developing a desirable segmentation method is very difficult due to strong imaging artifacts e.g., speckle noise, low contrast and intensity inhomogeneity, in breast ultrasound images. To solve this problem, this paper proposes a novel boundary-guided multiscale network (BGM-Net) to boost the performance of breast lesion segmentation from ultrasound images based on the feature pyramid network (FPN). First, we develop a boundary-guided feature enhancement (BGFE) module to enhance the feature map for each FPN layer by learning a boundary map of breast lesion regions. The BGFE module improves the boundary detection capability of the FPN framework so that weak boundaries in ambiguous regions can be correctly identified. Second, we design a multiscale scheme to leverage the information from different image scales in order to tackle ultrasound artifacts. Specifically, we downsample each testing image into a coarse counterpart, and both the testing image and its coarse counterpart are input into BGM-Net to predict a fine and a coarse segmentation maps, respectively. The segmentation result is then produced by fusing the fine and the coarse segmentation maps so that breast lesion regions are accurately segmented from ultrasound images and false detections are effectively removed attributing to boundary feature enhancement and multiscale image information. We validate the performance of the proposed approach on two challenging breast ultrasound datasets, and experimental results demonstrate that our approach outperforms state-of-the-art methods.
Collapse
Affiliation(s)
- Yunzhu Wu
- Department of Ultrasound, Shenzhen People’s Hospital, The Second Clinical College of Jinan University, Shenzhen, China
| | - Ruoxin Zhang
- Department of Gastroenterology, The First Affiliated Hospital of Guangdong Pharmaceutical University, Guangzhou, China
| | - Lei Zhu
- Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge, United Kingdom
| | - Weiming Wang
- School of Science and Technology, The Open University of Hong Kong, Hong Kong, China
| | - Shengwen Wang
- Department of Neurosurgery, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, China
- Guangdong Provincial Key Laboratory of Malignant Tumor Epigenetics and Gene Regulation, Sun Yat-Sen Memorial Hospital, Sun Yat-Sen University, Guangzhou, China
| | - Haoran Xie
- Department of Computing and Decision Sciences, Lingnan University, Hong Kong, China
| | - Gary Cheng
- Department of Mathematics and Information Technology, The Education University of Hong Kong, Hong Kong, China
| | - Fu Lee Wang
- School of Science and Technology, The Open University of Hong Kong, Hong Kong, China
| | - Xingxiang He
- Department of Gastroenterology, The First Affiliated Hospital of Guangdong Pharmaceutical University, Guangzhou, China
| | - Hai Zhang
- Department of Ultrasound, Shenzhen People’s Hospital, The Second Clinical College of Jinan University, Shenzhen, China
- The First Affiliated Hospital of Southern University of Science and Technology, Shenzhen, China
| |
Collapse
|
8
|
Shen X, Ma H, Liu R, Li H, He J, Wu X. Lesion segmentation in breast ultrasound images using the optimized marked watershed method. Biomed Eng Online 2021; 20:57. [PMID: 34098970 PMCID: PMC8186073 DOI: 10.1186/s12938-021-00891-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2020] [Accepted: 05/24/2021] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Breast cancer is one of the most serious diseases threatening women's health. Early screening based on ultrasound can help to detect and treat tumours in the early stage. However, due to the lack of radiologists with professional skills, ultrasound-based breast cancer screening has not been widely used in rural areas. Computer-aided diagnosis (CAD) technology can effectively alleviate this problem. Since breast ultrasound (BUS) images have low resolution and speckle noise, lesion segmentation, which is an important step in CAD systems, is challenging. RESULTS Two datasets were used for evaluation. Dataset A comprises 500 BUS images from local hospitals, while dataset B comprises 205 open-source BUS images. The experimental results show that the proposed method outperformed its related classic segmentation methods and the state-of-the-art deep learning model RDAU-NET. Its accuracy (Acc), Dice similarity coefficient (DSC) and Jaccard index (JI) reached 96.25%, 78.4% and 65.34% on dataset A, and its Acc, DSC and sensitivity reached 97.96%, 86.25% and 88.79% on dataset B, respectively. CONCLUSIONS We proposed an adaptive morphological snake based on marked watershed (AMSMW) algorithm for BUS image segmentation. It was proven to be robust, efficient and effective. In addition, it was found to be more sensitive to malignant lesions than benign lesions. METHODS The proposed method consists of two steps. In the first step, contrast limited adaptive histogram equalization (CLAHE) and a side window filter (SWF) are used to preprocess BUS images. Lesion contours can be effectively highlighted, and the influence of noise can be eliminated to a great extent. In the second step, we propose adaptive morphological snake (AMS). It can adjust the working parameters adaptively according to the size of the lesion. Its segmentation results are combined with those of the morphological method. Then, we determine the marked area and obtain candidate contours with a marked watershed (MW). Finally, the best lesion contour is chosen by the maximum average radial derivative (ARD).
Collapse
Affiliation(s)
- Xiaoyan Shen
- College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China
| | - He Ma
- College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China
- Key Laboratory of Intelligent Computing in Medical Image, Ministry of Education, Shenyang, China
| | - Ruibo Liu
- College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China
| | - Hong Li
- College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China
| | - Jiachuan He
- Department of radiology, Liaoning Cancer Hospital, Shenyang, China
| | - Xinran Wu
- College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China
| |
Collapse
|
9
|
Xue C, Zhu L, Fu H, Hu X, Li X, Zhang H, Heng PA. Global guidance network for breast lesion segmentation in ultrasound images. Med Image Anal 2021; 70:101989. [PMID: 33640719 DOI: 10.1016/j.media.2021.101989] [Citation(s) in RCA: 42] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Revised: 01/28/2021] [Accepted: 01/29/2021] [Indexed: 12/01/2022]
Abstract
Automatic breast lesion segmentation in ultrasound helps to diagnose breast cancer, which is one of the dreadful diseases that affect women globally. Segmenting breast regions accurately from ultrasound image is a challenging task due to the inherent speckle artifacts, blurry breast lesion boundaries, and inhomogeneous intensity distributions inside the breast lesion regions. Recently, convolutional neural networks (CNNs) have demonstrated remarkable results in medical image segmentation tasks. However, the convolutional operations in a CNN often focus on local regions, which suffer from limited capabilities in capturing long-range dependencies of the input ultrasound image, resulting in degraded breast lesion segmentation accuracy. In this paper, we develop a deep convolutional neural network equipped with a global guidance block (GGB) and breast lesion boundary detection (BD) modules for boosting the breast ultrasound lesion segmentation. The GGB utilizes the multi-layer integrated feature map as a guidance information to learn the long-range non-local dependencies from both spatial and channel domains. The BD modules learn additional breast lesion boundary map to enhance the boundary quality of a segmentation result refinement. Experimental results on a public dataset and a collected dataset show that our network outperforms other medical image segmentation methods and the recent semantic segmentation methods on breast ultrasound lesion segmentation. Moreover, we also show the application of our network on the ultrasound prostate segmentation, in which our method better identifies prostate regions than state-of-the-art networks.
Collapse
Affiliation(s)
- Cheng Xue
- Department of Electronic and Computer Engineering, The Hong Kong University of Science and Technology, Hong Kong, China
| | - Lei Zhu
- Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Hong Kong, China.
| | - Huazhu Fu
- Inception Institute of Artificial Intelligence, Abu Dhabi, UAE
| | - Xiaowei Hu
- Department of Electronic and Computer Engineering, The Hong Kong University of Science and Technology, Hong Kong, China
| | - Xiaomeng Li
- Department of Electronic and Computer Engineering, The Hong Kong University of Science and Technology, Hong Kong, China
| | - Hai Zhang
- Shenzhen People's Hospital, The Second Clinical College of Jinan University, The First Affiliated Hospital of Southern University of Science and Technology, Guangdong Province, China
| | - Pheng-Ann Heng
- Department of Computer Science and Engineering, The Chinese University of Hong Kong. Shenzhen Key Laboratory of Virtual Reality and Human Interaction Technology, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, China
| |
Collapse
|
10
|
Lei Y, He X, Yao J, Wang T, Wang L, Li W, Curran WJ, Liu T, Xu D, Yang X. Breast tumor segmentation in 3D automatic breast ultrasound using Mask scoring R-CNN. Med Phys 2020; 48:204-214. [PMID: 33128230 DOI: 10.1002/mp.14569] [Citation(s) in RCA: 42] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2020] [Revised: 10/20/2020] [Accepted: 10/20/2020] [Indexed: 12/24/2022] Open
Abstract
PURPOSE Automatic breast ultrasound (ABUS) imaging has become an essential tool in breast cancer diagnosis since it provides complementary information to other imaging modalities. Lesion segmentation on ABUS is a prerequisite step of breast cancer computer-aided diagnosis (CAD). This work aims to develop a deep learning-based method for breast tumor segmentation using three-dimensional (3D) ABUS automatically. METHODS For breast tumor segmentation in ABUS, we developed a Mask scoring region-based convolutional neural network (R-CNN) that consists of five subnetworks, that is, a backbone, a regional proposal network, a region convolutional neural network head, a mask head, and a mask score head. A network block building direct correlation between mask quality and region class was integrated into a Mask scoring R-CNN based framework for the segmentation of new ABUS images with ambiguous regions of interest (ROIs). For segmentation accuracy evaluation, we retrospectively investigated 70 patients with breast tumor confirmed with needle biopsy and manually delineated on ABUS, of which 40 were used for fivefold cross-validation and 30 were used for hold-out test. The comparison between the automatic breast tumor segmentations and the manual contours was quantified by I) six metrics including Dice similarity coefficient (DSC), Jaccard index, 95% Hausdorff distance (HD95), mean surface distance (MSD), residual mean square distance (RMSD), and center of mass distance (CMD); II) Pearson correlation analysis and Bland-Altman analysis. RESULTS The mean (median) DSC was 85% ± 10.4% (89.4%) and 82.1% ± 14.5% (85.6%) for cross-validation and hold-out test, respectively. The corresponding HD95, MSD, RMSD, and CMD of the two tests was 1.646 ± 1.191 and 1.665 ± 1.129 mm, 0.489 ± 0.406 and 0.475 ± 0.371 mm, 0.755 ± 0.755 and 0.751 ± 0.508 mm, and 0.672 ± 0.612 and 0.665 ± 0.729 mm. The mean volumetric difference (mean and ± 1.96 standard deviation) was 0.47 cc ([-0.77, 1.71)) for the cross-validation and 0.23 cc ([-0.23 0.69]) for hold-out test, respectively. CONCLUSION We developed a novel Mask scoring R-CNN approach for the automated segmentation of the breast tumor in ABUS images and demonstrated its accuracy for breast tumor segmentation. Our learning-based method can potentially assist the clinical CAD of breast cancer using 3D ABUS imaging.
Collapse
Affiliation(s)
- Yang Lei
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Xiuxiu He
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Jincao Yao
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital.,Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Tonghe Wang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Lijing Wang
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital.,Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Wei Li
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital.,Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Walter J Curran
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Tian Liu
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| | - Dong Xu
- Cancer Hospital of the University of Chinese Academy of Sciences, Zhejiang Cancer Hospital.,Institute of Cancer and Basic Medicine (IBMC), Chinese Academy of Sciences, Hangzhou, 310022, China
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA, 30322, USA
| |
Collapse
|
11
|
|
12
|
Wang F, Liu X, Yuan N, Qian B, Ruan L, Yin C, Jin C. Study on automatic detection and classification of breast nodule using deep convolutional neural network system. J Thorac Dis 2020; 12:4690-4701. [PMID: 33145042 PMCID: PMC7578508 DOI: 10.21037/jtd-19-3013] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Backgrounds Conventional ultrasound manual scanning and artificial diagnosis approaches in breast are considered to be operator-dependence, slight slow and error-prone. In this study, we used Automated Breast Ultrasound (ABUS) machine for the scanning, and deep convolutional neural network (CNN) technology, a kind of Deep Learning (DL) algorithm, for the detection and classification of breast nodules, aiming to achieve the automatic and accurate diagnosis of breast nodules. Methods Two hundred and ninety-three lesions from 194 patients with definite pathological diagnosis results (117 benign and 176 malignancy) were recruited as case group. Another 70 patients without breast diseases were enrolled as control group. All the breast scans were carried out by an ABUS machine and then randomly divided into training set, verification set and test set, with a proportion of 7:1:2. In the training set, we constructed a detection model by a three-dimensionally U-shaped convolutional neural network (3D U-Net) architecture for the purpose of segment the nodules from background breast images. Processes such as residual block, attention connections, and hard mining were used to optimize the model while strategies of random cropping, flipping and rotation for data augmentation. In the test phase, the current model was compared with those in previously reported studies. In the verification set, the detection effectiveness of detection model was evaluated. In the classification phase, multiple convolutional layers and fully-connected layers were applied to set up a classification model, aiming to identify whether the nodule was malignancy. Results Our detection model yielded a sensitivity of 91% and 1.92 false positive subjects per automatically scanned imaging. The classification model achieved a sensitivity of 87.0%, a specificity of 88.0% and an accuracy of 87.5%. Conclusions Deep CNN combined with ABUS maybe a promising tool for easy detection and accurate diagnosis of breast nodule.
Collapse
Affiliation(s)
- Feiqian Wang
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Xiaotong Liu
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Na Yuan
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Buyue Qian
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Litao Ruan
- Department of Ultrasound, The First Affiliated Hospital of Xi'an Jiaotong University, China
| | - Changchang Yin
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| | - Ciping Jin
- National Engineering Lab for Big Data Analytics, Xi'an Jiaotong University, Xi'an, China.,School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an, China
| |
Collapse
|
13
|
Nicosia L, Ferrari F, Bozzini AC, Latronico A, Trentin C, Meneghetti L, Pesapane F, Pizzamiglio M, Balesetreri N, Cassano E. Automatic breast ultrasound: state of the art and future perspectives. Ecancermedicalscience 2020; 14:1062. [PMID: 32728378 PMCID: PMC7373644 DOI: 10.3332/ecancer.2020.1062] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2020] [Indexed: 11/08/2022] Open
Abstract
The three-dimensional automated breast ultrasound system (3D ABUS) is a new device which represents a huge innovation in the breast ultrasound field, with several application scenarios of great interest. ABUS's aim is to solve some of the main defects of traditional ultrasound, such as lack of standardization, high level of skill non-reproducibility, small field of view and high commitment of physician time. ABUS has proven to be an excellent non-ionising alternative to other supplemental screening options for women with dense breast tissue; also, it has appeared to be very promising in daily clinical practice. The purpose of this paper is to present a summary of current applications of ABUS, focusing on clinical applications and future perspectives as ABUS is particularly promising for studies involving artificial intelligence, radiomics and evaluation of breast molecular subtypes.
Collapse
Affiliation(s)
- Luca Nicosia
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Federica Ferrari
- Postgraduation School in Radiodiagnostics, Università degli Studi di Milano, 20122 Milan, Italy
| | - Anna Carla Bozzini
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Antuono Latronico
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Chiara Trentin
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Lorenza Meneghetti
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Filippo Pesapane
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Maria Pizzamiglio
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Nicola Balesetreri
- Department of Radiology, European Institute of Oncology, 20141 Milan, Italy
| | - Enrico Cassano
- Department of Breast Radiology, European Institute of Oncology, 20141 Milan, Italy
| |
Collapse
|
14
|
Kore SS, Kadam AB. A novel incomplete sparse least square optimized regression model for abdominal mass detection in ultrasound images. EVOLUTIONARY INTELLIGENCE 2020. [DOI: 10.1007/s12065-020-00431-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
15
|
Lee CY, Chang TF, Chou YH, Yang KC. Fully automated lesion segmentation and visualization in automated whole breast ultrasound (ABUS) images. Quant Imaging Med Surg 2020; 10:568-584. [PMID: 32269918 DOI: 10.21037/qims.2020.01.12] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
Abstract
Background The number of breast cancer patients has increased each year, and the demand for breast cancer detection has become quite large. There are many common breast cancer diagnostic tools. The latest automated whole breast ultrasound (ABUS) technology can obtain a complete breast tissue structure, which improves breast cancer detection technology. However, due to the large amount of ABUS image data, manual interpretation is time-consuming and labor-intensive. If there are lesions in multiple images, there may be some omissions. In addition, if further volume information or the three-dimensional shape of the lesion is needed for therapy, it is necessary to manually segment each lesion, which is inefficient for diagnosis. Therefore, automatic lesion segmentation for ABUS is an important issue for guiding therapy. Methods Due to the amount of speckle noise in an ultrasonic image and the low contrast of the lesion boundary, it is quite difficult to automatically segment the lesion. To address the above challenges, this study proposes an automated lesion segmentation algorithm. The architecture of the proposed algorithm can be divided into four parts: (I) volume of interest selection, (II) preprocessing, (III) segmentation, and (IV) visualization. A volume of interest (VOI) is automatically selected first via a three-dimensional level-set, and then the method uses anisotropic diffusion to address the speckled noise and intensity inhomogeneity correction to eliminate shadowing artifacts before the adaptive distance regularization level set method (DRLSE) conducts segmentation. Finally, the two-dimensional segmented images are reconstructed for visualization in the three-dimensional space. Results The ground truth is delineated by two radiologists with more than 10 years of experience in breast sonography. In this study, three performance assessments are carried out to evaluate the effectiveness of the proposed algorithm. The first assessment is the similarity measurement. The second assessment is the comparison of the results of the proposed algorithm and the Chan-Vese level set method. The third assessment is the volume estimation of phantom cases. In this study, in the 2D validation of the first assessment, the area Dice similarity coefficients of the real cases named cases A, real cases B and phantoms are 0.84±0.02, 0.86±0.03 and 0.92±0.02, respectively. The overlap fraction (OF) and overlap value (OV) of the real cases A are 0.84±0.06 and 0.78±0.04, real case B are 0.91±0.04 and 0.82±0.05, respectively. The overlap fraction (OF) and overlap value (OV) of the phantoms are 0.95±0.02 and 0.92±0.03, respectively. In the 3D validation, the volume Dice similarity coefficients of the real cases A, real cases B and phantoms are 0.85±0.02, 0.89±0.04 and 0.94±0.02, respectively. The overlap fraction (OF) and overlap value (OV) of the real cases A are 0.82±0.06 and 0.79±0.04, real cases B are 0.92±0.04 and 0.85±0.07, respectively. The overlap fraction (OF) and overlap value (OV) of the phantoms are 0.95±0.01 and 0.93±0.04, respectively. Therefore, the proposed algorithm is highly reliable in most cases. In the second assessment, compared with Chan-Vese level set method, the Dice of the proposed algorithm in real cases A, real cases B and phantoms are 0.84±0.02, 0.86±0.03 and 0.92±0.02, respectively. The Dice of Chan-Vese level set in real cases A, real cases B and phantoms are 0.65±0.23, 0.69±0.14 and 0.76±0.14, respectively. The Dice performance of different methods on segmentation shows a highly significant impact (P<0.01). The results show that the proposed algorithm is more accurate than Chan-Vese level set method. In the third assessment, the Spearman's correlation coefficient between the segmented volumes and the corresponding ground truth volumes is ρ=0.929 (P=0.01). Conclusions In summary, the proposed method can batch process ABUS images, segment lesions, calculate their volumes and visualize lesions to facilitate observation by radiologists and physicians.
Collapse
Affiliation(s)
- Chia-Yen Lee
- Department of Electrical Engineering, National United University, Taipei, Taiwan
| | - Tzu-Fang Chang
- Department of Electrical Engineering, National United University, Taipei, Taiwan
| | - Yi-Hong Chou
- Department of Medical Imaging and Radiological Technology, Yuanpei University of Medical Technology, Hsinchu, Taiwan.,Department of Radiology, Taipei Veterans General Hospital, Taipei, Taiwan.,School of Medicine, National Yang Ming University, Taipei, Taiwan
| | - Kuen-Cheh Yang
- Department of Family Medicine, National Taiwan University Hospital, Bei-Hu Branch, Taipei, Taiwan
| |
Collapse
|
16
|
|
17
|
Murtaza G, Shuib L, Abdul Wahab AW, Mujtaba G, Mujtaba G, Nweke HF, Al-garadi MA, Zulfiqar F, Raza G, Azmi NA. Deep learning-based breast cancer classification through medical imaging modalities: state of the art and research challenges. Artif Intell Rev 2019. [DOI: 10.1007/s10462-019-09716-5] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
|
18
|
Xu Y, Wang Y, Yuan J, Cheng Q, Wang X, Carson PL. Medical breast ultrasound image segmentation by machine learning. ULTRASONICS 2019; 91:1-9. [PMID: 30029074 DOI: 10.1016/j.ultras.2018.07.006] [Citation(s) in RCA: 70] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/27/2018] [Revised: 07/12/2018] [Accepted: 07/12/2018] [Indexed: 05/02/2023]
Abstract
Breast cancer is the most commonly diagnosed cancer, which alone accounts for 30% all new cancer diagnoses for women, posing a threat to women's health. Segmentation of breast ultrasound images into functional tissues can aid tumor localization, breast density measurement, and assessment of treatment response, which is important to the clinical diagnosis of breast cancer. However, manually segmenting the ultrasound images, which is skill and experience dependent, would lead to a subjective diagnosis; in addition, it is time-consuming for radiologists to review hundreds of clinical images. Therefore, automatic segmentation of breast ultrasound images into functional tissues has received attention in recent years, amidst the more numerous studies of detection and segmentation of masses. In this paper, we propose to use convolutional neural networks (CNNs) for segmenting breast ultrasound images into four major tissues: skin, fibroglandular tissue, mass, and fatty tissue, on three-dimensional (3D) breast ultrasound images. Quantitative metrics for evaluation of segmentation results including Accuracy, Precision, Recall, and F1measure, all reached over 80%, which indicates that the method proposed has the capacity to distinguish functional tissues in breast ultrasound images. Another metric called the Jaccard similarity index (JSI) yields an 85.1% value, outperforming our previous study using the watershed algorithm with 74.54% JSI value. Thus, our proposed method might have the potential to provide the segmentations necessary to assist the clinical diagnosis of breast cancer and improve imaging in other modes in medical ultrasound.
Collapse
Affiliation(s)
- Yuan Xu
- Department of Electronic Science and Engineering, Nanjing University, Nanjing 210093, China
| | - Yuxin Wang
- Department of Electronic Science and Engineering, Nanjing University, Nanjing 210093, China
| | - Jie Yuan
- Department of Electronic Science and Engineering, Nanjing University, Nanjing 210093, China.
| | - Qian Cheng
- Department of Physics, Tongji University, Shanghai 200000, China
| | - Xueding Wang
- Department of Physics, Tongji University, Shanghai 200000, China; Department of Radiology, University of Michigan, Ann Arbor, MI 48109, USA
| | - Paul L Carson
- Department of Radiology, University of Michigan, Ann Arbor, MI 48109, USA
| |
Collapse
|
19
|
Lal M, Kaur L, Gupta S. Automatic segmentation of tumors in B-Mode breast ultrasound images using information gain based neutrosophic clustering. JOURNAL OF X-RAY SCIENCE AND TECHNOLOGY 2018; 26:209-225. [PMID: 29154313 DOI: 10.3233/xst-17313] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
BACKGROUND Since breast ultrasound images are of low contrast, contain inherent noise and shadowing effect due to its imaging process, segmentation of breast tumors depicting ultrasound image is a challenging task. Thus, a robust breast ultrasound image segmentation technique is inevitable. OBJECTIVE To develop an automatic lesion segmentation technique for breast ultrasound images. METHODS First, the technique automatically detects the suspicious tumor region of interest and discards the unwanted complex background regions. Next, based on the concept of information gain, the technique applies an existing neutrosophic clustering method to the detected region to segment the desired tumor area. The proposed technique computes information gain values from the local neighbourhood of each pixel, which is further used to update the membership values and the cluster centers for the neutrosophic clustering process. Integrating the concept of entropy and neutrosophic logic features into the technique enabled to generate better segmentation results. RESULTS Results of proposed method were compared both qualitatively and quantitatively with fuzzy c-means, neutrosophic c-means and neutrosophic ℓ-means clustering methods. It was observed that the proposed method outperformed the other three methods and yielded the best Mean (TP: 94.72, FP: 5.85, SI: 93.75, HD: 8.2, AMED: 2.4) and Standard deviation (TP: 3.2, FP: 3.7, SI: 3.8, HD: 2.6, AMED: 1.3) values for different quality metrics on the current set of breast ultrasound images. CONCLUSION Study demonstrated that the proposed technique is robust to the shadowing effect and produces more accurate segmentation of the tumor region, which is very similar to that visually segmented by Radiologist.
Collapse
Affiliation(s)
- Madan Lal
- Department of Computer Engineering, Punjabi University, Patiala, India
| | - Lakhwinder Kaur
- Department of Computer Engineering, Punjabi University, Patiala, India
| | - Savita Gupta
- Department of Computer Science and Engineering, University Institute of Engineering and Technology, Panjab University, Chandigarh, India
| |
Collapse
|
20
|
Wang Y, Nasief HG, Kohn S, Milkowski A, Clary T, Barnes S, Barbone PE, Hall TJ. Three-dimensional Ultrasound Elasticity Imaging on an Automated Breast Volume Scanning System. ULTRASONIC IMAGING 2017; 39:369-392. [PMID: 28585511 PMCID: PMC5643218 DOI: 10.1177/0161734617712238] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
Ultrasound elasticity imaging has demonstrated utility in breast imaging, but it is typically performed with handheld transducers and two-dimensional imaging. Two-dimensional (2D) elastography images tissue stiffness of only a plane and hence suffers from errors due to out-of-plane motion, whereas three-dimensional (3D) data acquisition and motion tracking can be used to track out-of-plane motion that is lost in 2D elastography systems. A commercially available automated breast volume scanning system that acquires 3D ultrasound data with precisely controlled elevational movement of the 1D array ultrasound transducer was employed in this study. A hybrid guided 3D motion-tracking algorithm was developed that first estimated the displacements in one plane using a modified quality-guided search method, and then performed an elevational guided-search for displacement estimation in adjacent planes. To assess the performance of the method, 3D radiofrequency echo data were acquired with this system from a phantom and from an in vivo human breast. For both experiments, the axial displacement fields were smooth and high cross-correlation coefficients were obtained in most of the tracking region. The motion-tracking performance of the new method was compared with a correlation-based exhaustive-search method. For all motion-tracking volume pairs, the average motion-compensated cross-correlation values obtained by the guided-search motion-tracking method were equivalent to those by the exhaustive-search method, and the computation time was about a factor of 10 lesser. Therefore, the proposed 3D ultrasound elasticity imaging method was a more efficient approach to produce a high quality of 3D ultrasound strain image.
Collapse
Affiliation(s)
- Yuqi Wang
- Department of Medical Physics, University of Wisconsin, Madison, WI 53705, USA
| | - Haidy G Nasief
- Department of Medical Physics, University of Wisconsin, Madison, WI 53705, USA
| | - Sarah Kohn
- Department of Medical Physics, University of Wisconsin, Madison, WI 53705, USA
| | - Andy Milkowski
- Siemens Healthcare USA, Ultrasound Division, Issaquah, WA 98029, USA
| | - Tom Clary
- The Inception Group, LLC, Sammamish, WA 98075, USA
| | - Stephen Barnes
- Siemens Healthcare USA, Ultrasound Division, Issaquah, WA 98029, USA
| | - Paul E Barbone
- Department of Mechanical Engineering, Boston University, Boston, MA 02215, USA
| | - Timothy J Hall
- Department of Medical Physics, University of Wisconsin, Madison, WI 53705, USA
| |
Collapse
|
21
|
Bharti P, Mittal D, Ananthasivan R. Computer-aided Characterization and Diagnosis of Diffuse Liver Diseases Based on Ultrasound Imaging: A Review. ULTRASONIC IMAGING 2017; 39:33-61. [PMID: 27097589 DOI: 10.1177/0161734616639875] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Diffuse liver diseases, such as hepatitis, fatty liver, and cirrhosis, are becoming a leading cause of fatality and disability all over the world. Early detection and diagnosis of these diseases is extremely important to save lives and improve effectiveness of treatment. Ultrasound imaging, a noninvasive diagnostic technique, is the most commonly used modality for examining liver abnormalities. However, the accuracy of ultrasound-based diagnosis depends highly on expertise of radiologists. Computer-aided diagnosis systems based on ultrasound imaging assist in fast diagnosis, provide a reliable "second opinion" for experts, and act as an effective tool to measure response of treatment on patients undergoing clinical trials. In this review, we first describe appearance of liver abnormalities in ultrasound images and state the practical issues encountered in characterization of diffuse liver diseases that can be addressed by software algorithms. We then discuss computer-aided diagnosis in general with features and classifiers relevant to diffuse liver diseases. In later sections of this paper, we review the published studies and describe the key findings of those studies. A concise tabular summary comparing image database, features extraction, feature selection, and classification algorithms presented in the published studies is also exhibited. Finally, we conclude with a summary of key findings and directions for further improvements in the areas of accuracy and objectiveness of computer-aided diagnosis.
Collapse
Affiliation(s)
- Puja Bharti
- 1 Department of Electrical and Instrumentation Engineering, Thapar University, Patiala, India
| | - Deepti Mittal
- 1 Department of Electrical and Instrumentation Engineering, Thapar University, Patiala, India
| | | |
Collapse
|
22
|
Prabusankarlal KM, Thirumoorthy P, Manavalan R. Segmentation of Breast Lesions in Ultrasound Images through Multiresolution Analysis Using Undecimated Discrete Wavelet Transform. ULTRASONIC IMAGING 2016; 38:384-402. [PMID: 26586725 DOI: 10.1177/0161734615615838] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Earliest detection and diagnosis of breast cancer reduces mortality rate of patients by increasing the treatment options. A novel method for the segmentation of breast ultrasound images is proposed in this work. The proposed method utilizes undecimated discrete wavelet transform to perform multiresolution analysis of the input ultrasound image. As the resolution level increases, although the effect of noise reduces, the details of the image also dilute. The appropriate resolution level, which contains essential details of the tumor, is automatically selected through mean structural similarity. The feature vector for each pixel is constructed by sampling intra-resolution and inter-resolution data of the image. The dimensionality of feature vectors is reduced by using principal components analysis. The reduced set of feature vectors is segmented into two disjoint clusters using spatial regularized fuzzy c-means algorithm. The proposed algorithm is evaluated by using four validation metrics on a breast ultrasound database of 150 images including 90 benign and 60 malignant cases. The algorithm produced significantly better segmentation results (Dice coef = 0.8595, boundary displacement error = 9.796, dvi = 1.744, and global consistency error = 0.1835) than the other three state of the art methods.
Collapse
Affiliation(s)
- K M Prabusankarlal
- Research and Development Centre, Bharathiar University, Coimbatore, India Department of Electronics & Communication, K.S.R. College of Arts & Science, Tiruchengode, India
| | - P Thirumoorthy
- Department of Electronics & Communication, Government Arts College, Dharmapuri, India
| | - R Manavalan
- Department of Computer Applications, K.S.R. College of Arts & Science, Tiruchengode, India
| |
Collapse
|
23
|
Gu P, Lee WM, Roubidoux MA, Yuan J, Wang X, Carson PL. Automated 3D ultrasound image segmentation to aid breast cancer image interpretation. ULTRASONICS 2016; 65:51-8. [PMID: 26547117 PMCID: PMC4702489 DOI: 10.1016/j.ultras.2015.10.023] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/23/2015] [Revised: 10/20/2015] [Accepted: 10/23/2015] [Indexed: 05/18/2023]
Abstract
Segmentation of an ultrasound image into functional tissues is of great importance to clinical diagnosis of breast cancer. However, many studies are found to segment only the mass of interest and not all major tissues. Differences and inconsistencies in ultrasound interpretation call for an automated segmentation method to make results operator-independent. Furthermore, manual segmentation of entire three-dimensional (3D) ultrasound volumes is time-consuming, resource-intensive, and clinically impractical. Here, we propose an automated algorithm to segment 3D ultrasound volumes into three major tissue types: cyst/mass, fatty tissue, and fibro-glandular tissue. To test its efficacy and consistency, the proposed automated method was employed on a database of 21 cases of whole breast ultrasound. Experimental results show that our proposed method not only distinguishes fat and non-fat tissues correctly, but performs well in classifying cyst/mass. Comparison of density assessment between the automated method and manual segmentation demonstrates good consistency with an accuracy of 85.7%. Quantitative comparison of corresponding tissue volumes, which uses overlap ratio, gives an average similarity of 74.54%, consistent with values seen in MRI brain segmentations. Thus, our proposed method exhibits great potential as an automated approach to segment 3D whole breast ultrasound volumes into functionally distinct tissues that may help to correct ultrasound speed of sound aberrations and assist in density based prognosis of breast cancer.
Collapse
Affiliation(s)
- Peng Gu
- Department of Electronic Science and Engineering, Nanjing University, 210093, China
| | - Won-Mean Lee
- Department of Radiology, University of Michigan, 48109, USA
| | | | - Jie Yuan
- Department of Electronic Science and Engineering, Nanjing University, 210093, China.
| | - Xueding Wang
- Department of Radiology, University of Michigan, 48109, USA
| | - Paul L Carson
- Department of Radiology, University of Michigan, 48109, USA.
| |
Collapse
|
24
|
Ng KH, Lau S. Vision 20/20: Mammographic breast density and its clinical applications. Med Phys 2015; 42:7059-77. [PMID: 26632060 DOI: 10.1118/1.4935141] [Citation(s) in RCA: 36] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022] Open
Affiliation(s)
- Kwan-Hoong Ng
- Department of Biomedical Imaging and University of Malaya Research Imaging Centre, Faculty of Medicine, University of Malaya, 50603 Kuala Lumpur, Malaysia
| | - Susie Lau
- Department of Biomedical Imaging and University of Malaya Research Imaging Centre, Faculty of Medicine, University of Malaya, 50603 Kuala Lumpur, Malaysia
| |
Collapse
|
25
|
Jiang WW, Li C, Li AH, Zheng YP. A novel breast ultrasound system for providing coronal images: system development and feasibility study. ULTRASONICS 2015; 56:427-434. [PMID: 25287975 DOI: 10.1016/j.ultras.2014.09.009] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/09/2014] [Revised: 09/10/2014] [Accepted: 09/16/2014] [Indexed: 06/03/2023]
Abstract
Breast ultrasound images along coronal plane contain important diagnosis information. However, conventional clinical 2D ultrasound cannot provide such images. In order to solve this problem, we developed a novel ultrasound system aimed at providing breast coronal images. In this system, a spatial sensor was fixed on an ultrasound probe to obtain the image spatial data. A narrow-band rendering method was used to form coronal images based on B-mode images and their corresponding spatial data. Software was developed for data acquisition, processing, rendering and visualization. In phantom experiments, 20 inclusions with different size (5-20 mm) were measured using this new system. The results obtained by the new method well correlated with those measured by a micrometer (y=1.0147x, R(2)=0.9927). The phantom tests also showed that this system had excellent intra- and inter-operator repeatability (ICC>0.995). Three subjects with breast lesions were scanned in vivo using this new system and a commercially available three-dimensional (3D) probe. The average scanning times for the two systems were 64 s and 74 s, respectively. The results revealed that this new method required shorter scanning time. The tumor sizes measured on the coronal plane provided by the new method were smaller by 5.6-11.9% in comparison with the results of the 3D probe. The phantom tests and preliminary subject tests indicated the feasibility of this system for clinical applications by providing additional information for clinical breast ultrasound diagnosis.
Collapse
Affiliation(s)
- Wei-wei Jiang
- Interdisciplinary Division of Biomedical Engineering, The Hong Kong Polytechnic University, Kowloon, Hong Kong, China
| | - Cheng Li
- Department of Ultrasound, State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University Cancer Center, Guangzhou, China; Department of Ultrasound, Hospital of Traditional Chinese Medicine of Zhongshan, Zhongshan, China
| | - An-hua Li
- Department of Ultrasound, State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University Cancer Center, Guangzhou, China
| | - Yong-Ping Zheng
- Interdisciplinary Division of Biomedical Engineering, The Hong Kong Polytechnic University, Kowloon, Hong Kong, China.
| |
Collapse
|
26
|
Zhou Z, Wu W, Wu S, Tsui PH, Lin CC, Zhang L, Wang T. Semi-automatic breast ultrasound image segmentation based on mean shift and graph cuts. ULTRASONIC IMAGING 2014; 36:256-276. [PMID: 24759696 DOI: 10.1177/0161734614524735] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Computerized tumor segmentation on breast ultrasound (BUS) images remains a challenging task. In this paper, we proposed a new method for semi-automatic tumor segmentation on BUS images using Gaussian filtering, histogram equalization, mean shift, and graph cuts. The only interaction required was to select two diagonal points to determine a region of interest (ROI) on an input image. The ROI image was shrunken by a factor of 2 using bicubic interpolation to reduce computation time. The shrunken image was smoothed by a Gaussian filter and then contrast-enhanced by histogram equalization. Next, the enhanced image was filtered by pyramid mean shift to improve homogeneity. The object and background seeds for graph cuts were automatically generated on the filtered image. Using these seeds, the filtered image was then segmented by graph cuts into a binary image containing the object and background. Finally, the binary image was expanded by a factor of 2 using bicubic interpolation, and the expanded image was processed by morphological opening and closing to refine the tumor contour. The method was implemented with OpenCV 2.4.3 and Visual Studio 2010 and tested for 38 BUS images with benign tumors and 31 BUS images with malignant tumors from different ultrasound scanners. Experimental results showed that our method had a true positive rate (TP) of 91.7%, a false positive (FP) rate of 11.9%, and a similarity (SI) rate of 85.6%. The mean run time on Intel Core 2.66 GHz CPU and 4 GB RAM was 0.49 ± 0.36 s. The experimental results indicate that the proposed method may be useful in BUS image segmentation.
Collapse
Affiliation(s)
- Zhuhuang Zhou
- College of Life Science and Bioengineering, Beijing University of Technology, Beijing, China
| | - Weiwei Wu
- College of Electronic Information and Control Engineering, Beijing University of Technology, Beijing, China
| | - Shuicai Wu
- College of Life Science and Bioengineering, Beijing University of Technology, Beijing, China
| | - Po-Hsiang Tsui
- Department of Medical Imaging and Radiological Sciences, College of Medicine, Chang Gung University, Taoyuan, Taiwan
| | - Chung-Chih Lin
- Department of Computer Science and Information Engineering, Chang Gung University, Taoyuan, Taiwan
| | - Ling Zhang
- Department of Biomedical Engineering, Shenzhen University, Shenzhen, Guangdong, China
| | - Tianfu Wang
- Department of Biomedical Engineering, Shenzhen University, Shenzhen, Guangdong, China
| |
Collapse
|
27
|
Arleo EK, Saleh M, Ionescu D, Drotman M, Min RJ, Hentel K. Recall rate of screening ultrasound with automated breast volumetric scanning (ABVS) in women with dense breasts: a first quarter experience. Clin Imaging 2014; 38:439-444. [DOI: 10.1016/j.clinimag.2014.03.012] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2014] [Revised: 03/03/2014] [Accepted: 03/24/2014] [Indexed: 10/25/2022]
|