1
|
Wang H, Wu H, Wang Z, Yue P, Ni D, Heng PA, Wang Y. A Narrative Review of Image Processing Techniques Related to Prostate Ultrasound. ULTRASOUND IN MEDICINE & BIOLOGY 2025; 51:189-209. [PMID: 39551652 DOI: 10.1016/j.ultrasmedbio.2024.10.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/08/2024] [Revised: 09/15/2024] [Accepted: 10/06/2024] [Indexed: 11/19/2024]
Abstract
Prostate cancer (PCa) poses a significant threat to men's health, with early diagnosis being crucial for improving prognosis and reducing mortality rates. Transrectal ultrasound (TRUS) plays a vital role in the diagnosis and image-guided intervention of PCa. To facilitate physicians with more accurate and efficient computer-assisted diagnosis and interventions, many image processing algorithms in TRUS have been proposed and achieved state-of-the-art performance in several tasks, including prostate gland segmentation, prostate image registration, PCa classification and detection and interventional needle detection. The rapid development of these algorithms over the past 2 decades necessitates a comprehensive summary. As a consequence, this survey provides a narrative review of this field, outlining the evolution of image processing methods in the context of TRUS image analysis and meanwhile highlighting their relevant contributions. Furthermore, this survey discusses current challenges and suggests future research directions to possibly advance this field further.
Collapse
Affiliation(s)
- Haiqiao Wang
- Medical UltraSound Image Computing (MUSIC) Lab, Smart Medical Imaging, Learning and Engineering (SMILE) Lab, School of Biomedical Engineering, Shenzhen University Medical School, Shenzhen University, Shenzhen, China; Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China
| | - Hong Wu
- Medical UltraSound Image Computing (MUSIC) Lab, Smart Medical Imaging, Learning and Engineering (SMILE) Lab, School of Biomedical Engineering, Shenzhen University Medical School, Shenzhen University, Shenzhen, China
| | - Zhuoyuan Wang
- Medical UltraSound Image Computing (MUSIC) Lab, Smart Medical Imaging, Learning and Engineering (SMILE) Lab, School of Biomedical Engineering, Shenzhen University Medical School, Shenzhen University, Shenzhen, China
| | - Peiyan Yue
- Medical UltraSound Image Computing (MUSIC) Lab, Smart Medical Imaging, Learning and Engineering (SMILE) Lab, School of Biomedical Engineering, Shenzhen University Medical School, Shenzhen University, Shenzhen, China
| | - Dong Ni
- Medical UltraSound Image Computing (MUSIC) Lab, Smart Medical Imaging, Learning and Engineering (SMILE) Lab, School of Biomedical Engineering, Shenzhen University Medical School, Shenzhen University, Shenzhen, China
| | - Pheng-Ann Heng
- Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China
| | - Yi Wang
- Medical UltraSound Image Computing (MUSIC) Lab, Smart Medical Imaging, Learning and Engineering (SMILE) Lab, School of Biomedical Engineering, Shenzhen University Medical School, Shenzhen University, Shenzhen, China.
| |
Collapse
|
2
|
Bennett R, Barrett T, Gnanapragasam VJ, Tse Z. Surface-Based Ultrasound Scans for the Screening of Prostate Cancer. IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY 2024; 6:212-218. [PMID: 39698116 PMCID: PMC11655114 DOI: 10.1109/ojemb.2024.3503494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2024] [Revised: 09/17/2024] [Accepted: 11/15/2024] [Indexed: 12/20/2024] Open
Abstract
Surface-based ultrasound (SUS) systems have undergone substantial improvement over the years in image quality, ease-of-use, and reduction in size. Their ability to image organs non-invasively makes them a prime technology for the diagnosis and monitoring of various diseases and conditions. An example is the screening/risk- stratification of prostate cancer (PCa) using prostate-specific antigen density (PSAD). Current literature predominantly focuses on prostate volume (PV) estimation techniques that make use of magnetic resonance imaging (MRI) or transrectal ultrasound (TRUS) imaging, while SUS techniques are largely overlooked. If a reliable SUS PCa screening method can be introduced, patients may be able to forgo unnecessary MRI or TRUS scans. Such a screening procedure could be introduced into standard primary care settings with point-of-care ultrasound systems available at a fraction of the cost of their larger hospital counterparts. This review analyses whether literature suggests it is possible to use SUS-derived PV in the calculation of PSAD.
Collapse
Affiliation(s)
- Rory Bennett
- School of Engineering and Materials ScienceQueen Mary University of LondonE1 4NSLondonU.K.
| | - Tristan Barrett
- Department of Radiology, Addenbrooke's HospitalUniversity of Cambridge School of Clinical MedicineCB2 0QQCambridgeU.K.
| | - Vincent J. Gnanapragasam
- Department of Radiology, Addenbrooke's HospitalUniversity of Cambridge School of Clinical MedicineCB2 0QQCambridgeU.K.
| | - Zion Tse
- School of Engineering and Materials ScienceQueen Mary University of LondonE1 4NSLondonU.K.
| |
Collapse
|
3
|
Jiang J, Guo Y, Bi Z, Huang Z, Yu G, Wang J. Segmentation of prostate ultrasound images: the state of the art and the future directions of segmentation algorithms. Artif Intell Rev 2022. [DOI: 10.1007/s10462-022-10179-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
4
|
Estimation of the Prostate Volume from Abdominal Ultrasound Images by Image-Patch Voting. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12031390] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Estimation of the prostate volume with ultrasound offers many advantages such as portability, low cost, harmlessness, and suitability for real-time operation. Abdominal Ultrasound (AUS) is a practical procedure that deserves more attention in automated prostate-volume-estimation studies. As the experts usually consider automatic end-to-end volume-estimation procedures as non-transparent and uninterpretable systems, we proposed an expert-in-the-loop automatic system that follows the classical prostate-volume-estimation procedures. Our system directly estimates the diameter parameters of the standard ellipsoid formula to produce the prostate volume. To obtain the diameters, our system detects four diameter endpoints from the transverse and two diameter endpoints from the sagittal AUS images as defined by the classical procedure. These endpoints are estimated using a new image-patch voting method to address characteristic problems of AUS images. We formed a novel prostate AUS data set from 305 patients with both transverse and sagittal planes. The data set includes MRI images for 75 of these patients. At least one expert manually marked all the data. Extensive experiments performed on this data set showed that the proposed system results ranged among experts’ volume estimations, and our system can be used in clinical practice.
Collapse
|
5
|
Karimi D, Zeng Q, Mathur P, Avinash A, Mahdavi S, Spadinger I, Abolmaesumi P, Salcudean SE. Accurate and robust deep learning-based segmentation of the prostate clinical target volume in ultrasound images. Med Image Anal 2019; 57:186-196. [PMID: 31325722 DOI: 10.1016/j.media.2019.07.005] [Citation(s) in RCA: 51] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2019] [Revised: 06/06/2019] [Accepted: 07/04/2019] [Indexed: 12/31/2022]
Abstract
The goal of this work was to develop a method for accurate and robust automatic segmentation of the prostate clinical target volume in transrectal ultrasound (TRUS) images for brachytherapy. These images can be difficult to segment because of weak or insufficient landmarks or strong artifacts. We devise a method, based on convolutional neural networks (CNNs), that produces accurate segmentations on easy and difficult images alike. We propose two strategies to achieve improved segmentation accuracy on difficult images. First, for CNN training we adopt an adaptive sampling strategy, whereby the training process is encouraged to pay more attention to images that are difficult to segment. Secondly, we train a CNN ensemble and use the disagreement among this ensemble to identify uncertain segmentations and to estimate a segmentation uncertainty map. We improve uncertain segmentations by utilizing the prior shape information in the form of a statistical shape model. Our method achieves Hausdorff distance of 2.7 ± 2.3 mm and Dice score of 93.9 ± 3.5%. Comparisons with several competing methods show that our method achieves significantly better results and reduces the likelihood of committing large segmentation errors. Furthermore, our experiments show that our approach to estimating segmentation uncertainty is better than or on par with recent methods for estimation of prediction uncertainty in deep learning models. Our study demonstrates that estimation of model uncertainty and use of prior shape information can significantly improve the performance of CNN-based medical image segmentation methods, especially on difficult images.
Collapse
Affiliation(s)
- Davood Karimi
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada.
| | - Qi Zeng
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada
| | - Prateek Mathur
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada
| | - Apeksha Avinash
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada
| | | | | | - Purang Abolmaesumi
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada
| | - Septimiu E Salcudean
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|
6
|
Lei Y, Tian S, He X, Wang T, Wang B, Patel P, Jani AB, Mao H, Curran WJ, Liu T, Yang X. Ultrasound prostate segmentation based on multidirectional deeply supervised V-Net. Med Phys 2019; 46:3194-3206. [PMID: 31074513 PMCID: PMC6625925 DOI: 10.1002/mp.13577] [Citation(s) in RCA: 68] [Impact Index Per Article: 11.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2018] [Revised: 04/14/2019] [Accepted: 05/01/2019] [Indexed: 01/09/2023] Open
Abstract
PURPOSE Transrectal ultrasound (TRUS) is a versatile and real-time imaging modality that is commonly used in image-guided prostate cancer interventions (e.g., biopsy and brachytherapy). Accurate segmentation of the prostate is key to biopsy needle placement, brachytherapy treatment planning, and motion management. Manual segmentation during these interventions is time-consuming and subject to inter- and intraobserver variation. To address these drawbacks, we aimed to develop a deep learning-based method which integrates deep supervision into a three-dimensional (3D) patch-based V-Net for prostate segmentation. METHODS AND MATERIALS We developed a multidirectional deep-learning-based method to automatically segment the prostate for ultrasound-guided radiation therapy. A 3D supervision mechanism is integrated into the V-Net stages to deal with the optimization difficulties when training a deep network with limited training data. We combine a binary cross-entropy (BCE) loss and a batch-based Dice loss into the stage-wise hybrid loss function for a deep supervision training. During the segmentation stage, the patches are extracted from the newly acquired ultrasound image as the input of the well-trained network and the well-trained network adaptively labels the prostate tissue. The final segmented prostate volume is reconstructed using patch fusion and further refined through a contour refinement processing. RESULTS Forty-four patients' TRUS images were used to test our segmentation method. Our segmentation results were compared with the manually segmented contours (ground truth). The mean prostate volume Dice similarity coefficient (DSC), Hausdorff distance (HD), mean surface distance (MSD), and residual mean surface distance (RMSD) were 0.92 ± 0.03, 3.94 ± 1.55, 0.60 ± 0.23, and 0.90 ± 0.38 mm, respectively. CONCLUSION We developed a novel deeply supervised deep learning-based approach with reliable contour refinement to automatically segment the TRUS prostate, demonstrated its clinical feasibility, and validated its accuracy compared to manual segmentation. The proposed technique could be a useful tool for diagnostic and therapeutic applications in prostate cancer.
Collapse
Affiliation(s)
- Yang Lei
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Sibo Tian
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Xiuxiu He
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Tonghe Wang
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Bo Wang
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Pretesh Patel
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Ashesh B. Jani
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Hui Mao
- Department of Radiology and Imaging Sciences and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Walter J. Curran
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Tian Liu
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| | - Xiaofeng Yang
- Department of Radiation Oncology and Winship Cancer InstituteEmory UniversityAtlantaGA30322USA
| |
Collapse
|
7
|
A deep learning approach for real time prostate segmentation in freehand ultrasound guided biopsy. Med Image Anal 2018; 48:107-116. [PMID: 29886268 DOI: 10.1016/j.media.2018.05.010] [Citation(s) in RCA: 44] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2018] [Revised: 05/30/2018] [Accepted: 05/31/2018] [Indexed: 12/14/2022]
Abstract
Targeted prostate biopsy, incorporating multi-parametric magnetic resonance imaging (mp-MRI) and its registration with ultrasound, is currently the state-of-the-art in prostate cancer diagnosis. The registration process in most targeted biopsy systems today relies heavily on accurate segmentation of ultrasound images. Automatic or semi-automatic segmentation is typically performed offline prior to the start of the biopsy procedure. In this paper, we present a deep neural network based real-time prostate segmentation technique during the biopsy procedure, hence paving the way for dynamic registration of mp-MRI and ultrasound data. In addition to using convolutional networks for extracting spatial features, the proposed approach employs recurrent networks to exploit the temporal information among a series of ultrasound images. One of the key contributions in the architecture is to use residual convolution in the recurrent networks to improve optimization. We also exploit recurrent connections within and across different layers of the deep networks to maximize the utilization of the temporal information. Furthermore, we perform dense and sparse sampling of the input ultrasound sequence to make the network robust to ultrasound artifacts. Our architecture is trained on 2,238 labeled transrectal ultrasound images, with an additional 637 and 1,017 unseen images used for validation and testing, respectively. We obtain a mean Dice similarity coefficient of 93%, a mean surface distance error of 1.10 mm and a mean Hausdorff distance error of 3.0 mm. A comparison of the reported results with those of a state-of-the-art technique indicates statistically significant improvement achieved by the proposed approach.
Collapse
|
8
|
Derraz F, Forzy G, Delebarre A, Taleb-Ahmed A, Oussalah M, Peyrodie L, Verclytte S. Prostate contours delineation using interactive directional active contours model and parametric shape prior model. INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING 2015; 31. [PMID: 26009857 DOI: 10.1002/cnm.2726] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2013] [Revised: 05/17/2015] [Accepted: 05/17/2015] [Indexed: 06/04/2023]
Abstract
Prostate contours delineation on Magnetic Resonance (MR) images is a challenging and important task in medical imaging with applications of guiding biopsy, surgery and therapy. While a fully automated method is highly desired for this application, it can be a very difficult task due to the structure and surrounding tissues of the prostate gland. Traditional active contours-based delineation algorithms are typically quite successful for piecewise constant images. Nevertheless, when MR images have diffuse edges or multiple similar objects (e.g. bladder close to prostate) within close proximity, such approaches have proven to be unsuccessful. In order to mitigate these problems, we proposed a new framework for bi-stage contours delineation algorithm based on directional active contours (DAC) incorporating prior knowledge of the prostate shape. We first explicitly addressed the prostate contour delineation problem based on fast globally DAC that incorporates both statistical and parametric shape prior model. In doing so, we were able to exploit the global aspects of contour delineation problem by incorporating a user feedback in contours delineation process where it is shown that only a small amount of user input can sometimes resolve ambiguous scenarios raised by DAC. In addition, once the prostate contours have been delineated, a cost functional is designed to incorporate both user feedback interaction and the parametric shape prior model. Using data from publicly available prostate MR datasets, which includes several challenging clinical datasets, we highlighted the effectiveness and the capability of the proposed algorithm. Besides, the algorithm has been compared with several state-of-the-art methods.
Collapse
Affiliation(s)
- Foued Derraz
- Telecommunications Laboratory, Technology Faculty, Abou Bekr Belkaïd University, Tlemcen, 13000, Algeria
- Université Nord de France, F-59000, Lille, France
- Unité de Traitement de Signaux Biomédicaux, Faculté de médecine et maïeutique, Lille, France
- LAMIH UMR CNRS 8201, Le Mont Houy, Université de Valenciennes et Cambresis, 59313, Valenciennes, France
| | - Gérard Forzy
- Unité de Traitement de Signaux Biomédicaux, Faculté de médecine et maïeutique, Lille, France
- Groupement des Hopitaux de l'́Institut Catholique de Lille, France
| | - Arnaud Delebarre
- Groupement des Hopitaux de l'́Institut Catholique de Lille, France
| | - Abdelmalik Taleb-Ahmed
- Université Nord de France, F-59000, Lille, France
- LAMIH UMR CNRS 8201, Le Mont Houy, Université de Valenciennes et Cambresis, 59313, Valenciennes, France
| | - Mourad Oussalah
- School of Electronics, Electrical and Computer Engineering, University of Birmingham, Edgbaston, Birmingham, B15 2TT, UK
| | - Laurent Peyrodie
- Université Nord de France, F-59000, Lille, France
- Hautes Etudes dÍngénieur, 13 rue de Toul, 59000, Lille, France
| | | |
Collapse
|
9
|
Fontanarosa D, van der Meer S, Bamber J, Harris E, O'Shea T, Verhaegen F. Review of ultrasound image guidance in external beam radiotherapy: I. Treatment planning and inter-fraction motion management. Phys Med Biol 2015; 60:R77-114. [PMID: 25592664 DOI: 10.1088/0031-9155/60/3/r77] [Citation(s) in RCA: 72] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
In modern radiotherapy, verification of the treatment to ensure the target receives the prescribed dose and normal tissues are optimally spared has become essential. Several forms of image guidance are available for this purpose. The most commonly used forms of image guidance are based on kilovolt or megavolt x-ray imaging. Image guidance can also be performed with non-harmful ultrasound (US) waves. This increasingly used technique has the potential to offer both anatomical and functional information.This review presents an overview of the historical and current use of two-dimensional and three-dimensional US imaging for treatment verification in radiotherapy. The US technology and the implementation in the radiotherapy workflow are described. The use of US guidance in the treatment planning process is discussed. The role of US technology in inter-fraction motion monitoring and management is explained, and clinical studies of applications in areas such as the pelvis, abdomen and breast are reviewed. A companion review paper (O'Shea et al 2015 Phys. Med. Biol. submitted) will extensively discuss the use of US imaging for intra-fraction motion quantification and novel applications of US technology to RT.
Collapse
Affiliation(s)
- Davide Fontanarosa
- Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center (MUMC), Maastricht 6201 BN, the Netherlands. Oncology Solutions Department, Philips Research, High Tech Campus 34, Eindhoven 5656 AE, the Netherlands
| | | | | | | | | | | |
Collapse
|
10
|
Chilali O, Ouzzane A, Diaf M, Betrouni N. A survey of prostate modeling for image analysis. Comput Biol Med 2014; 53:190-202. [PMID: 25156801 DOI: 10.1016/j.compbiomed.2014.07.019] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2013] [Revised: 06/22/2014] [Accepted: 07/23/2014] [Indexed: 11/18/2022]
Affiliation(s)
- O Chilali
- Inserm U703, 152, rue du Docteur Yersin, Lille University Hospital, 59120 Loos, France; Automatic Department, Mouloud Mammeri University, Tizi-Ouzou, Algeria
| | - A Ouzzane
- Inserm U703, 152, rue du Docteur Yersin, Lille University Hospital, 59120 Loos, France; Urology Department, Claude Huriez Hospital, Lille University Hospital, France
| | - M Diaf
- Automatic Department, Mouloud Mammeri University, Tizi-Ouzou, Algeria
| | - N Betrouni
- Inserm U703, 152, rue du Docteur Yersin, Lille University Hospital, 59120 Loos, France.
| |
Collapse
|
11
|
TRUS image segmentation with non-parametric kernel density estimation shape prior. Biomed Signal Process Control 2013. [DOI: 10.1016/j.bspc.2013.07.002] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
12
|
A supervised learning framework of statistical shape and probability priors for automatic prostate segmentation in ultrasound images. Med Image Anal 2013; 17:587-600. [DOI: 10.1016/j.media.2013.04.001] [Citation(s) in RCA: 41] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2012] [Revised: 02/05/2013] [Accepted: 04/01/2013] [Indexed: 11/21/2022]
|
13
|
Ghose S, Oliver A, Martí R, Lladó X, Vilanova JC, Freixenet J, Mitra J, Sidibé D, Meriaudeau F. A survey of prostate segmentation methodologies in ultrasound, magnetic resonance and computed tomography images. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2012; 108:262-287. [PMID: 22739209 DOI: 10.1016/j.cmpb.2012.04.006] [Citation(s) in RCA: 108] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/20/2011] [Revised: 04/17/2012] [Accepted: 04/17/2012] [Indexed: 06/01/2023]
Abstract
Prostate segmentation is a challenging task, and the challenges significantly differ from one imaging modality to another. Low contrast, speckle, micro-calcifications and imaging artifacts like shadow poses serious challenges to accurate prostate segmentation in transrectal ultrasound (TRUS) images. However in magnetic resonance (MR) images, superior soft tissue contrast highlights large variability in shape, size and texture information inside the prostate. In contrast poor soft tissue contrast between prostate and surrounding tissues in computed tomography (CT) images pose a challenge in accurate prostate segmentation. This article reviews the methods developed for prostate gland segmentation TRUS, MR and CT images, the three primary imaging modalities that aids prostate cancer diagnosis and treatment. The objective of this work is to study the key similarities and differences among the different methods, highlighting their strengths and weaknesses in order to assist in the choice of an appropriate segmentation methodology. We define a new taxonomy for prostate segmentation strategies that allows first to group the algorithms and then to point out the main advantages and drawbacks of each strategy. We provide a comprehensive description of the existing methods in all TRUS, MR and CT modalities, highlighting their key-points and features. Finally, a discussion on choosing the most appropriate segmentation strategy for a given imaging modality is provided. A quantitative comparison of the results as reported in literature is also presented.
Collapse
Affiliation(s)
- Soumya Ghose
- Computer Vision and Robotics Group, University of Girona, Campus Montilivi, Edifici P-IV, 17071 Girona, Spain.
| | | | | | | | | | | | | | | | | |
Collapse
|
14
|
Akbari H, Fei B. 3D ultrasound image segmentation using wavelet support vector machines. Med Phys 2012; 39:2972-84. [PMID: 22755682 PMCID: PMC3360689 DOI: 10.1118/1.4709607] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2011] [Revised: 04/09/2012] [Accepted: 04/11/2012] [Indexed: 11/07/2022] Open
Abstract
PURPOSE Transrectal ultrasound (TRUS) imaging is clinically used in prostate biopsy and therapy. Segmentation of the prostate on TRUS images has many applications. In this study, a three-dimensional (3D) segmentation method for TRUS images of the prostate is presented for 3D ultrasound-guided biopsy. METHODS This segmentation method utilizes a statistical shape, texture information, and intensity profiles. A set of wavelet support vector machines (W-SVMs) is applied to the images at various subregions of the prostate. The W-SVMs are trained to adaptively capture the features of the ultrasound images in order to differentiate the prostate and nonprostate tissue. This method consists of a set of wavelet transforms for extraction of prostate texture features and a kernel-based support vector machine to classify the textures. The voxels around the surface of the prostate are labeled in sagittal, coronal, and transverse planes. The weight functions are defined for each labeled voxel on each plane and on the model at each region. In the 3D segmentation procedure, the intensity profiles around the boundary between the tentatively labeled prostate and nonprostate tissue are compared to the prostate model. Consequently, the surfaces are modified based on the model intensity profiles. The segmented prostate is updated and compared to the shape model. These two steps are repeated until they converge. Manual segmentation of the prostate serves as the gold standard and a variety of methods are used to evaluate the performance of the segmentation method. RESULTS The results from 40 TRUS image volumes of 20 patients show that the Dice overlap ratio is 90.3% ± 2.3% and that the sensitivity is 87.7% ± 4.9%. CONCLUSIONS The proposed method provides a useful tool in our 3D ultrasound image-guided prostate biopsy and can also be applied to other applications in the prostate.
Collapse
Affiliation(s)
- Hamed Akbari
- Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, GA 30329, USA
| | | |
Collapse
|
15
|
Yang X, Fei B. 3D Prostate Segmentation of Ultrasound Images Combining Longitudinal Image Registration and Machine Learning. PROCEEDINGS OF SPIE--THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING 2012; 8316:83162O. [PMID: 24027622 DOI: 10.1117/12.912188] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
We developed a three-dimensional (3D) segmentation method for transrectal ultrasound (TRUS) images, which is based on longitudinal image registration and machine learning. Using longitudinal images of each individual patient, we register previously acquired images to the new images of the same subject. Three orthogonal Gabor filter banks were used to extract texture features from each registered image. Patient-specific Gabor features from the registered images are used to train kernel support vector machines (KSVMs) and then to segment the newly acquired prostate image. The segmentation method was tested in TRUS data from five patients. The average surface distance between our and manual segmentation is 1.18 ± 0.31 mm, indicating that our automatic segmentation method based on longitudinal image registration is feasible for segmenting the prostate in TRUS images.
Collapse
Affiliation(s)
- Xiaofeng Yang
- Department of Radiology and Imaging Sciences, Emory University, Atlanta, GA
| | | |
Collapse
|
16
|
Fei B, Schuster DM, Master V, Akbari H, Fenster A, Nieh P. A Molecular Image-directed, 3D Ultrasound-guided Biopsy System for the Prostate. PROCEEDINGS OF SPIE--THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING 2012; 2012. [PMID: 22708023 DOI: 10.1117/12.912182] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Systematic transrectal ultrasound (TRUS)-guided biopsy is the standard method for a definitive diagnosis of prostate cancer. However, this biopsy approach uses two-dimensional (2D) ultrasound images to guide biopsy and can miss up to 30% of prostate cancers. We are developing a molecular image-directed, three-dimensional (3D) ultrasound image-guided biopsy system for improved detection of prostate cancer. The system consists of a 3D mechanical localization system and software workstation for image segmentation, registration, and biopsy planning. In order to plan biopsy in a 3D prostate, we developed an automatic segmentation method based wavelet transform. In order to incorporate PET/CT images into ultrasound-guided biopsy, we developed image registration methods to fuse TRUS and PET/CT images. The segmentation method was tested in ten patients with a DICE overlap ratio of 92.4% ± 1.1 %. The registration method has been tested in phantoms. The biopsy system was tested in prostate phantoms and 3D ultrasound images were acquired from two human patients. We are integrating the system for PET/CT directed, 3D ultrasound-guided, targeted biopsy in human patients.
Collapse
Affiliation(s)
- Baowei Fei
- Department of Radiology and Imaging Sciences, Emory University, Atlanta, GA 30329
| | | | | | | | | | | |
Collapse
|
17
|
Ghose S, Mitra J, Oliver A, Martí R, Lladó X, Freixenet J, Vilanova JC, Comet J, Sidibé D, Meriaudeau F. Spectral clustering of shape and probability prior models for automatic prostate segmentation. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2012; 2012:2335-2338. [PMID: 23366392 DOI: 10.1109/embc.2012.6346431] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/01/2023]
Abstract
Imaging artifacts in Transrectal Ultrasound (TRUS) images and inter-patient variations in prostate shape and size challenge computer-aided automatic or semi-automatic segmentation of the prostate. In this paper, we propose to use multiple mean parametric models derived from principal component analysis (PCA) of shape and posterior probability information to segment the prostate. In contrast to traditional statistical models of shape and intensity priors, we use posterior probability of the prostate region determined from random forest classification to build, initialize and propagate our model. Multiple mean models derived from spectral clustering of combined shape and appearance parameters ensure improvement in segmentation accuracies. The proposed method achieves mean Dice similarity coefficient (DSC) value of 0.96±0.01, with a mean segmentation time of 0.67±0.02 seconds when validated with 46 images from 23 datasets in a leave-one-patient-out validation framework.
Collapse
Affiliation(s)
- S Ghose
- Le2i CNRS-UMR 6306, Université de Bourgogne, Le Creusot, France.
| | | | | | | | | | | | | | | | | | | |
Collapse
|
18
|
|
19
|
Ghose S, Oliver A, Martí R, Lladó X, Freixenet J, Mitra J, Vilanova JC, Comet-Batlle J, Meriaudeau F. Statistical shape and texture model of quadrature phase information for prostate segmentation. Int J Comput Assist Radiol Surg 2011; 7:43-55. [DOI: 10.1007/s11548-011-0616-y] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2011] [Accepted: 05/05/2011] [Indexed: 11/28/2022]
|
20
|
Garnier C, Bellanger JJ, Wu K, Shu H, Costet N, Mathieu R, De Crevoisier R, Coatrieux JL. Prostate segmentation in HIFU therapy. IEEE TRANSACTIONS ON MEDICAL IMAGING 2011; 30:792-803. [PMID: 21118767 PMCID: PMC3095593 DOI: 10.1109/tmi.2010.2095465] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Prostate segmentation in 3-D transrectal ultrasound images is an important step in the definition of the intra-operative planning of high intensity focused ultrasound (HIFU) therapy. This paper presents two main approaches for the semi-automatic methods based on discrete dynamic contour and optimal surface detection. They operate in 3-D and require a minimal user interaction. They are considered both alone or sequentially combined, with and without postregularization, and applied on anisotropic and isotropic volumes. Their performance, using different metrics, has been evaluated on a set of 28 3-D images by comparison with two expert delineations. For the most efficient algorithm, the symmetric average surface distance was found to be 0.77 mm.
Collapse
Affiliation(s)
- Carole Garnier
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
| | - Jean-Jacques Bellanger
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
| | - Ke Wu
- CRIBS, Centre de Recherche en Information Biomédicale sino-français
INSERM : LABORATOIRE INTERNATIONAL ASSOCIÉUniversité de Rennes ISouthEast UniversityRennes,FR
- LIST, Laboratory of Image Science and Technology
SouthEast UniversitySi Pai Lou 2, Nanjing, 210096,CN
| | - Huazhong Shu
- CRIBS, Centre de Recherche en Information Biomédicale sino-français
INSERM : LABORATOIRE INTERNATIONAL ASSOCIÉUniversité de Rennes ISouthEast UniversityRennes,FR
- LIST, Laboratory of Image Science and Technology
SouthEast UniversitySi Pai Lou 2, Nanjing, 210096,CN
| | - Nathalie Costet
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
| | - Romain Mathieu
- Service d'urologie
CHU RennesHôpital PontchaillouUniversité de Rennes I2 rue Henri Le Guilloux 35033 Rennes cedex 9,FR
| | - Renaud De Crevoisier
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
- Département de radiothérapie
CRLCC Eugène Marquis35000 Rennes,FR
| | - Jean-Louis Coatrieux
- LTSI, Laboratoire Traitement du Signal et de l'Image
INSERM : U642Université de Rennes ICampus de Beaulieu, 263 Avenue du Général Leclerc - CS 74205 - 35042 Rennes Cedex,FR
- CRIBS, Centre de Recherche en Information Biomédicale sino-français
INSERM : LABORATOIRE INTERNATIONAL ASSOCIÉUniversité de Rennes ISouthEast UniversityRennes,FR
- * Correspondence should be adressed to: Jean-Louis Coatrieux
| |
Collapse
|
21
|
Multiple Mean Models of Statistical Shape and Probability Priors for Automatic Prostate Segmentation. ACTA ACUST UNITED AC 2011. [DOI: 10.1007/978-3-642-23944-1_4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/15/2023]
|
22
|
Gao Y, Sandhu R, Fichtinger G, Tannenbaum AR. A coupled global registration and segmentation framework with application to magnetic resonance prostate imagery. IEEE TRANSACTIONS ON MEDICAL IMAGING 2010; 29:1781-94. [PMID: 20529727 PMCID: PMC2988404 DOI: 10.1109/tmi.2010.2052065] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
Extracting the prostate from magnetic resonance (MR) imagery is a challenging and important task for medical image analysis and surgical planning. We present in this work a unified shape-based framework to extract the prostate from MR prostate imagery. In many cases, shape-based segmentation is a two-part problem. First, one must properly align a set of training shapes such that any variation in shape is not due to pose. Then segmentation can be performed under the constraint of the learnt shape. However, the general registration task of prostate shapes becomes increasingly difficult due to the large variations in pose and shape in the training sets, and is not readily handled through existing techniques. Thus, the contributions of this paper are twofold. We first explicitly address the registration problem by representing the shapes of a training set as point clouds. In doing so, we are able to exploit the more global aspects of registration via a certain particle filtering based scheme. In addition, once the shapes have been registered, a cost functional is designed to incorporate both the local image statistics as well as the learnt shape prior. We provide experimental results, which include several challenging clinical data sets, to highlight the algorithm's capability of robustly handling supine/prone prostate registration and the overall segmentation task.
Collapse
Affiliation(s)
- Yi Gao
- Schools of Electrical and Computer Engineering and Biomedical Engineering, Georgia Institute of Technology, Atlanta, GA, 30332 USA
| | - Romeil Sandhu
- Schools of Electrical and Computer Engineering and Biomedical Engineering, Georgia Institute of Technology, Atlanta, GA 30332 USA
| | - Gabor Fichtinger
- School of Computing, Queens University, Kingston, ON K7L 3N6, Canada
| | - Allen Robert Tannenbaum
- Schools of Electrical and Computer Engineering and Biomedical Engineering, Georgia Institute of Technology, Atlanta, GA, 30332 USA and also with the Department of Electrical Engineering, Technion-IIT, Haifa 32000, Israel
| |
Collapse
|
23
|
|
24
|
Betrouni N, Lopes R, Makni N, Dewalle AS, Vermandel M, Rousseau J. Volume quantification by fuzzy logic modelling in freehand ultrasound imaging. ULTRASONICS 2009; 49:646-652. [PMID: 19409591 DOI: 10.1016/j.ultras.2009.03.008] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/17/2008] [Revised: 03/23/2009] [Accepted: 03/28/2009] [Indexed: 05/27/2023]
Abstract
INTRODUCTION Many algorithms exist for 3D reconstruction of data from freehand 2D ultrasound slices. These methods are based on interpolation techniques to fill the voxels from the pixels. For quantification purposes, segmentation is involved to delineate the structure of interest. However, speckle and partial volume effect errors can affect quantification. OBJECTIVE This study aimed to assess the effect of the combination of a fuzzy model and 3D reconstruction algorithms of freehand ultrasound images on these errors. METHODS We introduced a fuzzification step to correct the initial segmentation, by weighting the pixels by a distribution function, taking into account the local gray levels, the orientation of the local gradient, and the local contrast-to-noise ratio. We then used two of the most wide-spread reconstruction algorithms (pixel nearest neighbour (PNN) and voxel nearest neighbour (VNN)) to interpolate and create the volume of the structure. Finally, defuzzification was used to estimate the optimal volume. VALIDATION B-scans were acquired using 5 MHz and 8 MHz ultrasound probes on ultrasound tissue-mimicking phantoms. Quantitative evaluation of the reconstructed structures was done by comparing the method output to the real volumes. Comparison was also done with classical PNN and VNN algorithms. RESULTS With the fuzzy model quantification errors were less than 4.3%, whereas with classical algorithms, errors were larger (10.3% using PNN, 17.2% using VNN). Furthermore, for very small structures (0.5 cm(3)), errors reached 24.3% using the classical VNN algorithm, while they were about 9.6% with the fuzzy VNN model. CONCLUSION These experiments prove that the fuzzy model allows volumes to be determined with better accuracy and reproducibility, especially for small structures (<3 cm(3)).
Collapse
Affiliation(s)
- N Betrouni
- INSERM U703, Pavillon Vancostanobel, University Hospital of Lille (CHRU), Lille 59037, France.
| | | | | | | | | | | |
Collapse
|
25
|
Fazel Zarandi MH, Norouzzadeh S, Teimourian S, Moeen M. Counting eosinophils in bronchoalveolar lavage fluid images with fuzzy methodology. Appl Soft Comput 2009. [DOI: 10.1016/j.asoc.2008.09.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
26
|
Brock KK, Nichol AM, Ménard C, Moseley JL, Warde PR, Catton CN, Jaffray DA. Accuracy and sensitivity of finite element model-based deformable registration of the prostate. Med Phys 2008; 35:4019-25. [PMID: 18841853 DOI: 10.1118/1.2965263] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023] Open
Affiliation(s)
- Kristy K Brock
- Radiation Medicine Program, Princess Margaret Hospital, University Health Network, and the University of Toronto, Toronto, Ontario M5G 2M9, Canada.
| | | | | | | | | | | | | |
Collapse
|
27
|
Sahba F, Tizhoosh HR, Salama MMA. Application of reinforcement learning for segmentation of transrectal ultrasound images. BMC Med Imaging 2008; 8:8. [PMID: 18430220 PMCID: PMC2397386 DOI: 10.1186/1471-2342-8-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2007] [Accepted: 04/22/2008] [Indexed: 11/10/2022] Open
Abstract
Background Among different medical image modalities, ultrasound imaging has a very widespread clinical use. But, due to some factors, such as poor image contrast, noise and missing or diffuse boundaries, the ultrasound images are inherently difficult to segment. An important application is estimation of the location and volume of the prostate in transrectal ultrasound (TRUS) images. For this purpose, manual segmentation is a tedious and time consuming procedure. Methods We introduce a new method for the segmentation of the prostate in transrectal ultrasound images, using a reinforcement learning scheme. This algorithm is used to find the appropriate local values for sub-images and to extract the prostate. It contains an offline stage, where the reinforcement learning agent uses some images and manually segmented versions of these images to learn from. The reinforcement agent is provided with reward/punishment, determined objectively to explore/exploit the solution space. After this stage, the agent has acquired knowledge stored in the Q-matrix. The agent can then use this knowledge for new input images to extract a coarse version of the prostate. Results We have carried out experiments to segment TRUS images. The results demonstrate the potential of this approach in the field of medical image segmentation. Conclusion By using the proposed method, we can find the appropriate local values and segment the prostate. This approach can be used for segmentation tasks containing one object of interest. To improve this prototype, more investigations are needed.
Collapse
Affiliation(s)
- Farhang Sahba
- Medical Instrument Analysis and Machine Intelligence Group, University of Waterloo, Waterloo, Canada.
| | | | | |
Collapse
|
28
|
Hussein R, McKenzie FD. Identifying ambiguous prostate gland contours from histology using capsule shape information and least squares curve fitting. Int J Comput Assist Radiol Surg 2007. [DOI: 10.1007/s11548-007-0134-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
29
|
|
30
|
Betrouni N, Puech P, Dewalle AS, Lopes R, Dubois P, Vermandel M. 3D automatic segmentation and reconstruction of prostate on MR images. ACTA ACUST UNITED AC 2007; 2007:5259-62. [DOI: 10.1109/iembs.2007.4353528] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
31
|
Pasquier D, Lacornerie T, Vermandel M, Rousseau J, Lartigau E, Betrouni N. Automatic Segmentation of Pelvic Structures From Magnetic Resonance Images for Prostate Cancer Radiotherapy. Int J Radiat Oncol Biol Phys 2007; 68:592-600. [PMID: 17498571 DOI: 10.1016/j.ijrobp.2007.02.005] [Citation(s) in RCA: 99] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2006] [Revised: 02/06/2007] [Accepted: 02/08/2007] [Indexed: 11/18/2022]
Abstract
PURPOSE Target-volume and organ-at-risk delineation is a time-consuming task in radiotherapy planning. The development of automated segmentation tools remains problematic, because of pelvic organ shape variability. We evaluate a three-dimensional (3D), deformable-model approach and a seeded region-growing algorithm for automatic delineation of the prostate and organs-at-risk on magnetic resonance images. METHODS AND MATERIALS Manual and automatic delineation were compared in 24 patients using a sagittal T2-weighted (T2-w) turbo spin echo (TSE) sequence and an axial T1-weighted (T1-w) 3D fast-field echo (FFE) or TSE sequence. For automatic prostate delineation, an organ model-based method was used. Prostates without seminal vesicles were delineated as the clinical target volume (CTV). For automatic bladder and rectum delineation, a seeded region-growing method was used. Manual contouring was considered the reference method. The following parameters were measured: volume ratio (Vr) (automatic/manual), volume overlap (Vo) (ratio of the volume of intersection to the volume of union; optimal value = 1), and correctly delineated volume (Vc) (percent ratio of the volume of intersection to the manually defined volume; optimal value = 100). RESULTS For the CTV, the Vr, Vo, and Vc were 1.13 (+/-0.1 SD), 0.78 (+/-0.05 SD), and 94.75 (+/-3.3 SD), respectively. For the rectum, the Vr, Vo, and Vc were 0.97 (+/-0.1 SD), 0.78 (+/-0.06 SD), and 86.52 (+/-5 SD), respectively. For the bladder, the Vr, Vo, and Vc were 0.95 (+/-0.03 SD), 0.88 (+/-0.03 SD), and 91.29 (+/-3.1 SD), respectively. CONCLUSIONS Our results show that the organ-model method is robust, and results in reproducible prostate segmentation with minor interactive corrections. For automatic bladder and rectum delineation, magnetic resonance imaging soft-tissue contrast enables the use of region-growing methods.
Collapse
Affiliation(s)
- David Pasquier
- Département Universitaire de Radiothérapie, Centre Oscar Lambret, Université Lille II, Lille, France.
| | | | | | | | | | | |
Collapse
|
32
|
Abstract
Background Identifying the location and the volume of the prostate is important for ultrasound-guided prostate brachytherapy. Prostate volume is also important for prostate cancer diagnosis. Manual outlining of the prostate border is able to determine the prostate volume accurately, however, it is time consuming and tedious. Therefore, a number of investigations have been devoted to designing algorithms that are suitable for segmenting the prostate boundary in ultrasound images. The most popular method is the deformable model (snakes), a method that involves designing an energy function and then optimizing this function. The snakes algorithm usually requires either an initial contour or some points on the prostate boundary to be estimated close enough to the original boundary which is considered a drawback to this powerful method. Methods The proposed spectral clustering segmentation algorithm is built on a totally different foundation that doesn't involve any function design or optimization. It also doesn't need any contour or any points on the boundary to be estimated. The proposed algorithm depends mainly on graph theory techniques. Results Spectral clustering is used in this paper for both prostate gland segmentation from the background and internal gland segmentation. The obtained segmented images were compared to the expert radiologist segmented images. The proposed algorithm obtained excellent gland segmentation results with 93% average overlap areas. It is also able to internally segment the gland where the segmentation showed consistency with the cancerous regions identified by the expert radiologist. Conclusion The proposed spectral clustering segmentation algorithm obtained fast excellent estimates that can give rough prostate volume and location as well as internal gland segmentation without any user interaction.
Collapse
|
33
|
Medina R, Bravo A, Windyga P, Toro J, Yan P, Onik G. A 2-d active appearance model for prostate segmentation in ultrasound images. CONFERENCE PROCEEDINGS : ... ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL CONFERENCE 2007; 2005:3363-6. [PMID: 17280943 DOI: 10.1109/iembs.2005.1617198] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/13/2023]
Abstract
In this research we use an active appearance model (AAM) as the core of a robust segmentation algorithm that combines contour and texture information to learn shape variability through a training procedure in trans-rectal ultrasound (TRUS) images of the prostate. Training was carried out using a dataset of 95 images which are preprocessed using gray-level mathematical morphology operators. Preliminary results are promising. The segmentation can provide shapes that have an overlap with respect to a ground truth shape, traced by an expert, of up to 96%, and an average distance from point to curve of up to 1.3 pixels.
Collapse
Affiliation(s)
- R Medina
- Universidad de Los Andes, Grupo de Ingeniería Biomédica (GIBULA) Mérida 5101, Venezuela.
| | | | | | | | | | | |
Collapse
|
34
|
Tsantis S, Dimitropoulos N, Cavouras D, Nikiforidis G. A hybrid multi-scale model for thyroid nodule boundary detection on ultrasound images. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2006; 84:86-98. [PMID: 17055608 DOI: 10.1016/j.cmpb.2006.09.006] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/13/2006] [Revised: 09/14/2006] [Accepted: 09/14/2006] [Indexed: 05/12/2023]
Abstract
A hybrid model for thyroid nodule boundary detection on ultrasound images is introduced. The segmentation model combines the advantages of the "á trous" wavelet transform to detect sharp gray-level variations and the efficiency of the Hough transform to discriminate the region of interest within an environment with excessive structural noise. The proposed method comprise three major steps: a wavelet edge detection procedure for speckle reduction and edge map estimation, based on local maxima representation. Subsequently, a multiscale structure model is utilised in order to acquire a contour representation by means of local maxima chaining with similar attributes to form significant structures. Finally, the Hough transform is employed with 'a priori' knowledge related to the nodule's shape in order to distinguish the nodule's contour from adjacent structures. The comparative study between our automatic method and manual delineations demonstrated that the boundaries extracted by the hybrid model are closely correlated with that of the physicians. The proposed hybrid method can be of value to thyroid nodules' shape-based classification and as an educational tool for inexperienced radiologists.
Collapse
Affiliation(s)
- S Tsantis
- Department of Medical Physics, School of Medicine, University of Patras, Rio Patras 26500, Greece.
| | | | | | | |
Collapse
|
35
|
Hodge AC, Fenster A, Downey DB, Ladak HM. Prostate boundary segmentation from ultrasound images using 2D active shape models: optimisation and extension to 3D. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2006; 84:99-113. [PMID: 16930764 DOI: 10.1016/j.cmpb.2006.07.001] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2006] [Revised: 06/28/2006] [Accepted: 07/07/2006] [Indexed: 05/11/2023]
Abstract
Boundary outlining, or segmentation, of the prostate is an important task in diagnosis and treatment planning for prostate cancer. This paper describes an algorithm based on two-dimensional (2D) active shape models (ASM) for semi-automatic segmentation of the prostate boundary from ultrasound images. Optimisation of the 2D ASM for prostatic ultrasound was done first by examining ASM construction and image search parameters. Extension of the algorithm to three-dimensional (3D) segmentation was then done using rotational-based slicing. Evaluation of the 3D segmentation algorithm used distance- and volume-based error metrics to compare algorithm generated boundary outlines to gold standard (manually generated) boundary outlines. Minimum description length landmark placement for ASM construction, and specific values for constraints and image search were found to be optimal. Evaluation of the algorithm versus gold standard boundaries found an average mean absolute distance of 1.09+/-0.49 mm, an average percent absolute volume difference of 3.28+/-3.16%, and a 5x speed increase versus manual segmentation.
Collapse
Affiliation(s)
- Adam C Hodge
- Department of Medical Biophysics, The University of Western Ontario, London, Ontario, Canada
| | | | | | | |
Collapse
|
36
|
Betrouni N, Pasquier D, Dewalle AS, Jounwaz R, Dubois P, Lopes R, Lartigau E. Ultrasound image registration for patient setup in conformal radiotherapy of prostate cancer. CONFERENCE PROCEEDINGS : ... ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL CONFERENCE 2006; 2006:3795-3798. [PMID: 17945800 DOI: 10.1109/iembs.2006.260455] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
The goal of 3D conformal radiotherapy (CRT) is to conform the high dose region to the target volume while sparing surrounding normal tissue. Knowledge about the mobility of organs relative to the bony anatomy and to the reference position is of great importance when daily positioning patient. In this work we present a method to monitor patient setup during CRT of prostate cancer. The method is based on ultrasound tracking and matching with planning modality.
Collapse
Affiliation(s)
- N Betrouni
- INSERM, U703, EA 1049, Lille 2 Univ., France
| | | | | | | | | | | | | |
Collapse
|
37
|
Sahba F, Tizhoosh HR, Salama MM. A coarse-to-fine approach to prostate boundary segmentation in ultrasound images. Biomed Eng Online 2005; 4:58. [PMID: 16219098 PMCID: PMC1266388 DOI: 10.1186/1475-925x-4-58] [Citation(s) in RCA: 17] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2005] [Accepted: 10/11/2005] [Indexed: 11/13/2022] Open
Abstract
Background In this paper a novel method for prostate segmentation in transrectal ultrasound images is presented. Methods A segmentation procedure consisting of four main stages is proposed. In the first stage, a locally adaptive contrast enhancement method is used to generate a well-contrasted image. In the second stage, this enhanced image is thresholded to extract an area containing the prostate (or large portions of it). Morphological operators are then applied to obtain a point inside of this area. Afterwards, a Kalman estimator is employed to distinguish the boundary from irrelevant parts (usually caused by shadow) and generate a coarsely segmented version of the prostate. In the third stage, dilation and erosion operators are applied to extract outer and inner boundaries from the coarsely estimated version. Consequently, fuzzy membership functions describing regional and gray-level information are employed to selectively enhance the contrast within the prostate region. In the last stage, the prostate boundary is extracted using strong edges obtained from selectively enhanced image and information from the vicinity of the coarse estimation. Results A total average similarity of 98.76%(± 0.68) with gold standards was achieved. Conclusion The proposed approach represents a robust and accurate approach to prostate segmentation.
Collapse
Affiliation(s)
- Farhang Sahba
- Medical Instrument Analysis and Machine Intelligence Group, University of Waterloo, Waterloo, Canada
- Department of Systems Design Engineering, 200 University Avenue West, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada
| | - Hamid R Tizhoosh
- Medical Instrument Analysis and Machine Intelligence Group, University of Waterloo, Waterloo, Canada
- Department of Systems Design Engineering, 200 University Avenue West, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada
| | - Magdy M Salama
- Medical Instrument Analysis and Machine Intelligence Group, University of Waterloo, Waterloo, Canada
- Department of Electrical and Computer Engineering, 200 University Avenue West, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada
| |
Collapse
|