1
|
Lee J, Salari K, Nandalur S, Shen C, Al-Katib S, Zhao L, Krauss D, Thompson A, Seymour Z, Nandalur K. Prognostic value of central gland volume on MRI for biochemical recurrence after prostate radiotherapy. Abdom Radiol (NY) 2024:10.1007/s00261-024-04717-7. [PMID: 39592480 DOI: 10.1007/s00261-024-04717-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2024] [Revised: 11/19/2024] [Accepted: 11/20/2024] [Indexed: 11/28/2024]
Abstract
PURPOSE This study evaluates pretreatment prostate magnetic resonance imaging (MRI) metrics and clinical characteristics in predicting biochemical recurrence (BCR) after prostate radiotherapy (RT). METHODS In this retrospective single institution study, we identified men in our prostate cancer database who underwent MRI within 6 months prior to completing definitive RT from May 2011 to February 2023. Central gland volume and peripheral zone volume were measured by a radiologist using manual segmentation, along with Prostate Imaging-Reporting and Data System (PI-RADS) score. The primary objective was to determine the association of central gland volume with biochemical recurrence per Phoenix criteria. Multivariable and inverse probability weighted (IPW) Cox proportional hazards regression models were constructed. RESULTS A total of 373 men were included, with a median follow-up of 28 months. Thirteen (3.5%) were low risk, 97 (26%) favorable intermediate risk, 201 (53.9%) unfavorable intermediate risk, and 62 (16.6%) high risk. Fifty-four (14.5%) patients received conventionally fractionated RT, 105 (28.2%) moderately hypofractionated RT, 121 (32.4%) high-dose rate brachytherapy, and 93 (24.9%) stereotactic body RT. The 3- and 5-year rates of BCR were 7.8% and 18.3%, respectively. Higher central gland volume (per 5 cc) was associated with decreased risk of BCR (hazard ratio [HR]: 0.69, 95% confidence interval [CI]: 0.50-0.94, p = 0.02) on the multivariable Cox model and IPW model (HR: 0.75, 95% CI: 0.65-0.87, p < 0.001). No significant association was seen with peripheral zone volume, PI-RADS score, or RT modality. CONCLUSION Increased central gland volume on pretreatment prostate MRI is independently associated with a lower risk of biochemical recurrence after definitive radiation for prostate cancer. Central gland volume may improve patient selection and oncologic risk stratification prior to offering RT.
Collapse
Affiliation(s)
- Joseph Lee
- Corewell Health William Beaumont University Hospital, Royal Oak, USA.
| | - Kamran Salari
- Corewell Health William Beaumont University Hospital, Royal Oak, USA
| | | | - Chen Shen
- Corewell Health William Beaumont University Hospital, Royal Oak, USA
| | - Sayf Al-Katib
- Corewell Health William Beaumont University Hospital, Royal Oak, USA
| | - Lili Zhao
- Corewell Health William Beaumont University Hospital, Royal Oak, USA
| | - Daniel Krauss
- Corewell Health William Beaumont University Hospital, Royal Oak, USA
| | | | | | - Kiran Nandalur
- Corewell Health William Beaumont University Hospital, Royal Oak, USA
| |
Collapse
|
2
|
Kuanar S, Cai J, Nakai H, Nagayama H, Takahashi H, LeGout J, Kawashima A, Froemming A, Mynderse L, Dora C, Humphreys M, Klug J, Korfiatis P, Erickson B, Takahashi N. Transition-zone PSA-density calculated from MRI deep learning prostate zonal segmentation model for prediction of clinically significant prostate cancer. Abdom Radiol (NY) 2024; 49:3722-3734. [PMID: 38896250 DOI: 10.1007/s00261-024-04301-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2024] [Revised: 03/19/2024] [Accepted: 03/20/2024] [Indexed: 06/21/2024]
Abstract
PURPOSE To develop a deep learning (DL) zonal segmentation model of prostate MR from T2-weighted images and evaluate TZ-PSAD for prediction of the presence of csPCa (Gleason score of 7 or higher) compared to PSAD. METHODS 1020 patients with a prostate MRI were randomly selected to develop a DL zonal segmentation model. Test dataset included 20 cases in which 2 radiologists manually segmented both the peripheral zone (PZ) and TZ. Pair-wise Dice index was calculated for each zone. For the prediction of csPCa using PSAD and TZ-PSAD, we used 3461 consecutive MRI exams performed in patients without a history of prostate cancer, with pathological confirmation and available PSA values, but not used in the development of the segmentation model as internal test set and 1460 MRI exams from PI-CAI challenge as external test set. PSAD and TZ-PSAD were calculated from the segmentation model output. The area under the receiver operating curve (AUC) was compared between PSAD and TZ-PSAD using univariate and multivariate analysis (adjusts age) with the DeLong test. RESULTS Dice scores of the model against two radiologists were 0.87/0.87 and 0.74/0.72 for TZ and PZ, while those between the two radiologists were 0.88 for TZ and 0.75 for PZ. For the prediction of csPCa, the AUCs of TZPSAD were significantly higher than those of PSAD in both internal test set (univariate analysis, 0.75 vs. 0.73, p < 0.001; multivariate analysis, 0.80 vs. 0.78, p < 0.001) and external test set (univariate analysis, 0.76 vs. 0.74, p < 0.001; multivariate analysis, 0.77 vs. 0.75, p < 0.001 in external test set). CONCLUSION DL model-derived zonal segmentation facilitates the practical measurement of TZ-PSAD and shows it to be a slightly better predictor of csPCa compared to the conventional PSAD. Use of TZ-PSAD may increase the sensitivity of detecting csPCa by 2-5% for a commonly used specificity level.
Collapse
Affiliation(s)
- Shiba Kuanar
- Department of Radiology, Mayo Clinic, Rochester, MN, 55905, USA
| | - Jason Cai
- Department of Radiology, Mayo Clinic, Rochester, MN, 55905, USA
- Department of Radiology, Massachusetts General Hospital, Boston, MA, USA
| | - Hirotsugu Nakai
- Department of Radiology, Mayo Clinic, Rochester, MN, 55905, USA
| | - Hiroki Nagayama
- Department of Radiology, Mayo Clinic, Rochester, MN, 55905, USA
- Department of Radiology, Nagasaki University, Nagasaki, Japan
| | | | - Jordan LeGout
- Department of Radiology, Mayo Clinic, Jacksonville, FL, USA
| | | | - Adam Froemming
- Department of Radiology, Mayo Clinic, Rochester, MN, 55905, USA
| | | | - Chandler Dora
- Department of Urology, Mayo Clinic, Jacksonville, FL, USA
| | | | - Jason Klug
- Department of Radiology, Mayo Clinic, Rochester, MN, 55905, USA
| | | | | | - Naoki Takahashi
- Department of Radiology, Mayo Clinic, Rochester, MN, 55905, USA.
| |
Collapse
|
3
|
Gunashekar DD, Bielak L, Oerther B, Benndorf M, Nedelcu A, Hickey S, Zamboglou C, Grosu AL, Bock M. Comparison of data fusion strategies for automated prostate lesion detection using mpMRI correlated with whole mount histology. Radiat Oncol 2024; 19:96. [PMID: 39080735 PMCID: PMC11287985 DOI: 10.1186/s13014-024-02471-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2024] [Accepted: 06/14/2024] [Indexed: 08/03/2024] Open
Abstract
BACKGROUND In this work, we compare input level, feature level and decision level data fusion techniques for automatic detection of clinically significant prostate lesions (csPCa). METHODS Multiple deep learning CNN architectures were developed using the Unet as the baseline. The CNNs use both multiparametric MRI images (T2W, ADC, and High b-value) and quantitative clinical data (prostate specific antigen (PSA), PSA density (PSAD), prostate gland volume & gross tumor volume (GTV)), and only mp-MRI images (n = 118), as input. In addition, co-registered ground truth data from whole mount histopathology images (n = 22) were used as a test set for evaluation. RESULTS The CNNs achieved for early/intermediate / late level fusion a precision of 0.41/0.51/0.61, recall value of 0.18/0.22/0.25, an average precision of 0.13 / 0.19 / 0.27, and F scores of 0.55/0.67/ 0.76. Dice Sorensen Coefficient (DSC) was used to evaluate the influence of combining mpMRI with parametric clinical data for the detection of csPCa. We compared the DSC between the predictions of CNN's trained with mpMRI and parametric clinical and the CNN's trained with only mpMRI images as input with the ground truth. We obtained a DSC of data 0.30/0.34/0.36 and 0.26/0.33/0.34 respectively. Additionally, we evaluated the influence of each mpMRI input channel for the task of csPCa detection and obtained a DSC of 0.14 / 0.25 / 0.28. CONCLUSION The results show that the decision level fusion network performs better for the task of prostate lesion detection. Combining mpMRI data with quantitative clinical data does not show significant differences between these networks (p = 0.26/0.62/0.85). The results show that CNNs trained with all mpMRI data outperform CNNs with less input channels which is consistent with current clinical protocols where the same input is used for PI-RADS lesion scoring. TRIAL REGISTRATION The trial was registered retrospectively at the German Register for Clinical Studies (DRKS) under proposal number Nr. 476/14 & 476/19.
Collapse
Affiliation(s)
- Deepa Darshini Gunashekar
- Division of Medical Physics, Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany.
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany.
| | - Lars Bielak
- Division of Medical Physics, Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany
| | - Benedict Oerther
- Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Matthias Benndorf
- Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Andrea Nedelcu
- Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Samantha Hickey
- Division of Medical Physics, Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Constantinos Zamboglou
- Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- German Oncology Center, European University Cyprus, Limassol, Cyprus
| | - Anca-Ligia Grosu
- Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany
| | - Michael Bock
- Division of Medical Physics, Department of Diagnostic and Interventional Radiology, University Medical Center Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany
| |
Collapse
|
4
|
Fassia MK, Balasubramanian A, Woo S, Vargas HA, Hricak H, Konukoglu E, Becker AS. Deep Learning Prostate MRI Segmentation Accuracy and Robustness: A Systematic Review. Radiol Artif Intell 2024; 6:e230138. [PMID: 38568094 PMCID: PMC11294957 DOI: 10.1148/ryai.230138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2023] [Revised: 02/24/2024] [Accepted: 03/19/2024] [Indexed: 04/28/2024]
Abstract
Purpose To investigate the accuracy and robustness of prostate segmentation using deep learning across various training data sizes, MRI vendors, prostate zones, and testing methods relative to fellowship-trained diagnostic radiologists. Materials and Methods In this systematic review, Embase, PubMed, Scopus, and Web of Science databases were queried for English-language articles using keywords and related terms for prostate MRI segmentation and deep learning algorithms dated to July 31, 2022. A total of 691 articles from the search query were collected and subsequently filtered to 48 on the basis of predefined inclusion and exclusion criteria. Multiple characteristics were extracted from selected studies, such as deep learning algorithm performance, MRI vendor, and training dataset features. The primary outcome was comparison of mean Dice similarity coefficient (DSC) for prostate segmentation for deep learning algorithms versus diagnostic radiologists. Results Forty-eight studies were included. Most published deep learning algorithms for whole prostate gland segmentation (39 of 42 [93%]) had a DSC at or above expert level (DSC ≥ 0.86). The mean DSC was 0.79 ± 0.06 (SD) for peripheral zone, 0.87 ± 0.05 for transition zone, and 0.90 ± 0.04 for whole prostate gland segmentation. For selected studies that used one major MRI vendor, the mean DSCs of each were as follows: General Electric (three of 48 studies), 0.92 ± 0.03; Philips (four of 48 studies), 0.92 ± 0.02; and Siemens (six of 48 studies), 0.91 ± 0.03. Conclusion Deep learning algorithms for prostate MRI segmentation demonstrated accuracy similar to that of expert radiologists despite varying parameters; therefore, future research should shift toward evaluating segmentation robustness and patient outcomes across diverse clinical settings. Keywords: MRI, Genital/Reproductive, Prostate Segmentation, Deep Learning Systematic review registration link: osf.io/nxaev © RSNA, 2024.
Collapse
Affiliation(s)
- Mohammad-Kasim Fassia
- From the Departments of Radiology (M.K.F.) and Urology (A.B.), New York-Presbyterian Weill Cornell Medical Center, 525 E 68th St, New York, NY 10065-4870; Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY (S.W., H.A.V., H.H., A.S.B.); and Department of Biomedical Imaging, ETH-Zurich, Zurich Switzerland (E.K.)
| | - Adithya Balasubramanian
- From the Departments of Radiology (M.K.F.) and Urology (A.B.), New York-Presbyterian Weill Cornell Medical Center, 525 E 68th St, New York, NY 10065-4870; Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY (S.W., H.A.V., H.H., A.S.B.); and Department of Biomedical Imaging, ETH-Zurich, Zurich Switzerland (E.K.)
| | - Sungmin Woo
- From the Departments of Radiology (M.K.F.) and Urology (A.B.), New York-Presbyterian Weill Cornell Medical Center, 525 E 68th St, New York, NY 10065-4870; Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY (S.W., H.A.V., H.H., A.S.B.); and Department of Biomedical Imaging, ETH-Zurich, Zurich Switzerland (E.K.)
| | - Hebert Alberto Vargas
- From the Departments of Radiology (M.K.F.) and Urology (A.B.), New York-Presbyterian Weill Cornell Medical Center, 525 E 68th St, New York, NY 10065-4870; Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY (S.W., H.A.V., H.H., A.S.B.); and Department of Biomedical Imaging, ETH-Zurich, Zurich Switzerland (E.K.)
| | - Hedvig Hricak
- From the Departments of Radiology (M.K.F.) and Urology (A.B.), New York-Presbyterian Weill Cornell Medical Center, 525 E 68th St, New York, NY 10065-4870; Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY (S.W., H.A.V., H.H., A.S.B.); and Department of Biomedical Imaging, ETH-Zurich, Zurich Switzerland (E.K.)
| | - Ender Konukoglu
- From the Departments of Radiology (M.K.F.) and Urology (A.B.), New York-Presbyterian Weill Cornell Medical Center, 525 E 68th St, New York, NY 10065-4870; Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY (S.W., H.A.V., H.H., A.S.B.); and Department of Biomedical Imaging, ETH-Zurich, Zurich Switzerland (E.K.)
| | - Anton S. Becker
- From the Departments of Radiology (M.K.F.) and Urology (A.B.), New York-Presbyterian Weill Cornell Medical Center, 525 E 68th St, New York, NY 10065-4870; Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY (S.W., H.A.V., H.H., A.S.B.); and Department of Biomedical Imaging, ETH-Zurich, Zurich Switzerland (E.K.)
| |
Collapse
|
5
|
Talyshinskii A, Hameed BMZ, Ravinder PP, Naik N, Randhawa P, Shah M, Rai BP, Tokas T, Somani BK. Catalyzing Precision Medicine: Artificial Intelligence Advancements in Prostate Cancer Diagnosis and Management. Cancers (Basel) 2024; 16:1809. [PMID: 38791888 PMCID: PMC11119252 DOI: 10.3390/cancers16101809] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2024] [Revised: 04/29/2024] [Accepted: 05/07/2024] [Indexed: 05/26/2024] Open
Abstract
BACKGROUND The aim was to analyze the current state of deep learning (DL)-based prostate cancer (PCa) diagnosis with a focus on magnetic resonance (MR) prostate reconstruction; PCa detection/stratification/reconstruction; positron emission tomography/computed tomography (PET/CT); androgen deprivation therapy (ADT); prostate biopsy; associated challenges and their clinical implications. METHODS A search of the PubMed database was conducted based on the inclusion and exclusion criteria for the use of DL methods within the abovementioned areas. RESULTS A total of 784 articles were found, of which, 64 were included. Reconstruction of the prostate, the detection and stratification of prostate cancer, the reconstruction of prostate cancer, and diagnosis on PET/CT, ADT, and biopsy were analyzed in 21, 22, 6, 7, 2, and 6 studies, respectively. Among studies describing DL use for MR-based purposes, datasets with magnetic field power of 3 T, 1.5 T, and 3/1.5 T were used in 18/19/5, 0/1/0, and 3/2/1 studies, respectively, of 6/7 studies analyzing DL for PET/CT diagnosis which used data from a single institution. Among the radiotracers, [68Ga]Ga-PSMA-11, [18F]DCFPyl, and [18F]PSMA-1007 were used in 5, 1, and 1 study, respectively. Only two studies that analyzed DL in the context of DT met the inclusion criteria. Both were performed with a single-institution dataset with only manual labeling of training data. Three studies, each analyzing DL for prostate biopsy, were performed with single- and multi-institutional datasets. TeUS, TRUS, and MRI were used as input modalities in two, three, and one study, respectively. CONCLUSION DL models in prostate cancer diagnosis show promise but are not yet ready for clinical use due to variability in methods, labels, and evaluation criteria. Conducting additional research while acknowledging all the limitations outlined is crucial for reinforcing the utility and effectiveness of DL-based models in clinical settings.
Collapse
Affiliation(s)
- Ali Talyshinskii
- Department of Urology and Andrology, Astana Medical University, Astana 010000, Kazakhstan;
| | | | - Prajwal P. Ravinder
- Department of Urology, Kasturba Medical College, Mangaluru, Manipal Academy of Higher Education, Manipal 576104, India;
| | - Nithesh Naik
- Department of Mechanical and Industrial Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal 576104, India;
| | - Princy Randhawa
- Department of Mechatronics, Manipal University Jaipur, Jaipur 303007, India;
| | - Milap Shah
- Department of Urology, Aarogyam Hospital, Ahmedabad 380014, India;
| | - Bhavan Prasad Rai
- Department of Urology, Freeman Hospital, Newcastle upon Tyne NE7 7DN, UK;
| | - Theodoros Tokas
- Department of Urology, Medical School, University General Hospital of Heraklion, University of Crete, 14122 Heraklion, Greece;
| | - Bhaskar K. Somani
- Department of Mechanical and Industrial Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal 576104, India;
- Department of Urology, University Hospital Southampton NHS Trust, Southampton SO16 6YD, UK
| |
Collapse
|
6
|
Molière S, Hamzaoui D, Granger B, Montagne S, Allera A, Ezziane M, Luzurier A, Quint R, Kalai M, Ayache N, Delingette H, Renard-Penna R. Reference standard for the evaluation of automatic segmentation algorithms: Quantification of inter observer variability of manual delineation of prostate contour on MRI. Diagn Interv Imaging 2024; 105:65-73. [PMID: 37822196 DOI: 10.1016/j.diii.2023.08.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2023] [Revised: 07/28/2023] [Accepted: 08/01/2023] [Indexed: 10/13/2023]
Abstract
PURPOSE The purpose of this study was to investigate the relationship between inter-reader variability in manual prostate contour segmentation on magnetic resonance imaging (MRI) examinations and determine the optimal number of readers required to establish a reliable reference standard. MATERIALS AND METHODS Seven radiologists with various experiences independently performed manual segmentation of the prostate contour (whole-gland [WG] and transition zone [TZ]) on 40 prostate MRI examinations obtained in 40 patients. Inter-reader variability in prostate contour delineations was estimated using standard metrics (Dice similarity coefficient [DSC], Hausdorff distance and volume-based metrics). The impact of the number of readers (from two to seven) on segmentation variability was assessed using pairwise metrics (consistency) and metrics with respect to a reference segmentation (conformity), obtained either with majority voting or simultaneous truth and performance level estimation (STAPLE) algorithm. RESULTS The average segmentation DSC for two readers in pairwise comparison was 0.919 for WG and 0.876 for TZ. Variability decreased with the number of readers: the interquartile ranges of the DSC were 0.076 (WG) / 0.021 (TZ) for configurations with two readers, 0.005 (WG) / 0.012 (TZ) for configurations with three readers, and 0.002 (WG) / 0.0037 (TZ) for configurations with six readers. The interquartile range decreased slightly faster between two and three readers than between three and six readers. When using consensus methods, variability often reached its minimum with three readers (with STAPLE, DSC = 0.96 [range: 0.945-0.971] for WG and DSC = 0.94 [range: 0.912-0.957] for TZ, and interquartile range was minimal for configurations with three readers. CONCLUSION The number of readers affects the inter-reader variability, in terms of inter-reader consistency and conformity to a reference. Variability is minimal for three readers, or three readers represent a tipping point in the variability evolution, with both pairwise-based metrics or metrics with respect to a reference. Accordingly, three readers may represent an optimal number to determine references for artificial intelligence applications.
Collapse
Affiliation(s)
- Sébastien Molière
- Department of Radiology, Hôpitaux Universitaire de Strasbourg, Hôpital de Hautepierre, 67200, Strasbourg, France; Breast and Thyroid Imaging Unit, Institut de Cancérologie Strasbourg Europe, 67200, Strasbourg, France; IGBMC, Institut de Génétique et de Biologie Moléculaire et Cellulaire, 67400, Illkirch, France.
| | - Dimitri Hamzaoui
- Inria, Epione Team, Sophia Antipolis, Université Côte d'Azur, 06902, Nice, France
| | - Benjamin Granger
- Sorbonne Université, INSERM, Institut Pierre Louis d'Epidémiologie et de Santé Publique, IPLESP, AP-HP, Hôpital Pitié Salpêtrière, Département de Santé Publique, 75013, Paris, France
| | - Sarah Montagne
- Department of Radiology, Hôpital Tenon, Assistance Publique-Hôpitaux de Paris, 75020, Paris, France; Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique-Hôpitaux de Paris, 75013, Paris, France; GRC N° 5, Oncotype-Uro, Sorbonne Université, 75020, Paris, France
| | - Alexandre Allera
- Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique-Hôpitaux de Paris, 75013, Paris, France
| | - Malek Ezziane
- Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique-Hôpitaux de Paris, 75013, Paris, France
| | - Anna Luzurier
- Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique-Hôpitaux de Paris, 75013, Paris, France
| | - Raphaelle Quint
- Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique-Hôpitaux de Paris, 75013, Paris, France
| | - Mehdi Kalai
- Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique-Hôpitaux de Paris, 75013, Paris, France
| | - Nicholas Ayache
- Department of Radiology, Hôpitaux Universitaire de Strasbourg, Hôpital de Hautepierre, 67200, Strasbourg, France
| | - Hervé Delingette
- Department of Radiology, Hôpitaux Universitaire de Strasbourg, Hôpital de Hautepierre, 67200, Strasbourg, France
| | - Raphaële Renard-Penna
- Department of Radiology, Hôpital Tenon, Assistance Publique-Hôpitaux de Paris, 75020, Paris, France; Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique-Hôpitaux de Paris, 75013, Paris, France; GRC N° 5, Oncotype-Uro, Sorbonne Université, 75020, Paris, France
| |
Collapse
|
7
|
Yang B, Liu Y, Zhu J, Lu N, Dai J, Men K. Pretreatment information-aided automatic segmentation for online magnetic resonance imaging-guided prostate radiotherapy. Med Phys 2024; 51:922-932. [PMID: 37449545 DOI: 10.1002/mp.16608] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Revised: 06/20/2023] [Accepted: 06/20/2023] [Indexed: 07/18/2023] Open
Abstract
BACKGROUND It is necessary to contour regions of interest (ROIs) for online magnetic resonance imaging (MRI)-guided adaptive radiotherapy (MRIgART). These updated contours are used for online replanning to obtain maximum dosimetric benefits. Contouring can be accomplished using deformable image registration (DIR) and deep learning (DL)-based autosegmentation methods. However, these methods may require considerable manual editing and thus prolong treatment time. PURPOSE The present study aimed to improve autosegmentation performance by integrating patients' pretreatment information in a DL-based segmentation algorithm. It is expected to improve the efficiency of current MRIgART process. METHODS Forty patients with prostate cancer were enrolled retrospectively. The online adaptive MR images, patient-specific planning computed tomography (CT), and contours in CT were used for segmentation. The deformable registration of planning CT and MR images was performed first to obtain a deformable CT and corresponding contours. A novel DL network, which can integrate such patient-specific information (deformable CT and corresponding contours) into the segmentation task of MR images was designed. We performed a four-fold cross-validation for the DL models. The proposed method was compared with DIR and DL methods on segmentation of prostate cancer. The ROIs included the clinical target volume (CTV), bladder, rectum, left femur head, and right femur head. Dosimetric parameters of automatically generated ROIs were evaluated using a clinical treatment planning system. RESULTS The proposed method enhanced the segmentation accuracy of conventional procedures. Its mean value of the dice similarity coefficient (93.5%) over the five ROIs was higher than both DIR (87.5%) and DL (87.2%). The number of patients (n = 40) that required major editing using DIR, DL, and our method were 12, 18, and 7 (CTV); 17, 4, and 1 (bladder); 8, 11, and 5 (rectum); 2, 4, and 1 (left femur head); and 3, 7, and 1 (right femur head), respectively. The Spearman rank correlation coefficient of dosimetry parameters between the proposed method and ground truth was 0.972 ± 0.040, higher than that of DIR (0.897 ± 0.098) and DL (0.871 ± 0.134). CONCLUSION This study proposed a novel method that integrates patient-specific pretreatment information into DL-based segmentation algorithm. It outperformed baseline methods, thereby improving the efficiency and segmentation accuracy in adaptive radiotherapy.
Collapse
Affiliation(s)
- Bining Yang
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Yuxiang Liu
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Ji Zhu
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Ningning Lu
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Jianrong Dai
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| | - Kuo Men
- National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Beijing, China
| |
Collapse
|
8
|
Kaneko M, Magoulianitis V, Ramacciotti LS, Raman A, Paralkar D, Chen A, Chu TN, Yang Y, Xue J, Yang J, Liu J, Jadvar DS, Gill K, Cacciamani GE, Nikias CL, Duddalwar V, Jay Kuo CC, Gill IS, Abreu AL. The Novel Green Learning Artificial Intelligence for Prostate Cancer Imaging: A Balanced Alternative to Deep Learning and Radiomics. Urol Clin North Am 2024; 51:1-13. [PMID: 37945095 DOI: 10.1016/j.ucl.2023.08.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2023]
Abstract
The application of artificial intelligence (AI) on prostate magnetic resonance imaging (MRI) has shown promising results. Several AI systems have been developed to automatically analyze prostate MRI for segmentation, cancer detection, and region of interest characterization, thereby assisting clinicians in their decision-making process. Deep learning, the current trend in imaging AI, has limitations including the lack of transparency "black box", large data processing, and excessive energy consumption. In this narrative review, the authors provide an overview of the recent advances in AI for prostate cancer diagnosis and introduce their next-generation AI model, Green Learning, as a promising solution.
Collapse
Affiliation(s)
- Masatomo Kaneko
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer; Department of Urology, Graduate School of Medical Science, Kyoto Prefectural University of Medicine, Kyoto, Japan
| | - Vasileios Magoulianitis
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Lorenzo Storino Ramacciotti
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer
| | - Alex Raman
- Western University of Health Sciences. Pomona, CA, USA
| | - Divyangi Paralkar
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer
| | - Andrew Chen
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer
| | - Timothy N Chu
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer
| | - Yijing Yang
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Jintang Xue
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Jiaxin Yang
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Jinyuan Liu
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Donya S Jadvar
- Dornsife School of Letters and Science, University of Southern California, Los Angeles, CA, USA
| | - Karanvir Gill
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer
| | - Giovanni E Cacciamani
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer; Department of Radiology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Chrysostomos L Nikias
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Vinay Duddalwar
- Department of Radiology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - C-C Jay Kuo
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Inderbir S Gill
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Andre Luis Abreu
- USC Institute of Urology and Catherine & Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; USC Institute of Urology, Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer; Department of Radiology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA.
| |
Collapse
|
9
|
Thimansson E, Baubeta E, Engman J, Bjartell A, Zackrisson S. Deep learning performance on MRI prostate gland segmentation: evaluation of two commercially available algorithms compared with an expert radiologist. J Med Imaging (Bellingham) 2024; 11:015002. [PMID: 38404754 PMCID: PMC10882278 DOI: 10.1117/1.jmi.11.1.015002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2023] [Revised: 01/04/2024] [Accepted: 01/30/2024] [Indexed: 02/27/2024] Open
Abstract
Purpose Accurate whole-gland prostate segmentation is crucial for successful ultrasound-MRI fusion biopsy, focal cancer treatment, and radiation therapy techniques. Commercially available artificial intelligence (AI) models, using deep learning algorithms (DLAs) for prostate gland segmentation, are rapidly increasing in numbers. Typically, their performance in a true clinical context is scarcely examined or published. We used a heterogenous clinical MRI dataset in this study aiming to contribute to validation of AI-models. Approach We included 123 patients in this retrospective multicenter (7 hospitals), multiscanner (8 scanners, 2 vendors, 1.5T and 3T) study comparing prostate contour assessment by 2 commercially available Food and Drug Association (FDA)-cleared and CE-marked algorithms (DLA1 and DLA2) using an expert radiologist's manual contours as a reference standard (RSexp) in this clinical heterogeneous MRI dataset. No in-house training of the DLAs was performed before testing. Several methods for comparing segmentation overlap were used, the Dice similarity coefficient (DSC) being the most important. Results The DSC mean and standard deviation for DLA1 versus the radiologist reference standard (RSexp) was 0.90 ± 0.05 and for DLA2 versus RSexp it was 0.89 ± 0.04 . A paired t -test to compare the DSC for DLA1 and DLA2 showed no statistically significant difference (p = 0.8 ). Conclusions Two commercially available DL algorithms (FDA-cleared and CE-marked) can perform accurate whole-gland prostate segmentation on a par with expert radiologist manual planimetry on a real-world clinical dataset. Implementing AI models in the clinical routine may free up time that can be better invested in complex work tasks, adding more patient value.
Collapse
Affiliation(s)
- Erik Thimansson
- Lund University, Department of Translational Medicine, Diagnostic Radiology, Malmö, Sweden
- Helsingborg Hospital, Department of Radiology, Helsingborg, Sweden
| | - Erik Baubeta
- Lund University, Department of Translational Medicine, Diagnostic Radiology, Malmö, Sweden
- Skåne University Hospital, Department of Imaging and Functional Medicine, Malmö, Sweden
| | - Jonatan Engman
- Lund University, Department of Translational Medicine, Diagnostic Radiology, Malmö, Sweden
- Skåne University Hospital, Department of Imaging and Functional Medicine, Malmö, Sweden
| | - Anders Bjartell
- Lund University, Department of Translational Medicine, Urology, Malmö, Sweden
- Skåne University Hospital, Department of Urology, Malmö, Sweden
| | - Sophia Zackrisson
- Lund University, Department of Translational Medicine, Diagnostic Radiology, Malmö, Sweden
- Skåne University Hospital, Department of Imaging and Functional Medicine, Malmö, Sweden
| |
Collapse
|
10
|
Jeganathan T, Salgues E, Schick U, Tissot V, Fournier G, Valéri A, Nguyen TA, Bourbonne V. Inter-Rater Variability of Prostate Lesion Segmentation on Multiparametric Prostate MRI. Biomedicines 2023; 11:3309. [PMID: 38137530 PMCID: PMC10741937 DOI: 10.3390/biomedicines11123309] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2023] [Revised: 12/10/2023] [Accepted: 12/12/2023] [Indexed: 12/24/2023] Open
Abstract
INTRODUCTION External radiotherapy is a major treatment for localized prostate cancer (PCa). Dose escalation to the whole prostate gland increases biochemical relapse-free survival but also acute and late toxicities. Dose escalation to the dominant index lesion (DIL) only is of growing interest. It requires a robust delineation of the DIL. In this context, we aimed to evaluate the inter-observer variability of DIL delineation. MATERIAL AND METHODS Two junior radiologists and a senior radiation oncologist delineated DILs on 64 mpMRIs of patients with histologically confirmed PCa. For each mpMRI and each reader, eight individual DIL segmentations were delineated. These delineations were blindly performed from one another and resulted from the individual analysis of the T2, apparent diffusion coefficient (ADC), b2000, and dynamic contrast enhanced (DCE) sequences, as well as the analysis of combined sequences (T2ADC, T2ADCb2000, T2ADCDCE, and T2ADCb2000DCE). Delineation variability was assessed using the DICE coefficient, Jaccard index, Hausdorff distance measure, and mean distance to agreement. RESULTS T2, ADC, T2ADC, b2000, T2 + ADC + b2000, T2 + ADC + DCE, and T2 + ADC + b2000 + DCE sequences obtained DICE coefficients of 0.51, 0.50, 0.54, 0.52, 0.54, 0.55, 0.53, respectively, which are significantly higher than the perfusion sequence alone (0.35, p < 0.001). The analysis of other similarity metrics lead to similar results. The tumor volume and PI-RADS classification were positively correlated with the DICE scores. CONCLUSION Our study showed that the contours of prostatic lesions were more reproducible on certain sequences but confirmed the great variability of prostatic contours with a maximum DICE coefficient calculated at 0.55 (joint analysis of T2, ADC, and perfusion sequences).
Collapse
Affiliation(s)
- Thibaut Jeganathan
- Radiology Department, University Hospital, 29200 Brest, France; (T.J.); (E.S.); (V.T.)
| | - Emile Salgues
- Radiology Department, University Hospital, 29200 Brest, France; (T.J.); (E.S.); (V.T.)
| | - Ulrike Schick
- Radiation Oncology Department, University Hospital, 29200 Brest, France;
- INSERM, LaTIM UMR 1101, University of Western Brittany, 29238 Brest, France; (G.F.); (A.V.); (T.-A.N.)
| | - Valentin Tissot
- Radiology Department, University Hospital, 29200 Brest, France; (T.J.); (E.S.); (V.T.)
| | - Georges Fournier
- INSERM, LaTIM UMR 1101, University of Western Brittany, 29238 Brest, France; (G.F.); (A.V.); (T.-A.N.)
- Urology Department, University Hospital, 29200 Brest, France
| | - Antoine Valéri
- INSERM, LaTIM UMR 1101, University of Western Brittany, 29238 Brest, France; (G.F.); (A.V.); (T.-A.N.)
- Urology Department, University Hospital, 29200 Brest, France
| | - Truong-An Nguyen
- INSERM, LaTIM UMR 1101, University of Western Brittany, 29238 Brest, France; (G.F.); (A.V.); (T.-A.N.)
- Urology Department, University Hospital, 29200 Brest, France
| | - Vincent Bourbonne
- Radiation Oncology Department, University Hospital, 29200 Brest, France;
- INSERM, LaTIM UMR 1101, University of Western Brittany, 29238 Brest, France; (G.F.); (A.V.); (T.-A.N.)
| |
Collapse
|
11
|
Mervak BM, Fried JG, Wasnik AP. A Review of the Clinical Applications of Artificial Intelligence in Abdominal Imaging. Diagnostics (Basel) 2023; 13:2889. [PMID: 37761253 PMCID: PMC10529018 DOI: 10.3390/diagnostics13182889] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2023] [Revised: 08/23/2023] [Accepted: 09/05/2023] [Indexed: 09/29/2023] Open
Abstract
Artificial intelligence (AI) has been a topic of substantial interest for radiologists in recent years. Although many of the first clinical applications were in the neuro, cardiothoracic, and breast imaging subspecialties, the number of investigated and real-world applications of body imaging has been increasing, with more than 30 FDA-approved algorithms now available for applications in the abdomen and pelvis. In this manuscript, we explore some of the fundamentals of artificial intelligence and machine learning, review major functions that AI algorithms may perform, introduce current and potential future applications of AI in abdominal imaging, provide a basic understanding of the pathways by which AI algorithms can receive FDA approval, and explore some of the challenges with the implementation of AI in clinical practice.
Collapse
Affiliation(s)
| | | | - Ashish P. Wasnik
- Department of Radiology, University of Michigan—Michigan Medicine, 1500 E. Medical Center Dr., Ann Arbor, MI 48109, USA; (B.M.M.); (J.G.F.)
| |
Collapse
|
12
|
Yan Y, Liu R, Chen H, Zhang L, Zhang Q. CCT-Unet: A U-Shaped Network Based on Convolution Coupled Transformer for Segmentation of Peripheral and Transition Zones in Prostate MRI. IEEE J Biomed Health Inform 2023; 27:4341-4351. [PMID: 37368800 DOI: 10.1109/jbhi.2023.3289913] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/29/2023]
Abstract
The accurate segmentation of prostate region in magnetic resonance imaging (MRI) can provide reliable basis for artificially intelligent diagnosis of prostate cancer. Transformer-based models have been increasingly used in image analysis due to their ability to acquire long-term global contextual features. Although Transformer can provide feature representations of the overall appearance and contour representations at long distance, it does not perform well on small-scale datasets of prostate MRI due to its insensitivity to local variation such as the heterogeneity of the grayscale intensities in the peripheral zone and transition zone across patients; meanwhile, the convolutional neural network (CNN) could retain these local features well. Therefore, a robust prostate segmentation model that can aggregate the characteristics of CNN and Transformer is desired. In this work, a U-shaped network based on the convolution coupled Transformer is proposed for segmentation of peripheral and transition zones in prostate MRI, named the convolution coupled Transformer U-Net (CCT-Unet). The convolutional embedding block is first designed for encoding high-resolution input to retain the edge detail of the image. Then the convolution coupled Transformer block is proposed to enhance the ability of local feature extraction and capture long-term correlation that encompass anatomical information. The feature conversion module is also proposed to alleviate the semantic gap in the process of jumping connection. Extensive experiments have been conducted to compare our CCT-Unet with several state-of-the-art methods on both the ProstateX open dataset and the self-bulit Huashan dataset, and the results have consistently shown the accuracy and robustness of our CCT-Unet in MRI prostate segmentation.
Collapse
|
13
|
Kim H, Kang SW, Kim JH, Nagar H, Sabuncu M, Margolis DJA, Kim CK. The role of AI in prostate MRI quality and interpretation: Opportunities and challenges. Eur J Radiol 2023; 165:110887. [PMID: 37245342 DOI: 10.1016/j.ejrad.2023.110887] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2023] [Revised: 05/06/2023] [Accepted: 05/20/2023] [Indexed: 05/30/2023]
Abstract
Prostate MRI plays an important role in imaging the prostate gland and surrounding tissues, particularly in the diagnosis and management of prostate cancer. With the widespread adoption of multiparametric magnetic resonance imaging in recent years, the concerns surrounding the variability of imaging quality have garnered increased attention. Several factors contribute to the inconsistency of image quality, such as acquisition parameters, scanner differences and interobserver variabilities. While efforts have been made to standardize image acquisition and interpretation via the development of systems, such as PI-RADS and PI-QUAL, the scoring systems still depend on the subjective experience and acumen of humans. Artificial intelligence (AI) has been increasingly used in many applications, including medical imaging, due to its ability to automate tasks and lower human error rates. These advantages have the potential to standardize the tasks of image interpretation and quality control of prostate MRI. Despite its potential, thorough validation is required before the implementation of AI in clinical practice. In this article, we explore the opportunities and challenges of AI, with a focus on the interpretation and quality of prostate MRI.
Collapse
Affiliation(s)
- Heejong Kim
- Department of Radiology, Weill Cornell Medical College, 525 E 68th St Box 141, New York, NY 10021, United States
| | - Shin Won Kang
- Research Institute for Future Medicine, Samsung Medical Center, Republic of Korea
| | - Jae-Hun Kim
- Department of Radiology, Samsung Medical Center, Sungkyunkwan University School of Medicine, Republic of Korea
| | - Himanshu Nagar
- Department of Radiation Oncology, Weill Cornell Medical College, 525 E 68th St, New York, NY 10021, United States
| | - Mert Sabuncu
- Department of Radiology, Weill Cornell Medical College, 525 E 68th St Box 141, New York, NY 10021, United States
| | - Daniel J A Margolis
- Department of Radiology, Weill Cornell Medical College, 525 E 68th St Box 141, New York, NY 10021, United States.
| | - Chan Kyo Kim
- Department of Radiology and Center for Imaging Science, Samsung Medical Center, Sungkyunkwan University School of Medicine, Republic of Korea
| |
Collapse
|
14
|
He M, Cao Y, Chi C, Yang X, Ramin R, Wang S, Yang G, Mukhtorov O, Zhang L, Kazantsev A, Enikeev M, Hu K. Research progress on deep learning in magnetic resonance imaging-based diagnosis and treatment of prostate cancer: a review on the current status and perspectives. Front Oncol 2023; 13:1189370. [PMID: 37546423 PMCID: PMC10400334 DOI: 10.3389/fonc.2023.1189370] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2023] [Accepted: 05/30/2023] [Indexed: 08/08/2023] Open
Abstract
Multiparametric magnetic resonance imaging (mpMRI) has emerged as a first-line screening and diagnostic tool for prostate cancer, aiding in treatment selection and noninvasive radiotherapy guidance. However, the manual interpretation of MRI data is challenging and time-consuming, which may impact sensitivity and specificity. With recent technological advances, artificial intelligence (AI) in the form of computer-aided diagnosis (CAD) based on MRI data has been applied to prostate cancer diagnosis and treatment. Among AI techniques, deep learning involving convolutional neural networks contributes to detection, segmentation, scoring, grading, and prognostic evaluation of prostate cancer. CAD systems have automatic operation, rapid processing, and accuracy, incorporating multiple sequences of multiparametric MRI data of the prostate gland into the deep learning model. Thus, they have become a research direction of great interest, especially in smart healthcare. This review highlights the current progress of deep learning technology in MRI-based diagnosis and treatment of prostate cancer. The key elements of deep learning-based MRI image processing in CAD systems and radiotherapy of prostate cancer are briefly described, making it understandable not only for radiologists but also for general physicians without specialized imaging interpretation training. Deep learning technology enables lesion identification, detection, and segmentation, grading and scoring of prostate cancer, and prediction of postoperative recurrence and prognostic outcomes. The diagnostic accuracy of deep learning can be improved by optimizing models and algorithms, expanding medical database resources, and combining multi-omics data and comprehensive analysis of various morphological data. Deep learning has the potential to become the key diagnostic method in prostate cancer diagnosis and treatment in the future.
Collapse
Affiliation(s)
- Mingze He
- Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Yu Cao
- I.M. Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Changliang Chi
- Department of Urology, The First Hospital of Jilin University (Lequn Branch), Changchun, Jilin, China
| | - Xinyi Yang
- I.M. Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Rzayev Ramin
- Department of Radiology, The Second University Clinic, I.M. Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Shuowen Wang
- I.M. Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Guodong Yang
- I.M. Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Otabek Mukhtorov
- Regional State Budgetary Health Care Institution, Kostroma Regional Clinical Hospital named after Korolev E.I. Avenue Mira, Kostroma, Russia
| | - Liqun Zhang
- School of Biomedical Engineering, Faculty of Medicine, Dalian University of Technology, Dalian, Liaoning, China
| | - Anton Kazantsev
- Regional State Budgetary Health Care Institution, Kostroma Regional Clinical Hospital named after Korolev E.I. Avenue Mira, Kostroma, Russia
| | - Mikhail Enikeev
- Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University (Sechenov University), Moscow, Russia
| | - Kebang Hu
- Department of Urology, The First Hospital of Jilin University (Lequn Branch), Changchun, Jilin, China
| |
Collapse
|
15
|
Canellas R, Kohli MD, Westphalen AC. The Evidence for Using Artificial Intelligence to Enhance Prostate Cancer MR Imaging. Curr Oncol Rep 2023; 25:243-250. [PMID: 36749494 DOI: 10.1007/s11912-023-01371-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/14/2022] [Indexed: 02/08/2023]
Abstract
PURPOSE OF REVIEW The purpose of this review is to summarize the current status of artificial intelligence applied to prostate cancer MR imaging. RECENT FINDINGS Artificial intelligence has been applied to prostate cancer MR imaging to improve its diagnostic accuracy and reproducibility of interpretation. Multiple models have been tested for gland segmentation and volume calculation, automated lesion detection, localization, and characterization, as well as prediction of tumor aggressiveness and tumor recurrence. Studies show, for example, that very robust automated gland segmentation and volume calculations can be achieved and that lesions can be detected and accurately characterized. Although results are promising, we should view these with caution. Most studies included a small sample of patients from a single institution and most models did not undergo proper external validation. More research is needed with larger and well-design studies for the development of reliable artificial intelligence tools.
Collapse
Affiliation(s)
- Rodrigo Canellas
- Department of Radiology, University of Washington, 1959 NE Pacific St., 2nd Floor, Seattle, WA, 98195, USA
| | - Marc D Kohli
- Clinical Informatics, Department of Radiology and Biomedical Imaging, University of California, San Francisco, CA, 94143, USA.,Imaging Informatics, UCSF Health, 500 Parnassus Ave, 3rd Floor, San Francisco, CA, 94143, USA
| | - Antonio C Westphalen
- Department of Radiology, University of Washington, 1959 NE Pacific St., 2nd Floor, Seattle, WA, 98195, USA. .,Department of Urology, University of Washington, 1959 NE Pacific St., 2nd Floor, Seattle, WA, 98195, USA. .,Department Radiation Oncology, University of Washington, 1959 NE Pacific St., 2nd Floor, Seattle, WA, 98195, USA.
| |
Collapse
|
16
|
Automated prostate multi-regional segmentation in magnetic resonance using fully convolutional neural networks. Eur Radiol 2023:10.1007/s00330-023-09410-9. [PMID: 36690774 DOI: 10.1007/s00330-023-09410-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Revised: 11/06/2022] [Accepted: 12/27/2022] [Indexed: 01/25/2023]
Abstract
OBJECTIVE Automatic MR imaging segmentation of the prostate provides relevant clinical benefits for prostate cancer evaluation such as calculation of automated PSA density and other critical imaging biomarkers. Further, automated T2-weighted image segmentation of central-transition zone (CZ-TZ), peripheral zone (PZ), and seminal vesicle (SV) can help to evaluate clinically significant cancer following the PI-RADS v2.1 guidelines. Therefore, the main objective of this work was to develop a robust and reproducible CNN-based automatic prostate multi-regional segmentation model using an intercontinental cohort of prostate MRI. METHODS A heterogeneous database of 243 T2-weighted prostate studies from 7 countries and 10 machines of 3 different vendors, with the CZ-TZ, PZ, and SV regions manually delineated by two experienced radiologists (ground truth), was used to train (n = 123) and test (n = 120) a U-Net-based model with deep supervision using a cyclical learning rate. The performance of the model was evaluated by means of dice similarity coefficient (DSC), among others. Segmentation results with a DSC above 0.7 were considered accurate. RESULTS The proposed method obtained a DSC of 0.88 ± 0.01, 0.85 ± 0.02, 0.72 ± 0.02, and 0.72 ± 0.02 for the prostate gland, CZ-TZ, PZ, and SV respectively in the 120 studies of the test set when comparing the predicted segmentations with the ground truth. No statistically significant differences were found in the results obtained between manufacturers or continents. CONCLUSION Prostate multi-regional T2-weighted MR images automatic segmentation can be accurately achieved by U-Net like CNN, generalizable in a highly variable clinical environment with different equipment, acquisition configurations, and population. KEY POINTS • Deep learning techniques allows the accurate segmentation of the prostate in three different regions on MR T2w images. • Multi-centric database proved the generalization of the CNN model on different institutions across different continents. • CNN models can be used to aid on the diagnosis and follow-up of patients with prostate cancer.
Collapse
|
17
|
Hung ALY, Zheng H, Miao Q, Raman SS, Terzopoulos D, Sung K. CAT-Net: A Cross-Slice Attention Transformer Model for Prostate Zonal Segmentation in MRI. IEEE TRANSACTIONS ON MEDICAL IMAGING 2023; 42:291-303. [PMID: 36194719 PMCID: PMC10071136 DOI: 10.1109/tmi.2022.3211764] [Citation(s) in RCA: 17] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Prostate cancer is the second leading cause of cancer death among men in the United States. The diagnosis of prostate MRI often relies on accurate prostate zonal segmentation. However, state-of-the-art automatic segmentation methods often fail to produce well-contained volumetric segmentation of the prostate zones since certain slices of prostate MRI, such as base and apex slices, are harder to segment than other slices. This difficulty can be overcome by leveraging important multi-scale image-based information from adjacent slices, but current methods do not fully learn and exploit such cross-slice information. In this paper, we propose a novel cross-slice attention mechanism, which we use in a Transformer module to systematically learn cross-slice information at multiple scales. The module can be utilized in any existing deep-learning-based segmentation framework with skip connections. Experiments show that our cross-slice attention is able to capture cross-slice information significant for prostate zonal segmentation in order to improve the performance of current state-of-the-art methods. Cross-slice attention improves segmentation accuracy in the peripheral zones, such that segmentation results are consistent across all the prostate slices (apex, mid-gland, and base). The code for the proposed model is available at https://bit.ly/CAT-Net.
Collapse
|
18
|
Wu C, Montagne S, Hamzaoui D, Ayache N, Delingette H, Renard-Penna R. Automatic segmentation of prostate zonal anatomy on MRI: a systematic review of the literature. Insights Imaging 2022; 13:202. [PMID: 36543901 PMCID: PMC9772373 DOI: 10.1186/s13244-022-01340-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Accepted: 11/27/2022] [Indexed: 12/24/2022] Open
Abstract
OBJECTIVES Accurate zonal segmentation of prostate boundaries on MRI is a critical prerequisite for automated prostate cancer detection based on PI-RADS. Many articles have been published describing deep learning methods offering great promise for fast and accurate segmentation of prostate zonal anatomy. The objective of this review was to provide a detailed analysis and comparison of applicability and efficiency of the published methods for automatic segmentation of prostate zonal anatomy by systematically reviewing the current literature. METHODS A Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) was conducted until June 30, 2021, using PubMed, ScienceDirect, Web of Science and EMBase databases. Risk of bias and applicability based on Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2) criteria adjusted with Checklist for Artificial Intelligence in Medical Imaging (CLAIM) were assessed. RESULTS A total of 458 articles were identified, and 33 were included and reviewed. Only 2 articles had a low risk of bias for all four QUADAS-2 domains. In the remaining, insufficient details about database constitution and segmentation protocol provided sources of bias (inclusion criteria, MRI acquisition, ground truth). Eighteen different types of terminology for prostate zone segmentation were found, while 4 anatomic zones are described on MRI. Only 2 authors used a blinded reading, and 4 assessed inter-observer variability. CONCLUSIONS Our review identified numerous methodological flaws and underlined biases precluding us from performing quantitative analysis for this review. This implies low robustness and low applicability in clinical practice of the evaluated methods. Actually, there is not yet consensus on quality criteria for database constitution and zonal segmentation methodology.
Collapse
Affiliation(s)
- Carine Wu
- Sorbonne Université, Paris, France.
- Academic Department of Radiology, Hôpital Tenon, Assistance Publique des Hôpitaux de Paris, 4 Rue de La Chine, 75020, Paris, France.
| | - Sarah Montagne
- Sorbonne Université, Paris, France
- Academic Department of Radiology, Hôpital Tenon, Assistance Publique des Hôpitaux de Paris, 4 Rue de La Chine, 75020, Paris, France
- Academic Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique des Hôpitaux de Paris, Paris, France
- GRC N° 5, Oncotype-Uro, Sorbonne Université, Paris, France
| | - Dimitri Hamzaoui
- Inria, Epione Team, Sophia Antipolis, Université Côte d'Azur, Nice, France
| | - Nicholas Ayache
- Inria, Epione Team, Sophia Antipolis, Université Côte d'Azur, Nice, France
| | - Hervé Delingette
- Inria, Epione Team, Sophia Antipolis, Université Côte d'Azur, Nice, France
| | - Raphaële Renard-Penna
- Sorbonne Université, Paris, France
- Academic Department of Radiology, Hôpital Tenon, Assistance Publique des Hôpitaux de Paris, 4 Rue de La Chine, 75020, Paris, France
- Academic Department of Radiology, Hôpital Pitié-Salpétrière, Assistance Publique des Hôpitaux de Paris, Paris, France
- GRC N° 5, Oncotype-Uro, Sorbonne Université, Paris, France
| |
Collapse
|
19
|
Savjani RR, Lauria M, Bose S, Deng J, Yuan Y, Andrearczyk V. Automated Tumor Segmentation in Radiotherapy. Semin Radiat Oncol 2022; 32:319-329. [DOI: 10.1016/j.semradonc.2022.06.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
20
|
Adams LC, Makowski MR, Engel G, Rattunde M, Busch F, Asbach P, Niehues SM, Vinayahalingam S, van Ginneken B, Litjens G, Bressem KK. Prostate158 - An expert-annotated 3T MRI dataset and algorithm for prostate cancer detection. Comput Biol Med 2022; 148:105817. [PMID: 35841780 DOI: 10.1016/j.compbiomed.2022.105817] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2022] [Revised: 06/12/2022] [Accepted: 07/03/2022] [Indexed: 11/03/2022]
Abstract
BACKGROUND The development of deep learning (DL) models for prostate segmentation on magnetic resonance imaging (MRI) depends on expert-annotated data and reliable baselines, which are often not publicly available. This limits both reproducibility and comparability. METHODS Prostate158 consists of 158 expert annotated biparametric 3T prostate MRIs comprising T2w sequences and diffusion-weighted sequences with apparent diffusion coefficient maps. Two U-ResNets trained for segmentation of anatomy (central gland, peripheral zone) and suspicious lesions for prostate cancer (PCa) with a PI-RADS score of ≥4 served as baseline algorithms. Segmentation performance was evaluated using the Dice similarity coefficient (DSC), the Hausdorff distance (HD), and the average surface distance (ASD). The Wilcoxon test with Bonferroni correction was used to evaluate differences in performance. The generalizability of the baseline model was assessed using the open datasets Medical Segmentation Decathlon and PROSTATEx. RESULTS Compared to Reader 1, the models achieved a DSC/HD/ASD of 0.88/18.3/2.2 for the central gland, 0.75/22.8/1.9 for the peripheral zone, and 0.45/36.7/17.4 for PCa. Compared with Reader 2, the DSC/HD/ASD were 0.88/17.5/2.6 for the central gland, 0.73/33.2/1.9 for the peripheral zone, and 0.4/39.5/19.1 for PCa. Interrater agreement measured in DSC/HD/ASD was 0.87/11.1/1.0 for the central gland, 0.75/15.8/0.74 for the peripheral zone, and 0.6/18.8/5.5 for PCa. Segmentation performances on the Medical Segmentation Decathlon and PROSTATEx were 0.82/22.5/3.4; 0.86/18.6/2.5 for the central gland, and 0.64/29.2/4.7; 0.71/26.3/2.2 for the peripheral zone. CONCLUSIONS We provide an openly accessible, expert-annotated 3T dataset of prostate MRI and a reproducible benchmark to foster the development of prostate segmentation algorithms.
Collapse
Affiliation(s)
- Lisa C Adams
- Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute for Radiology, Luisenstraße 7, 10117, Hindenburgdamm 30, 12203, Berlin, Germany; Berlin Institute of Health at Charité - Universitätsmedizin Berlin, Charitéplatz 1, 10117, Berlin, Germany.
| | - Marcus R Makowski
- Technical University of Munich, Department of Diagnostic and Interventional Radiology, Faculty of Medicine, Ismaninger Str. 22, 81675, Munich, Germany
| | - Günther Engel
- Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute for Radiology, Luisenstraße 7, 10117, Hindenburgdamm 30, 12203, Berlin, Germany; Institute for Diagnostic and Interventional Radiology, Georg-August University, Göttingen, Germany
| | - Maximilian Rattunde
- Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute for Radiology, Luisenstraße 7, 10117, Hindenburgdamm 30, 12203, Berlin, Germany
| | - Felix Busch
- Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute for Radiology, Luisenstraße 7, 10117, Hindenburgdamm 30, 12203, Berlin, Germany
| | - Patrick Asbach
- Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute for Radiology, Luisenstraße 7, 10117, Hindenburgdamm 30, 12203, Berlin, Germany
| | - Stefan M Niehues
- Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute for Radiology, Luisenstraße 7, 10117, Hindenburgdamm 30, 12203, Berlin, Germany
| | - Shankeeth Vinayahalingam
- Department of Oral and Maxillofacial Surgery, Radboud University Medical Center, Nijmegen, GA, the Netherlands
| | | | - Geert Litjens
- Radboud University Medical Center, Nijmegen, GA, the Netherlands
| | - Keno K Bressem
- Charité - Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin and Humboldt Universität zu Berlin, Institute for Radiology, Luisenstraße 7, 10117, Hindenburgdamm 30, 12203, Berlin, Germany; Berlin Institute of Health at Charité - Universitätsmedizin Berlin, Charitéplatz 1, 10117, Berlin, Germany
| |
Collapse
|
21
|
Mata C, Walker P, Oliver A, Martí J, Lalande A. Usefulness of Collaborative Work in the Evaluation of Prostate Cancer from MRI. Clin Pract 2022; 12:350-362. [PMID: 35645317 PMCID: PMC9149964 DOI: 10.3390/clinpract12030040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2022] [Revised: 05/06/2022] [Accepted: 05/09/2022] [Indexed: 11/16/2022] Open
Abstract
The aim of this study is to show the usefulness of collaborative work in the evaluation of prostate cancer from T2-weighted MRI using a dedicated software tool. The variability of annotations on images of the prostate gland (central and peripheral zones as well as tumour) by two independent experts was firstly evaluated, and secondly compared with a consensus between these two experts. Using a prostate MRI database, experts drew regions of interest (ROIs) corresponding to healthy prostate (peripheral and central zones) and cancer. One of the experts then drew the ROI with knowledge of the other expert’s ROI. The surface area of each ROI was used to measure the Hausdorff distance and the Dice coefficient was measured from the respective contours. They were evaluated between the different experiments, taking the annotations of the second expert as the reference. The results showed that the significant differences between the two experts disappeared with collaborative work. To conclude, this study shows that collaborative work with a dedicated tool allows consensus between expertise in the evaluation of prostate cancer from T2-weighted MRI.
Collapse
Affiliation(s)
- Christian Mata
- Pediatric Computational Imaging Research Group, Hospital Sant Joan de Déu, 08950 Esplugues de Llobregat, Spain
- Research Centre for Biomedical Engineering (CREB), Barcelona East School of Engineering, Universitat Politècnica de Catalunya, 08019 Barcelona, Spain
- Correspondence:
| | - Paul Walker
- ImViA Laboratory, Université de Bourgogne Franche-Comté, 64 Rue de Sully, 21000 Dijon, France; (P.W.); (A.L.)
| | - Arnau Oliver
- Institute of Computer Vision and Robotics, University of Girona, Campus Montilivi, Ed. P-IV, 17003 Girona, Spain; (A.O.); (J.M.)
| | - Joan Martí
- Institute of Computer Vision and Robotics, University of Girona, Campus Montilivi, Ed. P-IV, 17003 Girona, Spain; (A.O.); (J.M.)
| | - Alain Lalande
- ImViA Laboratory, Université de Bourgogne Franche-Comté, 64 Rue de Sully, 21000 Dijon, France; (P.W.); (A.L.)
| |
Collapse
|
22
|
Gunashekar DD, Bielak L, Hägele L, Oerther B, Benndorf M, Grosu AL, Brox T, Zamboglou C, Bock M. Explainable AI for CNN-based prostate tumor segmentation in multi-parametric MRI correlated to whole mount histopathology. Radiat Oncol 2022; 17:65. [PMID: 35366918 PMCID: PMC8976981 DOI: 10.1186/s13014-022-02035-0] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2022] [Accepted: 03/15/2022] [Indexed: 12/15/2022] Open
Abstract
Automatic prostate tumor segmentation is often unable to identify the lesion even if multi-parametric MRI data is used as input, and the segmentation output is difficult to verify due to the lack of clinically established ground truth images. In this work we use an explainable deep learning model to interpret the predictions of a convolutional neural network (CNN) for prostate tumor segmentation. The CNN uses a U-Net architecture which was trained on multi-parametric MRI data from 122 patients to automatically segment the prostate gland and prostate tumor lesions. In addition, co-registered ground truth data from whole mount histopathology images were available in 15 patients that were used as a test set during CNN testing. To be able to interpret the segmentation results of the CNN, heat maps were generated using the Gradient Weighted Class Activation Map (Grad-CAM) method. The CNN achieved a mean Dice Sorensen Coefficient 0.62 and 0.31 for the prostate gland and the tumor lesions -with the radiologist drawn ground truth and 0.32 with whole-mount histology ground truth for tumor lesions. Dice Sorensen Coefficient between CNN predictions and manual segmentations from MRI and histology data were not significantly different. In the prostate the Grad-CAM heat maps could differentiate between tumor and healthy prostate tissue, which indicates that the image information in the tumor was essential for the CNN segmentation.
Collapse
Affiliation(s)
- Deepa Darshini Gunashekar
- Department of Radiology, Medical Physics, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany.
| | - Lars Bielak
- Department of Radiology, Medical Physics, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany
| | - Leonard Hägele
- Department of Radiology, Medical Physics, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Benedict Oerther
- Department of Radiology, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Matthias Benndorf
- Department of Radiology, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Anca-L Grosu
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany
- Department of Radiology, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Thomas Brox
- Department of Computer Science, University of Freiburg, Freiburg, Germany
| | - Constantinos Zamboglou
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany
- Department of Radiology, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
| | - Michael Bock
- Department of Radiology, Medical Physics, Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Freiburg, Germany
- German Cancer Consortium (DKTK), Partner Site Freiburg, Freiburg, Germany
| |
Collapse
|
23
|
Hamzaoui D, Montagne S, Renard-Penna R, Ayache N, Delingette H. Automatic zonal segmentation of the prostate from 2D and 3D T2-weighted MRI and evaluation for clinical use. J Med Imaging (Bellingham) 2022; 9:024001. [PMID: 35300345 PMCID: PMC8920492 DOI: 10.1117/1.jmi.9.2.024001] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Accepted: 02/23/2022] [Indexed: 11/14/2022] Open
Abstract
Purpose: An accurate zonal segmentation of the prostate is required for prostate cancer (PCa) management with MRI. Approach: The aim of this work is to present UFNet, a deep learning-based method for automatic zonal segmentation of the prostate from T2-weighted (T2w) MRI. It takes into account the image anisotropy, includes both spatial and channelwise attention mechanisms and uses loss functions to enforce prostate partition. The method was applied on a private multicentric three-dimensional T2w MRI dataset and on the public two-dimensional T2w MRI dataset ProstateX. To assess the model performance, the structures segmented by the algorithm on the private dataset were compared with those obtained by seven radiologists of various experience levels. Results: On the private dataset, we obtained a Dice score (DSC) of 93.90 ± 2.85 for the whole gland (WG), 91.00 ± 4.34 for the transition zone (TZ), and 79.08 ± 7.08 for the peripheral zone (PZ). Results were significantly better than other compared networks' ( p - value < 0.05 ). On ProstateX, we obtained a DSC of 90.90 ± 2.94 for WG, 86.84 ± 4.33 for TZ, and 78.40 ± 7.31 for PZ. These results are similar to state-of-the art results and, on the private dataset, are coherent with those obtained by radiologists. Zonal locations and sectorial positions of lesions annotated by radiologists were also preserved. Conclusions: Deep learning-based methods can provide an accurate zonal segmentation of the prostate leading to a consistent zonal location and sectorial position of lesions, and therefore can be used as a helping tool for PCa diagnosis.
Collapse
Affiliation(s)
- Dimitri Hamzaoui
- Université Côte d'Azur, Inria, Epione Project-Team, Sophia Antipolis, Valbonne, France
| | - Sarah Montagne
- Sorbonne Université, Radiology Department, CHU La Pitié Salpétrière/Tenon, Paris, France
| | - Raphaële Renard-Penna
- Sorbonne Université, Radiology Department, CHU La Pitié Salpétrière/Tenon, Paris, France
| | - Nicholas Ayache
- Université Côte d'Azur, Inria, Epione Project-Team, Sophia Antipolis, Valbonne, France
| | - Hervé Delingette
- Université Côte d'Azur, Inria, Epione Project-Team, Sophia Antipolis, Valbonne, France
| |
Collapse
|
24
|
Turkbey B, Haider MA. Deep learning-based artificial intelligence applications in prostate MRI: brief summary. Br J Radiol 2022; 95:20210563. [PMID: 34860562 PMCID: PMC8978238 DOI: 10.1259/bjr.20210563] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
Prostate cancer (PCa) is the most common cancer type in males in the Western World. MRI has an established role in diagnosis of PCa through guiding biopsies. Due to multistep complex nature of the MRI-guided PCa diagnosis pathway, diagnostic performance has a big variation. Developing artificial intelligence (AI) models using machine learning, particularly deep learning, has an expanding role in radiology. Specifically, for prostate MRI, several AI approaches have been defined in the literature for prostate segmentation, lesion detection and classification with the aim of improving diagnostic performance and interobserver agreement. In this review article, we summarize the use of radiology applications of AI in prostate MRI.
Collapse
Affiliation(s)
- Baris Turkbey
- Molecular Imaging Branch, NCI, NIH, Bethesda, MD, USA
| | | |
Collapse
|
25
|
Li H, Lee CH, Chia D, Lin Z, Huang W, Tan CH. Machine Learning in Prostate MRI for Prostate Cancer: Current Status and Future Opportunities. Diagnostics (Basel) 2022; 12:diagnostics12020289. [PMID: 35204380 PMCID: PMC8870978 DOI: 10.3390/diagnostics12020289] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2021] [Revised: 12/31/2021] [Accepted: 01/14/2022] [Indexed: 02/04/2023] Open
Abstract
Advances in our understanding of the role of magnetic resonance imaging (MRI) for the detection of prostate cancer have enabled its integration into clinical routines in the past two decades. The Prostate Imaging Reporting and Data System (PI-RADS) is an established imaging-based scoring system that scores the probability of clinically significant prostate cancer on MRI to guide management. Image fusion technology allows one to combine the superior soft tissue contrast resolution of MRI, with real-time anatomical depiction using ultrasound or computed tomography. This allows the accurate mapping of prostate cancer for targeted biopsy and treatment. Machine learning provides vast opportunities for automated organ and lesion depiction that could increase the reproducibility of PI-RADS categorisation, and improve co-registration across imaging modalities to enhance diagnostic and treatment methods that can then be individualised based on clinical risk of malignancy. In this article, we provide a comprehensive and contemporary review of advancements, and share insights into new opportunities in this field.
Collapse
Affiliation(s)
- Huanye Li
- School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 639798, Singapore; (H.L.); (Z.L.)
| | - Chau Hung Lee
- Department of Diagnostic Radiology, Tan Tock Seng Hospital, Singapore 308433, Singapore;
| | - David Chia
- Department of Radiation Oncology, National University Cancer Institute (NUH), Singapore 119074, Singapore;
| | - Zhiping Lin
- School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 639798, Singapore; (H.L.); (Z.L.)
| | - Weimin Huang
- Institute for Infocomm Research, A*Star, Singapore 138632, Singapore;
| | - Cher Heng Tan
- Department of Diagnostic Radiology, Tan Tock Seng Hospital, Singapore 308433, Singapore;
- Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore 639798, Singapore
- Correspondence:
| |
Collapse
|
26
|
Rouvière O, Moldovan PC, Vlachomitrou A, Gouttard S, Riche B, Groth A, Rabotnikov M, Ruffion A, Colombel M, Crouzet S, Weese J, Rabilloud M. Combined model-based and deep learning-based automated 3D zonal segmentation of the prostate on T2-weighted MR images: clinical evaluation. Eur Radiol 2022; 32:3248-3259. [PMID: 35001157 DOI: 10.1007/s00330-021-08408-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2021] [Revised: 09/28/2021] [Accepted: 10/09/2021] [Indexed: 11/04/2022]
Abstract
OBJECTIVE To train and to test for prostate zonal segmentation an existing algorithm already trained for whole-gland segmentation. METHODS The algorithm, combining model-based and deep learning-based approaches, was trained for zonal segmentation using the NCI-ISBI-2013 dataset and 70 T2-weighted datasets acquired at an academic centre. Test datasets were randomly selected among examinations performed at this centre on one of two scanners (General Electric, 1.5 T; Philips, 3 T) not used for training. Automated segmentations were corrected by two independent radiologists. When segmentation was initiated outside the prostate, images were cropped and segmentation repeated. Factors influencing the algorithm's mean Dice similarity coefficient (DSC) and its precision were assessed using beta regression. RESULTS Eighty-two test datasets were selected; one was excluded. In 13/81 datasets, segmentation started outside the prostate, but zonal segmentation was possible after image cropping. Depending on the radiologist chosen as reference, algorithm's median DSCs were 96.4/97.4%, 91.8/93.0% and 79.9/89.6% for whole-gland, central gland and anterior fibromuscular stroma (AFMS) segmentations, respectively. DSCs comparing radiologists' delineations were 95.8%, 93.6% and 81.7%, respectively. For all segmentation tasks, the scanner used for imaging significantly influenced the mean DSC and its precision, and the mean DSC was significantly lower in cases with initial segmentation outside the prostate. For central gland segmentation, the mean DSC was also significantly lower in larger prostates. The radiologist chosen as reference had no significant impact, except for AFMS segmentation. CONCLUSIONS The algorithm performance fell within the range of inter-reader variability but remained significantly impacted by the scanner used for imaging. KEY POINTS • Median Dice similarity coefficients obtained by the algorithm fell within human inter-reader variability for the three segmentation tasks (whole gland, central gland, anterior fibromuscular stroma). • The scanner used for imaging significantly impacted the performance of the automated segmentation for the three segmentation tasks. • The performance of the automated segmentation of the anterior fibromuscular stroma was highly variable across patients and showed also high variability across the two radiologists.
Collapse
Affiliation(s)
- Olivier Rouvière
- Department of Urinary and Vascular Imaging, Hôpital Edouard Herriot, Hospices Civils de Lyon, Pavillon B, 5 place d'Arsonval, F-69437, Lyon, France. .,Université de Lyon, F-69003, Lyon, France. .,Faculté de Médecine Lyon Est, Université Lyon 1, F-69003, Lyon, France. .,INSERM, LabTau, U1032, Lyon, France.
| | - Paul Cezar Moldovan
- Department of Urinary and Vascular Imaging, Hôpital Edouard Herriot, Hospices Civils de Lyon, Pavillon B, 5 place d'Arsonval, F-69437, Lyon, France
| | - Anna Vlachomitrou
- Philips France, 33 rue de Verdun, CS 60 055, 92156, Suresnes Cedex, France
| | - Sylvain Gouttard
- Department of Urinary and Vascular Imaging, Hôpital Edouard Herriot, Hospices Civils de Lyon, Pavillon B, 5 place d'Arsonval, F-69437, Lyon, France
| | - Benjamin Riche
- Service de Biostatistique Et Bioinformatique, Pôle Santé Publique, Hospices Civils de Lyon, F-69003, Lyon, France.,Laboratoire de Biométrie Et Biologie Évolutive, Équipe Biostatistique-Santé, UMR 5558, CNRS, F-69100, Villeurbanne, France
| | - Alexandra Groth
- Philips Research, Röntgenstrasse 24-26, 22335, Hamburg, Germany
| | | | - Alain Ruffion
- Department of Urology, Centre Hospitalier Lyon Sud, Hospices Civils de Lyon, F-69310, Pierre-Bénite, France
| | - Marc Colombel
- Université de Lyon, F-69003, Lyon, France.,Faculté de Médecine Lyon Est, Université Lyon 1, F-69003, Lyon, France.,Department of Urology, Hôpital Edouard Herriot, Hospices Civils de Lyon, F-69437, Lyon, France
| | - Sébastien Crouzet
- Department of Urology, Hôpital Edouard Herriot, Hospices Civils de Lyon, F-69437, Lyon, France
| | - Juergen Weese
- Philips Research, Röntgenstrasse 24-26, 22335, Hamburg, Germany
| | - Muriel Rabilloud
- Université de Lyon, F-69003, Lyon, France.,Faculté de Médecine Lyon Est, Université Lyon 1, F-69003, Lyon, France.,Service de Biostatistique Et Bioinformatique, Pôle Santé Publique, Hospices Civils de Lyon, F-69003, Lyon, France.,Laboratoire de Biométrie Et Biologie Évolutive, Équipe Biostatistique-Santé, UMR 5558, CNRS, F-69100, Villeurbanne, France
| |
Collapse
|
27
|
Skin Lesion Classification Based on Surface Fractal Dimensions and Statistical Color Cluster Features Using an Ensemble of Machine Learning Techniques. Cancers (Basel) 2021; 13:cancers13215256. [PMID: 34771421 PMCID: PMC8582408 DOI: 10.3390/cancers13215256] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2021] [Revised: 10/18/2021] [Accepted: 10/18/2021] [Indexed: 01/23/2023] Open
Abstract
Simple Summary This study aimed to investigate the efficacy of implementation of novel skin surface fractal dimension features as an auxiliary diagnostic method for melanoma recognition. We therefore examined the skin lesion classification accuracy of the kNN-CV algorithm and of the proposed Radial basis function neural network model. We found an increased accuracy of classification when the fractal analysis is added to the classical color distribution analysis. Our results indicate that by using a reliable classifier, more opportunities exist to detect timely cancerous skin lesions. Abstract (1) Background: An approach for skin cancer recognition and classification by implementation of a novel combination of features and two classifiers, as an auxiliary diagnostic method, is proposed. (2) Methods: The predictions are made by k-nearest neighbor with a 5-fold cross validation algorithm and a neural network model to assist dermatologists in the diagnosis of cancerous skin lesions. As a main contribution, this work proposes a descriptor that combines skin surface fractal dimension and relevant color area features for skin lesion classification purposes. The surface fractal dimension is computed using a 2D generalization of Higuchi’s method. A clustering method allows for the selection of the relevant color distribution in skin lesion images by determining the average percentage of color areas within the nevi and melanoma lesion areas. In a classification stage, the Higuchi fractal dimensions (HFDs) and the color features are classified, separately, using a kNN-CV algorithm. In addition, these features are prototypes for a Radial basis function neural network (RBFNN) classifier. The efficiency of our algorithms was verified by utilizing images belonging to the 7-Point, Med-Node, and PH2 databases; (3) Results: Experimental results show that the accuracy of the proposed RBFNN model in skin cancer classification is 95.42% for 7-Point, 94.71% for Med-Node, and 94.88% for PH2, which are all significantly better than that of the kNN algorithm. (4) Conclusions: 2D Higuchi’s surface fractal features have not been previously used for skin lesion classification purpose. We used fractal features further correlated to color features to create a RBFNN classifier that provides high accuracies of classification.
Collapse
|
28
|
Nandalur KR, Colvin R, Walker D, Nandalur SR, Seifman B, Gangwish D, Hafron J. Benign prostate hyperplasia as a potential protective factor against prostate cancer: Insights from a magnetic resonance imaging study of compositional characteristics. Prostate 2021; 81:1097-1104. [PMID: 34375453 DOI: 10.1002/pros.24207] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/29/2021] [Revised: 07/27/2021] [Accepted: 07/30/2021] [Indexed: 11/07/2022]
Abstract
PURPOSE The structural relationship between benign prostate hyperplasia (BPH) and prostate cancer (Pca) is controversial. The purpose of our study was to examine the association between quantitative prostate compositional metrics by magnetic resonance imaging (MRI) and Pca. METHODS We identified 405 patients who underwent prostate MRI and biopsy and/or prostatectomy from January 2019 to January 2021 at our institution. Segmentation volumetric methods were used to assess central gland (CG) and peripheral zone (PZ) volume. PZ mean thickness and mean apparent diffusion coefficient (ADC), marker of underlying histologic components, were measured. Multivariable logistic regression was performed with outcomes of ≥Grade Group (GG) 2 Pca and for multifocal disease. RESULTS On multivariable analysis, higher CG volumes were at lower odds of ≥GG2 disease (n = 227) (OR: 0.97, 95% CI 0.96-0.98, p < 0.0001), taking into account PZ volume (p = 0.18) and thickness (p = 0.70). For every one cc increase in CG volume, there was an approximately 3% decrease in odds of ≥GG2 disease. Similar findings were noted for multifocal disease (n = 180) (OR: 0.97, 95% CI 0.96-0.98, p < 0.0001). Notably, ADC of the normal PZ was not significantly associated with CG volume (p = 0.21) nor a predictor of disease (p = 0.49). CONCLUSIONS Increasing central gland volume, driven by BPH, is associated with lower odds of significant Pca, including multifocal disease, while PZ anatomic and histologic surrogate changes were noncontributory. Findings support BPH impediment of global tumor growth predicted by theoretical mechanobiological model. This potential stabilizing factor should be further studied for risk stratification and in consideration for BPH therapy.
Collapse
Affiliation(s)
- Kiran R Nandalur
- Department of Radiology and Molecular Imaging, Oakland University William Beaumont Hospital School of Medicine, Royal Oak, Michigan, USA
| | - Robert Colvin
- Department of Radiology and Molecular Imaging, Oakland University William Beaumont Hospital School of Medicine, Royal Oak, Michigan, USA
| | - David Walker
- Department of Radiology and Molecular Imaging, Oakland University William Beaumont Hospital School of Medicine, Royal Oak, Michigan, USA
| | - Sirisha R Nandalur
- Department of Radiation Oncology, Oakland University William Beaumont Hospital School of Medicine, Royal Oak, Michigan, USA
| | - Brian Seifman
- Department of Urology, Oakland University William Beaumont Hospital School of Medicine, Royal Oak, Michigan, USA
| | - David Gangwish
- Department of Urology, Oakland University William Beaumont Hospital School of Medicine, Royal Oak, Michigan, USA
| | - Jason Hafron
- Department of Urology, Oakland University William Beaumont Hospital School of Medicine, Royal Oak, Michigan, USA
| |
Collapse
|
29
|
Muglia VF, Westphalen AC. Editorial on "Convolutional Neural Networks for Automated Classification of Prostate Multiparametric Magnetic Resonance Imaging Based on Image Quality". J Magn Reson Imaging 2021; 55:491-492. [PMID: 34477274 DOI: 10.1002/jmri.27913] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2021] [Accepted: 08/24/2021] [Indexed: 11/09/2022] Open
Affiliation(s)
- Valdair F Muglia
- Department of Medical Imaging, Clinical Oncology and Hematology-Ribeirao Preto School of Medicine, University of Sao Paulo, Sao Paulo, Brazil
| | - Antonio Carlos Westphalen
- Departments of Radiology, Urology, and Radiation Oncology, University of Washington, Seattle, Washington, USA
| |
Collapse
|