1
|
Du Q, Wang L, Chen H. A mixed Mamba U-net for prostate segmentation in MR images. Sci Rep 2024; 14:19976. [PMID: 39198553 PMCID: PMC11358272 DOI: 10.1038/s41598-024-71045-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2024] [Accepted: 08/23/2024] [Indexed: 09/01/2024] Open
Abstract
The diagnosis of early prostate cancer depends on the accurate segmentation of prostate regions in magnetic resonance imaging (MRI). However, this segmentation task is challenging due to the particularities of prostate MR images themselves and the limitations of existing methods. To address these issues, we propose a U-shaped encoder-decoder network MM-UNet based on Mamba and CNN for prostate segmentation in MR images. Specifically, we first proposed an adaptive feature fusion module based on channel attention guidance to achieve effective fusion between adjacent hierarchical features and suppress the interference of background noise. Secondly, we propose a global context-aware module based on Mamba, which has strong long-range modeling capabilities and linear complexity, to capture global context information in images. Finally, we propose a multi-scale anisotropic convolution module based on the principle of parallel multi-scale anisotropic convolution blocks and 3D convolution decomposition. Experimental results on two public prostate MR image segmentation datasets demonstrate that the proposed method outperforms competing models in terms of prostate segmentation performance and achieves state-of-the-art performance. In future work, we intend to enhance the model's robustness and extend its applicability to additional medical image segmentation tasks.
Collapse
Affiliation(s)
- Qiu Du
- Department of Urology, Hunan Provincial People's Hospital, The First Affiliated Hospital of Hunan Normal University, Changsha, 410005, People's Republic of China
| | - Luowu Wang
- Department of Urology, Hunan Provincial People's Hospital, The First Affiliated Hospital of Hunan Normal University, Changsha, 410005, People's Republic of China
| | - Hao Chen
- Department of Urology, Hunan Provincial People's Hospital, The First Affiliated Hospital of Hunan Normal University, Changsha, 410005, People's Republic of China.
| |
Collapse
|
2
|
Ramacciotti LS, Hershenhouse JS, Mokhtar D, Paralkar D, Kaneko M, Eppler M, Gill K, Mogoulianitis V, Duddalwar V, Abreu AL, Gill I, Cacciamani GE. Comprehensive Assessment of MRI-based Artificial Intelligence Frameworks Performance in the Detection, Segmentation, and Classification of Prostate Lesions Using Open-Source Databases. Urol Clin North Am 2024; 51:131-161. [PMID: 37945098 DOI: 10.1016/j.ucl.2023.08.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2023]
Abstract
Numerous MRI-based artificial intelligence (AI) frameworks have been designed for prostate cancer lesion detection, segmentation, and classification via MRI as a result of intrareader and interreader variability that is inherent to traditional interpretation. Open-source data sets have been released with the intention of providing freely available MRIs for the testing of diverse AI frameworks in automated or semiautomated tasks. Here, an in-depth assessment of the performance of MRI-based AI frameworks for detecting, segmenting, and classifying prostate lesions using open-source databases was performed. Among 17 data sets, 12 were specific to prostate cancer detection/classification, with 52 studies meeting the inclusion criteria.
Collapse
Affiliation(s)
- Lorenzo Storino Ramacciotti
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Jacob S Hershenhouse
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Daniel Mokhtar
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Divyangi Paralkar
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Masatomo Kaneko
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Department of Urology, Graduate School of Medical Science, Kyoto Prefectural University of Medicine, Kyoto, Japan
| | - Michael Eppler
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Karanvir Gill
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Vasileios Mogoulianitis
- Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, USA
| | - Vinay Duddalwar
- Department of Radiology, University of Southern California, Los Angeles, CA, USA
| | - Andre L Abreu
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Department of Radiology, University of Southern California, Los Angeles, CA, USA
| | - Inderbir Gill
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Giovanni E Cacciamani
- USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA; Center for Image-Guided and Focal Therapy for Prostate Cancer, Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA; Department of Radiology, University of Southern California, Los Angeles, CA, USA.
| |
Collapse
|
3
|
Chen X, Liu X, Wu Y, Wang Z, Wang SH. Research related to the diagnosis of prostate cancer based on machine learning medical images: A review. Int J Med Inform 2024; 181:105279. [PMID: 37977054 DOI: 10.1016/j.ijmedinf.2023.105279] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2023] [Revised: 09/06/2023] [Accepted: 10/29/2023] [Indexed: 11/19/2023]
Abstract
BACKGROUND Prostate cancer is currently the second most prevalent cancer among men. Accurate diagnosis of prostate cancer can provide effective treatment for patients and greatly reduce mortality. The current medical imaging tools for screening prostate cancer are mainly MRI, CT and ultrasound. In the past 20 years, these medical imaging methods have made great progress with machine learning, especially the rise of deep learning has led to a wider application of artificial intelligence in the use of image-assisted diagnosis of prostate cancer. METHOD This review collected medical image processing methods, prostate and prostate cancer on MR images, CT images, and ultrasound images through search engines such as web of science, PubMed, and Google Scholar, including image pre-processing methods, segmentation of prostate gland on medical images, registration between prostate gland on different modal images, detection of prostate cancer lesions on the prostate. CONCLUSION Through these collated papers, it is found that the current research on the diagnosis and staging of prostate cancer using machine learning and deep learning is in its infancy, and most of the existing studies are on the diagnosis of prostate cancer and classification of lesions, and the accuracy is low, with the best results having an accuracy of less than 0.95. There are fewer studies on staging. The research is mainly focused on MR images and much less on CT images, ultrasound images. DISCUSSION Machine learning and deep learning combined with medical imaging have a broad application prospect for the diagnosis and staging of prostate cancer, but the research in this area still has more room for development.
Collapse
Affiliation(s)
- Xinyi Chen
- School of Electronic and Electrical Engineering, Shanghai University of Engineering Science, Shanghai 201620, China.
| | - Xiang Liu
- School of Electronic and Electrical Engineering, Shanghai University of Engineering Science, Shanghai 201620, China.
| | - Yuke Wu
- School of Electronic and Electrical Engineering, Shanghai University of Engineering Science, Shanghai 201620, China.
| | - Zhenglei Wang
- Department of Medical Imaging, Shanghai Electric Power Hospital, Shanghai 201620, China.
| | - Shuo Hong Wang
- Department of Molecular and Cellular Biology and Center for Brain Science, Harvard University, Cambridge, MA 02138, USA.
| |
Collapse
|
4
|
Matsuoka Y, Ueno Y, Uehara S, Tanaka H, Kobayashi M, Tanaka H, Yoshida S, Yokoyama M, Kumazawa I, Fujii Y. Deep-learning prostate cancer detection and segmentation on biparametric versus multiparametric magnetic resonance imaging: Added value of dynamic contrast-enhanced imaging. Int J Urol 2023; 30:1103-1111. [PMID: 37605627 DOI: 10.1111/iju.15280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Accepted: 07/30/2023] [Indexed: 08/23/2023]
Abstract
OBJECTIVES To develop diagnostic algorithms of multisequence prostate magnetic resonance imaging for cancer detection and segmentation using deep learning and explore values of dynamic contrast-enhanced imaging in multiparametric imaging, compared with biparametric imaging. METHODS We collected 3227 multiparametric imaging sets from 332 patients, including 218 cancer patients (291 biopsy-proven foci) and 114 noncancer patients. Diagnostic algorithms of T2-weighted, T2-weighted plus dynamic contrast-enhanced, biparametric, and multiparametric imaging were built using 2578 sets, and their performance for clinically significant cancer was evaluated using 649 sets. RESULTS Biparametric and multiparametric imaging had following region-based performance: sensitivity of 71.9% and 74.8% (p = 0.394) and positive predictive value of 61.3% and 74.8% (p = 0.013), respectively. In side-specific analyses of cancer images, the specificity was 72.6% and 89.5% (p < 0.001) and the negative predictive value was 78.9% and 83.5% (p = 0.364), respectively. False-negative cancer on multiparametric imaging was smaller (p = 0.002) and more dominant with grade group ≤2 (p = 0.028) than true positive foci. In the peripheral zone, false-positive regions on biparametric imaging turned out to be true negative on multiparametric imaging more frequently compared with the transition zone (78.3% vs. 47.2%, p = 0.018). In contrast, T2-weighted plus dynamic contrast-enhanced imaging had lower specificity than T2-weighted imaging (41.1% vs. 51.6%, p = 0.042). CONCLUSIONS When using deep learning, multiparametric imaging provides superior performance to biparametric imaging in the specificity and positive predictive value, especially in the peripheral zone. Dynamic contrast-enhanced imaging helps reduce overdiagnosis in multiparametric imaging.
Collapse
Affiliation(s)
- Yoh Matsuoka
- Department of Urology, Tokyo Medical and Dental University, Tokyo, Japan
- Department of Urology, Saitama Cancer Center, Saitama, Japan
| | - Yoshihiko Ueno
- Department of Information and Communications Engineering, Tokyo Institute of Technology, Yokohama, Kanagawa, Japan
| | - Sho Uehara
- Department of Urology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Hiroshi Tanaka
- Department of Radiology, Ochanomizu Surugadai Clinic, Tokyo, Japan
| | - Masaki Kobayashi
- Department of Urology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Hajime Tanaka
- Department of Urology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Soichiro Yoshida
- Department of Urology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Minato Yokoyama
- Department of Urology, Tokyo Medical and Dental University, Tokyo, Japan
| | - Itsuo Kumazawa
- Laboratory for Future Interdisciplinary Research of Science and Technology, Institute of Innovative Research, Tokyo Institute of Technology, Yokohama, Kanagawa, Japan
| | - Yasuhisa Fujii
- Department of Urology, Tokyo Medical and Dental University, Tokyo, Japan
| |
Collapse
|
5
|
Yang E, Shankar K, Kumar S, Seo C, Moon I. Equilibrium Optimization Algorithm with Deep Learning Enabled Prostate Cancer Detection on MRI Images. Biomedicines 2023; 11:3200. [PMID: 38137421 PMCID: PMC10740673 DOI: 10.3390/biomedicines11123200] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2023] [Revised: 11/22/2023] [Accepted: 11/28/2023] [Indexed: 12/24/2023] Open
Abstract
The enlargement of the prostate gland in the reproductive system of males is considered a form of prostate cancer (PrC). The survival rate is considerably improved with earlier diagnosis of cancer; thus, timely intervention should be administered. In this study, a new automatic approach combining several deep learning (DL) techniques was introduced to detect PrC from MRI and ultrasound (US) images. Furthermore, the presented method describes why a certain decision was made given the input MRI or US images. Many pretrained custom-developed layers were added to the pretrained model and employed in the dataset. The study presents an Equilibrium Optimization Algorithm with Deep Learning-based Prostate Cancer Detection and Classification (EOADL-PCDC) technique on MRIs. The main goal of the EOADL-PCDC method lies in the detection and classification of PrC. To achieve this, the EOADL-PCDC technique applies image preprocessing to improve the image quality. In addition, the EOADL-PCDC technique follows the CapsNet (capsule network) model for the feature extraction model. The EOA is based on hyperparameter tuning used to increase the efficiency of CapsNet. The EOADL-PCDC algorithm makes use of the stacked bidirectional long short-term memory (SBiLSTM) model for prostate cancer classification. A comprehensive set of simulations of the EOADL-PCDC algorithm was tested on the benchmark MRI dataset. The experimental outcome revealed the superior performance of the EOADL-PCDC approach over existing methods in terms of different metrics.
Collapse
Affiliation(s)
- Eunmok Yang
- Department of Financial Information Security, Kookmin University, Seoul 02707, Republic of Korea;
| | - K. Shankar
- Department of Computer Science and Engineering, Saveetha School of Engineering, Saveetha Institute of Medical and Technical Sciences, Chennai 602105, India;
- Big Data and Machine Learning Lab, South Ural State University, Chelyabinsk 454080, Russia
| | - Sachin Kumar
- College of IBS, National University of Science and Technology, MISiS, Moscow 119049, Russia;
| | - Changho Seo
- Department of Convergence Science, Kongju National University, Gongju-si 32588, Republic of Korea
| | - Inkyu Moon
- Department of Robotics & Mechatronics Engineering, Daegu Gyeongbuk Institute of Science & Technology (DGIST), Daegu 42988, Republic of Korea
| |
Collapse
|