1
|
Suzuki S, Monno Y, Arai R, Miyaoka M, Toya Y, Esaki M, Wada T, Hatta W, Takasu A, Nagao S, Ishibashi F, Minato Y, Konda K, Dohmen T, Miki K, Okutomi M. Diagnostic performance of deep-learning-based virtual chromoendoscopy in gastric neoplasms. Gastric Cancer 2024; 27:539-547. [PMID: 38240891 DOI: 10.1007/s10120-024-01469-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/05/2023] [Accepted: 01/09/2024] [Indexed: 04/15/2024]
Abstract
BACKGROUNDS Cycle-consistent generative adversarial network (CycleGAN) is a deep neural network model that performs image-to-image translations. We generated virtual indigo carmine (IC) chromoendoscopy images of gastric neoplasms using CycleGAN and compared their diagnostic performance with that of white light endoscopy (WLE). METHODS WLE and IC images of 176 patients with gastric neoplasms who underwent endoscopic resection were obtained. We used 1,633 images (911 WLE and 722 IC) of 146 cases in the training dataset to develop virtual IC images using CycleGAN. The remaining 30 WLE images were translated into 30 virtual IC images using the trained CycleGAN and used for validation. The lesion borders were evaluated by 118 endoscopists from 22 institutions using the 60 paired virtual IC and WLE images. The lesion area concordance rate and successful whole-lesion diagnosis were compared. RESULTS The lesion area concordance rate based on the pathological diagnosis in virtual IC was lower than in WLE (44.1% vs. 48.5%, p < 0.01). The successful whole-lesion diagnosis was higher in the virtual IC than in WLE images; however, the difference was insignificant (28.2% vs. 26.4%, p = 0.11). Conversely, subgroup analyses revealed a significantly higher diagnosis in virtual IC than in WLE for depressed morphology (41.9% vs. 36.9%, p = 0.02), differentiated histology (27.6% vs. 24.8%, p = 0.02), smaller lesion size (42.3% vs. 38.3%, p = 0.01), and assessed by expert endoscopists (27.3% vs. 23.6%, p = 0.03). CONCLUSIONS The diagnostic ability of virtual IC was higher for some lesions, but not completely superior to that of WLE. Adjustments are required to improve the imaging system's performance.
Collapse
Affiliation(s)
- Sho Suzuki
- Department of Gastroenterology, International University of Health and Welfare Ichikawa Hospital, 6-1-14, Konodai, Ichikawa-Shi, Chiba, 272-0827, Japan.
| | - Yusuke Monno
- Department of Systems and Control Engineering, School of Engineering, Tokyo Institute of Technology, Tokyo, Japan
| | - Ryo Arai
- Department of Systems and Control Engineering, School of Engineering, Tokyo Institute of Technology, Tokyo, Japan
| | - Masaki Miyaoka
- Department of Endoscopy, Fukuoka University Chikushi Hospital, Chikushino, Japan
| | - Yosuke Toya
- Division of Gastroenterology and Hepatology, Department of Internal Medicine, School of Medicine, Iwate Medical University, Yahaba, Japan
| | - Mitsuru Esaki
- Department of Medicine and Bioregulatory Science, Graduate School of Medical Sciences, Kyushu University, Fukouka, Japan
- Department of Gastroenterology, Harasanshin Hospital, Fukuoka, Japan
| | - Takuya Wada
- Department of Gastroenterology, Kitasato University School of Medicine, Sagamihara, Japan
| | - Waku Hatta
- Division of Gastroenterology, Tohoku University Graduate School of Medicine, Sendai, Japan
| | - Ayaka Takasu
- Division of Gastroenterology and Hepatology, Department of Medicine, Nihon University School of Medicine, Tokyo, Japan
| | - Shigeaki Nagao
- Medical Examination Center, Showa General Hospital, Tokyo, Japan
| | - Fumiaki Ishibashi
- Department of Gastroenterology, International University of Health and Welfare Ichikawa Hospital, 6-1-14, Konodai, Ichikawa-Shi, Chiba, 272-0827, Japan
- Endoscopy Center, Koganei Tsurukame Clinic, Tokyo, Japan
| | - Yohei Minato
- Department of Gastrointestinal Endoscopy, NTT Medical Center Tokyo, Tokyo, Japan
| | - Kenichi Konda
- Division of Gastroenterology, Department of Medicine, Showa University School of Medicine, Tokyo, Japan
| | - Takahiro Dohmen
- Department of Gastroenterology, Yuri Kumiai General Hospital, Yurihonjo, Japan
| | - Kenji Miki
- Department of Internal Medicine, Tsujinaka Hospital Kashiwanoha, Kashiwa, Japan
| | - Masatoshi Okutomi
- Department of Systems and Control Engineering, School of Engineering, Tokyo Institute of Technology, Tokyo, Japan
| |
Collapse
|
2
|
Ma Y, Zhang Y, Wang Z, Li J, Miao Y, Yang F, Pan W. DSFF-GAN: A novel stain transfer network for generating immunohistochemical image of endometrial cancer. Comput Biol Med 2024; 170:108046. [PMID: 38325211 DOI: 10.1016/j.compbiomed.2024.108046] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2023] [Revised: 12/31/2023] [Accepted: 01/26/2024] [Indexed: 02/09/2024]
Abstract
Immunohistochemistry (IHC) is a commonly used histological examination technique. Compared to Hematoxylin and Eosin (H&E) staining, it enables the examination of protein expression and localization in tissues, which is valuable for cancer treatment and prognosis assessment, such as the detection and diagnosis of endometrial cancer. However, IHC involves multiple staining steps, is time-consuming and expensive. One potential solution is to utilize deep learning networks to generate corresponding virtual IHC images from H&E images. However, the similarity of the IHC image generated by the existing methods needs to be further improved. In this work, we propose a novel dual-scale feature fusion (DSFF) generative adversarial network named DSFF-GAN, which comprises a cycle structure-color similarity loss, and DSFF block to constrain the model's training process and enhance its stain transfer capability. In addition, our method incorporates labeling information of positive cell regions as prior knowledge into the network to further improve the evaluation metrics. We train and test our model using endometrial cancer and publicly available breast cancer IHC datasets, and compare it with state-of-the-art methods. Compared to previous methods, our model demonstrates significant improvements in most evaluation metrics on both datasets. The research results show that our method further improves the quality of image generation and has potential value for the future clinical application of virtual IHC images.
Collapse
Affiliation(s)
- Yihao Ma
- School of Biology & Engineering (School of Modern Industry for Health and Medicine), Guizhou Medical University, Guiyang, Guizhou Province, China
| | - Yiqiong Zhang
- Guizhou Prenatal Diagnostic Center, Affiliated Hospital of Guizhou Medical University, Guiyang, Guizhou Province, China; School of Clinical Laboratory Science, Guizhou Medical University, Guiyang, Guizhou Province, China
| | - Zhengrong Wang
- Guizhou Prenatal Diagnostic Center, Affiliated Hospital of Guizhou Medical University, Guiyang, Guizhou Province, China
| | - Juan Li
- Department of Pathology, Affiliated Hospital of Guizhou Medical University, Guiyang, Guizhou Province, China
| | - Yuehong Miao
- School of Biology & Engineering (School of Modern Industry for Health and Medicine), Guizhou Medical University, Guiyang, Guizhou Province, China
| | - Fan Yang
- School of Biology & Engineering (School of Modern Industry for Health and Medicine), Guizhou Medical University, Guiyang, Guizhou Province, China.
| | - Wei Pan
- Guizhou Prenatal Diagnostic Center, Affiliated Hospital of Guizhou Medical University, Guiyang, Guizhou Province, China; School of Clinical Laboratory Science, Guizhou Medical University, Guiyang, Guizhou Province, China.
| |
Collapse
|
3
|
Sato J, Matsumoto T, Nakao R, Tanaka H, Nagahara H, Niioka H, Takamatsu T. Deep UV-excited fluorescence microscopy installed with CycleGAN-assisted image translation enhances precise detection of lymph node metastasis towards rapid intraoperative diagnosis. Sci Rep 2023; 13:21363. [PMID: 38049475 PMCID: PMC10696085 DOI: 10.1038/s41598-023-48319-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2023] [Accepted: 11/24/2023] [Indexed: 12/06/2023] Open
Abstract
Rapid and precise intraoperative diagnosing systems are required for improving surgical outcomes and patient prognosis. Because of the poor quality and time-intensive process of the prevalent frozen section procedure, various intraoperative diagnostic imaging systems have been explored. Microscopy with ultraviolet surface excitation (MUSE) is an inexpensive, maintenance-free, and rapid imaging technique that yields images like thin-sectioned samples without sectioning. However, pathologists find it nearly impossible to assign diagnostic labels to MUSE images of unfixed specimens; thus, AI for intraoperative diagnosis cannot be trained in a supervised learning manner. In this study, we propose a deep-learning pipeline model for lymph node metastasis detection, in which CycleGAN translate MUSE images of unfixed lymph nodes to formalin-fixed paraffin-embedded (FFPE) sample, and diagnostic prediction is performed using deep convolutional neural network trained on FFPE sample images. Our pipeline yielded an average accuracy of 84.6% when using each of the three deep convolutional neural networks, which is a 18.3% increase over the classification-only model without CycleGAN. The modality translation to FFPE sample images using CycleGAN can be applied to various intraoperative diagnostic imaging systems and eliminate the difficulty for pathologists in labeling new modality images in clinical sites. We anticipate our pipeline to be a starting point for accurate rapid intraoperative diagnostic systems for new imaging modalities, leading to healthcare quality improvement.
Collapse
Affiliation(s)
- Junya Sato
- Graduate School of Information Science and Technology, Osaka University, 1-5, Yamadaoka, Suita, Osaka, 565-0871, Japan
- Department of Radiology, Osaka University Graduate School of Medicine, 2-2, Yamadaoka, Suita, Osaka, 565-0871, Japan
- Department of Artificial Intelligence Diagnostic Radiology, Osaka University Graduate School of Medicine, 2-2, Yamadaoka, Suita, Osaka, 565-0871, Japan
| | - Tatsuya Matsumoto
- Department of Pathology and Cell Regulation, Kyoto Prefectural University of Medicine, 465 Kajiicho, Kawaramachi-Hirokoji, Kamigyo-ku, Kyoto, 602-8566, Japan
| | - Ryuta Nakao
- Department of Pathology and Cell Regulation, Kyoto Prefectural University of Medicine, 465 Kajiicho, Kawaramachi-Hirokoji, Kamigyo-ku, Kyoto, 602-8566, Japan
| | - Hideo Tanaka
- Department of Pathology and Cell Regulation, Kyoto Prefectural University of Medicine, 465 Kajiicho, Kawaramachi-Hirokoji, Kamigyo-ku, Kyoto, 602-8566, Japan
| | - Hajime Nagahara
- Graduate School of Information Science and Technology, Osaka University, 1-5, Yamadaoka, Suita, Osaka, 565-0871, Japan
- Institute for Datability Science, Osaka University, 2-8 Yamadaoka, Suita, 565-0871, Japan
| | - Hirohiko Niioka
- Graduate School of Information Science and Technology, Osaka University, 1-5, Yamadaoka, Suita, Osaka, 565-0871, Japan.
- Institute for Datability Science, Osaka University, 2-8 Yamadaoka, Suita, 565-0871, Japan.
| | - Tetsuro Takamatsu
- Department of Pathology and Cell Regulation, Kyoto Prefectural University of Medicine, 465 Kajiicho, Kawaramachi-Hirokoji, Kamigyo-ku, Kyoto, 602-8566, Japan.
- Department of Medical Photonics, Kyoto Prefectural University of Medicine, 465 Kajiicho, Kawaramachi-Hirokoji, Kamigyo-ku, Kyoto, 602-8566, Japan.
| |
Collapse
|
4
|
Roy M, Wang F, Teodoro G, Bhattarai S, Bhargava M, Rekha TS, Aneja R, Kong J. Deep learning based registration of serial whole-slide histopathology images in different stains. J Pathol Inform 2023; 14:100311. [PMID: 37214150 PMCID: PMC10193019 DOI: 10.1016/j.jpi.2023.100311] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2022] [Revised: 04/11/2023] [Accepted: 04/12/2023] [Indexed: 05/24/2023] Open
Abstract
For routine pathology diagnosis and imaging-based biomedical research, Whole-slide image (WSI) analyses have been largely limited to a 2D tissue image space. For a more definitive tissue representation to support fine-resolution spatial and integrative analyses, it is critical to extend such tissue-based investigations to a 3D tissue space with spatially aligned serial tissue WSIs in different stains, such as Hematoxylin and Eosin (H&E) and Immunohistochemistry (IHC) biomarkers. However, such WSI registration is technically challenged by the overwhelming image scale, the complex histology structure change, and the significant difference in tissue appearances in different stains. The goal of this study is to register serial sections from multi-stain histopathology whole-slide image blocks. We propose a novel translation-based deep learning registration network CGNReg that spatially aligns serial WSIs stained in H&E and by IHC biomarkers without prior deformation information for the model training. First, synthetic IHC images are produced from H&E slides through a robust image synthesis algorithm. Next, the synthetic and the real IHC images are registered through a Fully Convolutional Network with multi-scaled deformable vector fields and a joint loss optimization. We perform the registration at the full image resolution, retaining the tissue details in the results. Evaluated with a dataset of 76 breast cancer patients with 1 H&E and 2 IHC serial WSIs for each patient, CGNReg presents promising performance as compared with multiple state-of-the-art systems in our evaluation. Our results suggest that CGNReg can produce promising registration results with serial WSIs in different stains, enabling integrative 3D tissue-based biomedical investigations.
Collapse
Affiliation(s)
- Mousumi Roy
- Department of Computer Science, Stony Brook University, NY 11794, USA
| | - Fusheng Wang
- Department of Computer Science, Stony Brook University, NY 11794, USA
- Department of Biomedical Informatics, Stony Brook University, NY 11794, USA
| | - George Teodoro
- Department of Computer Science, Federal University of Minas Gerais, Belo Horizonte 31270-901, Brazil
| | - Shristi Bhattarai
- Department of Clinical and Diagnostic Sciences, School of Health Profession, University of Alabama at Birmingham, Birmingham, AL 35233, USA
| | - Mahak Bhargava
- Department of Clinical and Diagnostic Sciences, School of Health Profession, University of Alabama at Birmingham, Birmingham, AL 35233, USA
| | - T. Subbanna Rekha
- Department of Pathology, JSS Medical College, JSS Academy of Higher Education and Research, Mysuru, Karnataka 570009, India
| | - Ritu Aneja
- Department of Clinical and Diagnostic Sciences, School of Health Profession, University of Alabama at Birmingham, Birmingham, AL 35233, USA
| | - Jun Kong
- Department of Mathematics and Statistics, Georgia State University, Atlanta, GA 30303, USA
- Department of Computer Science and Winship Cancer Institute, Emory University, Atlanta, GA 30322, USA
| |
Collapse
|
5
|
Vasiljević J, Nisar Z, Feuerhake F, Wemmert C, Lampert T. CycleGAN for virtual stain transfer: Is seeing really believing? Artif Intell Med 2022; 133:102420. [PMID: 36328671 DOI: 10.1016/j.artmed.2022.102420] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2021] [Revised: 03/16/2022] [Accepted: 10/02/2022] [Indexed: 01/18/2023]
Abstract
Digital Pathology is an area prone to high variation due to multiple factors which can strongly affect diagnostic quality and visual appearance of the Whole-Slide-Images (WSIs). The state-of-the art methods to deal with such variation tend to address this through style-transfer inspired approaches. Usually, these solutions directly apply successful approaches from the literature, potentially with some task-related modifications. The majority of the obtained results are visually convincing, however, this paper shows that this is not a guarantee that such images can be directly used for either medical diagnosis or reducing domain shift.This article shows that slight modification in a stain transfer architecture, such as a choice of normalisation layer, while resulting in a variety of visually appealing results, surprisingly greatly effects the ability of a stain transfer model to reduce domain shift. By extensive qualitative and quantitative evaluations, we confirm that translations resulting from different stain transfer architectures are distinct from each other and from the real samples. Therefore conclusions made by visual inspection or pretrained model evaluation might be misleading.
Collapse
Affiliation(s)
- Jelica Vasiljević
- ICube, University of Strasbourg, CNRS (UMR 7357), France; University of Belgrade, Belgrade, Serbia; Faculty of Science, University of Kragujevac, Kragujevac, Serbia.
| | - Zeeshan Nisar
- ICube, University of Strasbourg, CNRS (UMR 7357), France
| | - Friedrich Feuerhake
- Institute of Pathology, Hannover Medical School, Germany; University Clinic, Freiburg, Germany
| | - Cédric Wemmert
- ICube, University of Strasbourg, CNRS (UMR 7357), France
| | - Thomas Lampert
- ICube, University of Strasbourg, CNRS (UMR 7357), France
| |
Collapse
|