1
|
Qi JH, Huang SL, Jin SZ. Novel milestones for early esophageal carcinoma: From bench to bed. World J Gastrointest Oncol 2024; 16:1104-1118. [PMID: 38660637 PMCID: PMC11037034 DOI: 10.4251/wjgo.v16.i4.1104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/15/2023] [Revised: 01/28/2024] [Accepted: 02/26/2024] [Indexed: 04/10/2024] Open
Abstract
Esophageal cancer (EC) is the seventh most common cancer worldwide, and esophageal squamous cell carcinoma (ESCC) accounts for the majority of cases of EC. To effectively diagnose and treat ESCC and improve patient prognosis, timely diagnosis in the initial phase of the illness is necessary. This article offers a detailed summary of the latest advancements and emerging technologies in the timely identification of ECs. Molecular biology and epigenetics approaches involve the use of molecular mechanisms combined with fluorescence quantitative polymerase chain reaction (qPCR), high-throughput sequencing technology (next-generation sequencing), and digital PCR technology to study endogenous or exogenous biomolecular changes in the human body and provide a decision-making basis for the diagnosis, treatment, and prognosis of diseases. The investigation of the microbiome is a swiftly progressing area in human cancer research, and microorganisms with complex functions are potential components of the tumor microenvironment. The intratumoral microbiota was also found to be connected to tumor progression. The application of endoscopy as a crucial technique for the early identification of ESCC has been essential, and with ongoing advancements in technology, endoscopy has continuously improved. With the advancement of artificial intelligence (AI) technology, the utilization of AI in the detection of gastrointestinal tumors has become increasingly prevalent. The implementation of AI can effectively resolve the discrepancies among observers, improve the detection rate, assist in predicting the depth of invasion and differentiation status, guide the pericancerous margins, and aid in a more accurate diagnosis of ESCC.
Collapse
Affiliation(s)
- Ji-Han Qi
- Department of Gastroenterology and Hepatology, The Second Affiliated Hospital of Harbin Medical University, Harbin 150086, Heilongjiang Province, China
| | - Shi-Ling Huang
- Department of Gastroenterology and Hepatology, The Second Affiliated Hospital of Harbin Medical University, Harbin 150086, Heilongjiang Province, China
| | - Shi-Zhu Jin
- Department of Gastroenterology and Hepatology, The Second Affiliated Hospital of Harbin Medical University, Harbin 150086, Heilongjiang Province, China
| |
Collapse
|
2
|
Dijkhuis TH, Bijlstra OD, Warmerdam MI, Faber RA, Linders DGJ, Galema HA, Broersen A, Dijkstra J, Kuppen PJK, Vahrmeijer AL, Mieog JSD. Semi-automatic standardized analysis method to objectively evaluate near-infrared fluorescent dyes in image-guided surgery. JOURNAL OF BIOMEDICAL OPTICS 2024; 29:026001. [PMID: 38312853 PMCID: PMC10833575 DOI: 10.1117/1.jbo.29.2.026001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/20/2023] [Revised: 12/18/2023] [Accepted: 12/20/2023] [Indexed: 02/06/2024]
Abstract
Significance Near-infrared fluorescence imaging still lacks a standardized, objective method to evaluate fluorescent dye efficacy in oncological surgical applications. This results in difficulties in translation between preclinical to clinical studies with fluorescent dyes and in the reproduction of results between studies, which in turn hampers further clinical translation of novel fluorescent dyes. Aim Our aim is to develop and evaluate a semi-automatic standardized method to objectively assess fluorescent signals in resected tissue. Approach A standardized imaging procedure was designed and quantitative analysis methods were developed to evaluate non-targeted and tumor-targeted fluorescent dyes. The developed analysis methods included manual selection of region of interest (ROI) on white light images, automated fluorescence signal ROI selection, and automatic quantitative image analysis. The proposed analysis method was then compared with a conventional analysis method, where fluorescence signal ROIs were manually selected on fluorescence images. Dice similarity coefficients and intraclass correlation coefficients were calculated to determine the inter- and intraobserver variabilities of the ROI selections and the determined signal- and tumor-to-background ratios. Results The proposed non-targeted fluorescent dyes analysis method showed statistically significantly improved variabilities after application on indocyanine green specimens. For specimens with the targeted dye SGM-101, the variability of the background ROI selection was statistically significantly improved by implementing the proposed method. Conclusion Semi-automatic methods for standardized quantitative analysis of fluorescence images were successfully developed and showed promising results to further improve the reproducibility and standardization of clinical studies evaluating fluorescent dyes.
Collapse
Affiliation(s)
- Tom H. Dijkhuis
- Leiden University Medical Center, Department of Surgery, Leiden, The Netherlands
| | - Okker D. Bijlstra
- Leiden University Medical Center, Department of Surgery, Leiden, The Netherlands
- Amsterdam University Medical Center, Cancer Center Amsterdam, Department of Surgery, Amsterdam, The Netherlands
| | - Mats I. Warmerdam
- Leiden University Medical Center, Department of Surgery, Leiden, The Netherlands
- Centre of Human Drug Research, Leiden, The Netherlands
| | - Robin A. Faber
- Leiden University Medical Center, Department of Surgery, Leiden, The Netherlands
| | - Daan G. J. Linders
- Leiden University Medical Center, Department of Surgery, Leiden, The Netherlands
| | - Hidde A. Galema
- Erasmus MC Cancer Institute, Department of Surgical Oncology and Gastrointestinal Surgery, Rotterdam, The Netherlands
| | - Alexander Broersen
- Leiden University Medical Center, Department of Radiology, Leiden, The Netherlands
| | - Jouke Dijkstra
- Leiden University Medical Center, Department of Radiology, Leiden, The Netherlands
| | - Peter J. K. Kuppen
- Leiden University Medical Center, Department of Surgery, Leiden, The Netherlands
| | | | - Jan Sven David Mieog
- Leiden University Medical Center, Department of Surgery, Leiden, The Netherlands
| |
Collapse
|
3
|
Pan Y, He L, Chen W, Yang Y. The current state of artificial intelligence in endoscopic diagnosis of early esophageal squamous cell carcinoma. Front Oncol 2023; 13:1198941. [PMID: 37293591 PMCID: PMC10247226 DOI: 10.3389/fonc.2023.1198941] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2023] [Accepted: 05/16/2023] [Indexed: 06/10/2023] Open
Abstract
Esophageal squamous cell carcinoma (ESCC) is a common malignant tumor of the digestive tract. The most effective method of reducing the disease burden in areas with a high incidence of esophageal cancer is to prevent the disease from developing into invasive cancer through screening. Endoscopic screening is key for the early diagnosis and treatment of ESCC. However, due to the uneven professional level of endoscopists, there are still many missed cases because of failure to recognize lesions. In recent years, along with remarkable progress in medical imaging and video evaluation technology based on deep machine learning, the development of artificial intelligence (AI) is expected to provide new auxiliary methods of endoscopic diagnosis and the treatment of early ESCC. The convolution neural network (CNN) in the deep learning model extracts the key features of the input image data using continuous convolution layers and then classifies images through full-layer connections. The CNN is widely used in medical image classification, and greatly improves the accuracy of endoscopic image classification. This review focuses on the AI-assisted diagnosis of early ESCC and prediction of early ESCC invasion depth under multiple imaging modalities. The excellent image recognition ability of AI is suitable for the detection and diagnosis of ESCC and can reduce missed diagnoses and help endoscopists better complete endoscopic examinations. However, the selective bias used in the training dataset of the AI system affects its general utility.
Collapse
Affiliation(s)
- Yuwei Pan
- Department of Gastroenterology, Chongqing University Cancer Hospital, Chongqing, China
| | - Lanying He
- Department of Gastroenterology, Chongqing University Cancer Hospital, Chongqing, China
| | - Weiqing Chen
- Department of Gastroenterology, Chongqing University Cancer Hospital, Chongqing, China
- Chongqing Key Laboratory of Translational Research for Cancer Metastasis and Individualized Treatment, Chongqing University Cancer Hospital, Chongqing, China
| | - Yongtao Yang
- Department of Gastroenterology, Chongqing University Cancer Hospital, Chongqing, China
- Chongqing Key Laboratory of Translational Research for Cancer Metastasis and Individualized Treatment, Chongqing University Cancer Hospital, Chongqing, China
| |
Collapse
|
4
|
Feng Y, Liang Y, Li P, Long Q, Song J, Li M, Wang X, Cheng CE, Zhao K, Ma J, Zhao L. Artificial intelligence assisted detection of superficial esophageal squamous cell carcinoma in white-light endoscopic images by using a generalized system. Discov Oncol 2023; 14:73. [PMID: 37208546 DOI: 10.1007/s12672-023-00694-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/10/2023] [Accepted: 05/15/2023] [Indexed: 05/21/2023] Open
Abstract
BACKGROUND The use of artificial intelligence (AI) assisted white light imaging (WLI) detection systems for superficial esophageal squamous cell carcinoma (SESCC) is limited by training with images from one specific endoscopy platform. METHODS In this study, we developed an AI system with a convolutional neural network (CNN) model using WLI images from Olympus and Fujifilm endoscopy platforms. The training dataset consisted of 5892 WLI images from 1283 patients, and the validation dataset included 4529 images from 1224 patients. We assessed the diagnostic performance of the AI system and compared it with that of endoscopists. We analyzed the system's ability to identify cancerous imaging characteristics and investigated the efficacy of the AI system as an assistant in diagnosis. RESULTS In the internal validation set, the AI system's per-image analysis had a sensitivity, specificity, accuracy, positive predictive value (PPV), and negative predictive value (NPV) of 96.64%, 95.35%, 91.75%, 90.91%, and 98.33%, respectively. In patient-based analysis, these values were 90.17%, 94.34%, 88.38%, 89.50%, and 94.72%, respectively. The diagnostic results in the external validation set were also favorable. The CNN model's diagnostic performance in recognizing cancerous imaging characteristics was comparable to that of expert endoscopists and significantly higher than that of mid-level and junior endoscopists. This model was competent in localizing SESCC lesions. Manual diagnostic performances were significantly improved with the assistance by AI system, especially in terms of accuracy (75.12% vs. 84.95%, p = 0.008), specificity (63.29% vs. 76.59%, p = 0.017) and PPV (64.95% vs. 75.23%, p = 0.006). CONCLUSIONS The results of this study demonstrate that the developed AI system is highly effective in automatically recognizing SESCC, displaying impressive diagnostic performance, and exhibiting strong generalizability. Furthermore, when used as an assistant in the diagnosis process, the system improved manual diagnostic performance.
Collapse
Affiliation(s)
- Yadong Feng
- Department of Gastroenterology, Zhongda Hospital Southeast University, 87 Dingjiaqiao Street, Nanjing, 210009, China.
- Department of Gastroenterology, the Affiliated Changshu Hospital of Nantong University, Changshu No. 2 People's Hospital, 18 Taishan Road, Suzhou, 215500, China.
| | - Yan Liang
- Department of Gastroenterology, Zhongda Hospital Southeast University, 87 Dingjiaqiao Street, Nanjing, 210009, China
| | - Peng Li
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, 88 Keling Road, Suzhou, 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, 96 Jinzhai Road, Hefei, 230026, China
| | - Qigang Long
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, 88 Keling Road, Suzhou, 215163, China
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, 96 Jinzhai Road, Hefei, 230026, China
| | - Jie Song
- Department of Gastroenterology, Zhongda Hospital Southeast University, 87 Dingjiaqiao Street, Nanjing, 210009, China
| | - Mengjie Li
- Department of Gastroenterology, Zhongda Hospital Southeast University, 87 Dingjiaqiao Street, Nanjing, 210009, China
| | - Xiaofen Wang
- Department of Gastroenterology, Zhongda Hospital Southeast University, 87 Dingjiaqiao Street, Nanjing, 210009, China
| | - Cui-E Cheng
- Department of Gastroenterology, the Affiliated Changshu Hospital of Nantong University, Changshu No. 2 People's Hospital, 18 Taishan Road, Suzhou, 215500, China
| | - Kai Zhao
- Department of Gastroenterology, Changzhou Jintan First People's Hospital Affiliated to Jiangsu University, 500 Jintan Avenue, Jintan, 210036, China
| | - Jifeng Ma
- Department of Gastroenterology, General Global Maanshan 17th Metallurgy Hospital, 828 West Hunan Road, Maanshan, 243011, China
| | - Lingxiao Zhao
- Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, 88 Keling Road, Suzhou, 215163, China.
- School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, 96 Jinzhai Road, Hefei, 230026, China.
| |
Collapse
|
5
|
Yuan XL, Zeng XH, Liu W, Mou Y, Zhang WH, Zhou ZD, Chen X, Hu YX, Hu B. Artificial intelligence for detecting and delineating the extent of superficial esophageal squamous cell carcinoma and precancerous lesions under narrow-band imaging (with video). Gastrointest Endosc 2023; 97:664-672.e4. [PMID: 36509114 DOI: 10.1016/j.gie.2022.12.003] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/31/2022] [Revised: 11/04/2022] [Accepted: 12/01/2022] [Indexed: 12/15/2022]
Abstract
BACKGROUND AND AIMS Although narrow-band imaging (NBI) is a useful modality for detecting and delineating esophageal squamous cell carcinoma (ESCC), there is a risk of incorrectly determining the margins of some lesions even with NBI. This study aimed to develop an artificial intelligence (AI) system for detecting superficial ESCC and precancerous lesions and delineating the extent of lesions under NBI. METHODS Nonmagnified NBI images from 4 hospitals were collected and annotated. Internal and external image test datasets were used to evaluate the detection and delineation performance of the system. The delineation performance of the system was compared with that of endoscopists. Furthermore, the system was directly integrated into the endoscopy equipment, and its real-time diagnostic capability was prospectively estimated. RESULTS The system was trained and tested using 10,047 still images and 140 videos from 1112 patients and 1183 lesions. In the image testing, the accuracy of the system in detecting lesions in internal and external tests was 92.4% and 89.9%, respectively. The accuracy of the system in delineating extents in internal and external tests was 88.9% and 87.0%, respectively. The delineation performance of the system was superior to that of junior endoscopists and similar to that of senior endoscopists. In the prospective clinical evaluation, the system exhibited satisfactory performance, with an accuracy of 91.4% in detecting lesions and an accuracy of 85.9% in delineating extents. CONCLUSIONS The proposed AI system could accurately detect superficial ESCC and precancerous lesions and delineate the extent of lesions under NBI.
Collapse
Affiliation(s)
- Xiang-Lei Yuan
- Department of Gastroenterology, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Xian-Hui Zeng
- Department of Gastroenterology, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Wei Liu
- Department of Gastroenterology, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Yi Mou
- Department of Gastroenterology, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| | - Wan-Hong Zhang
- Department of Gastroenterology, Cangxi People's Hospital, Guangyuan, Sichuan, China
| | - Zheng-Duan Zhou
- Department of Gastroenterology, Zigong Fourth People's Hospital, Zigong, Sichuan, China
| | - Xin Chen
- The First People's Hospital of Shuangliu District, Chengdu, Sichuan, China
| | - Yan-Xing Hu
- Xiamen Innovision Medical Technology Co, Ltd, Xiamen, China
| | - Bing Hu
- Department of Gastroenterology, West China Hospital, Sichuan University, Chengdu, Sichuan, China
| |
Collapse
|
6
|
Li M, Chen C, Cao Y, Zhou P, Deng X, Liu P, Wang Y, Lv X, Chen C. CIABNet: Category imbalance attention block network for the classification of multi-differentiated types of esophageal cancer. Med Phys 2023; 50:1507-1527. [PMID: 36272103 DOI: 10.1002/mp.16067] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Revised: 08/25/2022] [Accepted: 09/09/2022] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND Esophageal cancer has become one of the important cancers that seriously threaten human life and health, and its incidence and mortality rate are still among the top malignant tumors. Histopathological image analysis is the gold standard for diagnosing different differentiation types of esophageal cancer. PURPOSE The grading accuracy and interpretability of the auxiliary diagnostic model for esophageal cancer are seriously affected by small interclass differences, imbalanced data distribution, and poor model interpretability. Therefore, we focused on developing the category imbalance attention block network (CIABNet) model to try to solve the previous problems. METHODS First, the quantitative metrics and model visualization results are integrated to transfer knowledge from the source domain images to better identify the regions of interest (ROI) in the target domain of esophageal cancer. Second, in order to pay attention to the subtle interclass differences, we propose the concatenate fusion attention block, which can focus on the contextual local feature relationships and the changes of channel attention weights among different regions simultaneously. Third, we proposed a category imbalance attention module, which treats each esophageal cancer differentiation class fairly based on aggregating different intensity information at multiple scales and explores more representative regional features for each class, which effectively mitigates the negative impact of category imbalance. Finally, we use feature map visualization to focus on interpreting whether the ROIs are the same or similar between the model and pathologists, thus better improving the interpretability of the model. RESULTS The experimental results show that the CIABNet model outperforms other state-of-the-art models, which achieves the most advanced results in classifying the differentiation types of esophageal cancer with an average classification accuracy of 92.24%, an average precision of 93.52%, an average recall of 90.31%, an average F1 value of 91.73%, and an average AUC value of 97.43%. In addition, the CIABNet model has essentially similar or identical to the ROI of pathologists in identifying histopathological images of esophageal cancer. CONCLUSIONS Our experimental results prove that our proposed computer-aided diagnostic algorithm shows great potential in histopathological images of multi-differentiated types of esophageal cancer.
Collapse
Affiliation(s)
- Min Li
- College of Information Science and Engineering, Xinjiang University, Urumqi, China
- Key Laboratory of Signal Detection and Processing, Xinjiang University, Urumqi, China
| | - Chen Chen
- College of Information Science and Engineering, Xinjiang University, Urumqi, China
- Xinjiang Cloud Computing Application Laboratory, Karamay, China
| | - Yanzhen Cao
- Department of Pathology, The Affiliated Tumor Hospital of Xinjiang Medical University, Urumqi, China
| | - Panyun Zhou
- College of Software, Xinjiang University, Urumqi, China
| | - Xin Deng
- College of Software, Xinjiang University, Urumqi, China
| | - Pei Liu
- College of Information Science and Engineering, Xinjiang University, Urumqi, China
| | - Yunling Wang
- The First Affiliated Hospital of Xinjiang Medical University, Urumqi, China
| | - Xiaoyi Lv
- College of Information Science and Engineering, Xinjiang University, Urumqi, China
- Key Laboratory of Signal Detection and Processing, Xinjiang University, Urumqi, China
- Xinjiang Cloud Computing Application Laboratory, Karamay, China
- College of Software, Xinjiang University, Urumqi, China
- Key Laboratory of software engineering technology, Xinjiang University, Urumqi, China
| | - Cheng Chen
- College of Software, Xinjiang University, Urumqi, China
| |
Collapse
|
7
|
Galati JS, Duve RJ, O'Mara M, Gross SA. Artificial intelligence in gastroenterology: A narrative review. Artif Intell Gastroenterol 2022; 3:117-141. [DOI: 10.35712/aig.v3.i5.117] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/09/2022] [Revised: 11/21/2022] [Accepted: 12/21/2022] [Indexed: 12/28/2022] Open
Abstract
Artificial intelligence (AI) is a complex concept, broadly defined in medicine as the development of computer systems to perform tasks that require human intelligence. It has the capacity to revolutionize medicine by increasing efficiency, expediting data and image analysis and identifying patterns, trends and associations in large datasets. Within gastroenterology, recent research efforts have focused on using AI in esophagogastroduodenoscopy, wireless capsule endoscopy (WCE) and colonoscopy to assist in diagnosis, disease monitoring, lesion detection and therapeutic intervention. The main objective of this narrative review is to provide a comprehensive overview of the research being performed within gastroenterology on AI in esophagogastroduodenoscopy, WCE and colonoscopy.
Collapse
Affiliation(s)
- Jonathan S Galati
- Department of Medicine, NYU Langone Health, New York, NY 10016, United States
| | - Robert J Duve
- Department of Internal Medicine, Jacobs School of Medicine and Biomedical Sciences, University at Buffalo, Buffalo, NY 14203, United States
| | - Matthew O'Mara
- Division of Gastroenterology, NYU Langone Health, New York, NY 10016, United States
| | - Seth A Gross
- Division of Gastroenterology, NYU Langone Health, New York, NY 10016, United States
| |
Collapse
|
8
|
Islam MM, Poly TN, Walther BA, Yeh CY, Seyed-Abdul S, Li YC(J, Lin MC. Deep Learning for the Diagnosis of Esophageal Cancer in Endoscopic Images: A Systematic Review and Meta-Analysis. Cancers (Basel) 2022; 14:cancers14235996. [PMID: 36497480 PMCID: PMC9736434 DOI: 10.3390/cancers14235996] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2022] [Revised: 11/17/2022] [Accepted: 12/02/2022] [Indexed: 12/12/2022] Open
Abstract
Esophageal cancer, one of the most common cancers with a poor prognosis, is the sixth leading cause of cancer-related mortality worldwide. Early and accurate diagnosis of esophageal cancer, thus, plays a vital role in choosing the appropriate treatment plan for patients and increasing their survival rate. However, an accurate diagnosis of esophageal cancer requires substantial expertise and experience. Nowadays, the deep learning (DL) model for the diagnosis of esophageal cancer has shown promising performance. Therefore, we conducted an updated meta-analysis to determine the diagnostic accuracy of the DL model for the diagnosis of esophageal cancer. A search of PubMed, EMBASE, Scopus, and Web of Science, between 1 January 2012 and 1 August 2022, was conducted to identify potential studies evaluating the diagnostic performance of the DL model for esophageal cancer using endoscopic images. The study was performed in accordance with PRISMA guidelines. Two reviewers independently assessed potential studies for inclusion and extracted data from retrieved studies. Methodological quality was assessed by using the QUADAS-2 guidelines. The pooled accuracy, sensitivity, specificity, positive and negative predictive value, and the area under the receiver operating curve (AUROC) were calculated using a random effect model. A total of 28 potential studies involving a total of 703,006 images were included. The pooled accuracy, sensitivity, specificity, and positive and negative predictive value of DL for the diagnosis of esophageal cancer were 92.90%, 93.80%, 91.73%, 93.62%, and 91.97%, respectively. The pooled AUROC of DL for the diagnosis of esophageal cancer was 0.96. Furthermore, there was no publication bias among the studies. The findings of our study show that the DL model has great potential to accurately and quickly diagnose esophageal cancer. However, most studies developed their model using endoscopic data from the Asian population. Therefore, we recommend further validation through studies of other populations as well.
Collapse
Affiliation(s)
- Md. Mohaimenul Islam
- Graduate Institute of Biomedical Informatics, College of Medical Science and Technology, Taipei Medical University, Taipei 110, Taiwan
- International Center for Health Information Technology (ICHIT), Taipei Medical University, Taipei 110, Taiwan
- Research Center of Big Data and Meta-Analysis, Wan Fang Hospital, Taipei Medical University, Taipei 116, Taiwan
| | - Tahmina Nasrin Poly
- Graduate Institute of Biomedical Informatics, College of Medical Science and Technology, Taipei Medical University, Taipei 110, Taiwan
- International Center for Health Information Technology (ICHIT), Taipei Medical University, Taipei 110, Taiwan
- Research Center of Big Data and Meta-Analysis, Wan Fang Hospital, Taipei Medical University, Taipei 116, Taiwan
| | - Bruno Andreas Walther
- Deep Sea Ecology and Technology, Alfred-Wegener-Institut Helmholtz-Zentrum für Polar- und Meeresforschung, Am Handelshafen 12, D-27570 Bremerhaven, Germany
| | - Chih-Yang Yeh
- Graduate Institute of Biomedical Informatics, College of Medical Science and Technology, Taipei Medical University, Taipei 110, Taiwan
| | - Shabbir Seyed-Abdul
- Graduate Institute of Biomedical Informatics, College of Medical Science and Technology, Taipei Medical University, Taipei 110, Taiwan
| | - Yu-Chuan (Jack) Li
- Graduate Institute of Biomedical Informatics, College of Medical Science and Technology, Taipei Medical University, Taipei 110, Taiwan
- International Center for Health Information Technology (ICHIT), Taipei Medical University, Taipei 110, Taiwan
- Research Center of Big Data and Meta-Analysis, Wan Fang Hospital, Taipei Medical University, Taipei 116, Taiwan
- Department of Dermatology, Wan Fang Hospital, Taipei 116, Taiwan
- TMU Research Center of Cancer Translational Medicine, Taipei Medical University, Taipei 110, Taiwan
| | - Ming-Chin Lin
- Graduate Institute of Biomedical Informatics, College of Medical Science and Technology, Taipei Medical University, Taipei 110, Taiwan
- Department of Neurosurgery, Shuang Ho Hospital, Taipei Medical University, New Taipei City 23561, Taiwan
- Taipei Neuroscience Institute, Taipei Medical University, Taipei 11031, Taiwan
- Correspondence:
| |
Collapse
|
9
|
Sugimoto M, Koyama Y, Itoi T, Kawai T. Using texture and colour enhancement imaging to evaluate gastrointestinal diseases in clinical practice: a review. Ann Med 2022; 54:3315-3332. [PMID: 36420822 PMCID: PMC9704096 DOI: 10.1080/07853890.2022.2147992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
Abstract
White light imaging (WLI) is the most common endoscopic technique used for screening of gastrointestinal diseases. However, despite the advent of a new processor that offers sufficient clear illumination and other advanced developments in endoscopic instrumentation, WLI alone is inadequate for detecting all gastrointestinal diseases with abnormalities in mucosal discoloration and morphological changes to the mucosal surface. The recent development of image-enhanced endoscopy (IEE) has dramatically improved the detection of gastrointestinal diseases. Texture and colour enhancement imaging (TXI) is a new type of IEE that enhances brightness, surface irregularities, such as elevations or depressions, and subtle colour changes. TXI with two modes, namely modes 1 and 2, can selectively enhance brightness in dark areas of an endoscopic image and subtle tissue differences such as slight morphological or colour changes while simultaneously preventing over-enhancement. Several clinical studies have investigated the efficacy of TXI for detecting and visualizing gastrointestinal diseases, including oesophageal squamous cell carcinoma (ESCC), Barret's epithelium, gastric cancer, gastric mucosal atrophy and intestinal metaplasia. Although TXI is often more useful for detecting and visualizing gastrointestinal diseases than WLI, it remains unclear whether TXI outperforms other IEEs, such as narrow-band imaging (NBI), in similar functions, and whether the performance of TXI modes 1 and 2 are comparable. Therefore, large-scale prospective studies are needed to compare the efficacy of TXI to WLI and other IEEs for endoscopic evaluation of patients undergoing screening endoscopy. Here, we review the characteristics and efficacy of TXI for the detection and visualization of gastrointestinal diseases.Key MessagesTXI mode 1 can improve the visibility of gastrointestinal diseases and qualitative diagnosis, especially for diseases associated with colour changes.The enhancement of texture and brightness with TXI mode 2 enables the detection of diseases, and is ideal for use in the first screening of gastrointestinal tract.
Collapse
Affiliation(s)
- Mitsushige Sugimoto
- Department of Gastroenterological Endoscopy, Tokyo Medical University Hospital, Tokyo, Japan
| | - Yohei Koyama
- Department of Gastroenterology, Tokyo Medical University Hospital, Tokyo, Japan
| | - Takao Itoi
- Department of Gastroenterology, Tokyo Medical University Hospital, Tokyo, Japan
| | - Takashi Kawai
- Department of Gastroenterological Endoscopy, Tokyo Medical University Hospital, Tokyo, Japan
| |
Collapse
|