1
|
Clarke JA, Benning J, Isaacs J, Angell-Clarke S. A balance of clinical assessment and use of diagnostic imaging: A CT colonography comparative case report. Radiol Case Rep 2024; 19:2751-2755. [PMID: 38680738 PMCID: PMC11047173 DOI: 10.1016/j.radcr.2024.03.066] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2024] [Revised: 03/09/2024] [Accepted: 03/25/2024] [Indexed: 05/01/2024] Open
Abstract
Computer tomography colonography (CTC) is a non-invasive procedure which has replaced barium enema. CTC uses helical images of a cleansed and gas-distended colon for the diagnosis and treatment of colonic neoplasms. This case study compares 2 patients: one with positive pathology (patient A) and another as comparator (patient B) with a similar pathology to discuss and debate possible treatment pathways. Patient (A) CTC showed 2 polyps: 6 mm and 10 mm, which the colorectal surgeons thought only needed follow-up. Our comparator (patient B) displayed a similar pathology which measured 9 mm. In this case (patient B), there was mutual agreement with the surgeons for polypectomy but without haematology involvement which was atypical of the usual pathway. The surgeons did not see the 9 mm polyp at polypectomy which could be due to observer error or radiology reporter error. Given that conventional colonoscopy is more sensitive in detecting polyps; a repeat of both tests could confirm the presence of polyp, however, the surgeons gave patient (B) a virtual appointment and requested a repeat CTC in 12 months. In colorectal medicine there can be variations in the treatment of patients with polyps. While a repeat of both tests could confirm the presence of polyp in patient (B), the surgeons' decisions regarding the patient's treatment reflected a balance of confidence in clinical assessment and use of diagnostic imaging which can reduce unnecessary requests and use of diagnostic tests.
Collapse
Affiliation(s)
- Justin A. Clarke
- Ashford and St. Peter's Hospitals Radiology Department, Guilford Road, Chertsey, Surrey, UK
| | - Jeevon Benning
- Ashford and St. Peter's Hospitals Radiology Department, Guilford Road, Chertsey, Surrey, UK
| | - John Isaacs
- Ashford and St. Peter's Hospitals Research and Development Department, Guilford Road, Chertsey, Surrey, UK
| | | |
Collapse
|
2
|
Davila-Piñón P, Nogueira-Rodríguez A, Díez-Martín AI, Codesido L, Herrero J, Puga M, Rivas L, Sánchez E, Fdez-Riverola F, Glez-Peña D, Reboiro-Jato M, López-Fernández H, Cubiella J. Optical diagnosis in still images of colorectal polyps: comparison between expert endoscopists and PolyDeep, a Computer-Aided Diagnosis system. Front Oncol 2024; 14:1393815. [PMID: 38846970 PMCID: PMC11153726 DOI: 10.3389/fonc.2024.1393815] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2024] [Accepted: 04/22/2024] [Indexed: 06/09/2024] Open
Abstract
Background PolyDeep is a computer-aided detection and classification (CADe/x) system trained to detect and classify polyps. During colonoscopy, CADe/x systems help endoscopists to predict the histology of colonic lesions. Objective To compare the diagnostic performance of PolyDeep and expert endoscopists for the optical diagnosis of colorectal polyps on still images. Methods PolyDeep Image Classification (PIC) is an in vitro diagnostic test study. The PIC database contains NBI images of 491 colorectal polyps with histological diagnosis. We evaluated the diagnostic performance of PolyDeep and four expert endoscopists for neoplasia (adenoma, sessile serrated lesion, traditional serrated adenoma) and adenoma characterization and compared them with the McNemar test. Receiver operating characteristic curves were constructed to assess the overall discriminatory ability, comparing the area under the curve of endoscopists and PolyDeep with the chi- square homogeneity areas test. Results The diagnostic performance of the endoscopists and PolyDeep in the characterization of neoplasia is similar in terms of sensitivity (PolyDeep: 89.05%; E1: 91.23%, p=0.5; E2: 96.11%, p<0.001; E3: 86.65%, p=0.3; E4: 91.26% p=0.3) and specificity (PolyDeep: 35.53%; E1: 33.80%, p=0.8; E2: 34.72%, p=1; E3: 39.24%, p=0.8; E4: 46.84%, p=0.2). The overall discriminative ability also showed no statistically significant differences (PolyDeep: 0.623; E1: 0.625, p=0.8; E2: 0.654, p=0.2; E3: 0.629, p=0.9; E4: 0.690, p=0.09). In the optical diagnosis of adenomatous polyps, we found that PolyDeep had a significantly higher sensitivity and a significantly lower specificity. The overall discriminative ability of adenomatous lesions by expert endoscopists is significantly higher than PolyDeep (PolyDeep: 0.582; E1: 0.685, p < 0.001; E2: 0.677, p < 0.0001; E3: 0.658, p < 0.01; E4: 0.694, p < 0.0001). Conclusion PolyDeep and endoscopists have similar diagnostic performance in the optical diagnosis of neoplastic lesions. However, endoscopists have a better global discriminatory ability than PolyDeep in the optical diagnosis of adenomatous polyps.
Collapse
Affiliation(s)
- Pedro Davila-Piñón
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Fundación Pública Galega de Investigación Biomédica Galicia Sur, Complexo Hospitalario Universitario de Ourense, Sergas, Ourense, Spain
| | - Alba Nogueira-Rodríguez
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Astrid Irene Díez-Martín
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Fundación Pública Galega de Investigación Biomédica Galicia Sur, Complexo Hospitalario Universitario de Ourense, Sergas, Ourense, Spain
| | - Laura Codesido
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Fundación Pública Galega de Investigación Biomédica Galicia Sur, Complexo Hospitalario Universitario de Ourense, Sergas, Ourense, Spain
| | - Jesús Herrero
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Manuel Puga
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Laura Rivas
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Eloy Sánchez
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| | - Florentino Fdez-Riverola
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Daniel Glez-Peña
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Miguel Reboiro-Jato
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Hugo López-Fernández
- Department of Computer Science, Escuela Superior de Ingenieria Informática (ESEI), CINBIO, University of Vigo, Ourense, Spain
- Next Generation Computer Systems Group (SING) Research Group, Galicia Sur Health Research Institute (IIS Galicia Sur), Ourense, Spain
| | - Joaquín Cubiella
- Research Group in Gastrointestinal Oncology Ourense, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Ourense, Spain
- Department of Gastroenterology, Hospital Universitario de Ourense, Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBEREHD), Ourense, Spain
| |
Collapse
|
3
|
Wu R, Qin K, Fang Y, Xu Y, Zhang H, Li W, Luo X, Han Z, Liu S, Li Q. Application of the convolution neural network in determining the depth of invasion of gastrointestinal cancer: a systematic review and meta-analysis. J Gastrointest Surg 2024; 28:538-547. [PMID: 38583908 DOI: 10.1016/j.gassur.2023.12.029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/22/2023] [Revised: 12/16/2023] [Accepted: 12/30/2023] [Indexed: 04/09/2024]
Abstract
BACKGROUND With the development of endoscopic technology, endoscopic submucosal dissection (ESD) has been widely used in the treatment of gastrointestinal tumors. It is necessary to evaluate the depth of tumor invasion before the application of ESD. The convolution neural network (CNN) is a type of artificial intelligence that has the potential to assist in the classification of the depth of invasion in endoscopic images. This meta-analysis aimed to evaluate the performance of CNN in determining the depth of invasion of gastrointestinal tumors. METHODS A search on PubMed, Web of Science, and SinoMed was performed to collect the original publications about the use of CNN in determining the depth of invasion of gastrointestinal neoplasms. Pooled sensitivity and specificity were calculated using an exact binominal rendition of the bivariate mixed-effects regression model. I2 was used for the evaluation of heterogeneity. RESULTS A total of 17 articles were included; the pooled sensitivity was 84% (95% CI, 0.81-0.88), specificity was 91% (95% CI, 0.85-0.94), and the area under the curve (AUC) was 0.93 (95% CI, 0.90-0.95). The performance of CNN was significantly better than that of endoscopists (AUC: 0.93 vs 0.83, respectively; P = .0005). CONCLUSION Our review revealed that CNN is one of the most effective methods of endoscopy to evaluate the depth of invasion of early gastrointestinal tumors, which has the potential to work as a remarkable tool for clinical endoscopists to make decisions on whether the lesion is feasible for endoscopic treatment.
Collapse
Affiliation(s)
- Ruo Wu
- Nanfang Hospital (The First School of Clinical Medicine), Southern Medical University, Guangzhou, Guangdong, China
| | - Kaiwen Qin
- Department of Gastroenterology, Guangdong Provincial Key Laboratory of Gastroenterology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China
| | - Yuxin Fang
- Department of Gastroenterology, Guangdong Provincial Key Laboratory of Gastroenterology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China
| | - Yuyuan Xu
- Department of Hepatology Unit and Infectious Diseases, State Key Laboratory of Organ Failure Research, Guangdong Provincial Key Laboratory of Viral Hepatitis Research, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China
| | - Haonan Zhang
- Department of Gastroenterology, Guangdong Provincial Key Laboratory of Gastroenterology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China
| | - Wenhua Li
- Nanfang Hospital (The First School of Clinical Medicine), Southern Medical University, Guangzhou, Guangdong, China
| | - Xiaobei Luo
- Department of Gastroenterology, Guangdong Provincial Key Laboratory of Gastroenterology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China
| | - Zelong Han
- Department of Gastroenterology, Guangdong Provincial Key Laboratory of Gastroenterology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China
| | - Side Liu
- Department of Gastroenterology, Guangdong Provincial Key Laboratory of Gastroenterology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China; Pazhou Lab, Guangzhou, Guangdong, China
| | - Qingyuan Li
- Department of Gastroenterology, Guangdong Provincial Key Laboratory of Gastroenterology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong, China.
| |
Collapse
|
4
|
Murray J, Heng D, Lygate A, Porto L, Abade A, Manica S, Franco A. Applying artificial intelligence to determination of legal age of majority from radiographic data. Morphologie 2024; 108:100723. [PMID: 37897941 DOI: 10.1016/j.morpho.2023.100723] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2023] [Accepted: 08/24/2023] [Indexed: 10/30/2023]
Abstract
Forensic odontologists use biological patterns to estimate chronological age for the judicial system. The age of majority is a legally significant period with a limited set of reliable oral landmarks. Currently, experts rely on the questionable development of third molars to assess whether litigants can be prosecuted as legal adults. Identification of new and novel patterns may illuminate features more dependably indicative of chronological age, which have, until now, remained unseen. Unfortunately, biased perceptions and limited cognitive capacity compromise the ability of researchers to notice new patterns. The present study demonstrates how artificial intelligence can break through identification barriers and generate new estimation modalities. A convolutional neural network was trained with 4003 panoramic-radiographs to sort subjects into 'under-18' and 'over-18' age categories. The resultant architecture identified legal adults with a high predictive accuracy equally balanced between precision, specificity and recall. Moving forward, AI-based methods could improve courtroom efficiency, stand as automated assessment methods and contribute to our understanding of biological ageing.
Collapse
Affiliation(s)
- J Murray
- Department of Forensic Odontology, University of Dundee, Nethergate, Dundee DD1 4HN, UK.
| | - D Heng
- Department of Forensic Odontology, University of Dundee, Nethergate, Dundee DD1 4HN, UK
| | - A Lygate
- Department of Forensic Odontology, University of Dundee, Nethergate, Dundee DD1 4HN, UK
| | - L Porto
- Department of Mechanical Engineering, University of Brasilia, Federal District 70910-900, Brazil
| | - A Abade
- Departmento de Computacao, Instituto Federal de Educacao, Ciencie e Tecnologia de Mato Grosso, Cuiaba, Mato Grosso, Brazil
| | - S Manica
- Department of Forensic Odontology, University of Dundee, Nethergate, Dundee DD1 4HN, UK
| | - A Franco
- Department of Forensic Odontology, University of Dundee, Nethergate, Dundee DD1 4HN, UK; Division of Forensic Dentistry, Faculdade São Leopoldo Mandic, Campinas, Brazil
| |
Collapse
|
5
|
Kim DK, Kim BS, Kim YJ, Kim S, Yoon D, Lee DK, Jeong J, Jo YH. Development and validation of an artificial intelligence algorithm for detecting vocal cords in video laryngoscopy. Medicine (Baltimore) 2023; 102:e36761. [PMID: 38134083 PMCID: PMC10735139 DOI: 10.1097/md.0000000000036761] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/21/2023] [Accepted: 12/01/2023] [Indexed: 12/24/2023] Open
Abstract
Airway procedures in life-threatening situations are vital for saving lives. Video laryngoscopy (VL) is commonly performed during endotracheal intubation (ETI) in the emergency department. Artificial intelligence (AI) is widely used in the medical field, particularly to detect anatomical structures. This study aimed to develop an AI algorithm that detects vocal cords from VL images acquired during emergent situations. This retrospective study used VL images acquired in the emergency department to facilitate the ETI. The vocal cord image was labeled with a ground-truth bounding box. The dataset was divided into training and validation datasets. The algorithm was developed from a training dataset using the YOLOv4 model. The performance of the algorithm was evaluated using a test set. The test set was further divided into specific environments during the ETI for clinical subgroup analysis. In total, 20,161 images from 84 patients were used in this study. A total of 10,287, 5766, and 4108 images were used for the model training, validation, and test sets, respectively. The developed algorithm achieved F1 score 0.906, sensitivity 0.963, and specificity 0.842 in the validation set. The performance in the test set was F1 score 0.808, sensitivity 0.823, and specificity 0.804. We developed and validated an AI algorithm to detect vocal cords in VL. This algorithm demonstrated a high performance. The algorithm can be used to determine the vocal cord to ensure safe ETI.
Collapse
Affiliation(s)
- Dae Kon Kim
- Department of Emergency Medicine, Seoul National University Bundang Hospital, Seongnam, Republic of Korea
- Seoul National University, College of Medicine, Seoul, Republic of Korea
| | - Byeong Soo Kim
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, Republic of Korea
| | - Yu Jin Kim
- Department of Emergency Medicine, Seoul National University Bundang Hospital, Seongnam, Republic of Korea
- Seoul National University, College of Medicine, Seoul, Republic of Korea
| | - Sungwan Kim
- Department of Biomedical Engineering, Seoul National University College of Medicine, Seoul, Republic of Korea
- Institute of Bioengineering, Seoul National University, Seoul, Republic of Korea
| | - Dan Yoon
- Interdisciplinary Program in Bioengineering, Graduate School, Seoul National University, Seoul, Republic of Korea
| | - Dong Keon Lee
- Department of Emergency Medicine, Seoul National University Bundang Hospital, Seongnam, Republic of Korea
- Seoul National University, College of Medicine, Seoul, Republic of Korea
| | - Joo Jeong
- Department of Emergency Medicine, Seoul National University Bundang Hospital, Seongnam, Republic of Korea
- Seoul National University, College of Medicine, Seoul, Republic of Korea
| | - You Hwan Jo
- Department of Emergency Medicine, Seoul National University Bundang Hospital, Seongnam, Republic of Korea
- Seoul National University, College of Medicine, Seoul, Republic of Korea
| |
Collapse
|
6
|
Young E, Edwards L, Singh R. The Role of Artificial Intelligence in Colorectal Cancer Screening: Lesion Detection and Lesion Characterization. Cancers (Basel) 2023; 15:5126. [PMID: 37958301 PMCID: PMC10647850 DOI: 10.3390/cancers15215126] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2023] [Revised: 10/14/2023] [Accepted: 10/14/2023] [Indexed: 11/15/2023] Open
Abstract
Colorectal cancer remains a leading cause of cancer-related morbidity and mortality worldwide, despite the widespread uptake of population surveillance strategies. This is in part due to the persistent development of 'interval colorectal cancers', where patients develop colorectal cancer despite appropriate surveillance intervals, implying pre-malignant polyps were not resected at a prior colonoscopy. Multiple techniques have been developed to improve the sensitivity and accuracy of lesion detection and characterisation in an effort to improve the efficacy of colorectal cancer screening, thereby reducing the incidence of interval colorectal cancers. This article presents a comprehensive review of the transformative role of artificial intelligence (AI), which has recently emerged as one such solution for improving the quality of screening and surveillance colonoscopy. Firstly, AI-driven algorithms demonstrate remarkable potential in addressing the challenge of overlooked polyps, particularly polyp subtypes infamous for escaping human detection because of their inconspicuous appearance. Secondly, AI empowers gastroenterologists without exhaustive training in advanced mucosal imaging to characterise polyps with accuracy similar to that of expert interventionalists, reducing the dependence on pathologic evaluation and guiding appropriate resection techniques or referrals for more complex resections. AI in colonoscopy holds the potential to advance the detection and characterisation of polyps, addressing current limitations and improving patient outcomes. The integration of AI technologies into routine colonoscopy represents a promising step towards more effective colorectal cancer screening and prevention.
Collapse
Affiliation(s)
- Edward Young
- Faculty of Health and Medical Sciences, University of Adelaide, Lyell McEwin Hospital, Haydown Rd, Elizabeth Vale, SA 5112, Australia
| | - Louisa Edwards
- Faculty of Health and Medical Sciences, University of Adelaide, Queen Elizabeth Hospital, Port Rd, Woodville South, SA 5011, Australia
| | - Rajvinder Singh
- Faculty of Health and Medical Sciences, University of Adelaide, Lyell McEwin Hospital, Haydown Rd, Elizabeth Vale, SA 5112, Australia
| |
Collapse
|
7
|
Keshtkar K, Reza Safarpour A, Heshmat R, Sotoudehmanesh R, Keshtkar A. A Systematic Review and Meta-analysis of Convolutional Neural Network in the Diagnosis of Colorectal Polyps and Cancer. THE TURKISH JOURNAL OF GASTROENTEROLOGY : THE OFFICIAL JOURNAL OF TURKISH SOCIETY OF GASTROENTEROLOGY 2023; 34:985-997. [PMID: 37681266 PMCID: PMC10645297 DOI: 10.5152/tjg.2023.22491] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/22/2022] [Accepted: 03/22/2023] [Indexed: 09/09/2023]
Abstract
Convolutional neural networks are a class of deep neural networks used for different clinical purposes, including improving the detection rate of colorectal lesions. This systematic review and meta-analysis aimed to assess the performance of convolutional neural network-based models in the detection or classification of colorectal polyps and colorectal cancer. A systematic search was performed in MEDLINE, SCOPUS, Web of Science, and other related databases. The performance measures of the convolutional neural network models in the detection of colorectal polyps and colorectal cancer were calculated in the 2 scenarios of the best and worst accuracy. Stata and R software were used for conducting the meta-analysis. From 3368 searched records, 24 primary studies were included. The sensitivity and specificity of convolutional neural network models in predicting colorectal polyps in worst and best scenarios ranged from 84.7% to 91.6% and from 86.0% to 93.8%, respectively. These values in predicting colorectal cancer varied between 93.2% and 94.1% and between 94.6% and 97.7%. The positive and negative likelihood ratios varied between 6.2 and 14.5 and 0.09 and 0.17 in these scenarios, respectively, in predicting colorectal polyps, and 17.1-41.2 and 0.07-0.06 in predicting colorectal polyps. The diagnostic odds ratio and accuracy measures of convolutional neural network models in predicting colorectal polyps in worst and best scenarios ranged between 36% and 162% and between 80.5% and 88.6%, respectively. These values in predicting colorectal cancer in the worst and the best scenarios varied between 239.63% and 677.47% and between 88.2% and 96.4%. The area under the receiver operating characteristic varied between 0.92 and 0.97 in the worst and the best scenarios in colorectal polyps, respectively, and between 0.98 and 0.99 in colorectal polyps prediction. Convolutional neural network-based models showed an acceptable accuracy in detecting colorectal polyps and colorectal cancer.
Collapse
Affiliation(s)
- Kamyab Keshtkar
- University of Tehran School of Electrical and Computer Engineering, Tehran, Iran
| | - Ali Reza Safarpour
- Gastroenterohepatology Research Center, Shiraz University of Medical Sciences, Shiraz, Iran
| | - Ramin Heshmat
- Chronic Diseases Research Center, Endocrinology and Metabolism Population Sciences Institute, Tehran University of Medical Sciences, Tehran, Iran
| | - Rasoul Sotoudehmanesh
- Department of Gastroenterology, Digestive Disease Research Center, Digestive Disease Research Institute, Tehran University of Medical Sciences, Tehran, Iran
| | - Abbas Keshtkar
- Department of Health Sciences Education Development, Tehran University of Medical Sciences School of Public Health, Tehran, Iran
| |
Collapse
|
8
|
Sánchez-Peralta LF, Glover B, Saratxaga CL, Ortega-Morán JF, Nazarian S, Picón A, Pagador JB, Sánchez-Margallo FM. Clinical Validation Benchmark Dataset and Expert Performance Baseline for Colorectal Polyp Localization Methods. J Imaging 2023; 9:167. [PMID: 37754931 PMCID: PMC10532435 DOI: 10.3390/jimaging9090167] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2023] [Revised: 08/18/2023] [Accepted: 08/18/2023] [Indexed: 09/28/2023] Open
Abstract
Colorectal cancer is one of the leading death causes worldwide, but, fortunately, early detection highly increases survival rates, with the adenoma detection rate being one surrogate marker for colonoscopy quality. Artificial intelligence and deep learning methods have been applied with great success to improve polyp detection and localization and, therefore, the adenoma detection rate. In this regard, a comparison with clinical experts is required to prove the added value of the systems. Nevertheless, there is no standardized comparison in a laboratory setting before their clinical validation. The ClinExpPICCOLO comprises 65 unedited endoscopic images that represent the clinical setting. They include white light imaging and narrow band imaging, with one third of the images containing a lesion but, differently to another public datasets, the lesion does not appear well-centered in the image. Together with the dataset, an expert clinical performance baseline has been established with the performance of 146 gastroenterologists, who were required to locate the lesions in the selected images. Results shows statistically significant differences between experience groups. Expert gastroenterologists' accuracy was 77.74, while sensitivity and specificity were 86.47 and 74.33, respectively. These values can be established as minimum values for a DL method before performing a clinical trial in the hospital setting.
Collapse
Affiliation(s)
- Luisa F. Sánchez-Peralta
- Jesús Usón Minimally Invasive Surgery Centre, E-10071 Cáceres, Spain; (L.F.S.-P.); (J.F.O.-M.); (F.M.S.-M.)
- AI4polypNET Thematic Network, E-08193 Barcelona, Spain
| | - Ben Glover
- Imperial College London, London SW7 2BU, UK; (B.G.); (S.N.)
| | - Cristina L. Saratxaga
- TECNALIA, Basque Research and Technology Alliance (BRTA), E-48160 Derio, Spain; (C.L.S.); (A.P.)
| | - Juan Francisco Ortega-Morán
- Jesús Usón Minimally Invasive Surgery Centre, E-10071 Cáceres, Spain; (L.F.S.-P.); (J.F.O.-M.); (F.M.S.-M.)
- AI4polypNET Thematic Network, E-08193 Barcelona, Spain
| | | | - Artzai Picón
- TECNALIA, Basque Research and Technology Alliance (BRTA), E-48160 Derio, Spain; (C.L.S.); (A.P.)
- Department of Automatic Control and Systems Engineering, University of the Basque Country, E-48013 Bilbao, Spain
| | - J. Blas Pagador
- Jesús Usón Minimally Invasive Surgery Centre, E-10071 Cáceres, Spain; (L.F.S.-P.); (J.F.O.-M.); (F.M.S.-M.)
- AI4polypNET Thematic Network, E-08193 Barcelona, Spain
| | - Francisco M. Sánchez-Margallo
- Jesús Usón Minimally Invasive Surgery Centre, E-10071 Cáceres, Spain; (L.F.S.-P.); (J.F.O.-M.); (F.M.S.-M.)
- AI4polypNET Thematic Network, E-08193 Barcelona, Spain
- RICORS-TERAV Network, ISCIII, E-28029 Madrid, Spain
- Centro de Investigación Biomédica en Red de Enfermedades Cardiovasculares (CIBERCV), Instituto de Salud Carlos III, E-28029 Madrid, Spain
| |
Collapse
|
9
|
Galati JS, Lin K, Gross SA. Recent advances in devices and technologies that might prove revolutionary for colonoscopy procedures. Expert Rev Med Devices 2023; 20:1087-1103. [PMID: 37934873 DOI: 10.1080/17434440.2023.2280773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Accepted: 11/03/2023] [Indexed: 11/09/2023]
Abstract
INTRODUCTION Colorectal cancer (CRC) is the third most common malignancy and second leading cause of cancer-related mortality in the world. Adenoma detection rate (ADR), a quality indicator for colonoscopy, has gained prominence as it is inversely related to CRC incidence and mortality. As such, recent efforts have focused on developing novel colonoscopy devices and technologies to improve ADR. AREAS COVERED The main objective of this paper is to provide an overview of advancements in the fields of colonoscopy mechanical attachments, artificial intelligence-assisted colonoscopy, and colonoscopy optical enhancements with respect to ADR. We accomplished this by performing a comprehensive search of multiple electronic databases from inception to September 2023. This review is intended to be an introduction to colonoscopy devices and technologies. EXPERT OPINION Numerous mechanical attachments and optical enhancements have been developed that have the potential to improve ADR and AI has gone from being an inaccessible concept to a feasible means for improving ADR. While these advances are exciting and portend a change in what will be considered standard colonoscopy, they continue to require refinement. Future studies should focus on combining modalities to further improve ADR and exploring the use of these technologies in other facets of colonoscopy.
Collapse
Affiliation(s)
- Jonathan S Galati
- Department of Internal Medicine, NYU Langone Health, New York, NY, USA
| | - Kevin Lin
- Department of Internal Medicine, NYU Langone Health, New York, NY, USA
| | - Seth A Gross
- Division of Gastroenterology, NYU Langone Health, New York, NY, USA
| |
Collapse
|
10
|
Cherubini A, Dinh NN. A Review of the Technology, Training, and Assessment Methods for the First Real-Time AI-Enhanced Medical Device for Endoscopy. Bioengineering (Basel) 2023; 10:bioengineering10040404. [PMID: 37106592 PMCID: PMC10136070 DOI: 10.3390/bioengineering10040404] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 02/25/2023] [Accepted: 03/22/2023] [Indexed: 04/29/2023] Open
Abstract
Artificial intelligence (AI) has the potential to assist in endoscopy and improve decision making, particularly in situations where humans may make inconsistent judgments. The performance assessment of the medical devices operating in this context is a complex combination of bench tests, randomized controlled trials, and studies on the interaction between physicians and AI. We review the scientific evidence published about GI Genius, the first AI-powered medical device for colonoscopy to enter the market, and the device that is most widely tested by the scientific community. We provide an overview of its technical architecture, AI training and testing strategies, and regulatory path. In addition, we discuss the strengths and limitations of the current platform and its potential impact on clinical practice. The details of the algorithm architecture and the data that were used to train the AI device have been disclosed to the scientific community in the pursuit of a transparent AI. Overall, the first AI-enabled medical device for real-time video analysis represents a significant advancement in the use of AI for endoscopies and has the potential to improve the accuracy and efficiency of colonoscopy procedures.
Collapse
Affiliation(s)
- Andrea Cherubini
- Cosmo Intelligent Medical Devices, D02KV60 Dublin, Ireland
- Milan Center for Neuroscience, University of Milano-Bicocca, 20126 Milano, Italy
| | - Nhan Ngo Dinh
- Cosmo Intelligent Medical Devices, D02KV60 Dublin, Ireland
| |
Collapse
|
11
|
Zimmermann-Fraedrich K, Rösch T. Artificial intelligence and the push for small adenomas: all we need? Endoscopy 2023; 55:320-323. [PMID: 36882088 DOI: 10.1055/a-2038-7078] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 03/09/2023]
Affiliation(s)
| | - Thomas Rösch
- Department of Interdisciplinary Endoscopy University Hospital Hamburg-Eppendorf, Hamburg, Germany
| |
Collapse
|
12
|
Taber P, Armin JS, Orozco G, Del Fiol G, Erdrich J, Kawamoto K, Israni ST. Artificial Intelligence and Cancer Control: Toward Prioritizing Justice, Equity, Diversity, and Inclusion (JEDI) in Emerging Decision Support Technologies. Curr Oncol Rep 2023; 25:387-424. [PMID: 36811808 DOI: 10.1007/s11912-023-01376-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/06/2022] [Indexed: 02/24/2023]
Abstract
PURPOSE FOR REVIEW This perspective piece has two goals: first, to describe issues related to artificial intelligence-based applications for cancer control as they may impact health inequities or disparities; and second, to report on a review of systematic reviews and meta-analyses of artificial intelligence-based tools for cancer control to ascertain the extent to which discussions of justice, equity, diversity, inclusion, or health disparities manifest in syntheses of the field's best evidence. RECENT FINDINGS We found that, while a significant proportion of existing syntheses of research on AI-based tools in cancer control use formal bias assessment tools, the fairness or equitability of models is not yet systematically analyzable across studies. Issues related to real-world use of AI-based tools for cancer control, such as workflow considerations, measures of usability and acceptance, or tool architecture, are more visible in the literature, but still addressed only in a minority of reviews. Artificial intelligence is poised to bring significant benefits to a wide range of applications in cancer control, but more thorough and standardized evaluations and reporting of model fairness are required to build the evidence base for AI-based tool design for cancer and to ensure that these emerging technologies promote equitable healthcare.
Collapse
Affiliation(s)
- Peter Taber
- Department of Biomedical Informatics, University of Utah School of Medicine, 421 Wakara Way, Salt Lake City, UT, 84108, USA.
| | - Julie S Armin
- Department of Family and Community Medicine, University of Arizona College of Medicine, Tucson, AZ, USA
| | | | - Guilherme Del Fiol
- Department of Biomedical Informatics, University of Utah School of Medicine, 421 Wakara Way, Salt Lake City, UT, 84108, USA
| | - Jennifer Erdrich
- Division of Surgical Oncology, University of Arizona College of Medicine, Tucson, AZ, USA
| | - Kensaku Kawamoto
- Department of Biomedical Informatics, University of Utah School of Medicine, 421 Wakara Way, Salt Lake City, UT, 84108, USA
| | | |
Collapse
|
13
|
Artificial intelligence-assisted optical diagnosis for the resect-and-discard strategy in clinical practice: the Artificial intelligence BLI Characterization (ABC) study. Endoscopy 2023; 55:14-22. [PMID: 35562098 DOI: 10.1055/a-1852-0330] [Citation(s) in RCA: 41] [Impact Index Per Article: 41.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
BACKGROUND Optical diagnosis of colonic polyps is poorly reproducible outside of high volume referral centers. The present study aimed to assess whether real-time artificial intelligence (AI)-assisted optical diagnosis is accurate enough to implement the leave-in-situ strategy for diminutive (≤ 5 mm) rectosigmoid polyps (DRSPs). METHODS Consecutive colonoscopy outpatients with ≥ 1 DRSP were included. DRSPs were categorized as adenomas or nonadenomas by the endoscopists, who had differing expertise in optical diagnosis, with the assistance of a real-time AI system (CAD-EYE). The primary end point was ≥ 90 % negative predictive value (NPV) for adenomatous histology in high confidence AI-assisted optical diagnosis of DRSPs (Preservation and Incorporation of Valuable endoscopic Innovations [PIVI-1] threshold), with histopathology as the reference standard. The agreement between optical- and histology-based post-polypectomy surveillance intervals (≥ 90 %; PIVI-2 threshold) was also calculated according to European Society of Gastrointestinal Endoscopy (ESGE) and United States Multi-Society Task Force (USMSTF) guidelines. RESULTS Overall 596 DRSPs were retrieved for histology in 389 patients; an AI-assisted high confidence optical diagnosis was made in 92.3 %. The NPV of AI-assisted optical diagnosis for DRSPs (PIVI-1) was 91.0 % (95 %CI 87.1 %-93.9 %). The PIVI-2 threshold was met with 97.4 % (95 %CI 95.7 %-98.9 %) and 92.6 % (95 %CI 90.0 %-95.2 %) of patients according to ESGE and USMSTF, respectively. AI-assisted optical diagnosis accuracy was significantly lower for nonexperts (82.3 %, 95 %CI 76.4 %-87.3 %) than for experts (91.9 %, 95 %CI 88.5 %-94.5 %); however, nonexperts quickly approached the performance levels of experts over time. CONCLUSION AI-assisted optical diagnosis matches the required PIVI thresholds. This does not however offset the need for endoscopists' high level confidence and expertise. The AI system seems to be useful, especially for nonexperts.
Collapse
|
14
|
Galati JS, Duve RJ, O'Mara M, Gross SA. Artificial intelligence in gastroenterology: A narrative review. Artif Intell Gastroenterol 2022; 3:117-141. [DOI: 10.35712/aig.v3.i5.117] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/09/2022] [Revised: 11/21/2022] [Accepted: 12/21/2022] [Indexed: 12/28/2022] Open
Abstract
Artificial intelligence (AI) is a complex concept, broadly defined in medicine as the development of computer systems to perform tasks that require human intelligence. It has the capacity to revolutionize medicine by increasing efficiency, expediting data and image analysis and identifying patterns, trends and associations in large datasets. Within gastroenterology, recent research efforts have focused on using AI in esophagogastroduodenoscopy, wireless capsule endoscopy (WCE) and colonoscopy to assist in diagnosis, disease monitoring, lesion detection and therapeutic intervention. The main objective of this narrative review is to provide a comprehensive overview of the research being performed within gastroenterology on AI in esophagogastroduodenoscopy, WCE and colonoscopy.
Collapse
Affiliation(s)
- Jonathan S Galati
- Department of Medicine, NYU Langone Health, New York, NY 10016, United States
| | - Robert J Duve
- Department of Internal Medicine, Jacobs School of Medicine and Biomedical Sciences, University at Buffalo, Buffalo, NY 14203, United States
| | - Matthew O'Mara
- Division of Gastroenterology, NYU Langone Health, New York, NY 10016, United States
| | - Seth A Gross
- Division of Gastroenterology, NYU Langone Health, New York, NY 10016, United States
| |
Collapse
|
15
|
Young EJ, Rajandran A, Philpott HL, Sathananthan D, Hoile SF, Singh R. Mucosal imaging in colon polyps: New advances and what the future may hold. World J Gastroenterol 2022; 28:6632-6661. [PMID: 36620337 PMCID: PMC9813932 DOI: 10.3748/wjg.v28.i47.6632] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/03/2022] [Revised: 10/23/2022] [Accepted: 11/23/2022] [Indexed: 12/19/2022] Open
Abstract
An expanding range of advanced mucosal imaging technologies have been developed with the goal of improving the detection and characterization of lesions in the gastrointestinal tract. Many technologies have targeted colorectal neoplasia given the potential for intervention prior to the development of invasive cancer in the setting of widespread surveillance programs. Improvement in adenoma detection reduces miss rates and prevents interval cancer development. Advanced imaging technologies aim to enhance detection without significantly increasing procedural time. Accurate polyp characterisation guides resection techniques for larger polyps, as well as providing the platform for the “resect and discard” and “do not resect” strategies for small and diminutive polyps. This review aims to collate and summarise the evidence regarding these technologies to guide colonoscopic practice in both interventional and non-interventional endoscopists.
Collapse
Affiliation(s)
- Edward John Young
- Department of Gastroenterology, Lyell McEwin Hospital, Northern Adelaide Local Health Network, Elizabeth Vale 5031, South Australia, Australia
- Faculty of Health and Medical Sciences, University of Adelaide, Adelaide 5000, South Australia, Australia
| | - Arvinf Rajandran
- Department of Gastroenterology, Lyell McEwin Hospital, Northern Adelaide Local Health Network, Elizabeth Vale 5031, South Australia, Australia
| | - Hamish Lachlan Philpott
- Department of Gastroenterology, Lyell McEwin Hospital, Northern Adelaide Local Health Network, Elizabeth Vale 5031, South Australia, Australia
- Faculty of Health and Medical Sciences, University of Adelaide, Adelaide 5000, South Australia, Australia
| | - Dharshan Sathananthan
- Department of Gastroenterology, Lyell McEwin Hospital, Northern Adelaide Local Health Network, Elizabeth Vale 5031, South Australia, Australia
- Faculty of Health and Medical Sciences, University of Adelaide, Adelaide 5000, South Australia, Australia
| | - Sophie Fenella Hoile
- Department of Gastroenterology, Lyell McEwin Hospital, Northern Adelaide Local Health Network, Elizabeth Vale 5031, South Australia, Australia
- Faculty of Health and Medical Sciences, University of Adelaide, Adelaide 5000, South Australia, Australia
| | - Rajvinder Singh
- Department of Gastroenterology, Lyell McEwin Hospital, Northern Adelaide Local Health Network, Elizabeth Vale 5031, South Australia, Australia
- Faculty of Health and Medical Sciences, University of Adelaide, Adelaide 5000, South Australia, Australia
| |
Collapse
|
16
|
Zhu PS, Zhang YR, Ren JY, Li QL, Chen M, Sang T, Li WX, Li J, Cui XW. Ultrasound-based deep learning using the VGGNet model for the differentiation of benign and malignant thyroid nodules: A meta-analysis. Front Oncol 2022; 12:944859. [PMID: 36249056 PMCID: PMC9554631 DOI: 10.3389/fonc.2022.944859] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2022] [Accepted: 08/19/2022] [Indexed: 12/13/2022] Open
Abstract
Objective The aim of this study was to evaluate the accuracy of deep learning using the convolutional neural network VGGNet model in distinguishing benign and malignant thyroid nodules based on ultrasound images. Methods Relevant studies were selected from PubMed, Embase, Cochrane Library, China National Knowledge Infrastructure (CNKI), and Wanfang databases, which used the deep learning-related convolutional neural network VGGNet model to classify benign and malignant thyroid nodules based on ultrasound images. Cytology and pathology were used as gold standards. Furthermore, reported eligibility and risk bias were assessed using the QUADAS-2 tool, and the diagnostic accuracy of deep learning VGGNet was analyzed with pooled sensitivity, pooled specificity, diagnostic odds ratio, and the area under the curve. Results A total of 11 studies were included in this meta-analysis. The overall estimates of sensitivity and specificity were 0.87 [95% CI (0.83, 0.91)] and 0.85 [95% CI (0.79, 0.90)], respectively. The diagnostic odds ratio was 38.79 [95% CI (22.49, 66.91)]. The area under the curve was 0.93 [95% CI (0.90, 0.95)]. No obvious publication bias was found. Conclusion Deep learning using the convolutional neural network VGGNet model based on ultrasound images performed good diagnostic efficacy in distinguishing benign and malignant thyroid nodules. Systematic Review Registration https://www.crd.york.ac.nk/prospero, identifier CRD42022336701.
Collapse
Affiliation(s)
- Pei-Shan Zhu
- Department of Ultrasound, the First Affiliated Hospital of Medical College, Shihezi University, Shihezi, China
| | - Yu-Rui Zhang
- Department of Ultrasound, the First Affiliated Hospital of Medical College, Shihezi University, Shihezi, China
| | - Jia-Yu Ren
- Department of Medical Ultrasound, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Qiao-Li Li
- Department of Ultrasound, the First Affiliated Hospital of Medical College, Shihezi University, Shihezi, China
| | - Ming Chen
- Department of Ultrasound, the First Affiliated Hospital of Medical College, Shihezi University, Shihezi, China
| | - Tian Sang
- Department of Ultrasound, the First Affiliated Hospital of Medical College, Shihezi University, Shihezi, China
| | - Wen-Xiao Li
- Department of Ultrasound, the First Affiliated Hospital of Medical College, Shihezi University, Shihezi, China
| | - Jun Li
- Department of Ultrasound, the First Affiliated Hospital of Medical College, Shihezi University, Shihezi, China,NHC Key Laboratory of Prevention and Treatment of Central Asia High Incidence Diseases, First Affiliated Hospital, School of Medicine, Shihezi University, Shihezi, China,*Correspondence: Jun Li, ; Xin-Wu Cui,
| | - Xin-Wu Cui
- Department of Medical Ultrasound, Tongji Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China,*Correspondence: Jun Li, ; Xin-Wu Cui,
| |
Collapse
|
17
|
Reverberi C, Rigon T, Solari A, Hassan C, Cherubini P, Cherubini A. Experimental evidence of effective human-AI collaboration in medical decision-making. Sci Rep 2022; 12:14952. [PMID: 36056152 PMCID: PMC9440124 DOI: 10.1038/s41598-022-18751-2] [Citation(s) in RCA: 19] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2022] [Accepted: 08/18/2022] [Indexed: 11/25/2022] Open
Abstract
Artificial Intelligence (AI) systems are precious support for decision-making, with many applications also in the medical domain. The interaction between MDs and AI enjoys a renewed interest following the increased possibilities of deep learning devices. However, we still have limited evidence-based knowledge of the context, design, and psychological mechanisms that craft an optimal human-AI collaboration. In this multicentric study, 21 endoscopists reviewed 504 videos of lesions prospectively acquired from real colonoscopies. They were asked to provide an optical diagnosis with and without the assistance of an AI support system. Endoscopists were influenced by AI ([Formula: see text]), but not erratically: they followed the AI advice more when it was correct ([Formula: see text]) than incorrect ([Formula: see text]). Endoscopists achieved this outcome through a weighted integration of their and the AI opinions, considering the case-by-case estimations of the two reliabilities. This Bayesian-like rational behavior allowed the human-AI hybrid team to outperform both agents taken alone. We discuss the features of the human-AI interaction that determined this favorable outcome.
Collapse
Affiliation(s)
- Carlo Reverberi
- Department of Psychology, University of Milano-Bicocca, 20126, Milan, Italy.
- Milan Center for Neuroscience, University of Milano-Bicocca, 20126, Milan, Italy.
| | - Tommaso Rigon
- Department of Economics, Management and Statistics, University of Milano-Bicocca, 20126, Milan, Italy
| | - Aldo Solari
- Milan Center for Neuroscience, University of Milano-Bicocca, 20126, Milan, Italy
- Department of Economics, Management and Statistics, University of Milano-Bicocca, 20126, Milan, Italy
| | - Cesare Hassan
- Department of Biomedical Sciences, Humanitas University, 20072, Pieve Emanuele, Italy
- Endoscopy Unit, Humanitas Clinical and Research Center IRCCS, Rozzano, Italy
| | - Paolo Cherubini
- Department of Psychology, University of Milano-Bicocca, 20126, Milan, Italy
- Milan Center for Neuroscience, University of Milano-Bicocca, 20126, Milan, Italy
- Department of Neural and Behavioral Sciences, University of Pavia, Pavia, Italy
| | - Andrea Cherubini
- Milan Center for Neuroscience, University of Milano-Bicocca, 20126, Milan, Italy.
- Artificial Intelligence Group, Cosmo AI/Linkverse, Lainate, 20045, Milan, Italy.
| |
Collapse
|
18
|
UPolySeg: A U-Net-Based Polyp Segmentation Network Using Colonoscopy Images. GASTROENTEROLOGY INSIGHTS 2022. [DOI: 10.3390/gastroent13030027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Colonoscopy is a gold standard procedure for tracking the lower gastrointestinal region. A colorectal polyp is one such condition that is detected through colonoscopy. Even though technical advancements have improved the early detection of colorectal polyps, there is still a high percentage of misses due to various factors. Polyp segmentation can play a significant role in the detection of polyps at the early stage and can thus help reduce the severity of the disease. In this work, the authors implemented several image pre-processing techniques such as coherence transport and contrast limited adaptive histogram equalization (CLAHE) to handle different challenges in colonoscopy images. The processed image was then segmented into a polyp and normal pixel using a U-Net-based deep learning segmentation model named UPolySeg. The main framework of UPolySeg has an encoder–decoder section with feature concatenation in the same layer as the encoder–decoder along with the use of dilated convolution. The model was experimentally verified using the publicly available Kvasir-SEG dataset, which gives a global accuracy of 96.77%, a dice coefficient of 96.86%, an IoU of 87.91%, a recall of 95.57%, and a precision of 92.29%. The new framework for the polyp segmentation implementing UPolySeg improved the performance by 1.93% compared with prior work.
Collapse
|
19
|
Yamamoto S, Kinugasa H, Hamada K, Tomiya M, Tanimoto T, Ohto A, Toda A, Takei D, Matsubara M, Suzuki S, Inoue K, Tanaka T, Hiraoka S, Okada H, Kawahara Y. The diagnostic ability to classify neoplasias occurring in inflammatory bowel disease by artificial intelligence and endoscopists: A pilot study. J Gastroenterol Hepatol 2022; 37:1610-1616. [PMID: 35644932 DOI: 10.1111/jgh.15904] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/21/2022] [Revised: 05/18/2022] [Accepted: 05/24/2022] [Indexed: 01/08/2023]
Abstract
BACKGROUND AND AIM Although endoscopic resection with careful surveillance instead of total proctocolectomy become to be permitted for visible low-grade dysplasia, it is unclear how accurately endoscopists can differentiate these lesions, as classifying neoplasias occurring in inflammatory bowel disease (IBDN) is exceedingly challenging due to background chronic inflammation. We evaluated a pilot model of an artificial intelligence (AI) system for classifying IBDN and compared it with the endoscopist's ability. METHODS This study used a deep convolutional neural network, the EfficientNet-B3. Among patients who underwent treatment for IBDN at two hospitals between 2003 and 2021, we selected 862 non-magnified endoscopic images from 99 IBDN lesions and utilized 6 375 352 images that were increased by data augmentation for the development of AI. We evaluated the diagnostic ability of AI using two classifications: the "adenocarcinoma/high-grade dysplasia" and "low-grade dysplasia/sporadic adenoma/normal mucosa" groups. We compared the diagnostic accuracy between AI and endoscopists (three non-experts and four experts) using 186 test set images. RESULTS The diagnostic ability of the experts/non-experts/AI for the two classifications in the test set images had a sensitivity of 60.5% (95% confidence interval [CI]: 54.5-66.3)/70.5% (95% CI: 63.8-76.6)/72.5% (95% CI: 60.4-82.5), specificity of 88.0% (95% CI: 84.7-90.8)/78.8% (95% CI: 74.3-83.1)/82.9% (95% CI: 74.8-89.2), and accuracy of 77.8% (95% CI: 74.7-80.8)/75.8% (95% CI: 72-79.3)/79.0% (95% CI: 72.5-84.6), respectively. CONCLUSIONS The diagnostic accuracy of the two classifications of IBDN was higher than that of the experts. Our AI system is valuable enough to contribute to the next generation of clinical practice.
Collapse
Affiliation(s)
- Shumpei Yamamoto
- Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama, Japan.,Department of internal medicine, Japanese Red Cross Himeji Hospital, Himeji, Japan
| | - Hideaki Kinugasa
- Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama, Japan
| | - Kenta Hamada
- Department of Practical Gastrointestinal Endoscopy, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama, Japan
| | - Masahiro Tomiya
- Business Strategy Division, Ryobi Systems Co., Ltd., Okayama, Japan
| | | | - Akimitsu Ohto
- Business Strategy Division, Ryobi Systems Co., Ltd., Okayama, Japan
| | - Akira Toda
- Business Strategy Division, Ryobi Systems Co., Ltd., Okayama, Japan
| | - Daisuke Takei
- Department of Gastroenterology, Sumitomo Besshi Hospital, Niihama, Japan
| | - Minoru Matsubara
- Department of Gastroenterology, Sumitomo Besshi Hospital, Niihama, Japan
| | - Seiyu Suzuki
- Department of Gastroenterology, Sumitomo Besshi Hospital, Niihama, Japan
| | - Kosuke Inoue
- Department of Pathology, Sumitomo Besshi Hospital, Niihama, Japan
| | - Takehiro Tanaka
- Department of Pathology, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama, Japan
| | - Sakiko Hiraoka
- Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama, Japan
| | - Hiroyuki Okada
- Department of Gastroenterology and Hepatology, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama, Japan.,Department of internal medicine, Japanese Red Cross Himeji Hospital, Himeji, Japan
| | - Yoshiro Kawahara
- Department of Practical Gastrointestinal Endoscopy, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama, Japan
| |
Collapse
|
20
|
Xie X, Xiao YF, Zhao XY, Li JJ, Yang QQ, Peng X, Nie XB, Zhou JY, Zhao YB, Yang H, Liu X, Liu E, Chen YY, Zhou YY, Fan CQ, Bai JY, Lin H, Koulaouzidis A, Yang SM. Development and Validation of an Artificial Intelligence Model for Small Bowel Capsule Endoscopy Video Review. JAMA Netw Open 2022; 5:e2221992. [PMID: 35834249 PMCID: PMC9284338 DOI: 10.1001/jamanetworkopen.2022.21992] [Citation(s) in RCA: 16] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/17/2022] Open
Abstract
IMPORTANCE Reading small bowel capsule endoscopy (SBCE) videos is a tedious task for clinicians, and a new method should be applied to solve the situation. OBJECTIVES To develop and evaluate the performance of a convolutional neural network algorithm for SBCE video review in real-life clinical care. DESIGN, SETTING, AND PARTICIPANTS In this multicenter, retrospective diagnostic study, a deep learning neural network (SmartScan) was trained and validated for the SBCE video review. A total of 2927 SBCE examinations from 29 medical centers were used to train SmartScan to detect 17 types of CE structured terminology (CEST) findings from January 1, 2019, to June 30, 2020. SmartScan was later validated with conventional reading (CR) and SmartScan-assisted reading (SSAR) in 2898 SBCE examinations collected from 22 medical centers. Data analysis was performed from January 25 to December 31, 2021. EXPOSURE An artificial intelligence-based tool for interpreting clinical images of SBCE. MAIN OUTCOMES AND MEASURES The detection rate and efficiency of CEST findings detected by SSAR and CR were compared. RESULTS A total of 5825 SBCE examinations were retrospectively collected; 2898 examinations (1765 male participants [60.9%]; mean [SD] age, 49.8 [15.5] years) were included in the validation phase. From a total of 6084 CEST-classified SB findings, SSAR detected 5834 findings (95.9%; 95% CI, 95.4%-96.4%), significantly higher than CR, which detected 4630 findings (76.1%; 95% CI, 75.0%-77.2%). SmartScan-assisted reading achieved a higher per-patient detection rate (79.3% [2298 of 2898]) for CEST findings compared with CR (70.7% [2048 of 2298]; 95% CI, 69.0%-72.3%). With SSAR, the mean (SD) number of images (per SBCE video) requiring review was reduced to 779.2 (337.2) compared with 27 910.8 (12 882.9) with CR, for a mean (SD) reduction rate of 96.1% (4.3%). The mean (SD) reading time with SSAR was shortened to 5.4 (1.5) minutes compared with CR (51.4 [11.6] minutes), for a mean (SD) reduction rate of 89.3% (3.1%). CONCLUSIONS AND RELEVANCE This study suggests that a convolutional neural network-based algorithm is associated with an increased detection rate of SBCE findings and reduced SBCE video reading time.
Collapse
Affiliation(s)
- Xia Xie
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Yu-Feng Xiao
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Xiao-Yan Zhao
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Jian-Jun Li
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Qiang-Qiang Yang
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Xue Peng
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Xu-Biao Nie
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Jian-Yun Zhou
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Yong-Bing Zhao
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Huan Yang
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Xi Liu
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - En Liu
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Yu-Yang Chen
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Yuan-Yuan Zhou
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Chao-Qiang Fan
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Jian-Ying Bai
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| | - Hui Lin
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
- Department of Epidemiology, the Third Military Medical University, Chongqing, China
| | | | - Shi-Ming Yang
- Department of Gastroenterology, The Second Affiliated Hospital, the Third Military Medical University, Chongqing, China
| |
Collapse
|
21
|
Vulpoi RA, Luca M, Ciobanu A, Olteanu A, Barboi OB, Drug VL. Artificial Intelligence in Digestive Endoscopy—Where Are We and Where Are We Going? Diagnostics (Basel) 2022; 12:diagnostics12040927. [PMID: 35453975 PMCID: PMC9029251 DOI: 10.3390/diagnostics12040927] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Revised: 03/30/2022] [Accepted: 04/06/2022] [Indexed: 02/04/2023] Open
Abstract
Artificial intelligence, a computer-based concept that tries to mimic human thinking, is slowly becoming part of the endoscopy lab. It has developed considerably since the first attempt at developing an automated medical diagnostic tool, today being adopted in almost all medical fields, digestive endoscopy included. The detection rate of preneoplastic lesions (i.e., polyps) during colonoscopy may be increased with artificial intelligence assistance. It has also proven useful in detecting signs of ulcerative colitis activity. In upper digestive endoscopy, deep learning models may prove to be useful in the diagnosis and management of upper digestive tract diseases, such as gastroesophageal reflux disease, Barrett’s esophagus, and gastric cancer. As is the case with all new medical devices, there are challenges in the implementation in daily medical practice. The regulatory, economic, organizational culture, and language barriers between humans and machines are a few of them. Even so, many devices have been approved for use by their respective regulators. Future studies are currently striving to develop deep learning models that can replicate a growing amount of human brain activity. In conclusion, artificial intelligence may become an indispensable tool in digestive endoscopy.
Collapse
Affiliation(s)
- Radu-Alexandru Vulpoi
- Institute of Gastroenterology and Hepatology, Saint Spiridon Hospital, “Grigore T. Popa” University of Medicine and Pharmacy, 700111 Iași, Romania; (R.-A.V.); (A.O.); (V.L.D.)
| | - Mihaela Luca
- Institute of Computer Science, Romanian Academy—Iași Branch, 700481 Iași, Romania; (M.L.); (A.C.)
| | - Adrian Ciobanu
- Institute of Computer Science, Romanian Academy—Iași Branch, 700481 Iași, Romania; (M.L.); (A.C.)
| | - Andrei Olteanu
- Institute of Gastroenterology and Hepatology, Saint Spiridon Hospital, “Grigore T. Popa” University of Medicine and Pharmacy, 700111 Iași, Romania; (R.-A.V.); (A.O.); (V.L.D.)
| | - Oana-Bogdana Barboi
- Institute of Gastroenterology and Hepatology, Saint Spiridon Hospital, “Grigore T. Popa” University of Medicine and Pharmacy, 700111 Iași, Romania; (R.-A.V.); (A.O.); (V.L.D.)
- Correspondence: ; Tel.: +40-74-345-5012
| | - Vasile Liviu Drug
- Institute of Gastroenterology and Hepatology, Saint Spiridon Hospital, “Grigore T. Popa” University of Medicine and Pharmacy, 700111 Iași, Romania; (R.-A.V.); (A.O.); (V.L.D.)
| |
Collapse
|
22
|
Spadaccini M, Marco AD, Franchellucci G, Sharma P, Hassan C, Repici A. Discovering the first US FDA-approved computer-aided polyp detection system. Future Oncol 2022; 18:1405-1412. [PMID: 35081745 DOI: 10.2217/fon-2021-1135] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023] Open
Abstract
Colorectal cancer is the third most common cancer worldwide. Because of the slow progression of the precancerous precursors, an efficient endoscopic surveillance strategy may be expected. It seems that around one-fourth of colorectal malignancies are still missed during colonoscopy. Several endoscopic technologies have been introduced, without radical changes. Interest in the development of artificial intelligence applications in the medical field has grown in the past decade. Artificial intelligence can help to highlight a specific region of interest that needs closer examination for the identification of polyps. The aim of this review is to report the first clinical experiences with the first US FDA-approved, real-time, deep-learning, computer-aided detection system (GI Genius™, Medtronic).
Collapse
Affiliation(s)
- Marco Spadaccini
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
- Humanitas Clinical & Research Center-IRCCS, Endoscopy Unit, Rozzano, Italy
| | - Alessandro De Marco
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
- Humanitas Clinical & Research Center-IRCCS, Endoscopy Unit, Rozzano, Italy
| | - Gianluca Franchellucci
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
- Humanitas Clinical & Research Center-IRCCS, Endoscopy Unit, Rozzano, Italy
| | - Prateek Sharma
- Kansas City VA Medical Center, Gastroenterology & Hepatology, Kansas City, MO 66045, USA
| | - Cesare Hassan
- Nuovo Regina Margherita Hospital, Digestive Endoscopy Unit, Rome, Italy
| | - Alessandro Repici
- Department of Biomedical Sciences, Humanitas University, Pieve Emanuele, Italy
- Humanitas Clinical & Research Center-IRCCS, Endoscopy Unit, Rozzano, Italy
| |
Collapse
|
23
|
Effect of artificial intelligence-aided colonoscopy for adenoma and polyp detection: a meta-analysis of randomized clinical trials. Int J Colorectal Dis 2022; 37:495-506. [PMID: 34762157 DOI: 10.1007/s00384-021-04062-x] [Citation(s) in RCA: 25] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 10/29/2021] [Indexed: 02/04/2023]
Abstract
BACKGROUND This meta-analysis aimed to determine whether artificial intelligence (AI) improves colonoscopy outcome metrics i.e. adenoma detection rate (ADR) and polyp detection rate (PDR). METHODS Two authors independently searched Web of Science, PubMed, Science Direct, and Cochrane Library to find all published research before July 2021 that has compared AI-aided colonoscopy with routine colonoscopy (RC) for detection of adenoma and polyp. RESULTS This meta-analysis included 10 RCTs with 6629 individuals in AI-aided (n = 3300) and routine (n = 3329) groups. The results showed that both ADR (RR, 1.43; P < 0.001) and PDR (RR, 1.44; P < 0.001) using AI-aided endoscopy were significantly greater when compared with RC. The adenomas detected per colonoscopy (APC) (WMD, 0.25; P = 0.009), polyps detected per colonoscopy (PPC) (WMD, 0.52; P < 0.001), and sessile serrated lesions detected per colonoscopy (SSLPC) (RR, 1.53; P < 0.001) were significantly higher in the AI-aided group compared with the RC group. Subgroup analysis based on size, location, and shape of adenomas and polyps demonstrated that, except for in the cecum and pedunculated adenomas or polyps, the AI-aided groups of the other subgroups are more advantageous. Withdrawal time was longer in the AI-aided group when biopsies were included, while withdrawal time excluding biopsy time showed no significant difference. CONCLUSIONS AI-aided polyp detection system significantly increases lesion detection rate. In addition, lesion detection by AI is hardly affected by factors such as size, location, and shape.
Collapse
|
24
|
Liang F, Wang S, Zhang K, Liu TJ, Li JN. Development of artificial intelligence technology in diagnosis, treatment, and prognosis of colorectal cancer. World J Gastrointest Oncol 2022; 14:124-152. [PMID: 35116107 PMCID: PMC8790413 DOI: 10.4251/wjgo.v14.i1.124] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Revised: 08/19/2021] [Accepted: 11/15/2021] [Indexed: 02/06/2023] Open
Abstract
Artificial intelligence (AI) technology has made leaps and bounds since its invention. AI technology can be subdivided into many technologies such as machine learning and deep learning. The application scope and prospect of different technologies are also totally different. Currently, AI technologies play a pivotal role in the highly complex and wide-ranging medical field, such as medical image recognition, biotechnology, auxiliary diagnosis, drug research and development, and nutrition. Colorectal cancer (CRC) is a common gastrointestinal cancer that has a high mortality, posing a serious threat to human health. Many CRCs are caused by the malignant transformation of colorectal polyps. Therefore, early diagnosis and treatment are crucial to CRC prognosis. The methods of diagnosing CRC are divided into imaging diagnosis, endoscopy, and pathology diagnosis. Treatment methods are divided into endoscopic treatment, surgical treatment, and drug treatment. AI technology is in the weak era and does not have communication capabilities. Therefore, the current AI technology is mainly used for image recognition and auxiliary analysis without in-depth communication with patients. This article reviews the application of AI in the diagnosis, treatment, and prognosis of CRC and provides the prospects for the broader application of AI in CRC.
Collapse
Affiliation(s)
- Feng Liang
- Department of General Surgery, The Second Hospital of Jilin University, Changchun 130041, Jilin Province, China
| | - Shu Wang
- Department of Radiotherapy, Jilin University Second Hospital, Changchun 130041, Jilin Province, China
| | - Kai Zhang
- Department of General Surgery, The Second Hospital of Jilin University, Changchun 130041, Jilin Province, China
| | - Tong-Jun Liu
- Department of General Surgery, The Second Hospital of Jilin University, Changchun 130041, Jilin Province, China
| | - Jian-Nan Li
- Department of General Surgery, The Second Hospital of Jilin University, Changchun 130041, Jilin Province, China
| |
Collapse
|
25
|
Cortegoso Valdivia P, Elosua A, Houdeville C, Pennazio M, Fernández-Urién I, Dray X, Toth E, Eliakim R, Koulaouzidis A. Clinical feasibility of panintestinal (or panenteric) capsule endoscopy: a systematic review. Eur J Gastroenterol Hepatol 2021; 33:949-955. [PMID: 34034282 DOI: 10.1097/meg.0000000000002200] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
In recent years, panintestinal capsule endoscopy (PCE) with double-headed capsules has been used to perform complete, single-sitting exploration of both small bowel and colon in different clinical conditions. Double-headed capsules for colonic examination (CCE) have been exploited first in this setting, followed by newer generations of capsules (i.e. PillCam Crohn, PCC) specifically engineered for this purpose. The aim of this study was to evaluate the feasibility of PCE in the form of a systematic review. We performed a comprehensive literature search to identify papers in which CE was specifically used for a PCE of the gastrointestinal tract. Data on CE, bowel preparation regimen, rate of cleanliness and completeness, and data on transit times were analyzed. The primary outcome was to assess the feasibility of a whole-gut exploration with CE. Sixteen (n = 16) studies including 915 CE procedures with CCE1 (n = 134), CCE2 (n = 357) and PCC (n = 424) were included. 13/16 studies were performed in the setting of Crohn's disease. Cleanliness and completeness rates were acceptable in all studies, ranging from 63.9% and 68.6% to 100%, respectively. In conclusion, PCE is a feasible technique, although further structured studies are needed to explore its full potential.
Collapse
Affiliation(s)
- Pablo Cortegoso Valdivia
- Gastroenterology and Endoscopy Unit, University Hospital of Parma, University of Parma, Parma, Italy
| | - Alfonso Elosua
- Gastroenterology Unit, Hospital Garcia Orcoyen, Estella, Spain
| | - Charles Houdeville
- Sorbonne Université, Centre d'Endoscopie Digestive, Hôpital Saint-Antoine, APHP, Paris, France
| | - Marco Pennazio
- University Division of Gastroenterology, AOU Città della Salute e della Scienza, University of Turin, Turin, Italy
| | | | - Xavier Dray
- Sorbonne Université, Centre d'Endoscopie Digestive, Hôpital Saint-Antoine, APHP, Paris, France
| | - Ervin Toth
- Department of Gastroenterology, Skåne University Hospital, Lund University, Malmö, Sweden
| | - Rami Eliakim
- Department of Gastroenterology, Chaim Sheba Medical Center, Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel
| | - Anastasios Koulaouzidis
- Department of Social Medicine & Public Health, Pomeranian Medical University, Szczecin, Poland
| |
Collapse
|