1
|
Singh VK, Makhlouf Y, Sarker MMK, Craig S, Baena J, Greene C, Mason L, James JA, Salto-Tellez M, O'Reilly P, Maxwell P. KRASFormer: a fully vision transformer-based framework for predicting KRASgene mutations in histopathological images of colorectal cancer. Biomed Phys Eng Express 2024; 10:055012. [PMID: 38925106 DOI: 10.1088/2057-1976/ad5bed] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2024] [Accepted: 06/26/2024] [Indexed: 06/28/2024]
Abstract
Detecting the Kirsten Rat Sarcoma Virus (KRAS) gene mutation is significant for colorectal cancer (CRC) patients. TheKRASgene encodes a protein involved in the epidermal growth factor receptor (EGFR) signaling pathway, and mutations in this gene can negatively impact the use of monoclonal antibodies in anti-EGFR therapy and affect treatment decisions. Currently, commonly used methods like next-generation sequencing (NGS) identifyKRASmutations but are expensive, time-consuming, and may not be suitable for every cancer patient sample. To address these challenges, we have developedKRASFormer, a novel framework that predictsKRASgene mutations from Haematoxylin and Eosin (H & E) stained WSIs that are widely available for most CRC patients.KRASFormerconsists of two stages: the first stage filters out non-tumor regions and selects only tumour cells using a quality screening mechanism, and the second stage predicts theKRASgene either wildtype' or mutant' using a Vision Transformer-based XCiT method. The XCiT employs cross-covariance attention to capture clinically meaningful long-range representations of textural patterns in tumour tissue andKRASmutant cells. We evaluated the performance of the first stage using an independent CRC-5000 dataset, and the second stage included both The Cancer Genome Atlas colon and rectal cancer (TCGA-CRC-DX) and in-house cohorts. The results of our experiments showed that the XCiT outperformed existing state-of-the-art methods, achieving AUCs for ROC curves of 0.691 and 0.653 on TCGA-CRC-DX and in-house datasets, respectively. Our findings emphasize three key consequences: the potential of using H & E-stained tissue slide images for predictingKRASgene mutations as a cost-effective and time-efficient means for guiding treatment choice with CRC patients; the increase in performance metrics of a Transformer-based model; and the value of the collaboration between pathologists and data scientists in deriving a morphologically meaningful model.
Collapse
Affiliation(s)
- Vivek Kumar Singh
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
- Centre for Biomarkers and Biotherapeutics, Barts Cancer Institute, Queen Mary University of London, London, EC1M 6BQ, United Kingdom
| | - Yasmine Makhlouf
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
| | | | - Stephanie Craig
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
| | - Juvenal Baena
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
| | - Christine Greene
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
| | - Lee Mason
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
| | - Jacqueline A James
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
- Regional Molecular Diagnostic Service, Belfast Health and Social Care Trust, Belfast, BT9 7AE, United Kingdom
| | - Manuel Salto-Tellez
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
- Regional Molecular Diagnostic Service, Belfast Health and Social Care Trust, Belfast, BT9 7AE, United Kingdom
- Sonrai Analytics, Belfast, BT9 7AE, United Kingdom
- Cellular Pathology, Belfast Health and Social Care Trust, Belfast City Hospital, Lisburn Road, Belfast BT9 7AB, United Kingdom
| | | | - Perry Maxwell
- Precision Medicine Centre of Excellence, Health Sciences Building, The Patrick G Johnston Centre for Cancer Research, Queen's University Belfast, Belfast, BT9 7AE, United Kingdom
| |
Collapse
|
2
|
Liu X, Hu W, Diao S, Abera DE, Racoceanu D, Qin W. Multi-scale feature fusion for prediction of IDH1 mutations in glioma histopathological images. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2024; 248:108116. [PMID: 38518408 DOI: 10.1016/j.cmpb.2024.108116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Revised: 01/30/2024] [Accepted: 03/02/2024] [Indexed: 03/24/2024]
Abstract
BACKGROUND AND OBJECTIVE Mutations in isocitrate dehydrogenase 1 (IDH1) play a crucial role in the prognosis, diagnosis, and treatment of gliomas. However, current methods for determining its mutation status, such as immunohistochemistry and gene sequencing, are difficult to implement widely in routine clinical diagnosis. Recent studies have shown that using deep learning methods based on pathological images of glioma can predict the mutation status of the IDH1 gene. However, our research focuses on utilizing multi-scale information in pathological images to improve the accuracy of predicting IDH1 gene mutations, thereby providing an accurate and cost-effective prediction method for routine clinical diagnosis. METHODS In this paper, we propose a multi-scale fusion gene identification network (MultiGeneNet). The network first uses two feature extractors to obtain feature maps at different scale images, and then by employing a bilinear pooling layer based on Hadamard product to realize the fusion of multi-scale features. Through fully exploiting the complementarity among features at different scales, we are able to obtain a more comprehensive and rich representation of multi-scale features. RESULTS Based on the Hematoxylin and Eosin stained pathological section dataset of 296 patients, our method achieved an accuracy of 83.575 % and an AUC of 0.886, thus significantly outperforming other single-scale methods. CONCLUSIONS Our method can be deployed in medical aid systems at very low cost, serving as a diagnostic or prognostic tool for glioma patients in medically underserved areas.
Collapse
Affiliation(s)
- Xiang Liu
- Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen 518055, China
| | - Wanming Hu
- Department of Pathology, Sun Yat-sen University Cancer Center, State Key Laboratory of Oncology in South China, Guangdong Provincial Clinical Research Center for Cancer, Guangzhou 510060, China
| | - Songhui Diao
- Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen 518055, China
| | - Deboch Eyob Abera
- Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China; Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Shenzhen 518055, China
| | - Daniel Racoceanu
- Sorbonne University, Inria, CNRS, Inserm, AP-HP, Inria, Paris Brain Institute - ICM, F-75013 Paris, France
| | - Wenjian Qin
- Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China.
| |
Collapse
|
3
|
Al-Thelaya K, Gilal NU, Alzubaidi M, Majeed F, Agus M, Schneider J, Househ M. Applications of discriminative and deep learning feature extraction methods for whole slide image analysis: A survey. J Pathol Inform 2023; 14:100335. [PMID: 37928897 PMCID: PMC10622844 DOI: 10.1016/j.jpi.2023.100335] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2023] [Revised: 07/17/2023] [Accepted: 07/19/2023] [Indexed: 11/07/2023] Open
Abstract
Digital pathology technologies, including whole slide imaging (WSI), have significantly improved modern clinical practices by facilitating storing, viewing, processing, and sharing digital scans of tissue glass slides. Researchers have proposed various artificial intelligence (AI) solutions for digital pathology applications, such as automated image analysis, to extract diagnostic information from WSI for improving pathology productivity, accuracy, and reproducibility. Feature extraction methods play a crucial role in transforming raw image data into meaningful representations for analysis, facilitating the characterization of tissue structures, cellular properties, and pathological patterns. These features have diverse applications in several digital pathology applications, such as cancer prognosis and diagnosis. Deep learning-based feature extraction methods have emerged as a promising approach to accurately represent WSI contents and have demonstrated superior performance in histology-related tasks. In this survey, we provide a comprehensive overview of feature extraction methods, including both manual and deep learning-based techniques, for the analysis of WSIs. We review relevant literature, analyze the discriminative and geometric features of WSIs (i.e., features suited to support the diagnostic process and extracted by "engineered" methods as opposed to AI), and explore predictive modeling techniques using AI and deep learning. This survey examines the advances, challenges, and opportunities in this rapidly evolving field, emphasizing the potential for accurate diagnosis, prognosis, and decision-making in digital pathology.
Collapse
Affiliation(s)
- Khaled Al-Thelaya
- Department of Information and Computing Technology, College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| | - Nauman Ullah Gilal
- Department of Information and Computing Technology, College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| | - Mahmood Alzubaidi
- Department of Information and Computing Technology, College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| | - Fahad Majeed
- Department of Information and Computing Technology, College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| | - Marco Agus
- Department of Information and Computing Technology, College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| | - Jens Schneider
- Department of Information and Computing Technology, College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| | - Mowafa Househ
- Department of Information and Computing Technology, College of Science and Engineering, Hamad Bin Khalifa University, Doha, Qatar
| |
Collapse
|
4
|
Jhang JY, Tsai YC, Hsu TC, Huang CR, Cheng HC, Sheu BS. Gastric Section Correlation Network for Gastric Precancerous Lesion Diagnosis. IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY 2023; 5:434-442. [PMID: 38899022 PMCID: PMC11186652 DOI: 10.1109/ojemb.2023.3277219] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2023] [Revised: 03/24/2023] [Accepted: 05/10/2023] [Indexed: 06/21/2024] Open
Abstract
Goal: Diagnosing the corpus-predominant gastritis index (CGI) which is an early precancerous lesion in the stomach has been shown its effectiveness in identifying high gastric cancer risk patients for preventive healthcare. However, invasive biopsies and time-consuming pathological analysis are required for the CGI diagnosis. Methods: We propose a novel gastric section correlation network (GSCNet) for the CGI diagnosis from endoscopic images of three dominant gastric sections, the antrum, body and cardia. The proposed network consists of two dominant modules including the scaling feature fusion module and section correlation module. The front one aims to extract scaling fusion features which can effectively represent the mucosa under variant viewing angles and scale changes for each gastric section. The latter one aims to apply the medical prior knowledge with three section correlation losses to model the correlations of different gastric sections for the CGI diagnosis. Results: The proposed method outperforms competing deep learning methods and achieves high testing accuracy, sensitivity, and specificity of 0.957, 0.938 and 0.962, respectively. Conclusions: The proposed method is the first method to identify high gastric cancer risk patients with CGI from endoscopic images without invasive biopsies and time-consuming pathological analysis.
Collapse
Affiliation(s)
- Jyun-Yao Jhang
- Department of Computer Science and EngineeringNational Chung Hsing UniversityTaichung402Taiwan
| | - Yu-Ching Tsai
- Department of Internal MedicineTainan Hospital, Ministry of Health and WelfareTainan701Taiwan
- Department of Internal Medicine, National Cheng Kung University Hospital, College of MedicineNational Cheng Kung UniversityTainan701Taiwan
| | - Tzu-Chun Hsu
- Department of Computer Science and EngineeringNational Chung Hsing UniversityTaichung402Taiwan
| | - Chun-Rong Huang
- Cross College Elite Program, and Academy of Innovative Semiconductor and Sustainable ManufacturingNational Cheng Kung UniversityTainan701Taiwan
- Department of Computer Science and EngineeringNational Chung Hsing UniversityTaichung402Taiwan
| | - Hsiu-Chi Cheng
- Department of Internal Medicine, Institute of Clinical Medicine and Molecular MedicineNational Cheng Kung UniversityTainan701Taiwan
- Department of Internal MedicineTainan Hospital, Ministry of Health and WelfareTainan701Taiwan
| | - Bor-Shyang Sheu
- Institute of Clinical Medicine and Department of Internal Medicine, National Cheng Kung University HospitalNational Cheng Kung UniversityTainan701Taiwan
| |
Collapse
|