51
|
Pati P, Jaume G, Foncubierta-Rodríguez A, Feroce F, Anniciello AM, Scognamiglio G, Brancati N, Fiche M, Dubruc E, Riccio D, Di Bonito M, De Pietro G, Botti G, Thiran JP, Frucci M, Goksel O, Gabrani M. Hierarchical graph representations in digital pathology. Med Image Anal 2021; 75:102264. [PMID: 34781160 DOI: 10.1016/j.media.2021.102264] [Citation(s) in RCA: 35] [Impact Index Per Article: 11.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2021] [Revised: 09/22/2021] [Accepted: 10/06/2021] [Indexed: 01/01/2023]
Abstract
Cancer diagnosis, prognosis, and therapy response predictions from tissue specimens highly depend on the phenotype and topological distribution of constituting histological entities. Thus, adequate tissue representations for encoding histological entities is imperative for computer aided cancer patient care. To this end, several approaches have leveraged cell-graphs, capturing the cell-microenvironment, to depict the tissue. These allow for utilizing graph theory and machine learning to map the tissue representation to tissue functionality, and quantify their relationship. Though cellular information is crucial, it is incomplete alone to comprehensively characterize complex tissue structure. We herein treat the tissue as a hierarchical composition of multiple types of histological entities from fine to coarse level, capturing multivariate tissue information at multiple levels. We propose a novel multi-level hierarchical entity-graph representation of tissue specimens to model the hierarchical compositions that encode histological entities as well as their intra- and inter-entity level interactions. Subsequently, a hierarchical graph neural network is proposed to operate on the hierarchical entity-graph and map the tissue structure to tissue functionality. Specifically, for input histology images, we utilize well-defined cells and tissue regions to build HierArchical Cell-to-Tissue (HACT) graph representations, and devise HACT-Net, a message passing graph neural network, to classify the HACT representations. As part of this work, we introduce the BReAst Carcinoma Subtyping (BRACS) dataset, a large cohort of Haematoxylin & Eosin stained breast tumor regions-of-interest, to evaluate and benchmark our proposed methodology against pathologists and state-of-the-art computer-aided diagnostic approaches. Through comparative assessment and ablation studies, our proposed method is demonstrated to yield superior classification results compared to alternative methods as well as individual pathologists. The code, data, and models can be accessed at https://github.com/histocartography/hact-net.
Collapse
Affiliation(s)
- Pushpak Pati
- IBM Zurich Research Lab, Zurich, Switzerland; Computer-Assisted Applications in Medicine, ETH Zurich, Zurich, Switzerland.
| | - Guillaume Jaume
- IBM Zurich Research Lab, Zurich, Switzerland; Signal Processing Laboratory 5, EPFL, Lausanne, Switzerland
| | | | - Florinda Feroce
- National Cancer Institute - IRCCS-Fondazione Pascale, Naples, Italy
| | | | | | - Nadia Brancati
- Institute for High Performance Computing and Networking - CNR, Naples, Italy
| | - Maryse Fiche
- Aurigen- Centre de Pathologie, Lausanne, Switzerland
| | | | - Daniel Riccio
- Institute for High Performance Computing and Networking - CNR, Naples, Italy
| | | | - Giuseppe De Pietro
- Institute for High Performance Computing and Networking - CNR, Naples, Italy
| | - Gerardo Botti
- National Cancer Institute - IRCCS-Fondazione Pascale, Naples, Italy
| | | | - Maria Frucci
- Institute for High Performance Computing and Networking - CNR, Naples, Italy
| | - Orcun Goksel
- Computer-Assisted Applications in Medicine, ETH Zurich, Zurich, Switzerland; Department of Information Technology, Uppsala University, Sweden
| | | |
Collapse
|
52
|
Shaban M, Raza SEA, Hassan M, Jamshed A, Mushtaq S, Loya A, Batis N, Brooks J, Nankivell P, Sharma N, Robinson M, Mehanna H, Khurram SA, Rajpoot N. A digital score of tumour-associated stroma infiltrating lymphocytes predicts survival in head and neck squamous cell carcinoma. J Pathol 2021; 256:174-185. [PMID: 34698394 DOI: 10.1002/path.5819] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2021] [Revised: 10/01/2021] [Accepted: 10/23/2021] [Indexed: 12/20/2022]
Abstract
The infiltration of T-lymphocytes in the stroma and tumour is an indication of an effective immune response against the tumour, resulting in better survival. In this study, our aim was to explore the prognostic significance of tumour-associated stroma infiltrating lymphocytes (TASILs) in head and neck squamous cell carcinoma (HNSCC) through an AI-based automated method. A deep learning-based automated method was employed to segment tumour, tumour-associated stroma, and lymphocytes in digitally scanned whole slide images of HNSCC tissue slides. The spatial patterns of lymphocytes and tumour-associated stroma were digitally quantified to compute the tumour-associated stroma infiltrating lymphocytes score (TASIL-score). Finally, the prognostic significance of the TASIL-score for disease-specific and disease-free survival was investigated using the Cox proportional hazard analysis. Three different cohorts of haematoxylin and eosin (H&E)-stained tissue slides of HNSCC cases (n = 537 in total) were studied, including publicly available TCGA head and neck cancer cases. The TASIL-score carries prognostic significance (p = 0.002) for disease-specific survival of HNSCC patients. The TASIL-score also shows a better separation between low- and high-risk patients compared with the manual tumour-infiltrating lymphocytes (TILs) scoring by pathologists for both disease-specific and disease-free survival. A positive correlation of TASIL-score with molecular estimates of CD8+ T cells was also found, which is in line with existing findings. To the best of our knowledge, this is the first study to automate the quantification of TASILs from routine H&E slides of head and neck cancer. Our TASIL-score-based findings are aligned with the clinical knowledge, with the added advantages of objectivity, reproducibility, and strong prognostic value. Although we validated our method on three different cohorts (n = 537 cases in total), a comprehensive evaluation on large multicentric cohorts is required before the proposed digital score can be adopted in clinical practice. © 2021 The Authors. The Journal of Pathology published by John Wiley & Sons Ltd on behalf of The Pathological Society of Great Britain and Ireland.
Collapse
Affiliation(s)
- Muhammad Shaban
- Department of Computer Science, University of Warwick, Coventry, UK
| | | | - Mariam Hassan
- Department of Pathology, Shaukat Khanum Memorial Cancer Hospital Research Centre, Lahore, Pakistan
| | - Arif Jamshed
- Department of Pathology, Shaukat Khanum Memorial Cancer Hospital Research Centre, Lahore, Pakistan
| | - Sajid Mushtaq
- Department of Pathology, Shaukat Khanum Memorial Cancer Hospital Research Centre, Lahore, Pakistan
| | - Asif Loya
- Department of Pathology, Shaukat Khanum Memorial Cancer Hospital Research Centre, Lahore, Pakistan
| | - Nikolaos Batis
- Institute of Head and Neck Studies and Education, University of Birmingham, Birmingham, UK
| | - Jill Brooks
- Institute of Head and Neck Studies and Education, University of Birmingham, Birmingham, UK
| | - Paul Nankivell
- Institute of Head and Neck Studies and Education, University of Birmingham, Birmingham, UK
| | - Neil Sharma
- Institute of Head and Neck Studies and Education, University of Birmingham, Birmingham, UK
| | - Max Robinson
- School of Dental Sciences, Faculty of Medical Sciences, Newcastle University, Newcastle upon Tyne, UK
| | - Hisham Mehanna
- Institute of Head and Neck Studies and Education, University of Birmingham, Birmingham, UK
| | - Syed Ali Khurram
- School of Clinical Dentistry, University of Sheffield, Sheffield, UK
| | - Nasir Rajpoot
- Department of Computer Science, University of Warwick, Coventry, UK.,The Alan Turing Institute, London, UK.,Department of Pathology, University Hospitals Coventry & Warwickshire NHS Trust, Coventry, UK
| |
Collapse
|
53
|
Vuong TTL, Kim K, Song B, Kwak JT. Joint categorical and ordinal learning for cancer grading in pathology images. Med Image Anal 2021; 73:102206. [PMID: 34399153 DOI: 10.1016/j.media.2021.102206] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2020] [Revised: 07/26/2021] [Accepted: 07/28/2021] [Indexed: 02/07/2023]
Abstract
Cancer grading in pathology image analysis is one of the most critical tasks since it is related to patient outcomes and treatment planning. Traditionally, it has been considered a categorical problem, ignoring the natural ordering among the cancer grades, i.e., the higher the grade is, the more aggressive it is, and the worse the outcome is. Herein, we propose a joint categorical and ordinal learning framework for cancer grading in pathology images. The approach simultaneously performs both categorical classification and ordinal classification and aims to leverage the distinctive features from the two tasks. Moreover, we propose a new loss function for the ordinal classification task that offers an improved contrast between the correctly classified examples and misclassified examples. The proposed method is evaluated on multiple collections of colorectal and prostate pathology images that underwent different acquisition and processing procedures. Both quantitative and qualitative assessments of the experimental results confirm the effectiveness and robustness of the proposed method in comparison to other competing methods. The results suggest that the proposed approach could permit improved histopathologic analysis of cancer grades in pathology images.
Collapse
Affiliation(s)
- Trinh Thi Le Vuong
- School of Electrical Engineering, Korea University, Seoul 02841, Republic of Korea
| | - Kyungeun Kim
- Department of Pathology, Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, Seoul 03181, Republic of Korea
| | - Boram Song
- Department of Pathology, Kangbuk Samsung Hospital, Sungkyunkwan University School of Medicine, Seoul 03181, Republic of Korea
| | - Jin Tae Kwak
- School of Electrical Engineering, Korea University, Seoul 02841, Republic of Korea.
| |
Collapse
|
54
|
Zhang XM, Liang L, Liu L, Tang MJ. Graph Neural Networks and Their Current Applications in Bioinformatics. Front Genet 2021; 12:690049. [PMID: 34394185 PMCID: PMC8360394 DOI: 10.3389/fgene.2021.690049] [Citation(s) in RCA: 41] [Impact Index Per Article: 13.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2021] [Accepted: 05/28/2021] [Indexed: 12/22/2022] Open
Abstract
Graph neural networks (GNNs), as a branch of deep learning in non-Euclidean space, perform particularly well in various tasks that process graph structure data. With the rapid accumulation of biological network data, GNNs have also become an important tool in bioinformatics. In this research, a systematic survey of GNNs and their advances in bioinformatics is presented from multiple perspectives. We first introduce some commonly used GNN models and their basic principles. Then, three representative tasks are proposed based on the three levels of structural information that can be learned by GNNs: node classification, link prediction, and graph generation. Meanwhile, according to the specific applications for various omics data, we categorize and discuss the related studies in three aspects: disease prediction, drug discovery, and biomedical imaging. Based on the analysis, we provide an outlook on the shortcomings of current studies and point out their developing prospect. Although GNNs have achieved excellent results in many biological tasks at present, they still face challenges in terms of low-quality data processing, methodology, and interpretability and have a long road ahead. We believe that GNNs are potentially an excellent method that solves various biological problems in bioinformatics research.
Collapse
Affiliation(s)
- Xiao-Meng Zhang
- School of Information, Yunnan Normal University, Kunming, China
| | - Li Liang
- School of Information, Yunnan Normal University, Kunming, China
| | - Lin Liu
- School of Information, Yunnan Normal University, Kunming, China
- Key Laboratory of Educational Informatization for Nationalities Ministry of Education, Yunnan Normal University, Kunming, China
| | - Ming-Jing Tang
- Key Laboratory of Educational Informatization for Nationalities Ministry of Education, Yunnan Normal University, Kunming, China
- School of Life Sciences, Yunnan Normal University, Kunming, China
| |
Collapse
|
55
|
Vuong TTL, Song B, Kim K, Cho YM, Kwak JT. Multi-scale binary pattern encoding network for cancer classification in pathology images. IEEE J Biomed Health Inform 2021; 26:1152-1163. [PMID: 34310334 DOI: 10.1109/jbhi.2021.3099817] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Multi-scale approaches have been widely studied in pathology image analysis. These offer an ability to characterize tissues in an image at various scales, in which the tissues may appear differently. Many of such methods have focused on extracting multi-scale hand-crafted features and applied them to various tasks in pathology image analysis. Even, several deep learning methods explicitly adopt the multi-scale approaches. However, most of these methods simply merge the multi-scale features together or adopt the coarse-to-fine/fine-to-coarse strategy, which uses the features one at a time in a sequential manner. Utilizing the multi-scale features in a cooperative and discriminative fashion, the learning capabilities could be further improved. Herein, we propose a multi-scale approach that can identify and leverage the patterns of the multiple scales within a deep neural network and provide the superior capability of cancer classification. The patterns of the features across multiple scales are encoded as a binary pattern code and further converted to a decimal number, which can be easily embedded in the current framework of the deep neural networks. To evaluate the proposed method, multiple sets of pathology images are employed. Under the various experimental settings, the proposed method is systematically assessed and shows an improved classification performance in comparison to other competing methods.
Collapse
|
56
|
Oliveira SP, Neto PC, Fraga J, Montezuma D, Monteiro A, Monteiro J, Ribeiro L, Gonçalves S, Pinto IM, Cardoso JS. CAD systems for colorectal cancer from WSI are still not ready for clinical acceptance. Sci Rep 2021; 11:14358. [PMID: 34257363 PMCID: PMC8277780 DOI: 10.1038/s41598-021-93746-z] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 06/28/2021] [Indexed: 02/07/2023] Open
Abstract
Most oncological cases can be detected by imaging techniques, but diagnosis is based on pathological assessment of tissue samples. In recent years, the pathology field has evolved to a digital era where tissue samples are digitised and evaluated on screen. As a result, digital pathology opened up many research opportunities, allowing the development of more advanced image processing techniques, as well as artificial intelligence (AI) methodologies. Nevertheless, despite colorectal cancer (CRC) being the second deadliest cancer type worldwide, with increasing incidence rates, the application of AI for CRC diagnosis, particularly on whole-slide images (WSI), is still a young field. In this review, we analyse some relevant works published on this particular task and highlight the limitations that hinder the application of these works in clinical practice. We also empirically investigate the feasibility of using weakly annotated datasets to support the development of computer-aided diagnosis systems for CRC from WSI. Our study underscores the need for large datasets in this field and the use of an appropriate learning methodology to gain the most benefit from partially annotated datasets. The CRC WSI dataset used in this study, containing 1,133 colorectal biopsy and polypectomy samples, is available upon reasonable request.
Collapse
Affiliation(s)
- Sara P Oliveira
- INESCTEC, 4200-465, Porto, Portugal.
- Faculty of Engineering (FEUP), University of Porto, 4200-465, Porto, Portugal.
| | - Pedro C Neto
- INESCTEC, 4200-465, Porto, Portugal
- Faculty of Engineering (FEUP), University of Porto, 4200-465, Porto, Portugal
| | - João Fraga
- IMP Diagnostics, 4150-146, Porto, Portugal
| | - Diana Montezuma
- IMP Diagnostics, 4150-146, Porto, Portugal
- ICBAS, University of Porto, 4050-313, Porto , Portugal
- Cancer Biology and Epigenetics Group, IPO-Porto, 4200-072, Porto, Portugal
| | | | | | | | | | | | - Jaime S Cardoso
- INESCTEC, 4200-465, Porto, Portugal
- Faculty of Engineering (FEUP), University of Porto, 4200-465, Porto, Portugal
| |
Collapse
|
57
|
Senousy Z, Abdelsamea MM, Mohamed MM, Gaber MM. 3E-Net: Entropy-Based Elastic Ensemble of Deep Convolutional Neural Networks for Grading of Invasive Breast Carcinoma Histopathological Microscopic Images. ENTROPY (BASEL, SWITZERLAND) 2021; 23:620. [PMID: 34065765 PMCID: PMC8156865 DOI: 10.3390/e23050620] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Revised: 05/13/2021] [Accepted: 05/14/2021] [Indexed: 12/21/2022]
Abstract
Automated grading systems using deep convolution neural networks (DCNNs) have proven their capability and potential to distinguish between different breast cancer grades using digitized histopathological images. In digital breast pathology, it is vital to measure how confident a DCNN is in grading using a machine-confidence metric, especially with the presence of major computer vision challenging problems such as the high visual variability of the images. Such a quantitative metric can be employed not only to improve the robustness of automated systems, but also to assist medical professionals in identifying complex cases. In this paper, we propose Entropy-based Elastic Ensemble of DCNN models (3E-Net) for grading invasive breast carcinoma microscopy images which provides an initial stage of explainability (using an uncertainty-aware mechanism adopting entropy). Our proposed model has been designed in a way to (1) exclude images that are less sensitive and highly uncertain to our ensemble model and (2) dynamically grade the non-excluded images using the certain models in the ensemble architecture. We evaluated two variations of 3E-Net on an invasive breast carcinoma dataset and we achieved grading accuracy of 96.15% and 99.50%.
Collapse
Affiliation(s)
- Zakaria Senousy
- School of Computing and Digital Technology, Birmingham City University, Birmingham B4 7AP, UK; (Z.S.); (M.M.G.)
| | - Mohammed M. Abdelsamea
- School of Computing and Digital Technology, Birmingham City University, Birmingham B4 7AP, UK; (Z.S.); (M.M.G.)
- Faculty of Computers and Information, Assiut University, Assiut 71515, Egypt
| | - Mona Mostafa Mohamed
- Department of Zoology, Faculty of Science, Cairo University, Giza 12613, Egypt;
- Faculty of Basic Sciences, Galala University, Suez 435611, Egypt
| | - Mohamed Medhat Gaber
- School of Computing and Digital Technology, Birmingham City University, Birmingham B4 7AP, UK; (Z.S.); (M.M.G.)
- Faculty of Computer Science and Engineering, Galala University, Suez 435611, Egypt
| |
Collapse
|
58
|
Tizhoosh HR, Diamandis P, Campbell CJV, Safarpoor A, Kalra S, Maleki D, Riasatian A, Babaie M. Searching Images for Consensus: Can AI Remove Observer Variability in Pathology? THE AMERICAN JOURNAL OF PATHOLOGY 2021; 191:1702-1708. [PMID: 33636179 DOI: 10.1016/j.ajpath.2021.01.015] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/08/2020] [Revised: 01/05/2021] [Accepted: 01/25/2021] [Indexed: 02/07/2023]
Abstract
One of the major obstacles in reaching diagnostic consensus is observer variability. With the recent success of artificial intelligence, particularly the deep networks, the question emerges as to whether the fundamental challenge of diagnostic imaging can now be resolved. This article briefly reviews the problem and how eventually both supervised and unsupervised AI technologies could help to overcome it.
Collapse
Affiliation(s)
| | - Phedias Diamandis
- Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Canada
| | - Clinton J V Campbell
- Department of Pathology and Molecular Medicine, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada
| | - Amir Safarpoor
- Kimia Laboratory, University of Waterloo, Waterloo, Canada
| | - Shivam Kalra
- Kimia Laboratory, University of Waterloo, Waterloo, Canada
| | - Danial Maleki
- Kimia Laboratory, University of Waterloo, Waterloo, Canada
| | | | - Morteza Babaie
- Kimia Laboratory, University of Waterloo, Waterloo, Canada
| |
Collapse
|
59
|
An end-to-end breast tumour classification model using context-based patch modelling - A BiLSTM approach for image classification. Comput Med Imaging Graph 2020; 87:101838. [PMID: 33340945 DOI: 10.1016/j.compmedimag.2020.101838] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2020] [Revised: 10/31/2020] [Accepted: 11/29/2020] [Indexed: 11/20/2022]
Abstract
Researchers working on computational analysis of Whole Slide Images (WSIs) in histopathology have primarily resorted to patch-based modelling due to large resolution of each WSI. The large resolution makes WSIs infeasible to be fed directly into the machine learning models due to computational constraints. However, due to patch-based analysis, most of the current methods fail to exploit the underlying spatial relationship among the patches. In our work, we have tried to integrate this relationship along with feature-based correlation among the extracted patches from the particular tumorous region. The tumour regions extracted from WSI have arbitrary dimensions having the range 20,570 to 195 pixels across width and 17,290 to 226 pixels across height. For the given task of classification, we have used BiLSTMs to model both forward and backward contextual relationship. Also, using RNN based model, the limitation of sequence size is eliminated which allows the modelling of variable size images within a deep learning model. We have also incorporated the effect of spatial continuity by exploring different scanning techniques used to sample patches. To establish the efficiency of our approach, we trained and tested our model on two datasets, microscopy images and WSI tumour regions. Both datasets were published by ICIAR BACH Challenge 2018. Finally, we compared our results with top 5 teams who participated in the BACH challenge and achieved the top accuracy of 90% for microscopy image dataset. For WSI tumour region dataset, we compared the classification results with state of the art deep learning networks such as ResNet, DenseNet, and InceptionV3 using maximum voting technique. We achieved the highest performance accuracy of 84%. We found out that BiLSTMs with CNN features have performed much better in modelling patches into an end-to-end Image classification network. Additionally, the variable dimensions of WSI tumour regions were used for classification without the need for resizing. This suggests that our method is independent of tumour image size and can process large dimensional images without losing the resolution details.
Collapse
|
60
|
Graham S, Epstein D, Rajpoot N. Dense Steerable Filter CNNs for Exploiting Rotational Symmetry in Histology Images. IEEE TRANSACTIONS ON MEDICAL IMAGING 2020; 39:4124-4136. [PMID: 32746153 DOI: 10.1109/tmi.2020.3013246] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Histology images are inherently symmetric under rotation, where each orientation is equally as likely to appear. However, this rotational symmetry is not widely utilised as prior knowledge in modern Convolutional Neural Networks (CNNs), resulting in data hungry models that learn independent features at each orientation. Allowing CNNs to be rotation-equivariant removes the necessity to learn this set of transformations from the data and instead frees up model capacity, allowing more discriminative features to be learned. This reduction in the number of required parameters also reduces the risk of overfitting. In this paper, we propose Dense Steerable Filter CNNs (DSF-CNNs) that use group convolutions with multiple rotated copies of each filter in a densely connected framework. Each filter is defined as a linear combination of steerable basis filters, enabling exact rotation and decreasing the number of trainable parameters compared to standard filters. We also provide the first in-depth comparison of different rotation-equivariant CNNs for histology image analysis and demonstrate the advantage of encoding rotational symmetry into modern architectures. We show that DSF-CNNs achieve state-of-the-art performance, with significantly fewer parameters, when applied to three different tasks in the area of computational pathology: breast tumour classification, colon gland segmentation and multi-tissue nuclear segmentation.
Collapse
|