1
|
Wang P, Meng F, Donaldson P, Horan S, Panchy NL, Vischulis E, Winship E, Conner JK, Krysan PJ, Shiu S, Lehti‐Shiu MD. High-throughput measurement of plant fitness traits with an object detection method using Faster R-CNN. THE NEW PHYTOLOGIST 2022; 234:1521-1533. [PMID: 35218008 PMCID: PMC9310946 DOI: 10.1111/nph.18056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/08/2021] [Accepted: 02/09/2022] [Indexed: 06/14/2023]
Abstract
Revealing the contributions of genes to plant phenotype is frequently challenging because loss-of-function effects may be subtle or masked by varying degrees of genetic redundancy. Such effects can potentially be detected by measuring plant fitness, which reflects the cumulative effects of genetic changes over the lifetime of a plant. However, fitness is challenging to measure accurately, particularly in species with high fecundity and relatively small propagule sizes such as Arabidopsis thaliana. An image segmentation-based method using the software ImageJ and an object detection-based method using the Faster Region-based Convolutional Neural Network (R-CNN) algorithm were used for measuring two Arabidopsis fitness traits: seed and fruit counts. The segmentation-based method was error-prone (correlation between true and predicted seed counts, r2 = 0.849) because seeds touching each other were undercounted. By contrast, the object detection-based algorithm yielded near perfect seed counts (r2 = 0.9996) and highly accurate fruit counts (r2 = 0.980). Comparing seed counts for wild-type and 12 mutant lines revealed fitness effects for three genes; fruit counts revealed the same effects for two genes. Our study provides analysis pipelines and models to facilitate the investigation of Arabidopsis fitness traits and demonstrates the importance of examining fitness traits when studying gene functions.
Collapse
Affiliation(s)
- Peipei Wang
- Department of Plant BiologyMichigan State UniversityEast LansingMI48824USA
- DOE Great Lake Bioenergy Research CenterMichigan State UniversityEast LansingMI48824USA
| | - Fanrui Meng
- Department of Plant BiologyMichigan State UniversityEast LansingMI48824USA
- DOE Great Lake Bioenergy Research CenterMichigan State UniversityEast LansingMI48824USA
| | - Paityn Donaldson
- Department of Plant BiologyMichigan State UniversityEast LansingMI48824USA
| | - Sarah Horan
- Department of Plant BiologyMichigan State UniversityEast LansingMI48824USA
| | - Nicholas L. Panchy
- National Institute for Mathematical and Biological SynthesisUniversity of Tennessee1122 Volunteer Blvd, Suite 106KnoxvilleTN37996‐3410USA
| | - Elyse Vischulis
- Genetics and Genome Sciences Graduate ProgramMichigan State UniversityEast LansingMI48824USA
| | - Eamon Winship
- Department of Biochemistry and Molecular BiologyMichigan State UniversityEast LansingMI48824USA
| | - Jeffrey K. Conner
- Department of Plant BiologyMichigan State UniversityEast LansingMI48824USA
- W.K. Kellogg Biological StationMichigan State University3700 E. Gull Lake DriveHickory CornersMI49060USA
- Ecology, Evolution, and Behavior Graduate ProgramMichigan State UniversityEast LansingMI48824USA
| | - Patrick J. Krysan
- Department of HorticultureUniversity of Wisconsin‐MadisonMadisonWI53705USA
| | - Shin‐Han Shiu
- Department of Plant BiologyMichigan State UniversityEast LansingMI48824USA
- DOE Great Lake Bioenergy Research CenterMichigan State UniversityEast LansingMI48824USA
- Genetics and Genome Sciences Graduate ProgramMichigan State UniversityEast LansingMI48824USA
- Ecology, Evolution, and Behavior Graduate ProgramMichigan State UniversityEast LansingMI48824USA
- Department of Computational Mathematics, Science, and EngineeringMichigan State UniversityEast LansingMI48824USA
| | | |
Collapse
|
2
|
Astono IP, Welsh JS, Rowe CW, Jobling P. Objective quantification of nerves in immunohistochemistry specimens of thyroid cancer utilising deep learning. PLoS Comput Biol 2022; 18:e1009912. [PMID: 35226665 PMCID: PMC8912900 DOI: 10.1371/journal.pcbi.1009912] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2021] [Revised: 03/10/2022] [Accepted: 02/10/2022] [Indexed: 11/18/2022] Open
Abstract
Accurate quantification of nerves in cancer specimens is important to understand cancer behaviour. Typically, nerves are manually detected and counted in digitised images of thin tissue sections from excised tumours using immunohistochemistry. However the images are of a large size with nerves having substantial variation in morphology that renders accurate and objective quantification difficult using existing manual and automated counting techniques. Manual counting is precise, but time-consuming, susceptible to inconsistency and has a high rate of false negatives. Existing automated techniques using digitised tissue sections and colour filters are sensitive, however, have a high rate of false positives. In this paper we develop a new automated nerve detection approach, based on a deep learning model with an augmented classification structure. This approach involves pre-processing to extract the image patches for the deep learning model, followed by pixel-level nerve detection utilising the proposed deep learning model. Outcomes assessed were a) sensitivity of the model in detecting manually identified nerves (expert annotations), and b) the precision of additional model-detected nerves. The proposed deep learning model based approach results in a sensitivity of 89% and a precision of 75%. The code and pre-trained model are publicly available at https://github.com/IA92/Automated_Nerves_Quantification. The study of nerves as a prognostic marker for cancer is becoming increasingly important. However, accurate quantification of nerves in cancer specimens is difficult to achieve due to limitations in the existing manual and automated quantification methods. Manual quantification is time-consuming and subject to bias, whilst automated quantification, in general, has a high rate of false detections that makes it somewhat unreliable. In this paper, we propose an automated nerve quantification approach based on a novel deep learning model structure for objective nerve quantification in immunohistochemistry specimens of thyroid cancer. We evaluate the performance of the proposed approach by comparing it with existing manual and automated quantification methods. We show that our proposed approach is superior to the existing manual and automated quantification methods. The proposed approach is shown to have a high precision as well as being able to detect a significant number of nerves not detected by the experts in manual counting.
Collapse
Affiliation(s)
- Indriani P. Astono
- School of Engineering, The University of Newcastle, Newcastle, Australia
- * E-mail:
| | - James S. Welsh
- School of Engineering, The University of Newcastle, Newcastle, Australia
| | - Christopher W. Rowe
- School of Medicine and Public Health, The University of Newcastle, Newcastle, Australia
| | - Phillip Jobling
- School of Biomedical Sciences and Pharmacy, The University of Newcastle, Newcastle, Australia
| |
Collapse
|
3
|
Serre NBC, Fendrych M. ACORBA: Automated workflow to measure Arabidopsis thaliana root tip angle dynamics. QUANTITATIVE PLANT BIOLOGY 2022; 3:e9. [PMID: 37077987 PMCID: PMC10095971 DOI: 10.1017/qpb.2022.4] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Revised: 03/07/2022] [Accepted: 03/30/2022] [Indexed: 05/03/2023]
Abstract
The ability of plants to sense and orient their root growth towards gravity is studied in many laboratories. It is known that manual analysis of image data is subjected to human bias. Several semi-automated tools are available for analysing images from flatbed scanners, but there is no solution to automatically measure root bending angle over time for vertical-stage microscopy images. To address these problems, we developed ACORBA, which is an automated software that can measure root bending angle over time from vertical-stage microscope and flatbed scanner images. ACORBA also has a semi-automated mode for camera or stereomicroscope images. It represents a flexible approach based on both traditional image processing and deep machine learning segmentation to measure root angle progression over time. As the software is automated, it limits human interactions and is reproducible. ACORBA will support the plant biologist community by reducing labour and increasing reproducibility of image analysis of root gravitropism.
Collapse
Affiliation(s)
- Nelson B C Serre
- Department of Experimental Plant Biology, Faculty of Sciences, Charles University, Prague, Czech Republic
| | - Matyáš Fendrych
- Department of Experimental Plant Biology, Faculty of Sciences, Charles University, Prague, Czech Republic
| |
Collapse
|
4
|
Xiao Q, Bai X, Zhang C, He Y. Advanced high-throughput plant phenotyping techniques for genome-wide association studies: A review. J Adv Res 2022; 35:215-230. [PMID: 35003802 PMCID: PMC8721248 DOI: 10.1016/j.jare.2021.05.002] [Citation(s) in RCA: 41] [Impact Index Per Article: 20.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2020] [Revised: 05/05/2021] [Accepted: 05/09/2021] [Indexed: 01/22/2023] Open
Abstract
Linking phenotypes and genotypes to identify genetic architectures that regulate important traits is crucial for plant breeding and the development of plant genomics. In recent years, genome-wide association studies (GWASs) have been applied extensively to interpret relationships between genes and traits. Successful GWAS application requires comprehensive genomic and phenotypic data from large populations. Although multiple high-throughput DNA sequencing approaches are available for the generation of genomics data, the capacity to generate high-quality phenotypic data is lagging far behind. Traditional methods for plant phenotyping mostly rely on manual measurements, which are laborious, inaccurate, and time-consuming, greatly impairing the acquisition of phenotypic data from large populations. In contrast, high-throughput phenotyping has unique advantages, facilitating rapid, non-destructive, and high-throughput detection, and, in turn, addressing the shortcomings of traditional methods. Aim of Review: This review summarizes the current status with regard to the integration of high-throughput phenotyping and GWAS in plants, in addition to discussing the inherent challenges and future prospects. Key Scientific Concepts of Review: High-throughput phenotyping, which facilitates non-contact and dynamic measurements, has the potential to offer high-quality trait data for GWAS and, in turn, to enhance the unraveling of genetic structures of complex plant traits. In conclusion, high-throughput phenotyping integration with GWAS could facilitate the revealing of coding information in plant genomes.
Collapse
Affiliation(s)
- Qinlin Xiao
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Key Laboratory of Spectroscopy Sensing, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
| | - Xiulin Bai
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Key Laboratory of Spectroscopy Sensing, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
| | - Chu Zhang
- School of Information Engineering, Huzhou University, Huzhou 313000, China
| | - Yong He
- College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
- Key Laboratory of Spectroscopy Sensing, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China
| |
Collapse
|
5
|
Li J, Peng J, Jiang X, Rea AC, Peng J, Hu J. DeepLearnMOR: a deep-learning framework for fluorescence image-based classification of organelle morphology. PLANT PHYSIOLOGY 2021; 186:1786-1799. [PMID: 34618108 PMCID: PMC8331148 DOI: 10.1093/plphys/kiab223] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/20/2020] [Accepted: 04/11/2021] [Indexed: 05/09/2023]
Abstract
The proper biogenesis, morphogenesis, and dynamics of subcellular organelles are essential to their metabolic functions. Conventional techniques for identifying, classifying, and quantifying abnormalities in organelle morphology are largely manual and time-consuming, and require specific expertise. Deep learning has the potential to revolutionize image-based screens by greatly improving their scope, speed, and efficiency. Here, we used transfer learning and a convolutional neural network (CNN) to analyze over 47,000 confocal microscopy images from Arabidopsis wild-type and mutant plants with abnormal division of one of three essential energy organelles: chloroplasts, mitochondria, or peroxisomes. We have built a deep-learning framework, DeepLearnMOR (Deep Learning of the Morphology of Organelles), which can rapidly classify image categories and identify abnormalities in organelle morphology with over 97% accuracy. Feature visualization analysis identified important features used by the CNN to predict morphological abnormalities, and visual clues helped to better understand the decision-making process, thereby validating the reliability and interpretability of the neural network. This framework establishes a foundation for future larger-scale research with broader scopes and greater data set diversity and heterogeneity.
Collapse
Affiliation(s)
- Jiying Li
- Microsoft Corporation, Redmond, Washington 98052
| | - Jinghao Peng
- School of Computer Science, Northwestern Polytechnical University, Xi’an 710072, China
| | - Xiaotong Jiang
- Department of Energy Plant Research Laboratory, Michigan State University, East Lansing, Michigan 48824
| | - Anne C Rea
- Department of Energy Plant Research Laboratory, Michigan State University, East Lansing, Michigan 48824
| | - Jiajie Peng
- School of Computer Science, Northwestern Polytechnical University, Xi’an 710072, China
| | - Jianping Hu
- Department of Energy Plant Research Laboratory, Michigan State University, East Lansing, Michigan 48824
- Author for communication:
| |
Collapse
|
6
|
Warman C, Fowler JE. Deep learning-based high-throughput phenotyping can drive future discoveries in plant reproductive biology. PLANT REPRODUCTION 2021; 34:81-89. [PMID: 33725183 PMCID: PMC8128740 DOI: 10.1007/s00497-021-00407-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/15/2020] [Accepted: 02/15/2021] [Indexed: 05/09/2023]
Abstract
Advances in deep learning are providing a powerful set of image analysis tools that are readily accessible for high-throughput phenotyping applications in plant reproductive biology. High-throughput phenotyping systems are becoming critical for answering biological questions on a large scale. These systems have historically relied on traditional computer vision techniques. However, neural networks and specifically deep learning are rapidly becoming more powerful and easier to implement. Here, we examine how deep learning can drive phenotyping systems and be used to answer fundamental questions in reproductive biology. We describe previous applications of deep learning in the plant sciences, provide general recommendations for applying these methods to the study of plant reproduction, and present a case study in maize ear phenotyping. Finally, we highlight several examples where deep learning has enabled research that was previously out of reach and discuss the future outlook of these methods.
Collapse
Affiliation(s)
- Cedar Warman
- Department of Botany and Plant Pathology, Oregon State University, Corvallis, OR, USA.
- School of Plant Sciences, University of Arizona, Tucson, AZ, USA.
| | - John E Fowler
- Department of Botany and Plant Pathology, Oregon State University, Corvallis, OR, USA
| |
Collapse
|
7
|
Ghahremani M, Williams K, Corke FMK, Tiddeman B, Liu Y, Doonan JH. Deep Segmentation of Point Clouds of Wheat. FRONTIERS IN PLANT SCIENCE 2021; 12:608732. [PMID: 33841454 PMCID: PMC8025700 DOI: 10.3389/fpls.2021.608732] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 02/24/2021] [Indexed: 05/31/2023]
Abstract
The 3D analysis of plants has become increasingly effective in modeling the relative structure of organs and other traits of interest. In this paper, we introduce a novel pattern-based deep neural network, Pattern-Net, for segmentation of point clouds of wheat. This study is the first to segment the point clouds of wheat into defined organs and to analyse their traits directly in 3D space. Point clouds have no regular grid and thus their segmentation is challenging. Pattern-Net creates a dynamic link among neighbors to seek stable patterns from a 3D point set across several levels of abstraction using the K-nearest neighbor algorithm. To this end, different layers are connected to each other to create complex patterns from the simple ones, strengthen dynamic link propagation, alleviate the vanishing-gradient problem, encourage link reuse and substantially reduce the number of parameters. The proposed deep network is capable of analysing and decomposing unstructured complex point clouds into semantically meaningful parts. Experiments on a wheat dataset verify the effectiveness of our approach for segmentation of wheat in 3D space.
Collapse
Affiliation(s)
- Morteza Ghahremani
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
- Department of Computer Science, Aberystwyth University, Aberystwyth, United Kingdom
| | - Kevin Williams
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
| | - Fiona M. K. Corke
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
| | - Bernard Tiddeman
- Department of Computer Science, Aberystwyth University, Aberystwyth, United Kingdom
| | - Yonghuai Liu
- Department of Computer Science, Edge Hill University, Ormskirk, United Kingdom
| | - John H. Doonan
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
| |
Collapse
|
8
|
Hamidinekoo A, Garzón-Martínez GA, Ghahremani M, Corke FMK, Zwiggelaar R, Doonan JH, Lu C. DeepPod: a convolutional neural network based quantification of fruit number in Arabidopsis. Gigascience 2021; 9:5780255. [PMID: 32129846 PMCID: PMC7055469 DOI: 10.1093/gigascience/giaa012] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2019] [Revised: 11/18/2019] [Indexed: 11/28/2022] Open
Abstract
Background High-throughput phenotyping based on non-destructive imaging has great potential in plant biology and breeding programs. However, efficient feature extraction and quantification from image data remains a bottleneck that needs to be addressed. Advances in sensor technology have led to the increasing use of imaging to monitor and measure a range of plants including the model Arabidopsis thaliana. These extensive datasets contain diverse trait information, but feature extraction is often still implemented using approaches requiring substantial manual input. Results The computational detection and segmentation of individual fruits from images is a challenging task, for which we have developed DeepPod, a patch-based 2-phase deep learning framework. The associated manual annotation task is simple and cost-effective without the need for detailed segmentation or bounding boxes. Convolutional neural networks (CNNs) are used for classifying different parts of the plant inflorescence, including the tip, base, and body of the siliques and the stem inflorescence. In a post-processing step, different parts of the same silique are joined together for silique detection and localization, whilst taking into account possible overlapping among the siliques. The proposed framework is further validated on a separate test dataset of 2,408 images. Comparisons of the CNN-based prediction with manual counting (R2 = 0.90) showed the desired capability of methods for estimating silique number. Conclusions The DeepPod framework provides a rapid and accurate estimate of fruit number in a model system widely used by biologists to investigate many fundemental processes underlying growth and reproduction
Collapse
Affiliation(s)
- Azam Hamidinekoo
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK
| | - Gina A Garzón-Martínez
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Morteza Ghahremani
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK.,National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Fiona M K Corke
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Reyer Zwiggelaar
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK
| | - John H Doonan
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Chuan Lu
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK
| |
Collapse
|