1
|
Krishnan NM, Kumar S, Panda B. Fruit-In-Sight: A deep learning-based framework for secondary metabolite class prediction using fruit and leaf images. PLoS One 2024; 19:e0308708. [PMID: 39116159 PMCID: PMC11309380 DOI: 10.1371/journal.pone.0308708] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2024] [Accepted: 07/29/2024] [Indexed: 08/10/2024] Open
Abstract
Fruits produce a wide variety of secondary metabolites of great economic value. Analytical measurement of the metabolites is tedious, time-consuming, and expensive. Additionally, metabolite concentrations vary greatly from tree to tree, making it difficult to choose trees for fruit collection. The current study tested whether deep learning-based models can be developed using fruit and leaf images alone to predict a metabolite's concentration class (high or low). We collected fruits and leaves (n = 1045) from neem trees grown in the wild across 0.6 million sq km, imaged them, and measured concentration of five metabolites (azadirachtin, deacetyl-salannin, salannin, nimbin and nimbolide) using high-performance liquid chromatography. We used the data to train deep learning models for metabolite class prediction. The best model out of the seven tested (YOLOv5, GoogLeNet, InceptionNet, EfficientNet_B0, Resnext_50, Resnet18, and SqueezeNet) provided a validation F1 score of 0.93 and a test F1 score of 0.88. The sensitivity and specificity of the fruit model alone in the test set were 83.52 ± 6.19 and 82.35 ± 5.96, and 79.40 ± 8.50 and 85.64 ± 6.21, for the low and the high classes, respectively. The sensitivity was further boosted to 92.67± 5.25 for the low class and 88.11 ± 9.17 for the high class, and the specificity to 100% for both classes, using a multi-analyte framework. We incorporated the multi-analyte model in an Android mobile App Fruit-In-Sight that uses fruit and leaf images to decide whether to 'pick' or 'not pick' the fruits from a specific tree based on the metabolite concentration class. Our study provides evidence that images of fruits and leaves alone can predict the concentration class of a secondary metabolite without using expensive laboratory equipment and cumbersome analytical procedures, thus simplifying the process of choosing the right tree for fruit collection.
Collapse
Affiliation(s)
| | - Saroj Kumar
- School of Biotechnology, Jawaharlal Nehru University, New Delhi, India
| | - Binay Panda
- School of Biotechnology, Jawaharlal Nehru University, New Delhi, India
- Special Centre for Systems Medicine, Jawaharlal Nehru University, New Delhi, India
| |
Collapse
|
2
|
Xu J, Li Y, Zhang M, Zhang S. Sustainable agriculture in the digital era: Past, present, and future trends by bibliometric analysis. Heliyon 2024; 10:e34612. [PMID: 39113949 PMCID: PMC11305306 DOI: 10.1016/j.heliyon.2024.e34612] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2024] [Revised: 07/10/2024] [Accepted: 07/12/2024] [Indexed: 08/10/2024] Open
Abstract
The digital era is reshaping agricultural practices, opening new avenues for sustainable growth, and proving indispensable in global challenges like food security and environmental conservation. However, a comprehensive understanding of this evolving landscape remains paramount. This research evaluates 344 papers from the Web of Science database to delve into sustainable agriculture's historical and current patterns in the digital era through bibliometric analysis and project future domains. Specifically, citation analysis identified influential papers, journals, institutions, and countries, while co-authorship analysis verified the interactions between authors, affiliations, and countries. Co-citation analysis found four hotspot clusters: prosperity and challenges in agricultural sustainability, digital information and agricultural development, innovations for sustainable agriculture, and geospatial analysis in environmental studies. The co-occurrence of keywords analysis revealed four main clusters for future studies: smart agriculture and biodiversity conservation, digitalization and sustainable agriculture, technologies and agricultural challenge management, and digital intelligence and farmer adoption. The study pioneers the use of bibliometric analysis to explore sustainable agriculture in the digital era. It presents invaluable insights into the evolving landscape of this field, summarizing its hotspots and suggesting future trajectories.
Collapse
Affiliation(s)
- Jiahui Xu
- International Education College, Hebei Finance University, Baoding, 071051, Hebei, China
| | - Yanzi Li
- International Education College, Hebei Finance University, Baoding, 071051, Hebei, China
| | - Meiping Zhang
- Agriculture College, Heilongjiang Bayi Agricultural University, Daqing, 163319, Heilongjiang, China
| | - Shuhan Zhang
- PBC School of Finance, Tsinghua University, Beijing, 100083, China
| |
Collapse
|
3
|
Wang X, Xu Y, Wei X. Phenotypic characteristics of the mycelium of Pleurotus geesteranus using image recognition technology. Front Bioeng Biotechnol 2024; 12:1338276. [PMID: 38952667 PMCID: PMC11215179 DOI: 10.3389/fbioe.2024.1338276] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2023] [Accepted: 05/14/2024] [Indexed: 07/03/2024] Open
Abstract
Phenotypic analysis has significant potential for aiding breeding efforts. However, there is a notable lack of studies utilizing phenotypic analysis in the field of edible fungi. Pleurotus geesteranus is a lucrative edible fungus with significant market demand and substantial industrial output, and early-stage phenotypic analysis of Pleurotus geesteranus is imperative during its breeding process. This study utilizes image recognition technology to investigate the phenotypic features of the mycelium of P. geesteranus. We aim to establish the relations between these phenotypic characteristics and mycelial quality. Four groups of mycelia, namely, the non-degraded and degraded mycelium and the 5th and 14th subcultures, are used as image sources. Two categories of phenotypic metrics, outline and texture, are quantitatively calculated and analyzed. In the outline features of the mycelium, five indexes, namely, mycelial perimeter, radius, area, growth rate, and change speed, are proposed to demonstrate mycelial growth. In the texture features of the mycelium, five indexes, namely, mycelial coverage, roundness, groove depth, density, and density change, are studied to analyze the phenotypic characteristics of the mycelium. Moreover, we also compared the cellulase and laccase activities of the mycelium and found that cellulase level was consistent with the phenotypic indices of the mycelium, which further verified the accuracy of digital image processing technology in analyzing the phenotypic characteristics of the mycelium. The results indicate that there are significant differences in these 10 phenotypic characteristic indices ( P < 0.001 ), elucidating a close relationship between phenotypic characteristics and mycelial quality. This conclusion facilitates rapid and accurate strain selection in the early breeding stage of P. geesteranus.
Collapse
Affiliation(s)
- Xingyi Wang
- College of Mechanical and Electronic Engineering, Fujian Agriculture and Forestry University, Fuzhou, China
| | - Ya Xu
- College of Computer and Information Sciences, Fujian Agriculture and Forestry University, Fuzhou, China
| | - Xuan Wei
- College of Mechanical and Electronic Engineering, Fujian Agriculture and Forestry University, Fuzhou, China
| |
Collapse
|
4
|
Harandi N, Vandenberghe B, Vankerschaver J, Depuydt S, Van Messem A. How to make sense of 3D representations for plant phenotyping: a compendium of processing and analysis techniques. PLANT METHODS 2023; 19:60. [PMID: 37353846 DOI: 10.1186/s13007-023-01031-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 05/19/2023] [Indexed: 06/25/2023]
Abstract
Computer vision technology is moving more and more towards a three-dimensional approach, and plant phenotyping is following this trend. However, despite its potential, the complexity of the analysis of 3D representations has been the main bottleneck hindering the wider deployment of 3D plant phenotyping. In this review we provide an overview of typical steps for the processing and analysis of 3D representations of plants, to offer potential users of 3D phenotyping a first gateway into its application, and to stimulate its further development. We focus on plant phenotyping applications where the goal is to measure characteristics of single plants or crop canopies on a small scale in research settings, as opposed to large scale crop monitoring in the field.
Collapse
Affiliation(s)
- Negin Harandi
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | | | - Joris Vankerschaver
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | - Stephen Depuydt
- Erasmus Applied University of Sciences and Arts, Campus Kaai, Nijverheidskaai 170, Anderlecht, Belgium
| | - Arnout Van Messem
- Department of Mathematics, Université de Liège, Allée de la Découverte 12, Liège, Belgium.
| |
Collapse
|
5
|
Xie X, Ge Y, Walia H, Yang J, Yu H. Leaf-Counting in Monocot Plants Using Deep Regression Models. SENSORS (BASEL, SWITZERLAND) 2023; 23:1890. [PMID: 36850487 PMCID: PMC9962473 DOI: 10.3390/s23041890] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/05/2023] [Revised: 01/28/2023] [Accepted: 02/02/2023] [Indexed: 06/18/2023]
Abstract
Leaf numbers are vital in estimating the yield of crops. Traditional manual leaf-counting is tedious, costly, and an enormous job. Recent convolutional neural network-based approaches achieve promising results for rosette plants. However, there is a lack of effective solutions to tackle leaf counting for monocot plants, such as sorghum and maize. The existing approaches often require substantial training datasets and annotations, thus incurring significant overheads for labeling. Moreover, these approaches can easily fail when leaf structures are occluded in images. To address these issues, we present a new deep neural network-based method that does not require any effort to label leaf structures explicitly and achieves superior performance even with severe leaf occlusions in images. Our method extracts leaf skeletons to gain more topological information and applies augmentation to enhance structural variety in the original images. Then, we feed the combination of original images, derived skeletons, and augmentations into a regression model, transferred from Inception-Resnet-V2, for leaf-counting. We find that leaf tips are important in our regression model through an input modification method and a Grad-CAM method. The superiority of the proposed method is validated via comparison with the existing approaches conducted on a similar dataset. The results show that our method does not only improve the accuracy of leaf-counting, with overlaps and occlusions, but also lower the training cost, with fewer annotations compared to the previous state-of-the-art approaches.The robustness of the proposed method against the noise effect is also verified by removing the environmental noises during the image preprocessing and reducing the effect of the noises introduced by skeletonization, with satisfactory outcomes.
Collapse
Affiliation(s)
- Xinyan Xie
- School of Computing, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| | - Yufeng Ge
- Department of Biological Systems Engineering, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | - Harkamal Walia
- Department of Agronomy and Horticulture, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | - Jinliang Yang
- Department of Agronomy and Horticulture, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | - Hongfeng Yu
- School of Computing, University of Nebraska-Lincoln, Lincoln, NE 68588, USA
| |
Collapse
|
6
|
Yuan J, Kaur D, Zhou Z, Nagle M, Kiddle NG, Doshi NA, Behnoudfar A, Peremyslova E, Ma C, Strauss SH, Li F. Robust High-Throughput Phenotyping with Deep Segmentation Enabled by a Web-Based Annotator. PLANT PHENOMICS 2022; 2022:9893639. [PMID: 36059601 PMCID: PMC9394117 DOI: 10.34133/2022/9893639] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Accepted: 03/17/2022] [Indexed: 11/24/2022]
Abstract
The abilities of plant biologists and breeders to characterize the genetic basis of physiological traits are limited by their abilities to obtain quantitative data representing precise details of trait variation, and particularly to collect this data at a high-throughput scale with low cost. Although deep learning methods have demonstrated unprecedented potential to automate plant phenotyping, these methods commonly rely on large training sets that can be time-consuming to generate. Intelligent algorithms have therefore been proposed to enhance the productivity of these annotations and reduce human efforts. We propose a high-throughput phenotyping system which features a Graphical User Interface (GUI) and a novel interactive segmentation algorithm: Semantic-Guided Interactive Object Segmentation (SGIOS). By providing a user-friendly interface and intelligent assistance with annotation, this system offers potential to streamline and accelerate the generation of training sets, reducing the effort required by the user. Our evaluation shows that our proposed SGIOS model requires fewer user inputs compared to the state-of-art models for interactive segmentation. As a case study of the use of the GUI applied for genetic discovery in plants, we present an example of results from a preliminary genome-wide association study (GWAS) of in planta regeneration in Populus trichocarpa (poplar). We further demonstrate that the inclusion of a semantic prior map with SGIOS can accelerate the training process for future GWAS, using a sample of a dataset extracted from a poplar GWAS of in vitro regeneration. The capabilities of our phenotyping system surpass those of unassisted humans to rapidly and precisely phenotype our traits of interest. The scalability of this system enables large-scale phenomic screens that would otherwise be time-prohibitive, thereby providing increased power for GWAS, mutant screens, and other studies relying on large sample sizes to characterize the genetic basis of trait variation. Our user-friendly system can be used by researchers lacking a computational background, thus helping to democratize the use of deep segmentation as a tool for plant phenotyping.
Collapse
Affiliation(s)
| | | | - Zheng Zhou
- Oregon State University, Corvallis, OR, USA
| | | | | | | | | | | | | | | | - Fuxin Li
- Oregon State University, Corvallis, OR, USA
| |
Collapse
|
7
|
Bhagat S, Kokare M, Haswani V, Hambarde P, Kamble R. Eff-UNet++: A novel architecture for plant leaf segmentation and counting. ECOL INFORM 2022. [DOI: 10.1016/j.ecoinf.2022.101583] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
8
|
Xiang L, Nolan TM, Bao Y, Elmore M, Tuel T, Gai J, Shah D, Wang P, Huser NM, Hurd AM, McLaughlin SA, Howell SH, Walley JW, Yin Y, Tang L. Robotic Assay for Drought (RoAD): an automated phenotyping system for brassinosteroid and drought responses. THE PLANT JOURNAL : FOR CELL AND MOLECULAR BIOLOGY 2021; 107:1837-1853. [PMID: 34216161 DOI: 10.1111/tpj.15401] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 06/16/2021] [Accepted: 06/19/2021] [Indexed: 06/13/2023]
Abstract
Brassinosteroids (BRs) are a group of plant steroid hormones involved in regulating growth, development, and stress responses. Many components of the BR pathway have previously been identified and characterized. However, BR phenotyping experiments are typically performed in a low-throughput manner, such as on Petri plates. Additionally, the BR pathway affects drought responses, but drought experiments are time consuming and difficult to control. To mitigate these issues and increase throughput, we developed the Robotic Assay for Drought (RoAD) system to perform BR and drought response experiments in soil-grown Arabidopsis plants. RoAD is equipped with a robotic arm, a rover, a bench scale, a precisely controlled watering system, an RGB camera, and a laser profilometer. It performs daily weighing, watering, and imaging tasks and is capable of administering BR response assays by watering plants with Propiconazole (PCZ), a BR biosynthesis inhibitor. We developed image processing algorithms for both plant segmentation and phenotypic trait extraction to accurately measure traits including plant area, plant volume, leaf length, and leaf width. We then applied machine learning algorithms that utilize the extracted phenotypic parameters to identify image-derived traits that can distinguish control, drought-treated, and PCZ-treated plants. We carried out PCZ and drought experiments on a set of BR mutants and Arabidopsis accessions with altered BR responses. Finally, we extended the RoAD assays to perform BR response assays using PCZ in Zea mays (maize) plants. This study establishes an automated and non-invasive robotic imaging system as a tool to accurately measure morphological and growth-related traits of Arabidopsis and maize plants in 3D, providing insights into the BR-mediated control of plant growth and stress responses.
Collapse
Affiliation(s)
- Lirong Xiang
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Trevor M Nolan
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| | - Yin Bao
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Mitch Elmore
- Department of Plant Pathology and Microbiology, Iowa State University, Ames, IA, 50011, USA
| | - Taylor Tuel
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Jingyao Gai
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Dylan Shah
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Ping Wang
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Nicole M Huser
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Ashley M Hurd
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Sean A McLaughlin
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Stephen H Howell
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| | - Justin W Walley
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
- Department of Plant Pathology and Microbiology, Iowa State University, Ames, IA, 50011, USA
| | - Yanhai Yin
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| | - Lie Tang
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| |
Collapse
|
9
|
Kolhar S, Jagtap J. Convolutional neural network based encoder-decoder architectures for semantic segmentation of plants. ECOL INFORM 2021. [DOI: 10.1016/j.ecoinf.2021.101373] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
10
|
Yao L, van de Zedde R, Kowalchuk G. Recent developments and potential of robotics in plant eco-phenotyping. Emerg Top Life Sci 2021; 5:289-300. [PMID: 34013965 PMCID: PMC8166337 DOI: 10.1042/etls20200275] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2021] [Revised: 04/12/2021] [Accepted: 04/13/2021] [Indexed: 02/04/2023]
Abstract
Automated acquisition of plant eco-phenotypic information can serve as a decision-making basis for precision agricultural management and can also provide detailed insights into plant growth status, pest management, water and fertilizer management for plant breeders and plant physiologists. Because the microscopic components and macroscopic morphology of plants will be affected by the ecological environment, research on plant eco-phenotyping is more meaningful than the study of single-plant phenotyping. To achieve high-throughput acquisition of phenotyping information, the combination of high-precision sensors and intelligent robotic platforms have become an emerging research focus. Robotic platforms and automated systems are the important carriers of phenotyping monitoring sensors that enable large-scale screening. Through the diverse design and flexible systems, an efficient operation can be achieved across a range of experimental and field platforms. The combination of robot technology and plant phenotyping monitoring tools provides the data to inform novel artificial intelligence (AI) approaches that will provide steppingstones for new research breakthroughs. Therefore, this article introduces robotics and eco-phenotyping and examines research significant to this novel domain of plant eco-phenotyping. Given the monitoring scenarios of phenotyping information at different scales, the used intelligent robot technology, efficient automation platform, and advanced sensor equipment are summarized in detail. We further discuss the challenges posed to current research as well as the future developmental trends in the application of robot technology and plant eco-phenotyping. These include the use of collected data for AI applications and high-bandwidth data transfer, and large well-structured (meta) data storage approaches in plant sciences and agriculture.
Collapse
Affiliation(s)
- Lili Yao
- Wageningen University & Research, Wageningen, Netherlands
| | | | | |
Collapse
|
11
|
Jangra S, Chaudhary V, Yadav RC, Yadav NR. High-Throughput Phenotyping: A Platform to Accelerate Crop Improvement. PHENOMICS (CHAM, SWITZERLAND) 2021; 1:31-53. [PMID: 36939738 PMCID: PMC9590473 DOI: 10.1007/s43657-020-00007-6] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Development of high-throughput phenotyping technologies has progressed considerably in the last 10 years. These technologies provide precise measurements of desired traits among thousands of field-grown plants under diversified environments; this is a critical step towards selection of better performing lines as to yield, disease resistance, and stress tolerance to accelerate crop improvement programs. High-throughput phenotyping techniques and platforms help unraveling the genetic basis of complex traits associated with plant growth and development and targeted traits. This review focuses on the advancements in technologies involved in high-throughput, field-based, aerial, and unmanned platforms. Development of user-friendly data management tools and softwares to better understand phenotyping will increase the use of field-based high-throughput techniques, which have potential to revolutionize breeding strategies and meet the future needs of stakeholders.
Collapse
Affiliation(s)
- Sumit Jangra
- Department of Molecular Biology, Biotechnology, and Bioinformatics, CCS Haryana Agricultural University, Hisar, 125004 India
| | - Vrantika Chaudhary
- Department of Molecular Biology, Biotechnology, and Bioinformatics, CCS Haryana Agricultural University, Hisar, 125004 India
| | - Ram C. Yadav
- Department of Molecular Biology, Biotechnology, and Bioinformatics, CCS Haryana Agricultural University, Hisar, 125004 India
| | - Neelam R. Yadav
- Department of Molecular Biology, Biotechnology, and Bioinformatics, CCS Haryana Agricultural University, Hisar, 125004 India
| |
Collapse
|
12
|
Hamidinekoo A, Garzón-Martínez GA, Ghahremani M, Corke FMK, Zwiggelaar R, Doonan JH, Lu C. DeepPod: a convolutional neural network based quantification of fruit number in Arabidopsis. Gigascience 2021; 9:5780255. [PMID: 32129846 PMCID: PMC7055469 DOI: 10.1093/gigascience/giaa012] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2019] [Revised: 11/18/2019] [Indexed: 11/28/2022] Open
Abstract
Background High-throughput phenotyping based on non-destructive imaging has great potential in plant biology and breeding programs. However, efficient feature extraction and quantification from image data remains a bottleneck that needs to be addressed. Advances in sensor technology have led to the increasing use of imaging to monitor and measure a range of plants including the model Arabidopsis thaliana. These extensive datasets contain diverse trait information, but feature extraction is often still implemented using approaches requiring substantial manual input. Results The computational detection and segmentation of individual fruits from images is a challenging task, for which we have developed DeepPod, a patch-based 2-phase deep learning framework. The associated manual annotation task is simple and cost-effective without the need for detailed segmentation or bounding boxes. Convolutional neural networks (CNNs) are used for classifying different parts of the plant inflorescence, including the tip, base, and body of the siliques and the stem inflorescence. In a post-processing step, different parts of the same silique are joined together for silique detection and localization, whilst taking into account possible overlapping among the siliques. The proposed framework is further validated on a separate test dataset of 2,408 images. Comparisons of the CNN-based prediction with manual counting (R2 = 0.90) showed the desired capability of methods for estimating silique number. Conclusions The DeepPod framework provides a rapid and accurate estimate of fruit number in a model system widely used by biologists to investigate many fundemental processes underlying growth and reproduction
Collapse
Affiliation(s)
- Azam Hamidinekoo
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK
| | - Gina A Garzón-Martínez
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Morteza Ghahremani
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK.,National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Fiona M K Corke
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Reyer Zwiggelaar
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK
| | - John H Doonan
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, Ceredigion SY233EB, UK
| | - Chuan Lu
- Department of Computer Science, Aberystwyth University, Aberystwyth, Ceredigion SY233DB, UK
| |
Collapse
|
13
|
Bhugra S, Garg K, Chaudhury S, Lall B. A Hierarchical Framework for Leaf Instance Segmentation: Application to Plant Phenotyping. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR) 2021. [DOI: 10.1109/icpr48806.2021.9411981] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/19/2023]
|
14
|
Growth of pineapple plantlets during acclimatisation can be monitored through automated image analysis of the canopy. EUROBIOTECH JOURNAL 2020. [DOI: 10.2478/ebtj-2020-0026] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Abstract
Pineapple is an economically important tropical fruit crop, but the lack of adequate planting material limits its productivity. A range of micropropagation protocols has been developed over the years to address this shortfall. Still, the final stage of micropropagation, i.e. acclimatisation, remains a challenge as pineapple plantlets grow very slowly. Several studies have been conducted focusing on this phase and attempting to improve plantlet growth and establishment, which requires tools for the non-destructive evaluation of growth during acclimatisation. This report describes the use of semi-automated and automated image analysis to quantify canopy growth of pineapple plantlets, during five months of acclimatisation. The canopy area progressively increased during acclimatisation, particularly after 90 days. Regression analyses were performed to determine the relationships between the automated image analysis and morphological indicators of growth. The mathematical relationships between estimations of the canopy area and the fresh and dry weights of intact plantlets, middle-aged leaves (D leaves) and roots showed determination coefficients (R2) between 0.84 and 0.92. We propose an appropriate tool for the simple, objective and non-destructive evaluation of pineapple plantlets growth, which can be generally applied for plant phenotyping, to reduce costs and develop streamlined pipelines for the assessment of plant growth.
Collapse
|
15
|
Lozano-Claros D, Meng X, Custovic E, Deng G, Berkowitz O, Whelan J, Lewsey MG. Developmental normalization of phenomics data generated by high throughput plant phenotyping systems. PLANT METHODS 2020; 16:111. [PMID: 32817754 PMCID: PMC7424680 DOI: 10.1186/s13007-020-00653-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/18/2020] [Accepted: 08/06/2020] [Indexed: 06/11/2023]
Abstract
BACKGROUND Sowing time is commonly used as the temporal reference for Arabidopsis thaliana (Arabidopsis) experiments in high throughput plant phenotyping (HTPP) systems. This relies on the assumption that germination and seedling establishment are uniform across the population. However, individual seeds have different development trajectories even under uniform environmental conditions. This leads to increased variance in quantitative phenotyping approaches. We developed the Digital Adjustment of Plant Development (DAPD) normalization method. It normalizes time-series HTPP measurements by reference to an early developmental stage and in an automated manner. The timeline of each measurement series is shifted to a reference time. The normalization is determined by cross-correlation at multiple time points of the time-series measurements, which may include rosette area, leaf size, and number. RESULTS The DAPD method improved the accuracy of phenotyping measurements by decreasing the statistical dispersion of quantitative traits across a time-series. We applied DAPD to evaluate the relative growth rate in Arabidopsis plants and demonstrated that it improves uniformity in measurements, permitting a more informative comparison between individuals. Application of DAPD decreased variance of phenotyping measurements by up to 2.5 times compared to sowing-time normalization. The DAPD method also identified more outliers than any other central tendency technique applied to the non-normalized dataset. CONCLUSIONS DAPD is an effective method to control for temporal differences in development within plant phenotyping datasets. In principle, it can be applied to HTPP data from any species/trait combination for which a relevant developmental scale can be defined.
Collapse
Affiliation(s)
- Diego Lozano-Claros
- Department of Animal, Plant and Soil Science, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
- Department of Engineering, School of Engineering and Mathematical Sciences, La Trobe University, Melbourne, VIC 3086 Australia
| | - Xiangxiang Meng
- Department of Animal, Plant and Soil Science, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
- Australian Research Council Centre of Excellence in Plant Energy Biology, La Trobe University, Bundoora, VIC 3086 Australia
- Currently: Key Laboratory of Biofuels, Shandong Provincial Key Laboratory of Energy Genetics, Qingdao Institute of Bioenergy and Bioprocess Technology, Chinese Academy of Sciences, Qingdao, 266101 China
| | - Eddie Custovic
- Department of Engineering, School of Engineering and Mathematical Sciences, La Trobe University, Melbourne, VIC 3086 Australia
| | - Guang Deng
- Department of Engineering, School of Engineering and Mathematical Sciences, La Trobe University, Melbourne, VIC 3086 Australia
| | - Oliver Berkowitz
- Department of Animal, Plant and Soil Science, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
- Australian Research Council Research Hub for Medicinal Agriculture, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
- Australian Research Council Centre of Excellence in Plant Energy Biology, La Trobe University, Bundoora, VIC 3086 Australia
| | - James Whelan
- Department of Animal, Plant and Soil Science, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
- Australian Research Council Research Hub for Medicinal Agriculture, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
- Australian Research Council Centre of Excellence in Plant Energy Biology, La Trobe University, Bundoora, VIC 3086 Australia
| | - Mathew G. Lewsey
- Department of Animal, Plant and Soil Science, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
- Australian Research Council Research Hub for Medicinal Agriculture, AgriBio Building, La Trobe University, Bundoora, VIC 3086 Australia
| |
Collapse
|
16
|
Tausen M, Clausen M, Moeskjær S, Shihavuddin ASM, Dahl AB, Janss L, Andersen SU. Greenotyper: Image-Based Plant Phenotyping Using Distributed Computing and Deep Learning. FRONTIERS IN PLANT SCIENCE 2020; 11:1181. [PMID: 32849731 PMCID: PMC7427585 DOI: 10.3389/fpls.2020.01181] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/17/2019] [Accepted: 07/21/2020] [Indexed: 05/07/2023]
Abstract
Image-based phenotype data with high temporal resolution offers advantages over end-point measurements in plant quantitative genetics experiments, because growth dynamics can be assessed and analysed for genotype-phenotype association. Recently, network-based camera systems have been deployed as customizable, low-cost phenotyping solutions. Here, we implemented a large, automated image-capture system based on distributed computing using 180 networked Raspberry Pi units that could simultaneously monitor 1,800 white clover (Trifolium repens) plants. The camera system proved stable with an average uptime of 96% across all 180 cameras. For analysis of the captured images, we developed the Greenotyper image analysis pipeline. It detected the location of the plants with a bounding box accuracy of 97.98%, and the U-net-based plant segmentation had an intersection over union accuracy of 0.84 and a pixel accuracy of 0.95. We used Greenotyper to analyze a total of 355,027 images, which required 24-36 h. Automated phenotyping using a large number of static cameras and plants thus proved a cost-effective alternative to systems relying on conveyor belts or mobile cameras.
Collapse
Affiliation(s)
- Marni Tausen
- Bioinformatics Research Centre, Aarhus University, Aarhus, Denmark
- Department of Molecular Biology and Genetics, Aarhus University, Aarhus, Denmark
| | - Marc Clausen
- Department of Molecular Biology and Genetics, Aarhus University, Aarhus, Denmark
| | - Sara Moeskjær
- Department of Molecular Biology and Genetics, Aarhus University, Aarhus, Denmark
| | - ASM Shihavuddin
- Image Analysis & Computer Graphics, DTU Compute, Lyngby, Denmark
- EEE Department, Green University of Bangladesh (GUB), Dhaka, Bangladesh
| | | | - Luc Janss
- Department of Molecular Biology and Genetics, Aarhus University, Aarhus, Denmark
| | | |
Collapse
|
17
|
Image segmentation based on ultimate levelings: From attribute filters to machine learning strategies. Pattern Recognit Lett 2020. [DOI: 10.1016/j.patrec.2020.03.013] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
18
|
Dobrescu A, Giuffrida MV, Tsaftaris SA. Doing More With Less: A Multitask Deep Learning Approach in Plant Phenotyping. FRONTIERS IN PLANT SCIENCE 2020; 11:141. [PMID: 32256503 PMCID: PMC7093010 DOI: 10.3389/fpls.2020.00141] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/06/2019] [Accepted: 01/29/2020] [Indexed: 05/24/2023]
Abstract
Image-based plant phenotyping has been steadily growing and this has steeply increased the need for more efficient image analysis techniques capable of evaluating multiple plant traits. Deep learning has shown its potential in a multitude of visual tasks in plant phenotyping, such as segmentation and counting. Here, we show how different phenotyping traits can be extracted simultaneously from plant images, using multitask learning (MTL). MTL leverages information contained in the training images of related tasks to improve overall generalization and learns models with fewer labels. We present a multitask deep learning framework for plant phenotyping, able to infer three traits simultaneously: (i) leaf count, (ii) projected leaf area (PLA), and (iii) genotype classification. We adopted a modified pretrained ResNet50 as a feature extractor, trained end-to-end to predict multiple traits. We also leverage MTL to show that through learning from more easily obtainable annotations (such as PLA and genotype) we can predict a better leaf count (harder to obtain annotation). We evaluate our findings on several publicly available datasets of top-view images of Arabidopsis thaliana. Experimental results show that the proposed MTL method improves the leaf count mean squared error (MSE) by more than 40%, compared to a single task network on the same dataset. We also show that our MTL framework can be trained with up to 75% fewer leaf count annotations without significantly impacting performance, whereas a single task model shows a steady decline when fewer annotations are available. Code available at https://github.com/andobrescu/Multi_task_plant_phenotyping.
Collapse
|
19
|
Costa JM, Marques da Silva J, Pinheiro C, Barón M, Mylona P, Centritto M, Haworth M, Loreto F, Uzilday B, Turkan I, Oliveira MM. Opportunities and Limitations of Crop Phenotyping in Southern European Countries. FRONTIERS IN PLANT SCIENCE 2019; 10:1125. [PMID: 31608085 PMCID: PMC6774291 DOI: 10.3389/fpls.2019.01125] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/09/2019] [Accepted: 08/15/2019] [Indexed: 05/31/2023]
Abstract
The Mediterranean climate is characterized by hot dry summers and frequent droughts. Mediterranean crops are frequently subjected to high evapotranspiration demands, soil water deficits, high temperatures, and photo-oxidative stress. These conditions will become more severe due to global warming which poses major challenges to the sustainability of the agricultural sector in Mediterranean countries. Selection of crop varieties adapted to future climatic conditions and more tolerant to extreme climatic events is urgently required. Plant phenotyping is a crucial approach to address these challenges. High-throughput plant phenotyping (HTPP) helps to monitor the performance of improved genotypes and is one of the most effective strategies to improve the sustainability of agricultural production. In spite of the remarkable progress in basic knowledge and technology of plant phenotyping, there are still several practical, financial, and political constraints to implement HTPP approaches in field and controlled conditions across the Mediterranean. The European panorama of phenotyping is heterogeneous and integration of phenotyping data across different scales and translation of "phytotron research" to the field, and from model species to crops, remain major challenges. Moreover, solutions specifically tailored to Mediterranean agriculture (e.g., crops and environmental stresses) are in high demand, as the region is vulnerable to climate change and to desertification processes. The specific phenotyping requirements of Mediterranean crops have not yet been fully identified. The high cost of HTPP infrastructures is a major limiting factor, though the limited availability of skilled personnel may also impair its implementation in Mediterranean countries. We propose that the lack of suitable phenotyping infrastructures is hindering the development of new Mediterranean agricultural varieties and will negatively affect future competitiveness of the agricultural sector. We provide an overview of the heterogeneous panorama of phenotyping within Mediterranean countries, describing the state of the art of agricultural production, breeding initiatives, and phenotyping capabilities in five countries: Italy, Greece, Portugal, Spain, and Turkey. We characterize some of the main impediments for development of plant phenotyping in those countries and identify strategies to overcome barriers and maximize the benefits of phenotyping and modeling approaches to Mediterranean agriculture and related sustainability.
Collapse
Affiliation(s)
| | - Jorge Marques da Silva
- Biosystems and Integrative Sciences Institute (BioISI), Faculty of Sciences, Universidade de Lisboa, Lisbon, Portugal
| | - Carla Pinheiro
- FCT NOVA, Universidade Nova de Lisboa, Monte da Caparica, Portugal
- ITQB NOVA, Universidade Nova de Lisboa, Oeiras, Portugal
| | - Matilde Barón
- Estación Experimental del Zaidín, Consejo Superior de Investigaciones Científicas (CSIC), Granada, Spain
| | - Photini Mylona
- HAO-DEMETER, Institute of Plant Breeding and Genetic Resources, Thermi, Greece
| | - Mauro Centritto
- Institute for Sustainable Plant Protection, Italian National Research Council (IPSP-CNR), Sesto Fiorentino, Italy
| | | | - Francesco Loreto
- Department of Biology, Agriculture and Food Sciences, CNR, Rome, Italy
| | - Baris Uzilday
- Department of Biology, Faculty of Science, Ege University, I˙zmir, Turkey
| | - Ismail Turkan
- Department of Biology, Faculty of Science, Ege University, I˙zmir, Turkey
| | | |
Collapse
|
20
|
Paulus S. Measuring crops in 3D: using geometry for plant phenotyping. PLANT METHODS 2019; 15:103. [PMID: 31497064 PMCID: PMC6719375 DOI: 10.1186/s13007-019-0490-0] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/29/2019] [Accepted: 08/27/2019] [Indexed: 05/22/2023]
Abstract
Using 3D sensing for plant phenotyping has risen within the last years. This review provides an overview on 3D traits for the demands of plant phenotyping considering different measuring techniques, derived traits and use-cases of biological applications. A comparison between a high resolution 3D measuring device and an established measuring tool, the leaf meter, is shown to categorize the possible measurement accuracy. Furthermore, different measuring techniques such as laser triangulation, structure from motion, time-of-flight, terrestrial laser scanning or structured light approaches enable the assessment of plant traits such as leaf width and length, plant size, volume and development on plant and organ level. The introduced traits were shown with respect to the measured plant types, the used measuring technique and the link to their biological use case. These were trait and growth analysis for measurements over time as well as more complex investigation on water budget, drought responses and QTL (quantitative trait loci) analysis. The used processing pipelines were generalized in a 3D point cloud processing workflow showing the single processing steps to derive plant parameters on plant level, on organ level using machine learning or over time using time series measurements. Finally the next step in plant sensing, the fusion of different sensor types namely 3D and spectral measurements is introduced by an example on sugar beet. This multi-dimensional plant model is the key to model the influence of geometry on radiometric measurements and to correct it. This publication depicts the state of the art for 3D measuring of plant traits as they were used in plant phenotyping regarding how the data is acquired, how this data is processed and what kind of traits is measured at the single plant, the miniplot, the experimental field and the open field scale. Future research will focus on highly resolved point clouds on the experimental and field scale as well as on the automated trait extraction of organ traits to track organ development at these scales.
Collapse
Affiliation(s)
- Stefan Paulus
- Institute of Sugar Beet Research, Holtenser Landstr. 77, 37079 Göttingen, Germany
| |
Collapse
|
21
|
Yu JG, Li Y, Gao C, Gaoa H, Xia GS, Yub ZL, Lic Y. Exemplar-Based Recursive Instance Segmentation With Application to Plant Image Analysis. IEEE TRANSACTIONS ON IMAGE PROCESSING : A PUBLICATION OF THE IEEE SIGNAL PROCESSING SOCIETY 2019; 29:389-404. [PMID: 31329554 DOI: 10.1109/tip.2019.2923571] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Instance segmentation is a challenging computer vision problem which lies at the intersection of object detection and semantic segmentation. Motivated by plant image analysis in the context of plant phenotyping, a recently emerging application field of computer vision, this paper presents the Exemplar-Based Recursive Instance Segmentation (ERIS) framework. A three-layer probabilistic model is firstly introduced to jointly represent hypotheses, voting elements, instance labels and their connections. Afterwards, a recursive optimization algorithm is developed to infer the maximum a posteriori (MAP) solution, which handles one instance at a time by alternating among the three steps of detection, segmentation and update. The proposed ERIS framework departs from previous works mainly in two respects. First, it is exemplar-based and model-free, which can achieve instance-level segmentation of a specific object class given only a handful of (typically less than 10) annotated exemplars. Such a merit enables its use in case that no massive manually-labeled data is available for training strong classification models, as required by most existing methods. Second, instead of attempting to infer the solution in a single shot, which suffers from extremely high computational complexity, our recursive optimization strategy allows for reasonably efficient MAP-inference in full hypothesis space. The ERIS framework is substantialized for the specific application of plant leaf segmentation in this work. Experiments are conducted on public benchmarks to demonstrate the superiority of our method in both effectiveness and efficiency in comparison with the state-of-the-art.
Collapse
|
22
|
Li B, Xu X, Han J, Zhang L, Bian C, Jin L, Liu J. The estimation of crop emergence in potatoes by UAV RGB imagery. PLANT METHODS 2019; 15:15. [PMID: 30792752 PMCID: PMC6371461 DOI: 10.1186/s13007-019-0399-7] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/26/2018] [Accepted: 01/31/2019] [Indexed: 05/19/2023]
Abstract
BACKGROUND Crop emergence and canopy cover are important physiological traits for potato (Solanum tuberosum L.) cultivar evaluation and nutrients management. They play important roles in variety screening, field management and yield prediction. Traditional manual assessment of these traits is not only laborious but often subjective. RESULTS In this study, semi-automated image analysis software was developed to estimate crop emergence from high-resolution RGB ortho-images captured from an unmanned aerial vehicle (UAV). Potato plant objects were extracted from bare soil using Excess Green Index and Otsu thresholding methods. Six morphological features were calculated from the images to be variables of a Random Forest classifier for estimating the number of potato plants at emergence stage. The outputs were then used to estimate crop emergence in three field experiments that were designed to investigate the effects of cultivars, levels of potassium (K) fertiliser input, and new compound fertilisers on potato growth. The results indicated that RGB UAV image analysis can accurately estimate potato crop emergence rate in comparison to manual assessment, with correlation coefficient ( r 2 ) of 0.96 and provide an efficient tool to evaluate emergence uniformity. CONCLUSIONS The proposed UAV image analysis method is a promising tool for use as a high throughput phenotyping method for assessing potato crop development at emergence stage. It can also facilitate future studies on optimizing fertiliser management and improving emergence consistency.
Collapse
Affiliation(s)
- Bo Li
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
- NIAB EMR, New Road, East Malling, Kent, ME19 4BD UK
| | - Xiangming Xu
- NIAB EMR, New Road, East Malling, Kent, ME19 4BD UK
| | - Jiwan Han
- Institute of Biological, Environmental and Rural Sciences (IBERS), Aberystwyth University, Penglais, Aberystwyth, Ceredigion, SY23 3FL UK
| | - Li Zhang
- NIAB EMR, New Road, East Malling, Kent, ME19 4BD UK
| | - Chunsong Bian
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
| | - Liping Jin
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
| | - Jiangang Liu
- Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences (CAAS)/Key Laboratory of Biology and Genetic Improvement of Tuber and Root Crops, Ministry of Agriculture, Beijing, China
| |
Collapse
|
23
|
Bolger AM, Poorter H, Dumschott K, Bolger ME, Arend D, Osorio S, Gundlach H, Mayer KFX, Lange M, Scholz U, Usadel B. Computational aspects underlying genome to phenome analysis in plants. THE PLANT JOURNAL : FOR CELL AND MOLECULAR BIOLOGY 2019; 97:182-198. [PMID: 30500991 PMCID: PMC6849790 DOI: 10.1111/tpj.14179] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/24/2018] [Revised: 11/06/2018] [Accepted: 11/16/2018] [Indexed: 05/18/2023]
Abstract
Recent advances in genomics technologies have greatly accelerated the progress in both fundamental plant science and applied breeding research. Concurrently, high-throughput plant phenotyping is becoming widely adopted in the plant community, promising to alleviate the phenotypic bottleneck. While these technological breakthroughs are significantly accelerating quantitative trait locus (QTL) and causal gene identification, challenges to enable even more sophisticated analyses remain. In particular, care needs to be taken to standardize, describe and conduct experiments robustly while relying on plant physiology expertise. In this article, we review the state of the art regarding genome assembly and the future potential of pangenomics in plant research. We also describe the necessity of standardizing and describing phenotypic studies using the Minimum Information About a Plant Phenotyping Experiment (MIAPPE) standard to enable the reuse and integration of phenotypic data. In addition, we show how deep phenotypic data might yield novel trait-trait correlations and review how to link phenotypic data to genomic data. Finally, we provide perspectives on the golden future of machine learning and their potential in linking phenotypes to genomic features.
Collapse
Affiliation(s)
- Anthony M. Bolger
- Institute for Biology I, BioSCRWTH Aachen UniversityWorringer Weg 352074AachenGermany
| | - Hendrik Poorter
- Forschungszentrum Jülich (FZJ) Institute of Bio‐ and Geosciences (IBG‐2) Plant SciencesWilhelm‐Johnen‐Straße52428JülichGermany
- Department of Biological SciencesMacquarie UniversityNorth RydeNSW2109Australia
| | - Kathryn Dumschott
- Institute for Biology I, BioSCRWTH Aachen UniversityWorringer Weg 352074AachenGermany
| | - Marie E. Bolger
- Forschungszentrum Jülich (FZJ) Institute of Bio‐ and Geosciences (IBG‐2) Plant SciencesWilhelm‐Johnen‐Straße52428JülichGermany
| | - Daniel Arend
- Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) GaterslebenCorrensstraße 306466SeelandGermany
| | - Sonia Osorio
- Department of Molecular Biology and BiochemistryInstituto de Hortofruticultura Subtropical y Mediterránea “La Mayora”Universidad de Málaga‐Consejo Superior de Investigaciones CientíficasCampus de Teatinos29071MálagaSpain
| | - Heidrun Gundlach
- Plant Genome and Systems Biology (PGSB)Helmholtz Zentrum München (HMGU)Ingolstädter Landstraße 185764NeuherbergGermany
| | - Klaus F. X. Mayer
- Plant Genome and Systems Biology (PGSB)Helmholtz Zentrum München (HMGU)Ingolstädter Landstraße 185764NeuherbergGermany
| | - Matthias Lange
- Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) GaterslebenCorrensstraße 306466SeelandGermany
| | - Uwe Scholz
- Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) GaterslebenCorrensstraße 306466SeelandGermany
| | - Björn Usadel
- Institute for Biology I, BioSCRWTH Aachen UniversityWorringer Weg 352074AachenGermany
- Forschungszentrum Jülich (FZJ) Institute of Bio‐ and Geosciences (IBG‐2) Plant SciencesWilhelm‐Johnen‐Straße52428JülichGermany
| |
Collapse
|
24
|
Itakura K, Hosoi F. Automatic Leaf Segmentation for Estimating Leaf Area and Leaf Inclination Angle in 3D Plant Images. SENSORS (BASEL, SWITZERLAND) 2018; 18:E3576. [PMID: 30360406 PMCID: PMC6210333 DOI: 10.3390/s18103576] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/07/2018] [Revised: 10/19/2018] [Accepted: 10/20/2018] [Indexed: 11/23/2022]
Abstract
Automatic and efficient plant monitoring offers accurate plant management. Construction of three-dimensional (3D) models of plants and acquisition of their spatial information is an effective method for obtaining plant structural parameters. Here, 3D images of leaves constructed with multiple scenes taken from different positions were segmented automatically for the automatic retrieval of leaf areas and inclination angles. First, for the initial segmentation, leave images were viewed from the top, then leaves in the top-view images were segmented using distance transform and the watershed algorithm. Next, the images of leaves after the initial segmentation were reduced by 90%, and the seed regions for each leaf were produced. The seed region was re-projected onto the 3D images, and each leaf was segmented by expanding the seed region with the 3D information. After leaf segmentation, the leaf area of each leaf and its inclination angle were estimated accurately via a voxel-based calculation. As a result, leaf area and leaf inclination angle were estimated accurately after automatic leaf segmentation. This method for automatic plant structure analysis allows accurate and efficient plant breeding and growth management.
Collapse
Affiliation(s)
- Kenta Itakura
- Graduate School, University of Tokyo, Tokyo 113-8657, Japan.
| | - Fumiki Hosoi
- Graduate School, University of Tokyo, Tokyo 113-8657, Japan.
| |
Collapse
|
25
|
Wang Y, Xu L. Unsupervised segmentation of greenhouse plant images based on modified Latent Dirichlet Allocation. PeerJ 2018; 6:e5036. [PMID: 29967727 PMCID: PMC6026534 DOI: 10.7717/peerj.5036] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2018] [Accepted: 05/29/2018] [Indexed: 11/20/2022] Open
Abstract
Agricultural greenhouse plant images with complicated scenes are difficult to precisely manually label. The appearance of leaf disease spots and mosses increases the difficulty in plant segmentation. Considering these problems, this paper proposed a statistical image segmentation algorithm MSBS-LDA (Mean-shift Bandwidths Searching Latent Dirichlet Allocation), which can perform unsupervised segmentation of greenhouse plants. The main idea of the algorithm is to take advantage of the language model LDA (Latent Dirichlet Allocation) to deal with image segmentation based on the design of spatial documents. The maximum points of probability density function in image space are mapped as documents and Mean-shift is utilized to fulfill the word-document assignment. The proportion of the first major word in word frequency statistics determines the coordinate space bandwidth, and the spatial LDA segmentation procedure iteratively searches for optimal color space bandwidth in the light of the LUV distances between classes. In view of the fruits in plant segmentation result and the ever-changing illumination condition in greenhouses, an improved leaf segmentation method based on watershed is proposed to further segment the leaves. Experiment results show that the proposed methods can segment greenhouse plants and leaves in an unsupervised way and obtain a high segmentation accuracy together with an effective extraction of the fruit part.
Collapse
Affiliation(s)
- Yi Wang
- College of Electronics and Information Engineering, Tongji University, Shanghai, China
| | - Lihong Xu
- College of Electronics and Information Engineering, Tongji University, Shanghai, China
| |
Collapse
|
26
|
Das Choudhury S, Bashyam S, Qiu Y, Samal A, Awada T. Holistic and component plant phenotyping using temporal image sequence. PLANT METHODS 2018; 14:35. [PMID: 29760766 PMCID: PMC5944015 DOI: 10.1186/s13007-018-0303-x] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/24/2017] [Accepted: 04/26/2018] [Indexed: 05/24/2023]
Abstract
BACKGROUND Image-based plant phenotyping facilitates the extraction of traits noninvasively by analyzing large number of plants in a relatively short period of time. It has the potential to compute advanced phenotypes by considering the whole plant as a single object (holistic phenotypes) or as individual components, i.e., leaves and the stem (component phenotypes), to investigate the biophysical characteristics of the plants. The emergence timing, total number of leaves present at any point of time and the growth of individual leaves during vegetative stage life cycle of the maize plants are significant phenotypic expressions that best contribute to assess the plant vigor. However, image-based automated solution to this novel problem is yet to be explored. RESULTS A set of new holistic and component phenotypes are introduced in this paper. To compute the component phenotypes, it is essential to detect the individual leaves and the stem. Thus, the paper introduces a novel method to reliably detect the leaves and the stem of the maize plants by analyzing 2-dimensional visible light image sequences captured from the side using a graph based approach. The total number of leaves are counted and the length of each leaf is measured for all images in the sequence to monitor leaf growth. To evaluate the performance of the proposed algorithm, we introduce University of Nebraska-Lincoln Component Plant Phenotyping Dataset (UNL-CPPD) and provide ground truth to facilitate new algorithm development and uniform comparison. The temporal variation of the component phenotypes regulated by genotypes and environment (i.e., greenhouse) are experimentally demonstrated for the maize plants on UNL-CPPD. Statistical models are applied to analyze the greenhouse environment impact and demonstrate the genetic regulation of the temporal variation of the holistic phenotypes on the public dataset called Panicoid Phenomap-1. CONCLUSION The central contribution of the paper is a novel computer vision based algorithm for automated detection of individual leaves and the stem to compute new component phenotypes along with a public release of a benchmark dataset, i.e., UNL-CPPD. Detailed experimental analyses are performed to demonstrate the temporal variation of the holistic and component phenotypes in maize regulated by environment and genetic variation with a discussion on their significance in the context of plant science.
Collapse
Affiliation(s)
- Sruti Das Choudhury
- School of Natural Resources, University of Nebraska-Lincoln, Lincoln, NE USA
- Department of Computer Science and Engineering, University of Nebraska-Lincoln, Lincoln, NE USA
| | - Srinidhi Bashyam
- Department of Computer Science and Engineering, University of Nebraska-Lincoln, Lincoln, NE USA
| | - Yumou Qiu
- Department of Statistics, University of Nebraska-Lincoln, Lincoln, NE USA
| | - Ashok Samal
- Department of Computer Science and Engineering, University of Nebraska-Lincoln, Lincoln, NE USA
| | - Tala Awada
- School of Natural Resources, University of Nebraska-Lincoln, Lincoln, NE USA
| |
Collapse
|
27
|
Abstract
Complicated image scene of the agricultural greenhouse plant images makes it very difficult to obtain precise manual labeling, leading to the hardship of getting the accurate training set of the conditional random field (CRF). Considering this problem, this paper proposed an unsupervised conditional random field image segmentation algorithm ULCRF (Unsupervised Learning Conditional Random Field), which can perform fast unsupervised segmentation of greenhouse plant images, and further the plant organs in the image, i.e. fruits, leaves and stems, are segmented. The main idea of this algorithm is to calculate the unary potential, namely the initial label of the Dense CRF, by the unsupervised learning model LDA (Latent Dirichlet Allocation). In view of the ever-changing image features at different stages of fruit growth, a multi-resolution ULCRF is proposed to improve the accuracy of image segmentation in the middle stage and late stage of the fruit growth. An image is down-sampled twice to obtain three layers of different resolution images, and the features of each layer are interrelated with each other. Experiment results show that the proposed method can segment greenhouse plant images in an unsupervised method automatically and obtain a high segmentation accuracy together with a high extraction precision of the fruit part.
Collapse
|
28
|
Ubbens J, Cieslak M, Prusinkiewicz P, Stavness I. The use of plant models in deep learning: an application to leaf counting in rosette plants. PLANT METHODS 2018; 14:6. [PMID: 29375647 PMCID: PMC5773030 DOI: 10.1186/s13007-018-0273-z] [Citation(s) in RCA: 92] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/04/2017] [Accepted: 01/09/2018] [Indexed: 05/21/2023]
Abstract
Deep learning presents many opportunities for image-based plant phenotyping. Here we consider the capability of deep convolutional neural networks to perform the leaf counting task. Deep learning techniques typically require large and diverse datasets to learn generalizable models without providing a priori an engineered algorithm for performing the task. This requirement is challenging, however, for applications in the plant phenotyping field, where available datasets are often small and the costs associated with generating new data are high. In this work we propose a new method for augmenting plant phenotyping datasets using rendered images of synthetic plants. We demonstrate that the use of high-quality 3D synthetic plants to augment a dataset can improve performance on the leaf counting task. We also show that the ability of the model to generate an arbitrary distribution of phenotypes mitigates the problem of dataset shift when training and testing on different datasets. Finally, we show that real and synthetic plants are significantly interchangeable when training a neural network on the leaf counting task.
Collapse
Affiliation(s)
- Jordan Ubbens
- University of Saskatchewan, 105 Administration Place, Saskatoon, S7N 5C5 Canada
| | - Mikolaj Cieslak
- University of Calgary, 2500 University Dr NW, Calgary, T2N 1N4 Canada
| | | | - Ian Stavness
- University of Saskatchewan, 105 Administration Place, Saskatoon, S7N 5C5 Canada
| |
Collapse
|
29
|
Gehan MA, Fahlgren N, Abbasi A, Berry JC, Callen ST, Chavez L, Doust AN, Feldman MJ, Gilbert KB, Hodge JG, Hoyer JS, Lin A, Liu S, Lizárraga C, Lorence A, Miller M, Platon E, Tessman M, Sax T. PlantCV v2: Image analysis software for high-throughput plant phenotyping. PeerJ 2017; 5:e4088. [PMID: 29209576 PMCID: PMC5713628 DOI: 10.7717/peerj.4088] [Citation(s) in RCA: 121] [Impact Index Per Article: 17.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2017] [Accepted: 11/03/2017] [Indexed: 12/11/2022] Open
Abstract
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.
Collapse
Affiliation(s)
- Malia A. Gehan
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
| | - Noah Fahlgren
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
| | - Arash Abbasi
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
| | - Jeffrey C. Berry
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
| | - Steven T. Callen
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
- Current affiliation: Monsanto Company, St. Louis, MO, United States of America
| | - Leonardo Chavez
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
| | - Andrew N. Doust
- Department of Plant Biology, Ecology, and Evolution, Oklahoma State University, Stillwater, OK, United States of America
| | - Max J. Feldman
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
| | - Kerrigan B. Gilbert
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
| | - John G. Hodge
- Department of Plant Biology, Ecology, and Evolution, Oklahoma State University, Stillwater, OK, United States of America
| | - J. Steen Hoyer
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
- Computational and Systems Biology Program, Washington University in St. Louis, St. Louis, MO, United States of America
| | - Andy Lin
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
- Current affiliation: Unidev, St. Louis, MO, United States of America
| | - Suxing Liu
- Arkansas Biosciences Institute, Arkansas State University, Jonesboro, AR, United States of America
- Current affiliation: Department of Plant Biology, University of Georgia, Athens, GA, United States of America
| | - César Lizárraga
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
- Current affiliation: CiBO Technologies, Cambridge, MA, United States of America
| | - Argelia Lorence
- Arkansas Biosciences Institute, Department of Chemistry and Physics, Arkansas State University, Jonesboro, AR, United States of America
| | - Michael Miller
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
- Current affiliation: Department of Agronomy and Horticulture, Center for Plant Science Innovation, Beadle Center for Biotechnology, University of Nebraska - Lincoln, Lincoln, NE, United States of America
| | | | - Monica Tessman
- Donald Danforth Plant Science Center, St. Louis, MO, United States of America
- Department of Plant Biology, Ecology, and Evolution, Oklahoma State University, Stillwater, OK, United States of America
| | - Tony Sax
- Missouri University of Science and Technology, Rolla, MO, United States of America
| |
Collapse
|
30
|
Leaf Disease Segmentation From Agricultural Images via Hybridization of Active Contour Model and OFA. JOURNAL OF INTELLIGENT SYSTEMS 2017. [DOI: 10.1515/jisys-2017-0415] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Abstract
In this paper, an alternative active contour model (ACM) driven by an oppositional fruit fly algorithm (OFA) is presented. Unlike the traditional ACM variant, which is frequently caught in a local minimum, this methodology helps the focalizing of control points toward the global least of the energy function. In the proposed system, energy minimization is performed through a fruit fly algorithm, and every control point is compelled in a local search window. As for the local search window, the rectangular-shaped approach has been viewed. The results demonstrated that the fruit fly strategy utilizing polar coordinates is, for the most part, desirable over the fruit fly performed in rectangular shapes. Three performance metrics, such as the Jaccard index, the Dice index, and the Hausdorff distance, were utilized to validate the proposed strategy in real agricultural and synthetic images. From the results, it is clear that the proposed OFA technique shows a great option for the agricultural plant image segmentation process, considering any kind of disease that occurred in plant leaves.
Collapse
|
31
|
Valle B, Simonneau T, Boulord R, Sourd F, Frisson T, Ryckewaert M, Hamard P, Brichet N, Dauzat M, Christophe A. PYM: a new, affordable, image-based method using a Raspberry Pi to phenotype plant leaf area in a wide diversity of environments. PLANT METHODS 2017; 13:98. [PMID: 29151844 PMCID: PMC5678554 DOI: 10.1186/s13007-017-0248-5] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/13/2017] [Accepted: 10/26/2017] [Indexed: 05/24/2023]
Abstract
BACKGROUND Plant science uses increasing amounts of phenotypic data to unravel the complex interactions between biological systems and their variable environments. Originally, phenotyping approaches were limited by manual, often destructive operations, causing large errors. Plant imaging emerged as a viable alternative allowing non-invasive and automated data acquisition. Several procedures based on image analysis were developed to monitor leaf growth as a major phenotyping target. However, in most proposals, a time-consuming parameterization of the analysis pipeline is required to handle variable conditions between images, particularly in the field due to unstable light and interferences with soil surface or weeds. To cope with these difficulties, we developed a low-cost, 2D imaging method, hereafter called PYM. The method is based on plant leaf ability to absorb blue light while reflecting infrared wavelengths. PYM consists of a Raspberry Pi computer equipped with an infrared camera and a blue filter and is associated with scripts that compute projected leaf area. This new method was tested on diverse species placed in contrasting conditions. Application to field conditions was evaluated on lettuces grown under photovoltaic panels. The objective was to look for possible acclimation of leaf expansion under photovoltaic panels to optimise the use of solar radiation per unit soil area. RESULTS The new PYM device proved to be efficient and accurate for screening leaf area of various species in wide ranges of environments. In the most challenging conditions that we tested, error on plant leaf area was reduced to 5% using PYM compared to 100% when using a recently published method. A high-throughput phenotyping cart, holding 6 chained PYM devices, was designed to capture up to 2000 pictures of field-grown lettuce plants in less than 2 h. Automated analysis of image stacks of individual plants over their growth cycles revealed unexpected differences in leaf expansion rate between lettuces rows depending on their position below or between the photovoltaic panels. CONCLUSIONS The imaging device described here has several benefits, such as affordability, low cost, reliability and flexibility for online analysis and storage. It should be easily appropriated and customized to meet the needs of various users.
Collapse
Affiliation(s)
- Benoît Valle
- UMR759 Laboratoire d’Ecophysiologie des Plantes sous Stress Environnementaux (LEPSE), INRA, Montpellier SupAgro, 2 Place Pierre Viala, 34060 Montpellier Cedex 2, France
- Sun’R SAS, 7 rue de Clichy, 75009 Paris, France
| | - Thierry Simonneau
- UMR759 Laboratoire d’Ecophysiologie des Plantes sous Stress Environnementaux (LEPSE), INRA, Montpellier SupAgro, 2 Place Pierre Viala, 34060 Montpellier Cedex 2, France
| | - Romain Boulord
- UMR759 Laboratoire d’Ecophysiologie des Plantes sous Stress Environnementaux (LEPSE), INRA, Montpellier SupAgro, 2 Place Pierre Viala, 34060 Montpellier Cedex 2, France
| | | | | | | | - Philippe Hamard
- UMR759 Laboratoire d’Ecophysiologie des Plantes sous Stress Environnementaux (LEPSE), INRA, Montpellier SupAgro, 2 Place Pierre Viala, 34060 Montpellier Cedex 2, France
| | - Nicolas Brichet
- UMR759 Laboratoire d’Ecophysiologie des Plantes sous Stress Environnementaux (LEPSE), INRA, Montpellier SupAgro, 2 Place Pierre Viala, 34060 Montpellier Cedex 2, France
| | - Myriam Dauzat
- UMR759 Laboratoire d’Ecophysiologie des Plantes sous Stress Environnementaux (LEPSE), INRA, Montpellier SupAgro, 2 Place Pierre Viala, 34060 Montpellier Cedex 2, France
| | - Angélique Christophe
- UMR759 Laboratoire d’Ecophysiologie des Plantes sous Stress Environnementaux (LEPSE), INRA, Montpellier SupAgro, 2 Place Pierre Viala, 34060 Montpellier Cedex 2, France
| |
Collapse
|
32
|
Perez-Sanz F, Navarro PJ, Egea-Cortines M. Plant phenomics: an overview of image acquisition technologies and image data analysis algorithms. Gigascience 2017; 6:1-18. [PMID: 29048559 PMCID: PMC5737281 DOI: 10.1093/gigascience/gix092] [Citation(s) in RCA: 64] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2017] [Revised: 06/20/2017] [Accepted: 09/17/2017] [Indexed: 11/25/2022] Open
Abstract
The study of phenomes or phenomics has been a central part of biology. The field of automatic phenotype acquisition technologies based on images has seen an important advance in the last years. As with other high-throughput technologies, it addresses a common set of problems, including data acquisition and analysis. In this review, we give an overview of the main systems developed to acquire images. We give an in-depth analysis of image processing with its major issues and the algorithms that are being used or emerging as useful to obtain data out of images in an automatic fashion.
Collapse
Affiliation(s)
- Fernando Perez-Sanz
- Genetics, ETSIA, Instituto de Biotecnología Vegetal, Universidad Politécnica de Cartagena, 30202 Cartagena, Spain
| | - Pedro J Navarro
- Genetics, Instituto de Biotecnología Vegetal, Universidad Politécnica de Cartagena, Campus Muralla del Mar, s/n, Cartagena 30202, Spain
| | - Marcos Egea-Cortines
- Genetics, ETSIA, Instituto de Biotecnología Vegetal, Universidad Politécnica de Cartagena, 30202 Cartagena, Spain
| |
Collapse
|
33
|
Atkinson JA, Lobet G, Noll M, Meyer PE, Griffiths M, Wells DM. Combining semi-automated image analysis techniques with machine learning algorithms to accelerate large-scale genetic studies. Gigascience 2017; 6:1-7. [PMID: 29020748 PMCID: PMC5632292 DOI: 10.1093/gigascience/gix084] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2017] [Revised: 08/09/2017] [Accepted: 08/16/2017] [Indexed: 12/22/2022] Open
Abstract
Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping.
Collapse
Affiliation(s)
- Jonathan A. Atkinson
- Centre for Plant Integrative Biology, School of Biosciences, University of Nottingham, Sutton Bonington, LE12 5RD, United Kingdom
| | - Guillaume Lobet
- Agrosphere, IBG3, Forschungszentrum Jülich, Jülich 52425, Germany
- Earth and Life Institute, Université Catholique de Louvain, B-1348 Louvain-la-Neuve, Belgium
| | - Manuel Noll
- InBios, Université de Liège, 4000 Liège, Belgium
| | | | - Marcus Griffiths
- Centre for Plant Integrative Biology, School of Biosciences, University of Nottingham, Sutton Bonington, LE12 5RD, United Kingdom
| | - Darren M. Wells
- Centre for Plant Integrative Biology, School of Biosciences, University of Nottingham, Sutton Bonington, LE12 5RD, United Kingdom
| |
Collapse
|
34
|
Minervini M, Giuffrida MV, Perata P, Tsaftaris SA. Phenotiki: an open software and hardware platform for affordable and easy image-based phenotyping of rosette-shaped plants. THE PLANT JOURNAL : FOR CELL AND MOLECULAR BIOLOGY 2017; 90:204-216. [PMID: 28066963 DOI: 10.1111/tpj.13472] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/25/2016] [Revised: 11/21/2016] [Accepted: 12/22/2016] [Indexed: 05/21/2023]
Abstract
Phenotyping is important to understand plant biology, but current solutions are costly, not versatile or are difficult to deploy. To solve this problem, we present Phenotiki, an affordable system for plant phenotyping that, relying on off-the-shelf parts, provides an easy to install and maintain platform, offering an out-of-box experience for a well-established phenotyping need: imaging rosette-shaped plants. The accompanying software (with available source code) processes data originating from our device seamlessly and automatically. Our software relies on machine learning to devise robust algorithms, and includes an automated leaf count obtained from 2D images without the need of depth (3D). Our affordable device (~€200) can be deployed in growth chambers or greenhouse to acquire optical 2D images of approximately up to 60 adult Arabidopsis rosettes concurrently. Data from the device are processed remotely on a workstation or via a cloud application (based on CyVerse). In this paper, we present a proof-of-concept validation experiment on top-view images of 24 Arabidopsis plants in a combination of genotypes that has not been compared previously. Phenotypic analysis with respect to morphology, growth, color and leaf count has not been performed comprehensively before now. We confirm the findings of others on some of the extracted traits, showing that we can phenotype at reduced cost. We also perform extensive validations with external measurements and with higher fidelity equipment, and find no loss in statistical accuracy when we use the affordable setting that we propose. Device set-up instructions and analysis software are publicly available ( http://phenotiki.com).
Collapse
Affiliation(s)
- Massimo Minervini
- IMT School for Advanced Studies, Piazza S. Francesco 19, 55100, Lucca, Italy
| | - Mario V Giuffrida
- IMT School for Advanced Studies, Piazza S. Francesco 19, 55100, Lucca, Italy
- Institute for Digital Communications, School of Engineering, University of Edinburgh, Thomas Bayes Road, EH9 3FG, Edinburgh, UK
- The Alan Turing Institute, 96 Euston Road, NW1 2DB, London, UK
| | - Pierdomenico Perata
- PlantLab, Institute of Life Sciences, Scuola Superiore Sant'Anna, Via Mariscoglio 34, 56124, Pisa, Italy
| | - Sotirios A Tsaftaris
- Institute for Digital Communications, School of Engineering, University of Edinburgh, Thomas Bayes Road, EH9 3FG, Edinburgh, UK
| |
Collapse
|
35
|
Ubbens JR, Stavness I. Deep Plant Phenomics: A Deep Learning Platform for Complex Plant Phenotyping Tasks. FRONTIERS IN PLANT SCIENCE 2017; 8:1190. [PMID: 28736569 PMCID: PMC5500639 DOI: 10.3389/fpls.2017.01190] [Citation(s) in RCA: 130] [Impact Index Per Article: 18.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/03/2017] [Accepted: 06/22/2017] [Indexed: 05/19/2023]
Abstract
Plant phenomics has received increasing interest in recent years in an attempt to bridge the genotype-to-phenotype knowledge gap. There is a need for expanded high-throughput phenotyping capabilities to keep up with an increasing amount of data from high-dimensional imaging sensors and the desire to measure more complex phenotypic traits (Knecht et al., 2016). In this paper, we introduce an open-source deep learning tool called Deep Plant Phenomics. This tool provides pre-trained neural networks for several common plant phenotyping tasks, as well as an easy platform that can be used by plant scientists to train models for their own phenotyping applications. We report performance results on three plant phenotyping benchmarks from the literature, including state of the art performance on leaf counting, as well as the first published results for the mutant classification and age regression tasks for Arabidopsis thaliana.
Collapse
|
36
|
Bhugra S, Agarwal N, Yadav S, Banerjee S, Chaudhury S, Lall B. Extraction of Phenotypic Traits for Drought Stress Study Using Hyperspectral Images. LECTURE NOTES IN COMPUTER SCIENCE 2017:608-614. [DOI: 10.1007/978-3-319-69900-4_77] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/19/2023]
|
37
|
|
38
|
Roussel J, Geiger F, Fischbach A, Jahnke S, Scharr H. 3D Surface Reconstruction of Plant Seeds by Volume Carving: Performance and Accuracies. FRONTIERS IN PLANT SCIENCE 2016; 7:745. [PMID: 27375628 PMCID: PMC4895124 DOI: 10.3389/fpls.2016.00745] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/18/2016] [Accepted: 05/17/2016] [Indexed: 05/18/2023]
Abstract
We describe a method for 3D reconstruction of plant seed surfaces, focusing on small seeds with diameters as small as 200 μm. The method considers robotized systems allowing single seed handling in order to rotate a single seed in front of a camera. Even though such systems feature high position repeatability, at sub-millimeter object scales, camera pose variations have to be compensated. We do this by robustly estimating the tool center point from each acquired image. 3D reconstruction can then be performed by a simple shape-from-silhouette approach. In experiments we investigate runtimes, theoretically achievable accuracy, experimentally achieved accuracy, and show as a proof of principle that the proposed method is well sufficient for 3D seed phenotyping purposes.
Collapse
Affiliation(s)
| | | | | | | | - Hanno Scharr
- Institute of Bio- and Geo-sciences, IBG-2: Plant Sciences, Forschungszentrum Jülich GmbHJülich, Germany
| |
Collapse
|
39
|
|
40
|
|
41
|
Roussel J, Geiger F, Fischbach A, Jahnke S, Scharr H. 3D Surface Reconstruction of Plant Seeds by Volume Carving: Performance and Accuracies. FRONTIERS IN PLANT SCIENCE 2016. [PMID: 27375628 DOI: 10.3389/fpls.2016] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
We describe a method for 3D reconstruction of plant seed surfaces, focusing on small seeds with diameters as small as 200 μm. The method considers robotized systems allowing single seed handling in order to rotate a single seed in front of a camera. Even though such systems feature high position repeatability, at sub-millimeter object scales, camera pose variations have to be compensated. We do this by robustly estimating the tool center point from each acquired image. 3D reconstruction can then be performed by a simple shape-from-silhouette approach. In experiments we investigate runtimes, theoretically achievable accuracy, experimentally achieved accuracy, and show as a proof of principle that the proposed method is well sufficient for 3D seed phenotyping purposes.
Collapse
Affiliation(s)
- Johanna Roussel
- Institute of Bio- and Geo-sciences, IBG-2: Plant Sciences, Forschungszentrum Jülich GmbH Jülich, Germany
| | - Felix Geiger
- Institute of Bio- and Geo-sciences, IBG-2: Plant Sciences, Forschungszentrum Jülich GmbH Jülich, Germany
| | - Andreas Fischbach
- Institute of Bio- and Geo-sciences, IBG-2: Plant Sciences, Forschungszentrum Jülich GmbH Jülich, Germany
| | - Siegfried Jahnke
- Institute of Bio- and Geo-sciences, IBG-2: Plant Sciences, Forschungszentrum Jülich GmbH Jülich, Germany
| | - Hanno Scharr
- Institute of Bio- and Geo-sciences, IBG-2: Plant Sciences, Forschungszentrum Jülich GmbH Jülich, Germany
| |
Collapse
|
42
|
Minervini M, Scharr H, Tsaftaris SA. The significance of image compression in plant phenotyping applications. FUNCTIONAL PLANT BIOLOGY : FPB 2015; 42:971-988. [PMID: 32480737 DOI: 10.1071/fp15033] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/11/2015] [Accepted: 07/01/2015] [Indexed: 06/11/2023]
Abstract
We are currently witnessing an increasingly higher throughput in image-based plant phenotyping experiments. The majority of imaging data are collected using complex automated procedures and are then post-processed to extract phenotyping-related information. In this article, we show that the image compression used in such procedures may compromise phenotyping results and this needs to be taken into account. We use three illuminating proof-of-concept experiments that demonstrate that compression (especially in the most common lossy JPEG form) affects measurements of plant traits and the errors introduced can be high. We also systematically explore how compression affects measurement fidelity, quantified as effects on image quality, as well as errors in extracted plant visual traits. To do so, we evaluate a variety of image-based phenotyping scenarios, including size and colour of shoots, leaf and root growth. To show that even visual impressions can be used to assess compression effects, we use root system images as examples. Overall, we find that compression has a considerable effect on several types of analyses (albeit visual or quantitative) and that proper care is necessary to ensure that this choice does not affect biological findings. In order to avoid or at least minimise introduced measurement errors, for each scenario, we derive recommendations and provide guidelines on how to identify suitable compression options in practice. We also find that certain compression choices can offer beneficial returns in terms of reducing the amount of data storage without compromising phenotyping results. This may enable even higher throughput experiments in the future.
Collapse
Affiliation(s)
- Massimo Minervini
- Pattern Recognition and Image Analysis, IMT Institute for Advanced Studies, Lucca, Piazza S. Francesco, 19, 55100 Lucca, Italy
| | - Hanno Scharr
- Institute of Bio- and Geosciences: Plant Sciences, Forschungszentrum Jülich GmbH, Wilhelm-Johnen-Straße, 52428 Jülich, Germany
| | - Sotirios A Tsaftaris
- Pattern Recognition and Image Analysis, IMT Institute for Advanced Studies, Lucca, Piazza S. Francesco, 19, 55100 Lucca, Italy
| |
Collapse
|
43
|
Fahlgren N, Gehan MA, Baxter I. Lights, camera, action: high-throughput plant phenotyping is ready for a close-up. CURRENT OPINION IN PLANT BIOLOGY 2015; 24:93-9. [PMID: 25733069 DOI: 10.1016/j.pbi.2015.02.006] [Citation(s) in RCA: 310] [Impact Index Per Article: 34.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2015] [Revised: 02/13/2015] [Accepted: 02/13/2015] [Indexed: 05/18/2023]
Abstract
Anticipated population growth, shifting demographics, and environmental variability over the next century are expected to threaten global food security. In the face of these challenges, crop yield for food and fuel must be maintained and improved using fewer input resources. In recent years, genetic tools for profiling crop germplasm has benefited from rapid advances in DNA sequencing, and now similar advances are needed to improve the throughput of plant phenotyping. We highlight recent developments in high-throughput plant phenotyping using robotic-assisted imaging platforms and computer vision-assisted analysis tools.
Collapse
Affiliation(s)
- Noah Fahlgren
- Donald Danforth Plant Science Center, United States.
| | - Malia A Gehan
- Donald Danforth Plant Science Center, United States.
| | - Ivan Baxter
- USDA-ARS, Donald Danforth Plant Science Center, United States
| |
Collapse
|
44
|
3-D Histogram-Based Segmentation and Leaf Detection for Rosette Plants. COMPUTER VISION - ECCV 2014 WORKSHOPS 2015. [DOI: 10.1007/978-3-319-16220-1_5] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
|
45
|
A review of imaging techniques for plant phenotyping. SENSORS 2014; 14:20078-111. [PMID: 25347588 PMCID: PMC4279472 DOI: 10.3390/s141120078] [Citation(s) in RCA: 360] [Impact Index Per Article: 36.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/08/2014] [Revised: 10/09/2014] [Accepted: 10/10/2014] [Indexed: 11/29/2022]
Abstract
Given the rapid development of plant genomic technologies, a lack of access to plant phenotyping capabilities limits our ability to dissect the genetics of quantitative traits. Effective, high-throughput phenotyping platforms have recently been developed to solve this problem. In high-throughput phenotyping platforms, a variety of imaging methodologies are being used to collect data for quantitative studies of complex traits related to the growth, yield and adaptation to biotic or abiotic stress (disease, insects, drought and salinity). These imaging techniques include visible imaging (machine vision), imaging spectroscopy (multispectral and hyperspectral remote sensing), thermal infrared imaging, fluorescence imaging, 3D imaging and tomographic imaging (MRT, PET and CT). This paper presents a brief review on these imaging techniques and their applications in plant phenotyping. The features used to apply these imaging techniques to plant phenotyping are described and discussed in this review.
Collapse
|