1
|
Rodene E, Fernando GD, Piyush V, Ge Y, Schnable JC, Ghosh S, Yang J. Image Filtering to Improve Maize Tassel Detection Accuracy Using Machine Learning Algorithms. SENSORS (BASEL, SWITZERLAND) 2024; 24:2172. [PMID: 38610383 PMCID: PMC11013961 DOI: 10.3390/s24072172] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/16/2024] [Revised: 03/20/2024] [Accepted: 03/26/2024] [Indexed: 04/14/2024]
Abstract
Unmanned aerial vehicle (UAV)-based imagery has become widely used to collect time-series agronomic data, which are then incorporated into plant breeding programs to enhance crop improvements. To make efficient analysis possible, in this study, by leveraging an aerial photography dataset for a field trial of 233 different inbred lines from the maize diversity panel, we developed machine learning methods for obtaining automated tassel counts at the plot level. We employed both an object-based counting-by-detection (CBD) approach and a density-based counting-by-regression (CBR) approach. Using an image segmentation method that removes most of the pixels not associated with the plant tassels, the results showed a dramatic improvement in the accuracy of object-based (CBD) detection, with the cross-validation prediction accuracy (r2) peaking at 0.7033 on a detector trained with images with a filter threshold of 90. The CBR approach showed the greatest accuracy when using unfiltered images, with a mean absolute error (MAE) of 7.99. However, when using bootstrapping, images filtered at a threshold of 90 showed a slightly better MAE (8.65) than the unfiltered images (8.90). These methods will allow for accurate estimates of flowering-related traits and help to make breeding decisions for crop improvement.
Collapse
Affiliation(s)
- Eric Rodene
- Department of Agronomy and Horticulture, University of Nebraska-Lincoln, Lincoln, NE 68583, USA; (E.R.); (J.C.S.)
- Center for Plant Science Innovation, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | | | - Ved Piyush
- Department of Statistics, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | - Yufeng Ge
- Center for Plant Science Innovation, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
- Department of Biological Systems Engineering, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | - James C. Schnable
- Department of Agronomy and Horticulture, University of Nebraska-Lincoln, Lincoln, NE 68583, USA; (E.R.); (J.C.S.)
- Center for Plant Science Innovation, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | - Souparno Ghosh
- Department of Statistics, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| | - Jinliang Yang
- Department of Agronomy and Horticulture, University of Nebraska-Lincoln, Lincoln, NE 68583, USA; (E.R.); (J.C.S.)
- Center for Plant Science Innovation, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
| |
Collapse
|
2
|
Mostafa S, Mondal D, Panjvani K, Kochian L, Stavness I. Explainable deep learning in plant phenotyping. Front Artif Intell 2023; 6:1203546. [PMID: 37795496 PMCID: PMC10546035 DOI: 10.3389/frai.2023.1203546] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2023] [Accepted: 08/25/2023] [Indexed: 10/06/2023] Open
Abstract
The increasing human population and variable weather conditions, due to climate change, pose a threat to the world's food security. To improve global food security, we need to provide breeders with tools to develop crop cultivars that are more resilient to extreme weather conditions and provide growers with tools to more effectively manage biotic and abiotic stresses in their crops. Plant phenotyping, the measurement of a plant's structural and functional characteristics, has the potential to inform, improve and accelerate both breeders' selections and growers' management decisions. To improve the speed, reliability and scale of plant phenotyping procedures, many researchers have adopted deep learning methods to estimate phenotypic information from images of plants and crops. Despite the successful results of these image-based phenotyping studies, the representations learned by deep learning models remain difficult to interpret, understand, and explain. For this reason, deep learning models are still considered to be black boxes. Explainable AI (XAI) is a promising approach for opening the deep learning model's black box and providing plant scientists with image-based phenotypic information that is interpretable and trustworthy. Although various fields of study have adopted XAI to advance their understanding of deep learning models, it has yet to be well-studied in the context of plant phenotyping research. In this review article, we reviewed existing XAI studies in plant shoot phenotyping, as well as related domains, to help plant researchers understand the benefits of XAI and make it easier for them to integrate XAI into their future studies. An elucidation of the representations within a deep learning model can help researchers explain the model's decisions, relate the features detected by the model to the underlying plant physiology, and enhance the trustworthiness of image-based phenotypic information used in food production systems.
Collapse
Affiliation(s)
- Sakib Mostafa
- Department of Computer Science, University of Saskatchewan, Saskatoon, SK, Canada
| | - Debajyoti Mondal
- Department of Computer Science, University of Saskatchewan, Saskatoon, SK, Canada
| | - Karim Panjvani
- Global Institute for Food Security, University of Saskatchewan, Saskatoon, SK, Canada
| | - Leon Kochian
- Global Institute for Food Security, University of Saskatchewan, Saskatoon, SK, Canada
| | - Ian Stavness
- Department of Computer Science, University of Saskatchewan, Saskatoon, SK, Canada
| |
Collapse
|
3
|
Harandi N, Vandenberghe B, Vankerschaver J, Depuydt S, Van Messem A. How to make sense of 3D representations for plant phenotyping: a compendium of processing and analysis techniques. PLANT METHODS 2023; 19:60. [PMID: 37353846 DOI: 10.1186/s13007-023-01031-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Received: 10/18/2022] [Accepted: 05/19/2023] [Indexed: 06/25/2023]
Abstract
Computer vision technology is moving more and more towards a three-dimensional approach, and plant phenotyping is following this trend. However, despite its potential, the complexity of the analysis of 3D representations has been the main bottleneck hindering the wider deployment of 3D plant phenotyping. In this review we provide an overview of typical steps for the processing and analysis of 3D representations of plants, to offer potential users of 3D phenotyping a first gateway into its application, and to stimulate its further development. We focus on plant phenotyping applications where the goal is to measure characteristics of single plants or crop canopies on a small scale in research settings, as opposed to large scale crop monitoring in the field.
Collapse
Affiliation(s)
- Negin Harandi
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | | | - Joris Vankerschaver
- Center for Biosystems and Biotech Data Science, Ghent University Global Campus, 119 Songdomunhwa-ro, Yeonsu-gu, Incheon, South Korea
- Department of Applied Mathematics, Computer Science and Statistics, Ghent University, Krijgslaan 281, S9, Ghent, Belgium
| | - Stephen Depuydt
- Erasmus Applied University of Sciences and Arts, Campus Kaai, Nijverheidskaai 170, Anderlecht, Belgium
| | - Arnout Van Messem
- Department of Mathematics, Université de Liège, Allée de la Découverte 12, Liège, Belgium.
| |
Collapse
|
4
|
Kattenborn T, Richter R, Guimarães‐Steinicke C, Feilhauer H, Wirth C. AngleCam
: Predicting the temporal variation of leaf angle distributions from image series with deep learning. Methods Ecol Evol 2022. [DOI: 10.1111/2041-210x.13968] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Affiliation(s)
- Teja Kattenborn
- Remote Sensing Centre for Earth System Research (RSC4Earth) Leipzig University Leipzig Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle‐Jena‐Leipzig Germany
| | - Ronny Richter
- German Centre for Integrative Biodiversity Research (iDiv) Halle‐Jena‐Leipzig Germany
- Systematic Botany and Functional Biodiversity Institute of Biology Leipzig University Leipzig Germany
| | - Claudia Guimarães‐Steinicke
- Remote Sensing Centre for Earth System Research (RSC4Earth) Leipzig University Leipzig Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle‐Jena‐Leipzig Germany
| | - Hannes Feilhauer
- Remote Sensing Centre for Earth System Research (RSC4Earth) Leipzig University Leipzig Germany
- German Centre for Integrative Biodiversity Research (iDiv) Halle‐Jena‐Leipzig Germany
| | - Christian Wirth
- German Centre for Integrative Biodiversity Research (iDiv) Halle‐Jena‐Leipzig Germany
- Systematic Botany and Functional Biodiversity Institute of Biology Leipzig University Leipzig Germany
| |
Collapse
|
5
|
Senger E, Osorio S, Olbricht K, Shaw P, Denoyes B, Davik J, Predieri S, Karhu S, Raubach S, Lippi N, Höfer M, Cockerton H, Pradal C, Kafkas E, Litthauer S, Amaya I, Usadel B, Mezzetti B. Towards smart and sustainable development of modern berry cultivars in Europe. THE PLANT JOURNAL : FOR CELL AND MOLECULAR BIOLOGY 2022; 111:1238-1251. [PMID: 35751152 DOI: 10.1111/tpj.15876] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/04/2022] [Revised: 06/15/2022] [Accepted: 06/22/2022] [Indexed: 06/15/2023]
Abstract
Fresh berries are a popular and important component of the human diet. The demand for high-quality berries and sustainable production methods is increasing globally, challenging breeders to develop modern berry cultivars that fulfill all desired characteristics. Since 1994, research projects have characterized genetic resources, developed modern tools for high-throughput screening, and published data in publicly available repositories. However, the key findings of different disciplines are rarely linked together, and only a limited range of traits and genotypes has been investigated. The Horizon2020 project BreedingValue will address these challenges by studying a broader panel of strawberry, raspberry and blueberry genotypes in detail, in order to recover the lost genetic diversity that has limited the aroma and flavor intensity of recent cultivars. We will combine metabolic analysis with sensory panel tests and surveys to identify the key components of taste, flavor and aroma in berries across Europe, leading to a high-resolution map of quality requirements for future berry cultivars. Traits linked to berry yields and the effect of environmental stress will be investigated using modern image analysis methods and modeling. We will also use genetic analysis to determine the genetic basis of complex traits for the development and optimization of modern breeding technologies, such as molecular marker arrays, genomic selection and genome-wide association studies. Finally, the results, raw data and metadata will be made publicly available on the open platform Germinate in order to meet FAIR data principles and provide the basis for sustainable research in the future.
Collapse
Affiliation(s)
- Elisa Senger
- Institute of Bio- and Geosciences, IBG-4 Bioinformatics, BioSC, CEPLAS, Forschungszentrum Jülich, Jülich, Germany
| | - Sonia Osorio
- Departamento de Biología Molecular y Bioquímica, Instituto de Hortofruticultura Subtropical y Mediterránea 'La Mayora', Universidad de Málaga-Consejo Superior de Investigaciones Científicas, Campus de Teatinos, Málaga, Spain
| | | | - Paul Shaw
- Department of Information and Computational Sciences, The James Hutton Institute, Invergowrie, Scotland, UK
| | - Béatrice Denoyes
- Université de Bordeaux, UMR BFP, INRAE, Villenave d'Ornon, France
| | - Jahn Davik
- Department of Molecular Plant Biology, Norwegian Institute of Bioeconomy Research (NIBIO), Ås, Norway
| | - Stefano Predieri
- Bio-Agrofood Department, Institute for Bioeconomy, IBE-CNR, Italian National Research Council, Bologna, Italy
| | - Saila Karhu
- Natural Resources Institute Finland (Luke), Turku, Finland
| | - Sebastian Raubach
- Department of Information and Computational Sciences, The James Hutton Institute, Invergowrie, Scotland, UK
| | - Nico Lippi
- Bio-Agrofood Department, Institute for Bioeconomy, IBE-CNR, Italian National Research Council, Bologna, Italy
| | - Monika Höfer
- Institute of Breeding Research on Fruit Crops, Federal Research Centre for Cultivated Plants (JKI), Dresden, Germany
| | - Helen Cockerton
- Genetics, Genomics and Breeding Department, NIAB, East Malling, UK
| | - Christophe Pradal
- CIRAD and UMR AGAP Institute, Montpellier, France
- INRIA and LIRMM, University Montpellier, CNRS, Montpellier, France
| | - Ebru Kafkas
- Department of Horticulture, Faculty of Agriculture, Çukurova University, Balcalı, Adana, Turkey
| | | | - Iraida Amaya
- Unidad Asociada deI + D + i IFAPA-CSIC Biotecnología y Mejora en Fresa, Málaga, Spain
- Laboratorio de Genómica y Biotecnología, Centro IFAPA de Málaga, Instituto Andaluz de Investigación y Formación Agraria y Pesquera, Málaga, Spain
| | - Björn Usadel
- Institute of Bio- and Geosciences, IBG-4 Bioinformatics, BioSC, CEPLAS, Forschungszentrum Jülich, Jülich, Germany
- Institute for Biological Data Science, Heinrich-Heine University Düsseldorf, Düsseldorf, Germany
| | - Bruno Mezzetti
- Department of Agricultural, Food and Environmental Sciences, Università Politecnica delle Marche, Ancona, Italy
| |
Collapse
|
6
|
Okura F. 3D modeling and reconstruction of plants and trees: A cross-cutting review across computer graphics, vision, and plant phenotyping. BREEDING SCIENCE 2022; 72:31-47. [PMID: 36045890 PMCID: PMC8987840 DOI: 10.1270/jsbbs.21074] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Accepted: 11/26/2021] [Indexed: 06/15/2023]
Abstract
This paper reviews the past and current trends of three-dimensional (3D) modeling and reconstruction of plants and trees. These topics have been studied in multiple research fields, including computer vision, graphics, plant phenotyping, and forestry. This paper, therefore, provides a cross-cutting review. Representations of plant shape and structure are first summarized, where every method for plant modeling and reconstruction is based on a shape/structure representation. The methods were then categorized into 1) creating non-existent plants (modeling) and 2) creating models from real-world plants (reconstruction). This paper also discusses the limitations of current methods and possible future directions.
Collapse
Affiliation(s)
- Fumio Okura
- Graduate School of Information Science and Technology, Osaka University, 1-5 Yamadaoka, Suita, Osaka 565-0871, Japan
| |
Collapse
|
7
|
Wu S, Wen W, Gou W, Lu X, Zhang W, Zheng C, Xiang Z, Chen L, Guo X. A miniaturized phenotyping platform for individual plants using multi-view stereo 3D reconstruction. FRONTIERS IN PLANT SCIENCE 2022; 13:897746. [PMID: 36003825 PMCID: PMC9393617 DOI: 10.3389/fpls.2022.897746] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/16/2022] [Accepted: 07/08/2022] [Indexed: 05/14/2023]
Abstract
Plant phenotyping is essential in plant breeding and management. High-throughput data acquisition and automatic phenotypes extraction are common concerns in plant phenotyping. Despite the development of phenotyping platforms and the realization of high-throughput three-dimensional (3D) data acquisition in tall plants, such as maize, handling small-size plants with complex structural features remains a challenge. This study developed a miniaturized shoot phenotyping platform MVS-Pheno V2 focusing on low plant shoots. The platform is an improvement of MVS-Pheno V1 and was developed based on multi-view stereo 3D reconstruction. It has the following four components: Hardware, wireless communication and control, data acquisition system, and data processing system. The hardware sets the rotation on top of the platform, separating plants to be static while rotating. A novel local network was established to realize wireless communication and control; thus, preventing cable twining. The data processing system was developed to calibrate point clouds and extract phenotypes, including plant height, leaf area, projected area, shoot volume, and compactness. This study used three cultivars of wheat shoots at four growth stages to test the performance of the platform. The mean absolute percentage error of point cloud calibration was 0.585%. The squared correlation coefficient R 2 was 0.9991, 0.9949, and 0.9693 for plant height, leaf length, and leaf width, respectively. The root mean squared error (RMSE) was 0.6996, 0.4531, and 0.1174 cm for plant height, leaf length, and leaf width. The MVS-Pheno V2 platform provides an alternative solution for high-throughput phenotyping of low individual plants and is especially suitable for shoot architecture-related plant breeding and management studies.
Collapse
Affiliation(s)
- Sheng Wu
- Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Weiliang Wen
- Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Wenbo Gou
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Xianju Lu
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Wenqi Zhang
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Chenxi Zheng
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Zhiwei Xiang
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Liping Chen
- Intelligent Equipment Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- *Correspondence: Liping Chen
| | - Xinyu Guo
- Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
- College of Agricultural Engineering, Jiangsu University, Zhenjiang, China
- Xinyu Guo
| |
Collapse
|
8
|
Serre NBC, Fendrych M. ACORBA: Automated workflow to measure Arabidopsis thaliana root tip angle dynamics. QUANTITATIVE PLANT BIOLOGY 2022; 3:e9. [PMID: 37077987 PMCID: PMC10095971 DOI: 10.1017/qpb.2022.4] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Revised: 03/07/2022] [Accepted: 03/30/2022] [Indexed: 05/03/2023]
Abstract
The ability of plants to sense and orient their root growth towards gravity is studied in many laboratories. It is known that manual analysis of image data is subjected to human bias. Several semi-automated tools are available for analysing images from flatbed scanners, but there is no solution to automatically measure root bending angle over time for vertical-stage microscopy images. To address these problems, we developed ACORBA, which is an automated software that can measure root bending angle over time from vertical-stage microscope and flatbed scanner images. ACORBA also has a semi-automated mode for camera or stereomicroscope images. It represents a flexible approach based on both traditional image processing and deep machine learning segmentation to measure root angle progression over time. As the software is automated, it limits human interactions and is reproducible. ACORBA will support the plant biologist community by reducing labour and increasing reproducibility of image analysis of root gravitropism.
Collapse
Affiliation(s)
- Nelson B C Serre
- Department of Experimental Plant Biology, Faculty of Sciences, Charles University, Prague, Czech Republic
| | - Matyáš Fendrych
- Department of Experimental Plant Biology, Faculty of Sciences, Charles University, Prague, Czech Republic
| |
Collapse
|
9
|
Salter WT, Shrestha A, Barbour MM. Open source 3D phenotyping of chickpea plant architecture across plant development. PLANT METHODS 2021; 17:95. [PMID: 34530876 PMCID: PMC8444385 DOI: 10.1186/s13007-021-00795-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/12/2020] [Accepted: 08/30/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND Being able to accurately assess the 3D architecture of plant canopies can allow us to better estimate plant productivity and improve our understanding of underlying plant processes. This is especially true if we can monitor these traits across plant development. Photogrammetry techniques, such as structure from motion, have been shown to provide accurate 3D reconstructions of monocot crop species such as wheat and rice, yet there has been little success reconstructing crop species with smaller leaves and more complex branching architectures, such as chickpea. RESULTS In this work, we developed a low-cost 3D scanner and used an open-source data processing pipeline to assess the 3D structure of individual chickpea plants. The imaging system we developed consists of a user programmable turntable and three cameras that automatically captures 120 images of each plant and offloads these to a computer for processing. The capture process takes 5-10 min for each plant and the majority of the reconstruction process on a Windows PC is automated. Plant height and total plant surface area were validated against "ground truth" measurements, producing R2 > 0.99 and a mean absolute percentage error < 10%. We demonstrate the ability to assess several important architectural traits, including canopy volume and projected area, and estimate relative growth rate in commercial chickpea cultivars and lines from local and international breeding collections. Detailed analysis of individual reconstructions also allowed us to investigate partitioning of plant surface area, and by proxy plant biomass. CONCLUSIONS Our results show that it is possible to use low-cost photogrammetry techniques to accurately reconstruct individual chickpea plants, a crop with a complex architecture consisting of many small leaves and a highly branching structure. We hope that our use of open-source software and low-cost hardware will encourage others to use this promising technique for more architecturally complex species.
Collapse
Affiliation(s)
- William T. Salter
- School of Life and Environmental Sciences, Sydney Institute of Agriculture, The University of Sydney, NSW 2570 Brownlow Hill, Australia
| | - Arjina Shrestha
- School of Life and Environmental Sciences, Sydney Institute of Agriculture, The University of Sydney, NSW 2570 Brownlow Hill, Australia
| | - Margaret M. Barbour
- School of Life and Environmental Sciences, Sydney Institute of Agriculture, The University of Sydney, NSW 2570 Brownlow Hill, Australia
- School of Science, University of Waikato, Hillcrest, Hamilton, 3216 New Zealand
| |
Collapse
|
10
|
Jin S, Su Y, Zhang Y, Song S, Li Q, Liu Z, Ma Q, Ge Y, Liu L, Ding Y, Baret F, Guo Q. Exploring Seasonal and Circadian Rhythms in Structural Traits of Field Maize from LiDAR Time Series. PLANT PHENOMICS (WASHINGTON, D.C.) 2021; 2021:9895241. [PMID: 34557676 PMCID: PMC8441379 DOI: 10.34133/2021/9895241] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2021] [Accepted: 07/27/2021] [Indexed: 06/02/2023]
Abstract
Plant growth rhythm in structural traits is important for better understanding plant response to the ever-changing environment. Terrestrial laser scanning (TLS) is a well-suited tool to study structural rhythm under field conditions. Recent studies have used TLS to describe the structural rhythm of trees, but no consistent patterns have been drawn. Meanwhile, whether TLS can capture structural rhythm in crops is unclear. Here, we aim to explore the seasonal and circadian rhythms in maize structural traits at both the plant and leaf levels from time-series TLS. The seasonal rhythm was studied using TLS data collected at four key growth periods, including jointing, bell-mouthed, heading, and maturity periods. Circadian rhythms were explored by using TLS data acquired around every 2 hours in a whole day under standard and cold stress conditions. Results showed that TLS can quantify the seasonal and circadian rhythm in structural traits at both plant and leaf levels. (1) Leaf inclination angle decreased significantly between the jointing stage and bell-mouthed stage. Leaf azimuth was stable after the jointing stage. (2) Some individual-level structural rhythms (e.g., azimuth and projected leaf area/PLA) were consistent with leaf-level structural rhythms. (3) The circadian rhythms of some traits (e.g., PLA) were not consistent under standard and cold stress conditions. (4) Environmental factors showed better correlations with leaf traits under cold stress than standard conditions. Temperature was the most important factor that significantly correlated with all leaf traits except leaf azimuth. This study highlights the potential of time-series TLS in studying outdoor agricultural chronobiology.
Collapse
Affiliation(s)
- Shichao Jin
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored by Province and Ministry, Jiangsu Key Laboratory for Information Agriculture, Nanjing Agricultural University, Nanjing 210095, China
- Jiangsu Provincial Key Laboratory of Geographic Information Science and Technology, International Institute for Earth System Sciences, Nanjing University, Nanjing, Jiangsu 210023, China
- State Key Laboratory of Vegetation and Environmental Change, Institute of Botany, Chinese Academy of Sciences, Beijing 100093, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Yanjun Su
- State Key Laboratory of Vegetation and Environmental Change, Institute of Botany, Chinese Academy of Sciences, Beijing 100093, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Yongguang Zhang
- Jiangsu Provincial Key Laboratory of Geographic Information Science and Technology, International Institute for Earth System Sciences, Nanjing University, Nanjing, Jiangsu 210023, China
| | - Shilin Song
- State Key Laboratory of Vegetation and Environmental Change, Institute of Botany, Chinese Academy of Sciences, Beijing 100093, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Qing Li
- National Technique Innovation Center for Regional Wheat Production/Key Laboratory of Crop Ecophysiology, Ministry of Agriculture, Nanjing Agricultural University, Nanjing, 210095 Jiangsu, China
| | - Zhonghua Liu
- State Key Laboratory of Vegetation and Environmental Change, Institute of Botany, Chinese Academy of Sciences, Beijing 100093, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Qin Ma
- Department of Forestry, Mississippi State University, Mississippi State 39759, USA
| | - Yan Ge
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored by Province and Ministry, Jiangsu Key Laboratory for Information Agriculture, Nanjing Agricultural University, Nanjing 210095, China
| | - LingLi Liu
- State Key Laboratory of Vegetation and Environmental Change, Institute of Botany, Chinese Academy of Sciences, Beijing 100093, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Yanfeng Ding
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored by Province and Ministry, Jiangsu Key Laboratory for Information Agriculture, Nanjing Agricultural University, Nanjing 210095, China
| | - Frédéric Baret
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Collaborative Innovation Centre for Modern Crop Production Co-Sponsored by Province and Ministry, Jiangsu Key Laboratory for Information Agriculture, Nanjing Agricultural University, Nanjing 210095, China
- Environnement Méditerranéen et Modélisation des Agro-Hydrosystèmes (EMMAH), Institut National de la Recherche Agronomique, Unité Mixte de Recherche 1114 Domaine Saint-Paul, Avignon Cedex 84914, France
| | - Qinghua Guo
- Department of Ecology, College of Environmental Sciences, and Key Laboratory of Earth Surface Processes of the Ministry of Education, Peking University, Beijing 100871, China
| |
Collapse
|
11
|
Xiang L, Nolan TM, Bao Y, Elmore M, Tuel T, Gai J, Shah D, Wang P, Huser NM, Hurd AM, McLaughlin SA, Howell SH, Walley JW, Yin Y, Tang L. Robotic Assay for Drought (RoAD): an automated phenotyping system for brassinosteroid and drought responses. THE PLANT JOURNAL : FOR CELL AND MOLECULAR BIOLOGY 2021; 107:1837-1853. [PMID: 34216161 DOI: 10.1111/tpj.15401] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 06/16/2021] [Accepted: 06/19/2021] [Indexed: 06/13/2023]
Abstract
Brassinosteroids (BRs) are a group of plant steroid hormones involved in regulating growth, development, and stress responses. Many components of the BR pathway have previously been identified and characterized. However, BR phenotyping experiments are typically performed in a low-throughput manner, such as on Petri plates. Additionally, the BR pathway affects drought responses, but drought experiments are time consuming and difficult to control. To mitigate these issues and increase throughput, we developed the Robotic Assay for Drought (RoAD) system to perform BR and drought response experiments in soil-grown Arabidopsis plants. RoAD is equipped with a robotic arm, a rover, a bench scale, a precisely controlled watering system, an RGB camera, and a laser profilometer. It performs daily weighing, watering, and imaging tasks and is capable of administering BR response assays by watering plants with Propiconazole (PCZ), a BR biosynthesis inhibitor. We developed image processing algorithms for both plant segmentation and phenotypic trait extraction to accurately measure traits including plant area, plant volume, leaf length, and leaf width. We then applied machine learning algorithms that utilize the extracted phenotypic parameters to identify image-derived traits that can distinguish control, drought-treated, and PCZ-treated plants. We carried out PCZ and drought experiments on a set of BR mutants and Arabidopsis accessions with altered BR responses. Finally, we extended the RoAD assays to perform BR response assays using PCZ in Zea mays (maize) plants. This study establishes an automated and non-invasive robotic imaging system as a tool to accurately measure morphological and growth-related traits of Arabidopsis and maize plants in 3D, providing insights into the BR-mediated control of plant growth and stress responses.
Collapse
Affiliation(s)
- Lirong Xiang
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Trevor M Nolan
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| | - Yin Bao
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Mitch Elmore
- Department of Plant Pathology and Microbiology, Iowa State University, Ames, IA, 50011, USA
| | - Taylor Tuel
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Jingyao Gai
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Dylan Shah
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
| | - Ping Wang
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Nicole M Huser
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Ashley M Hurd
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Sean A McLaughlin
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
| | - Stephen H Howell
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| | - Justin W Walley
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
- Department of Plant Pathology and Microbiology, Iowa State University, Ames, IA, 50011, USA
| | - Yanhai Yin
- Department of Genetics, Development and Cell Biology, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| | - Lie Tang
- Department of Agricultural and Biosystems Engineering, Iowa State University, Ames, IA, 50011, USA
- Plant Sciences Institutes, Iowa State University, Ames, IA, 50011, USA
| |
Collapse
|
12
|
Fundamental Understanding of Tea Growth and Modeling of Precise Tea Shoot Picking Based on 3-D Coordinate Instrument. Processes (Basel) 2021. [DOI: 10.3390/pr9061059] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Tea is a popular beverage worldwide and also has great medical value. A fundamental understanding of tea shoot growth and a precision picking model should be established to realize mechanized picking of tea shoots with a small product loss. Accordingly, the terminal bud length (Lbud), tea stem length (Lstem), terminal bud angle (αbud), tea stem angle (αstem), and growth time (t) were considered as the key growth parameters; the sum of the vertical lengths of the terminal bud and stem (ξ), the picking radius (r), and the vertical length of the stem (Zstem) were considered as the picking indexes of the tea shoots. The variations in growth parameters with time were investigated using a 3-D coordinate instrument, and the relationships between the growth parameters and the picking indexes were established using an artificial neural network (ANN). The results indicated that the tea growth cycles for periods P1, P2, P3, P4, P5, and P6 were 14, 7, 6, 4, 4, and 6 d, respectively. A growth cycle diagram of the tea growth was established. Moreover, a 5-2-12-3 ANN model was developed. The best prediction of ξ, r, and Zstem was found with 16 training epochs. The MSE value was 0.0923 × 10−4, and the R values for the training, test, and validation data were 0.99976, 0.99871, and 0.99857, respectively, indicating that the established ANN model demonstrates excellent performance in predicting the picking indexes of tea shoots.
Collapse
|
13
|
Miao T, Wen W, Li Y, Wu S, Zhu C, Guo X. Label3DMaize: toolkit for 3D point cloud data annotation of maize shoots. Gigascience 2021; 10:6272094. [PMID: 33963385 PMCID: PMC8105162 DOI: 10.1093/gigascience/giab031] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2020] [Revised: 03/10/2021] [Accepted: 04/12/2021] [Indexed: 01/31/2023] Open
Abstract
Background The 3D point cloud is the most direct and effective data form for studying plant structure and morphology. In point cloud studies, the point cloud segmentation of individual plants to organs directly determines the accuracy of organ-level phenotype estimation and the reliability of the 3D plant reconstruction. However, highly accurate, automatic, and robust point cloud segmentation approaches for plants are unavailable. Thus, the high-throughput segmentation of many shoots is challenging. Although deep learning can feasibly solve this issue, software tools for 3D point cloud annotation to construct the training dataset are lacking. Results We propose a top-to-down point cloud segmentation algorithm using optimal transportation distance for maize shoots. We apply our point cloud annotation toolkit for maize shoots, Label3DMaize, to achieve semi-automatic point cloud segmentation and annotation of maize shoots at different growth stages, through a series of operations, including stem segmentation, coarse segmentation, fine segmentation, and sample-based segmentation. The toolkit takes ∼4–10 minutes to segment a maize shoot and consumes 10–20% of the total time if only coarse segmentation is required. Fine segmentation is more detailed than coarse segmentation, especially at the organ connection regions. The accuracy of coarse segmentation can reach 97.2% that of fine segmentation. Conclusion Label3DMaize integrates point cloud segmentation algorithms and manual interactive operations, realizing semi-automatic point cloud segmentation of maize shoots at different growth stages. The toolkit provides a practical data annotation tool for further online segmentation research based on deep learning and is expected to promote automatic point cloud processing of various plants.
Collapse
Affiliation(s)
- Teng Miao
- College of Information and Electrical Engineering, Shenyang Agricultural University, Dongling Road, Shenhe District, Liaoning Province, Shenyang 110161, China
| | - Weiliang Wen
- Beijing Research Center for Information Technology in Agriculture, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China.,National Engineering Research Center for Information Technology in Agriculture, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China.,Beijing Key Lab of Digital Plant, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China
| | - Yinglun Li
- National Engineering Research Center for Information Technology in Agriculture, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China.,Beijing Key Lab of Digital Plant, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China
| | - Sheng Wu
- Beijing Research Center for Information Technology in Agriculture, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China.,National Engineering Research Center for Information Technology in Agriculture, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China.,Beijing Key Lab of Digital Plant, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China
| | - Chao Zhu
- College of Information and Electrical Engineering, Shenyang Agricultural University, Dongling Road, Shenhe District, Liaoning Province, Shenyang 110161, China
| | - Xinyu Guo
- Beijing Research Center for Information Technology in Agriculture, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China.,National Engineering Research Center for Information Technology in Agriculture, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China.,Beijing Key Lab of Digital Plant, 11#Shuguang Huayuan Middle Road, Haidian District, Beijing 100097, China
| |
Collapse
|
14
|
Ghahremani M, Williams K, Corke FMK, Tiddeman B, Liu Y, Doonan JH. Deep Segmentation of Point Clouds of Wheat. FRONTIERS IN PLANT SCIENCE 2021; 12:608732. [PMID: 33841454 PMCID: PMC8025700 DOI: 10.3389/fpls.2021.608732] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 02/24/2021] [Indexed: 05/31/2023]
Abstract
The 3D analysis of plants has become increasingly effective in modeling the relative structure of organs and other traits of interest. In this paper, we introduce a novel pattern-based deep neural network, Pattern-Net, for segmentation of point clouds of wheat. This study is the first to segment the point clouds of wheat into defined organs and to analyse their traits directly in 3D space. Point clouds have no regular grid and thus their segmentation is challenging. Pattern-Net creates a dynamic link among neighbors to seek stable patterns from a 3D point set across several levels of abstraction using the K-nearest neighbor algorithm. To this end, different layers are connected to each other to create complex patterns from the simple ones, strengthen dynamic link propagation, alleviate the vanishing-gradient problem, encourage link reuse and substantially reduce the number of parameters. The proposed deep network is capable of analysing and decomposing unstructured complex point clouds into semantically meaningful parts. Experiments on a wheat dataset verify the effectiveness of our approach for segmentation of wheat in 3D space.
Collapse
Affiliation(s)
- Morteza Ghahremani
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
- Department of Computer Science, Aberystwyth University, Aberystwyth, United Kingdom
| | - Kevin Williams
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
| | - Fiona M. K. Corke
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
| | - Bernard Tiddeman
- Department of Computer Science, Aberystwyth University, Aberystwyth, United Kingdom
| | - Yonghuai Liu
- Department of Computer Science, Edge Hill University, Ormskirk, United Kingdom
| | - John H. Doonan
- National Plant Phenomics Centre, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth, United Kingdom
| |
Collapse
|
15
|
High-throughput image segmentation and machine learning approaches in the plant sciences across multiple scales. Emerg Top Life Sci 2021; 5:239-248. [DOI: 10.1042/etls20200273] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2020] [Revised: 02/09/2021] [Accepted: 02/11/2021] [Indexed: 01/12/2023]
Abstract
Agriculture has benefited greatly from the rise of big data and high-performance computing. The acquisition and analysis of data across biological scales have resulted in strategies modeling inter- actions between plant genotype and environment, models of root architecture that provide insight into resource utilization, and the elucidation of cell-to-cell communication mechanisms that are instrumental in plant development. Image segmentation and machine learning approaches for interpreting plant image data are among many of the computational methodologies that have evolved to address challenging agricultural and biological problems. These approaches have led to contributions such as the accelerated identification of gene that modulate stress responses in plants and automated high-throughput phenotyping for early detection of plant diseases. The continued acquisition of high throughput imaging across multiple biological scales provides opportunities to further push the boundaries of our understandings quicker than ever before. In this review, we explore the current state of the art methodologies in plant image segmentation and machine learning at the agricultural, organ, and cellular scales in plants. We show how the methodologies for segmentation and classification differ due to the diversity of physical characteristics found at these different scales. We also discuss the hardware technologies most commonly used at these different scales, the types of quantitative metrics that can be extracted from these images, and how the biological mechanisms by which plants respond to abiotic/biotic stresses or genotypic modifications can be extracted from these approaches.
Collapse
|
16
|
Singh R, Gehlot A, Vaseem Akram S, Kumar Thakur A, Buddhi D, Kumar Das P. Forest 4.0: Digitalization of forest using the Internet of Things (IoT). JOURNAL OF KING SAUD UNIVERSITY - COMPUTER AND INFORMATION SCIENCES 2021. [DOI: 10.1016/j.jksuci.2021.02.009] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
17
|
Jiang C, Ryu Y, Wang H, Keenan TF. An optimality-based model explains seasonal variation in C3 plant photosynthetic capacity. GLOBAL CHANGE BIOLOGY 2020; 26:6493-6510. [PMID: 32654330 DOI: 10.1111/gcb.15276] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/05/2020] [Accepted: 05/18/2020] [Indexed: 06/11/2023]
Abstract
The maximum rate of carboxylation (Vcmax ) is an essential leaf trait determining the photosynthetic capacity of plants. Existing approaches for estimating Vcmax at large scale mainly rely on empirical relationships with proxies such as leaf nitrogen/chlorophyll content or hyperspectral reflectance, or on complicated inverse models from gross primary production or solar-induced fluorescence. A novel mechanistic approach based on the assumption that plants optimize resource investment coordinating with environment and growth has been shown to accurately predict C3 plant Vcmax based on mean growing season environmental conditions. However, the ability of optimality theory to explain seasonal variation in Vcmax has not been fully investigated. Here, we adapt an optimality-based model to simulate daily Vcmax,25C (Vcmax at a standardized temperature of 25°C) by incorporating the effects of antecedent environment, which affects current plant functioning, and dynamic light absorption, which coordinates with plant functioning. We then use seasonal Vcmax,25C field measurements from 10 sites across diverse ecosystems to evaluate model performance. Overall, the model explains about 83% of the seasonal variation in C3 plant Vcmax,25C across the 10 sites, with a medium root mean square error of 12.3 μmol m-2 s-1 , which suggests that seasonal changes in Vcmax,25C are consistent with optimal plant function. We show that failing to account for acclimation to antecedent environment or coordination with dynamic light absorption dramatically decreases estimation accuracy. Our results show that optimality-based approach can accurately reproduce seasonal variation in canopy photosynthetic potential, and suggest that incorporating such theory into next-generation trait-based terrestrial biosphere models would improve predictions of global photosynthesis.
Collapse
Affiliation(s)
- Chongya Jiang
- Department of Landscape Architecture and Rural Systems Engineering, Seoul National University, Seoul, Korea
| | - Youngryel Ryu
- Department of Landscape Architecture and Rural Systems Engineering, Seoul National University, Seoul, Korea
| | - Han Wang
- Department of Earth System Science, Tsinghua University, Beijing, China
| | - Trevor F Keenan
- Climate and Ecosystem Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, CA, USA
- Department of Environmental Science, Policy and Management, UC Berkeley, Berkeley, CA, USA
| |
Collapse
|
18
|
Ko DK, Brandizzi F. Network-based approaches for understanding gene regulation and function in plants. THE PLANT JOURNAL : FOR CELL AND MOLECULAR BIOLOGY 2020; 104:302-317. [PMID: 32717108 PMCID: PMC8922287 DOI: 10.1111/tpj.14940] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/12/2020] [Accepted: 07/14/2020] [Indexed: 05/03/2023]
Abstract
Expression reprogramming directed by transcription factors is a primary gene regulation underlying most aspects of the biology of any organism. Our views of how gene regulation is coordinated are dramatically changing thanks to the advent and constant improvement of high-throughput profiling and transcriptional network inference methods: from activities of individual genes to functional interactions across genes. These technical and analytical advances can reveal the topology of transcriptional networks in which hundreds of genes are hierarchically regulated by multiple transcription factors at systems level. Here we review the state of the art of experimental and computational methods used in plant biology research to obtain large-scale datasets and model transcriptional networks. Examples of direct use of these network models and perspectives on their limitations and future directions are also discussed.
Collapse
Affiliation(s)
- Dae Kwan Ko
- MSU-DOE Plant Research Lab, Michigan State University, East Lansing, MI 48824, USA
- Great Lakes Bioenergy Research Center, Michigan State University, East Lansing, MI 48824, USA
| | - Federica Brandizzi
- MSU-DOE Plant Research Lab, Michigan State University, East Lansing, MI 48824, USA
- Great Lakes Bioenergy Research Center, Michigan State University, East Lansing, MI 48824, USA
- Department of Plant Biology, Michigan State University, East Lansing, MI 48824, USA
- For correspondence ()
| |
Collapse
|
19
|
Performances Evaluation of a Low-Cost Platform for High-Resolution Plant Phenotyping. SENSORS 2020; 20:s20113150. [PMID: 32498361 PMCID: PMC7308841 DOI: 10.3390/s20113150] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Revised: 05/28/2020] [Accepted: 05/30/2020] [Indexed: 12/28/2022]
Abstract
This study aims to test the performances of a low-cost and automatic phenotyping platform, consisting of a Red-Green-Blue (RGB) commercial camera scanning objects on rotating plates and the reconstruction of main plant phenotypic traits via the structure for motion approach (SfM). The precision of this platform was tested in relation to three-dimensional (3D) models generated from images of potted maize, tomato and olive tree, acquired at a different frequency (steps of 4°, 8° and 12°) and quality (4.88, 6.52 and 9.77 µm/pixel). Plant and organs heights, angles and areas were extracted from the 3D models generated for each combination of these factors. Coefficient of determination (R2), relative Root Mean Square Error (rRMSE) and Akaike Information Criterion (AIC) were used as goodness-of-fit indexes to compare the simulated to the observed data. The results indicated that while the best performances in reproducing plant traits were obtained using 90 images at 4.88 µm/pixel (R2 = 0.81, rRMSE = 9.49% and AIC = 35.78), this corresponded to an unviable processing time (from 2.46 h to 28.25 h for herbaceous plants and olive trees, respectively). Conversely, 30 images at 4.88 µm/pixel resulted in a good compromise between a reliable reconstruction of considered traits (R2 = 0.72, rRMSE = 11.92% and AIC = 42.59) and processing time (from 0.50 h to 2.05 h for herbaceous plants and olive trees, respectively). In any case, the results pointed out that this input combination may vary based on the trait under analysis, which can be more or less demanding in terms of input images and time according to the complexity of its shape (R2 = 0.83, rRSME = 10.15% and AIC = 38.78). These findings highlight the reliability of the developed low-cost platform for plant phenotyping, further indicating the best combination of factors to speed up the acquisition and elaboration process, at the same time minimizing the bias between observed and simulated data.
Collapse
|
20
|
Van den Broeck L, Gordon M, Inzé D, Williams C, Sozzani R. Gene Regulatory Network Inference: Connecting Plant Biology and Mathematical Modeling. Front Genet 2020; 11:457. [PMID: 32547596 PMCID: PMC7270862 DOI: 10.3389/fgene.2020.00457] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2019] [Accepted: 04/14/2020] [Indexed: 12/26/2022] Open
Abstract
Plant responses to environmental and intrinsic signals are tightly controlled by multiple transcription factors (TFs). These TFs and their regulatory connections form gene regulatory networks (GRNs), which provide a blueprint of the transcriptional regulations underlying plant development and environmental responses. This review provides examples of experimental methodologies commonly used to identify regulatory interactions and generate GRNs. Additionally, this review describes network inference techniques that leverage gene expression data to predict regulatory interactions. These computational and experimental methodologies yield complex networks that can identify new regulatory interactions, driving novel hypotheses. Biological properties that contribute to the complexity of GRNs are also described in this review. These include network topology, network size, transient binding of TFs to DNA, and competition between multiple upstream regulators. Finally, this review highlights the potential of machine learning approaches to leverage gene expression data to predict phenotypic outputs.
Collapse
Affiliation(s)
- Lisa Van den Broeck
- Department of Plant and Microbial Biology, North Carolina State University, Raleigh, NC, United States
| | - Max Gordon
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC, United States
| | - Dirk Inzé
- Department of Plant Biotechnology and Bioinformatics, Ghent University, Ghent, Belgium.,VIB Center for Plant Systems Biology, Ghent, Belgium
| | - Cranos Williams
- Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC, United States
| | - Rosangela Sozzani
- Department of Plant and Microbial Biology, North Carolina State University, Raleigh, NC, United States
| |
Collapse
|
21
|
Jiang Y, Li C. Convolutional Neural Networks for Image-Based High-Throughput Plant Phenotyping: A Review. PLANT PHENOMICS (WASHINGTON, D.C.) 2020; 2020:4152816. [PMID: 33313554 PMCID: PMC7706326 DOI: 10.34133/2020/4152816] [Citation(s) in RCA: 104] [Impact Index Per Article: 26.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Accepted: 03/12/2020] [Indexed: 05/19/2023]
Abstract
Plant phenotyping has been recognized as a bottleneck for improving the efficiency of breeding programs, understanding plant-environment interactions, and managing agricultural systems. In the past five years, imaging approaches have shown great potential for high-throughput plant phenotyping, resulting in more attention paid to imaging-based plant phenotyping. With this increased amount of image data, it has become urgent to develop robust analytical tools that can extract phenotypic traits accurately and rapidly. The goal of this review is to provide a comprehensive overview of the latest studies using deep convolutional neural networks (CNNs) in plant phenotyping applications. We specifically review the use of various CNN architecture for plant stress evaluation, plant development, and postharvest quality assessment. We systematically organize the studies based on technical developments resulting from imaging classification, object detection, and image segmentation, thereby identifying state-of-the-art solutions for certain phenotyping applications. Finally, we provide several directions for future research in the use of CNN architecture for plant phenotyping purposes.
Collapse
Affiliation(s)
- Yu Jiang
- Horticulture Section, School of Integrative Plant Science, Cornell AgriTech, Cornell University, USA
- School of Electrical and Computer Engineering, College of Engineering, The University of Georgia, USA
- Phenomics and Plant Robotics Center, The University of Georgia, USA
| | - Changying Li
- School of Electrical and Computer Engineering, College of Engineering, The University of Georgia, USA
- Phenomics and Plant Robotics Center, The University of Georgia, USA
| |
Collapse
|
22
|
Wu S, Wen W, Wang Y, Fan J, Wang C, Gou W, Guo X. MVS-Pheno: A Portable and Low-Cost Phenotyping Platform for Maize Shoots Using Multiview Stereo 3D Reconstruction. PLANT PHENOMICS (WASHINGTON, D.C.) 2020; 2020:1848437. [PMID: 33313542 PMCID: PMC7706320 DOI: 10.34133/2020/1848437] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/06/2019] [Accepted: 02/19/2020] [Indexed: 05/26/2023]
Abstract
Plant phenotyping technologies play important roles in plant research and agriculture. Detailed phenotypes of individual plants can guide the optimization of shoot architecture for plant breeding and are useful to analyze the morphological differences in response to environments for crop cultivation. Accordingly, high-throughput phenotyping technologies for individual plants grown in field conditions are urgently needed, and MVS-Pheno, a portable and low-cost phenotyping platform for individual plants, was developed. The platform is composed of four major components: a semiautomatic multiview stereo (MVS) image acquisition device, a data acquisition console, data processing and phenotype extraction software for maize shoots, and a data management system. The platform's device is detachable and adjustable according to the size of the target shoot. Image sequences for each maize shoot can be captured within 60-120 seconds, yielding 3D point clouds of shoots are reconstructed using MVS-based commercial software, and the phenotypic traits at the organ and individual plant levels are then extracted by the software. The correlation coefficient (R 2) between the extracted and manually measured plant height, leaf width, and leaf area values are 0.99, 0.87, and 0.93, respectively. A data management system has also been developed to store and manage the acquired raw data, reconstructed point clouds, agronomic information, and resulting phenotypic traits. The platform offers an optional solution for high-throughput phenotyping of field-grown plants, which is especially useful for large populations or experiments across many different ecological regions.
Collapse
Affiliation(s)
- Sheng Wu
- Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
- National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
- Beijing Key Lab of Digital Plant, Beijing 100097, China
| | - Weiliang Wen
- Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
- National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
- Beijing Key Lab of Digital Plant, Beijing 100097, China
| | - Yongjian Wang
- Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
- National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
- Beijing Key Lab of Digital Plant, Beijing 100097, China
| | - Jiangchuan Fan
- Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
- National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
- Beijing Key Lab of Digital Plant, Beijing 100097, China
| | - Chuanyu Wang
- Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
- National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
- Beijing Key Lab of Digital Plant, Beijing 100097, China
| | - Wenbo Gou
- Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
- National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
- Beijing Key Lab of Digital Plant, Beijing 100097, China
| | - Xinyu Guo
- Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
- National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
- Beijing Key Lab of Digital Plant, Beijing 100097, China
| |
Collapse
|
23
|
Dutagaci H, Rasti P, Galopin G, Rousseau D. ROSE-X: an annotated data set for evaluation of 3D plant organ segmentation methods. PLANT METHODS 2020; 16:28. [PMID: 32158494 PMCID: PMC7057657 DOI: 10.1186/s13007-020-00573-w] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/02/2019] [Accepted: 02/21/2020] [Indexed: 06/02/2023]
Abstract
BACKGROUND The production and availability of annotated data sets are indispensable for training and evaluation of automatic phenotyping methods. The need for complete 3D models of real plants with organ-level labeling is even more pronounced due to the advances in 3D vision-based phenotyping techniques and the difficulty of full annotation of the intricate 3D plant structure. RESULTS We introduce the ROSE-X data set of 11 annotated 3D models of real rosebush plants acquired through X-ray tomography and presented both in volumetric form and as point clouds. The annotation is performed manually to provide ground truth data in the form of organ labels for the voxels corresponding to the plant shoot. This data set is constructed to serve both as training data for supervised learning methods performing organ-level segmentation and as a benchmark to evaluate their performance. The rosebush models in the data set are of high quality and complex architecture with organs frequently touching each other posing a challenge for the current plant organ segmentation methods. We report leaf/stem segmentation results obtained using four baseline methods. The best performance is achieved by the volumetric approach where local features are trained with a random forest classifier, giving Intersection of Union (IoU) values of 97.93% and 86.23% for leaf and stem classes, respectively. CONCLUSION We provided an annotated 3D data set of 11 rosebush plants for training and evaluation of organ segmentation methods. We also reported leaf/stem segmentation results of baseline methods, which are open to improvement. The data set, together with the baseline results, has the potential of becoming a significant resource for future studies on automatic plant phenotyping.
Collapse
Affiliation(s)
- Helin Dutagaci
- LARIS, UMR INRA IRHS, Université d’Angers, 62 Avenue Notre Dame du Lac, 49000 Angers, France
| | - Pejman Rasti
- LARIS, UMR INRA IRHS, Université d’Angers, 62 Avenue Notre Dame du Lac, 49000 Angers, France
- INRA, UMR1345 Institut de Recherche en Horticulture et Semences, 42 Georges Morel CS 60057, 49071 Beaucouze, France
- ESAIP, école d’ingénieur informatique et environnement, Saint Barthélemy d’Anjou, France
| | - Gilles Galopin
- INRA, UMR1345 Institut de Recherche en Horticulture et Semences, 42 Georges Morel CS 60057, 49071 Beaucouze, France
| | - David Rousseau
- LARIS, UMR INRA IRHS, Université d’Angers, 62 Avenue Notre Dame du Lac, 49000 Angers, France
- INRA, UMR1345 Institut de Recherche en Horticulture et Semences, 42 Georges Morel CS 60057, 49071 Beaucouze, France
| |
Collapse
|
24
|
Olas JJ, Fichtner F, Apelt F. All roads lead to growth: imaging-based and biochemical methods to measure plant growth. JOURNAL OF EXPERIMENTAL BOTANY 2020; 71:11-21. [PMID: 31613967 PMCID: PMC6913701 DOI: 10.1093/jxb/erz406] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/28/2019] [Accepted: 08/28/2019] [Indexed: 05/31/2023]
Abstract
Plant growth is a highly complex biological process that involves innumerable interconnected biochemical and signalling pathways. Many different techniques have been developed to measure growth, unravel the various processes that contribute to plant growth, and understand how a complex interaction between genotype and environment determines the growth phenotype. Despite this complexity, the term 'growth' is often simplified by researchers; depending on the method used for quantification, growth is viewed as an increase in plant or organ size, a change in cell architecture, or an increase in structural biomass. In this review, we summarise the cellular and molecular mechanisms underlying plant growth, highlight state-of-the-art imaging and non-imaging-based techniques to quantitatively measure growth, including a discussion of their advantages and drawbacks, and suggest a terminology for growth rates depending on the type of technique used.
Collapse
Affiliation(s)
- Justyna Jadwiga Olas
- University of Potsdam, Institute of Biochemistry and Biology, Karl-Liebknecht-Straße, Haus, Potsdam, Germany
| | - Franziska Fichtner
- Max Planck Institute of Molecular Plant Physiology, Am Mühlenberg, Potsdam, Germany
| | - Federico Apelt
- Max Planck Institute of Molecular Plant Physiology, Am Mühlenberg, Potsdam, Germany
| |
Collapse
|
25
|
Chaudhury A, Boudon F, Godin C. 3D Plant Phenotyping: All You Need is Labelled Point Cloud Data. COMPUTER VISION – ECCV 2020 WORKSHOPS 2020:244-260. [DOI: 10.1007/978-3-030-65414-6_18] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/19/2023]
|
26
|
Bernotas G, Scorza LCT, Hansen MF, Hales IJ, Halliday KJ, Smith LN, Smith ML, McCormick AJ. A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth. Gigascience 2019; 8:giz056. [PMID: 31127811 PMCID: PMC6534809 DOI: 10.1093/gigascience/giz056] [Citation(s) in RCA: 39] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2018] [Revised: 03/25/2019] [Accepted: 04/21/2019] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Tracking and predicting the growth performance of plants in different environments is critical for predicting the impact of global climate change. Automated approaches for image capture and analysis have allowed for substantial increases in the throughput of quantitative growth trait measurements compared with manual assessments. Recent work has focused on adopting computer vision and machine learning approaches to improve the accuracy of automated plant phenotyping. Here we present PS-Plant, a low-cost and portable 3D plant phenotyping platform based on an imaging technique novel to plant phenotyping called photometric stereo (PS). RESULTS We calibrated PS-Plant to track the model plant Arabidopsis thaliana throughout the day-night (diel) cycle and investigated growth architecture under a variety of conditions to illustrate the dramatic effect of the environment on plant phenotype. We developed bespoke computer vision algorithms and assessed available deep neural network architectures to automate the segmentation of rosettes and individual leaves, and extract basic and more advanced traits from PS-derived data, including the tracking of 3D plant growth and diel leaf hyponastic movement. Furthermore, we have produced the first PS training data set, which includes 221 manually annotated Arabidopsis rosettes that were used for training and data analysis (1,768 images in total). A full protocol is provided, including all software components and an additional test data set. CONCLUSIONS PS-Plant is a powerful new phenotyping tool for plant research that provides robust data at high temporal and spatial resolutions. The system is well-suited for small- and large-scale research and will help to accelerate bridging of the phenotype-to-genotype gap.
Collapse
Affiliation(s)
- Gytis Bernotas
- Centre for Machine Vision, Bristol Robotics Laboratory, University of the West of England, T block, Frenchay Campus, Coldharbour Lane, Bristol BS16 1QY, UK
| | - Livia C T Scorza
- SynthSys & Institute of Molecular Plant Sciences, School of Biological Sciences, University of Edinburgh, The King's Buildings, Edinburgh EH9 3BF, UK
| | - Mark F Hansen
- Centre for Machine Vision, Bristol Robotics Laboratory, University of the West of England, T block, Frenchay Campus, Coldharbour Lane, Bristol BS16 1QY, UK
| | - Ian J Hales
- Centre for Machine Vision, Bristol Robotics Laboratory, University of the West of England, T block, Frenchay Campus, Coldharbour Lane, Bristol BS16 1QY, UK
| | - Karen J Halliday
- SynthSys & Institute of Molecular Plant Sciences, School of Biological Sciences, University of Edinburgh, The King's Buildings, Edinburgh EH9 3BF, UK
| | - Lyndon N Smith
- Centre for Machine Vision, Bristol Robotics Laboratory, University of the West of England, T block, Frenchay Campus, Coldharbour Lane, Bristol BS16 1QY, UK
| | - Melvyn L Smith
- Centre for Machine Vision, Bristol Robotics Laboratory, University of the West of England, T block, Frenchay Campus, Coldharbour Lane, Bristol BS16 1QY, UK
| | - Alistair J McCormick
- SynthSys & Institute of Molecular Plant Sciences, School of Biological Sciences, University of Edinburgh, The King's Buildings, Edinburgh EH9 3BF, UK
| |
Collapse
|