1
|
Bouillon P, Fanciullino AL, Belin E, Bréard D, Boisard S, Bonnet B, Hanteville S, Bernard F, Celton JM. Image analysis and polyphenol profiling unveil red-flesh apple phenotype complexity. PLANT METHODS 2024; 20:71. [PMID: 38755652 PMCID: PMC11100172 DOI: 10.1186/s13007-024-01196-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Accepted: 04/28/2024] [Indexed: 05/18/2024]
Abstract
BACKGROUND The genetic basis of colour development in red-flesh apples (Malus domestica Borkh) has been widely characterised; however, current models do not explain the observed variations in red pigmentation intensity and distribution. Available methods to evaluate the red-flesh trait rely on the estimation of an average overall colour using a discrete class notation index. However, colour variations among red-flesh cultivars are continuous while development of red colour is non-homogeneous and genotype-dependent. A robust estimation of red-flesh colour intensity and distribution is essential to fully capture the diversity among genotypes and provide a basis to enable identification of loci influencing the red-flesh trait. RESULTS In this study, we developed a multivariable approach to evaluate the red-flesh trait in apple. This method was implemented to study the phenotypic diversity in a segregating hybrid F1 family (91 genotypes). We developed a Python pipeline based on image and colour analysis to quantitatively dissect the red-flesh pigmentation from RGB (Red Green Blue) images and compared the efficiency of RGB and CIEL*a*b* colour spaces in discriminating genotypes previously classified with a visual notation. Chemical destructive methods, including targeted-metabolite analysis using ultra-high performance liquid chromatography with ultraviolet detection (UPLC-UV), were performed to quantify major phenolic compounds in fruits' flesh, as well as pH and water contents. Multivariate analyses were performed to study covariations of biochemical factors in relation to colour expression in CIEL*a*b* colour space. Our results indicate that anthocyanin, flavonol and flavanol concentrations, as well as pH, are closely related to flesh pigmentation in apple. CONCLUSTION Extraction of colour descriptors combined to chemical analyses helped in discriminating genotypes in relation to their flesh colour. These results suggest that the red-flesh trait in apple is a complex trait associated with several biochemical factors.
Collapse
Affiliation(s)
- Pierre Bouillon
- Univ Angers, Institut Agro, INRAE, IRHS, SFR QUASAV, F-49000 , Angers, France
- IFO, 49140, Seiches sur le Loir, France
| | | | - Etienne Belin
- Univ Angers, Institut Agro, INRAE, IRHS, SFR QUASAV, F-49000 , Angers, France
| | - Dimitri Bréard
- SONAS, SFR QUASAVUniv Angers, SONAS, SFR QUASAV, Univ Angers, F-49000, Angers, France
| | - Séverine Boisard
- SONAS, SFR QUASAVUniv Angers, SONAS, SFR QUASAV, Univ Angers, F-49000, Angers, France
| | - Béatrice Bonnet
- Univ Angers, Institut Agro, INRAE, IRHS, SFR QUASAV, F-49000 , Angers, France
| | - Sylvain Hanteville
- Univ Angers, Institut Agro, INRAE, IRHS, SFR QUASAV, F-49000 , Angers, France
| | | | - Jean-Marc Celton
- Univ Angers, Institut Agro, INRAE, IRHS, SFR QUASAV, F-49000 , Angers, France.
| |
Collapse
|
2
|
Mathieu L, Reder M, Siah A, Ducasse A, Langlands-Perry C, Marcel TC, Morel JB, Saintenac C, Ballini E. SeptoSympto: a precise image analysis of Septoria tritici blotch disease symptoms using deep learning methods on scanned images. PLANT METHODS 2024; 20:18. [PMID: 38297386 PMCID: PMC10832182 DOI: 10.1186/s13007-024-01136-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2023] [Accepted: 01/07/2024] [Indexed: 02/02/2024]
Abstract
BACKGROUND Investigations on plant-pathogen interactions require quantitative, accurate, and rapid phenotyping of crop diseases. However, visual assessment of disease symptoms is preferred over available numerical tools due to transferability challenges. These assessments are laborious, time-consuming, require expertise, and are rater dependent. More recently, deep learning has produced interesting results for evaluating plant diseases. Nevertheless, it has yet to be used to quantify the severity of Septoria tritici blotch (STB) caused by Zymoseptoria tritici-a frequently occurring and damaging disease on wheat crops. RESULTS We developed an image analysis script in Python, called SeptoSympto. This script uses deep learning models based on the U-Net and YOLO architectures to quantify necrosis and pycnidia on detached, flattened and scanned leaves of wheat seedlings. Datasets of different sizes (containing 50, 100, 200, and 300 leaves) were annotated to train Convolutional Neural Networks models. Five different datasets were tested to develop a robust tool for the accurate analysis of STB symptoms and facilitate its transferability. The results show that (i) the amount of annotated data does not influence the performances of models, (ii) the outputs of SeptoSympto are highly correlated with those of the experts, with a similar magnitude to the correlations between experts, and (iii) the accuracy of SeptoSympto allows precise and rapid quantification of necrosis and pycnidia on both durum and bread wheat leaves inoculated with different strains of the pathogen, scanned with different scanners and grown under different conditions. CONCLUSIONS SeptoSympto takes the same amount of time as a visual assessment to evaluate STB symptoms. However, unlike visual assessments, it allows for data to be stored and evaluated by experts and non-experts in a more accurate and unbiased manner. The methods used in SeptoSympto make it a transferable, highly accurate, computationally inexpensive, easy-to-use, and adaptable tool. This study demonstrates the potential of using deep learning to assess complex plant disease symptoms such as STB.
Collapse
Affiliation(s)
- Laura Mathieu
- PHIM Plant Health Institute, Univ Montpellier, INRAE, CIRAD, Institut Agro, IRD, Montpellier, France.
| | - Maxime Reder
- PHIM Plant Health Institute, Univ Montpellier, INRAE, CIRAD, Institut Agro, IRD, Montpellier, France
| | - Ali Siah
- BioEcoAgro, Junia, Lille University, Liège University, UPJV, Artois University, ULCO, INRAE, Lille, France
| | - Aurélie Ducasse
- PHIM Plant Health Institute, Univ Montpellier, INRAE, CIRAD, Institut Agro, IRD, Montpellier, France
| | | | | | - Jean-Benoît Morel
- PHIM Plant Health Institute, Univ Montpellier, INRAE, CIRAD, Institut Agro, IRD, Montpellier, France
| | | | - Elsa Ballini
- PHIM Plant Health Institute, Univ Montpellier, CIRAD, INRAE, IRD, Institut Agro, Montpellier, France.
| |
Collapse
|
3
|
Carlier A, Dandrifosse S, Dumont B, Mercatoris B. Comparing CNNs and PLSr for estimating wheat organs biophysical variables using proximal sensing. FRONTIERS IN PLANT SCIENCE 2023; 14:1204791. [PMID: 38053768 PMCID: PMC10694231 DOI: 10.3389/fpls.2023.1204791] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Accepted: 10/30/2023] [Indexed: 12/07/2023]
Abstract
Estimation of biophysical vegetation variables is of interest for diverse applications, such as monitoring of crop growth and health or yield prediction. However, remote estimation of these variables remains challenging due to the inherent complexity of plant architecture, biology and surrounding environment, and the need for features engineering. Recent advancements in deep learning, particularly convolutional neural networks (CNN), offer promising solutions to address this challenge. Unfortunately, the limited availability of labeled data has hindered the exploration of CNNs for regression tasks, especially in the frame of crop phenotyping. In this study, the effectiveness of various CNN models in predicting wheat dry matter, nitrogen uptake, and nitrogen concentration from RGB and multispectral images taken from tillering to maturity was examined. To overcome the scarcity of labeled data, a training pipeline was devised. This pipeline involves transfer learning, pseudo-labeling of unlabeled data and temporal relationship correction. The results demonstrated that CNN models significantly benefit from the pseudolabeling method, while the machine learning approach employing a PLSr did not show comparable performance. Among the models evaluated, EfficientNetB4 achieved the highest accuracy for predicting above-ground biomass, with an R² value of 0.92. In contrast, Resnet50 demonstrated superior performance in predicting LAI, nitrogen uptake, and nitrogen concentration, with R² values of 0.82, 0.73, and 0.80, respectively. Moreover, the study explored multi-output models to predict the distribution of dry matter and nitrogen uptake between stem, inferior leaves, flag leaf, and ear. The findings indicate that CNNs hold promise as accessible and promising tools for phenotyping quantitative biophysical variables of crops. However, further research is required to harness their full potential.
Collapse
Affiliation(s)
- Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| | - Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| | - Benoit Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, Gembloux, Belgium
| |
Collapse
|
4
|
Carlier A, Dandrifosse S, Dumont B, Mercatoris B. To What Extent Does Yellow Rust Infestation Affect Remotely Sensed Nitrogen Status? PLANT PHENOMICS (WASHINGTON, D.C.) 2023; 5:0083. [PMID: 37681000 PMCID: PMC10482323 DOI: 10.34133/plantphenomics.0083] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Accepted: 08/03/2023] [Indexed: 09/09/2023]
Abstract
The utilization of high-throughput in-field phenotyping systems presents new opportunities for evaluating crop stress. However, existing studies have primarily focused on individual stresses, overlooking the fact that crops in field conditions frequently encounter multiple stresses, which can display similar symptoms or interfere with the detection of other stress factors. Therefore, this study aimed to investigate the impact of wheat yellow rust on reflectance measurements and nitrogen status assessment. A multi-sensor mobile platform was utilized to capture RGB and multispectral images throughout a 2-year fertilization-fungicide trial. To identify disease-induced damage, the SegVeg approach, which combines a U-NET architecture and a pixel-wise classifier, was applied to RGB images, generating a mask capable of distinguishing between healthy and damaged areas of the leaves. The observed proportion of damage in the images demonstrated similar effectiveness to visual scoring methods in explaining grain yield. Furthermore, the study discovered that the disease not only affected reflectance through leaf damage but also influenced the reflectance of healthy areas by disrupting the overall nitrogen status of the plants. This emphasizes the importance of incorporating disease impact into reflectance-based decision support tools to account for its effects on spectral data. This effect was successfully mitigated by employing the NDRE vegetation index calculated exclusively from the healthy portions of the leaves or by incorporating the proportion of damage into the model. However, these findings also highlight the necessity for further research specifically addressing the challenges presented by multiple stresses in crop phenotyping.
Collapse
Affiliation(s)
- Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| | - Sebastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| | - Benoît Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech,
University of Liège, 5030 Gembloux, Belgium
| |
Collapse
|
5
|
Gao Y, Li Y, Jiang R, Zhan X, Lu H, Guo W, Yang W, Ding Y, Liu S. Enhancing Green Fraction Estimation in Rice and Wheat Crops: A Self-Supervised Deep Learning Semantic Segmentation Approach. PLANT PHENOMICS (WASHINGTON, D.C.) 2023; 5:0064. [PMID: 37469555 PMCID: PMC10353659 DOI: 10.34133/plantphenomics.0064] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 06/12/2023] [Indexed: 07/21/2023]
Abstract
The green fraction (GF), which is the fraction of green vegetation in a given viewing direction, is closely related to the light interception ability of the crop canopy. Monitoring the dynamics of GF is therefore of great interest for breeders to identify genotypes with high radiation use efficiency. The accuracy of GF estimation depends heavily on the quality of the segmentation dataset and the accuracy of the image segmentation method. To enhance segmentation accuracy while reducing annotation costs, we developed a self-supervised strategy for deep learning semantic segmentation of rice and wheat field images with very contrasting field backgrounds. First, the Digital Plant Phenotyping Platform was used to generate large, perfectly labeled simulated field images for wheat and rice crops, considering diverse canopy structures and a wide range of environmental conditions (sim dataset). We then used the domain adaptation model cycle-consistent generative adversarial network (CycleGAN) to bridge the reality gap between the simulated and real images (real dataset), producing simulation-to-reality images (sim2real dataset). Finally, 3 different semantic segmentation models (U-Net, DeepLabV3+, and SegFormer) were trained using 3 datasets (real, sim, and sim2real datasets). The performance of the 9 training strategies was assessed using real images captured from various sites. The results showed that SegFormer trained using the sim2real dataset achieved the best segmentation performance for both rice and wheat crops (rice: Accuracy = 0.940, F1-score = 0.937; wheat: Accuracy = 0.952, F1-score = 0.935). Likewise, favorable GF estimation results were obtained using the above strategy (rice: R2 = 0.967, RMSE = 0.048; wheat: R2 = 0.984, RMSE = 0.028). Compared with SegFormer trained using a real dataset, the optimal strategy demonstrated greater superiority for wheat images than for rice images. This discrepancy can be partially attributed to the differences in the backgrounds of the rice and wheat fields. The uncertainty analysis indicated that our strategy could be disrupted by the inhomogeneity of pixel brightness and the presence of senescent elements in the images. In summary, our self-supervised strategy addresses the issues of high cost and uncertain annotation accuracy during dataset creation, ultimately enhancing GF estimation accuracy for rice and wheat field images. The best weights we trained in wheat and rice are available: https://github.com/PheniX-Lab/sim2real-seg.
Collapse
Affiliation(s)
- Yangmingrui Gao
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Jiangsu Collaborative Innovation Center for Modern Crop Production,
Nanjing Agricultural University, Nanjing, China
| | - Yinglun Li
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Jiangsu Collaborative Innovation Center for Modern Crop Production,
Nanjing Agricultural University, Nanjing, China
| | - Ruibo Jiang
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Jiangsu Collaborative Innovation Center for Modern Crop Production,
Nanjing Agricultural University, Nanjing, China
| | - Xiaohai Zhan
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Jiangsu Collaborative Innovation Center for Modern Crop Production,
Nanjing Agricultural University, Nanjing, China
| | - Hao Lu
- Key Laboratory of Image Processing and Intelligent Control, School of Artificial Intelligence and Automation,
Huazhong University of Science and Technology, Wuhan, China
| | - Wei Guo
- Graduate School of Agricultural and Life Sciences,
The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | - Wanneng Yang
- National Key Laboratory of Crop Genetic Improvement, National Center of Plant Gene Research, and Hubei Key Laboratory of Agricultural Bioinformatics,
Huazhong Agricultural University, Wuhan 430070, China
| | - Yanfeng Ding
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Jiangsu Collaborative Innovation Center for Modern Crop Production,
Nanjing Agricultural University, Nanjing, China
| | - Shouyang Liu
- Plant Phenomics Research Centre, Academy for Advanced Interdisciplinary Studies, Jiangsu Collaborative Innovation Center for Modern Crop Production,
Nanjing Agricultural University, Nanjing, China
| |
Collapse
|
6
|
Madec S, Irfan K, Velumani K, Baret F, David E, Daubige G, Samatan LB, Serouart M, Smith D, James C, Camacho F, Guo W, De Solan B, Chapman SC, Weiss M. VegAnn, Vegetation Annotation of multi-crop RGB images acquired under diverse conditions for segmentation. Sci Data 2023; 10:302. [PMID: 37208401 DOI: 10.1038/s41597-023-02098-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Accepted: 03/22/2023] [Indexed: 05/21/2023] Open
Abstract
Applying deep learning to images of cropping systems provides new knowledge and insights in research and commercial applications. Semantic segmentation or pixel-wise classification, of RGB images acquired at the ground level, into vegetation and background is a critical step in the estimation of several canopy traits. Current state of the art methodologies based on convolutional neural networks (CNNs) are trained on datasets acquired under controlled or indoor environments. These models are unable to generalize to real-world images and hence need to be fine-tuned using new labelled datasets. This motivated the creation of the VegAnn - Vegetation Annotation - dataset, a collection of 3775 multi-crop RGB images acquired for different phenological stages using different systems and platforms in diverse illumination conditions. We anticipate that VegAnn will help improving segmentation algorithm performances, facilitate benchmarking and promote large-scale crop vegetation segmentation research.
Collapse
Affiliation(s)
- Simon Madec
- UMR TETIS, CIRAD, Montpellier, France.
- INRAE, Avignon Université, UMR EMMAH 1114, 84000, Avignon, France.
- Arvalis, 228, route de l'Aérodrome - CS 40509, 84914, Avignon, Cedex 9, France.
| | - Kamran Irfan
- INRAE, Avignon Université, UMR EMMAH 1114, 84000, Avignon, France
- HIPHEN SAS, 120 Rue Jean Dausset, Agroparc-Batiment Technicité, 84140, Avignon, France
| | - Kaaviya Velumani
- INRAE, Avignon Université, UMR EMMAH 1114, 84000, Avignon, France
| | - Frederic Baret
- INRAE, Avignon Université, UMR EMMAH 1114, 84000, Avignon, France
| | - Etienne David
- INRAE, Avignon Université, UMR EMMAH 1114, 84000, Avignon, France
- Arvalis, 228, route de l'Aérodrome - CS 40509, 84914, Avignon, Cedex 9, France
- HIPHEN SAS, 120 Rue Jean Dausset, Agroparc-Batiment Technicité, 84140, Avignon, France
| | - Gaetan Daubige
- Arvalis, 228, route de l'Aérodrome - CS 40509, 84914, Avignon, Cedex 9, France
| | | | - Mario Serouart
- INRAE, Avignon Université, UMR EMMAH 1114, 84000, Avignon, France
- Arvalis, 228, route de l'Aérodrome - CS 40509, 84914, Avignon, Cedex 9, France
| | - Daniel Smith
- The University of Queensland, School of Agriculture and Food Sciences, Gatton, QLD, 4343, Australia
| | - Chrisbin James
- The University of Queensland, School of Agriculture and Food Sciences, Gatton, QLD, 4343, Australia
| | | | - Wei Guo
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo, 188-0002, Japan
| | - Benoit De Solan
- Arvalis, 228, route de l'Aérodrome - CS 40509, 84914, Avignon, Cedex 9, France
| | - Scott C Chapman
- The University of Queensland, School of Agriculture and Food Sciences, Gatton, QLD, 4343, Australia
| | - Marie Weiss
- INRAE, Avignon Université, UMR EMMAH 1114, 84000, Avignon, France
| |
Collapse
|
7
|
Serouart M, Lopez-Lozano R, Daubige G, Baumont M, Escale B, De Solan B, Baret F. Analyzing Changes in Maize Leaves Orientation due to GxExM Using an Automatic Method from RGB Images. PLANT PHENOMICS (WASHINGTON, D.C.) 2023; 5:0046. [PMID: 37228515 PMCID: PMC10204743 DOI: 10.34133/plantphenomics.0046] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 04/08/2023] [Indexed: 05/27/2023]
Abstract
The sowing pattern has an important impact on light interception efficiency in maize by determining the spatial distribution of leaves within the canopy. Leaves orientation is an important architectural trait determining maize canopies light interception. Previous studies have indicated how maize genotypes may adapt leaves orientation to avoid mutual shading with neighboring plants as a plastic response to intraspecific competition. The goal of the present study is 2-fold: firstly, to propose and validate an automatic algorithm (Automatic Leaf Azimuth Estimation from Midrib detection [ALAEM]) based on leaves midrib detection in vertical red green blue (RGB) images to describe leaves orientation at the canopy level; and secondly, to describe genotypic and environmental differences in leaves orientation in a panel of 5 maize hybrids sowing at 2 densities (6 and 12 plants.m-2) and 2 row spacing (0.4 and 0.8 m) over 2 different sites in southern France. The ALAEM algorithm was validated against in situ annotations of leaves orientation, showing a satisfactory agreement (root mean square [RMSE] error = 0.1, R2 = 0.35) in the proportion of leaves oriented perpendicular to rows direction across sowing patterns, genotypes, and sites. The results from ALAEM permitted to identify significant differences in leaves orientation associated to leaves intraspecific competition. In both experiments, a progressive increase in the proportion of leaves oriented perpendicular to the row is observed when the rectangularity of the sowing pattern increases from 1 (6 plants.m-2, 0.4 m row spacing) towards 8 (12 plants.m-2, 0.8 m row spacing). Significant differences among the 5 cultivars were found, with 2 hybrids exhibiting, systematically, a more plastic behavior with a significantly higher proportion of leaves oriented perpendicularly to avoid overlapping with neighbor plants at high rectangularity. Differences in leaves orientation were also found between experiments in a squared sowing pattern (6 plants.m-2, 0.4 m row spacing), indicating a possible contribution of illumination conditions inducing a preferential orientation toward east-west direction when intraspecific competition is low.
Collapse
Affiliation(s)
- Mario Serouart
- Arvalis, Institut du végétal, 228, route de l’aérodrome - CS 40509, 84914 Avignon Cedex 9, France
- INRAE, Avignon Université, UMR EMMAH, UMT CAPTE, 228, route de l’aérodrome - CS 40509, 84914 Avignon Cedex 9, France
| | - Raul Lopez-Lozano
- INRAE, Avignon Université, UMR EMMAH, UMT CAPTE, 228, route de l’aérodrome - CS 40509, 84914 Avignon Cedex 9, France
| | - Gaëtan Daubige
- Arvalis, Institut du végétal, 228, route de l’aérodrome - CS 40509, 84914 Avignon Cedex 9, France
| | - Maëva Baumont
- Arvalis, Ecophysiology, 21 Chemin de Pau, 64121 Montardon, France
| | - Brigitte Escale
- Arvalis, Ecophysiology, 21 Chemin de Pau, 64121 Montardon, France
| | - Benoit De Solan
- Arvalis, Institut du végétal, 228, route de l’aérodrome - CS 40509, 84914 Avignon Cedex 9, France
| | - Frédéric Baret
- INRAE, Avignon Université, UMR EMMAH, UMT CAPTE, 228, route de l’aérodrome - CS 40509, 84914 Avignon Cedex 9, France
| |
Collapse
|
8
|
Sharma N, Banerjee BP, Hayden M, Kant S. An Open-Source Package for Thermal and Multispectral Image Analysis for Plants in Glasshouse. PLANTS (BASEL, SWITZERLAND) 2023; 12:317. [PMID: 36679030 PMCID: PMC9866171 DOI: 10.3390/plants12020317] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Revised: 01/03/2023] [Accepted: 01/06/2023] [Indexed: 06/17/2023]
Abstract
Advanced plant phenotyping techniques to measure biophysical traits of crops are helping to deliver improved crop varieties faster. Phenotyping of plants using different sensors for image acquisition and its analysis with novel computational algorithms are increasingly being adapted to measure plant traits. Thermal and multispectral imagery provides novel opportunities to reliably phenotype crop genotypes tested for biotic and abiotic stresses under glasshouse conditions. However, optimization for image acquisition, pre-processing, and analysis is required to correct for optical distortion, image co-registration, radiometric rescaling, and illumination correction. This study provides a computational pipeline that optimizes these issues and synchronizes image acquisition from thermal and multispectral sensors. The image processing pipeline provides a processed stacked image comprising RGB, green, red, NIR, red edge, and thermal, containing only the pixels present in the object of interest, e.g., plant canopy. These multimodal outputs in thermal and multispectral imageries of the plants can be compared and analysed mutually to provide complementary insights and develop vegetative indices effectively. This study offers digital platform and analytics to monitor early symptoms of biotic and abiotic stresses and to screen a large number of genotypes for improved growth and productivity. The pipeline is packaged as open source and is hosted online so that it can be utilized by researchers working with similar sensors for crop phenotyping.
Collapse
Affiliation(s)
- Neelesh Sharma
- Agriculture Victoria, Grains Innovation Park, 110 Natimuk Rd, Horsham, VIC 3400, Australia
| | - Bikram Pratap Banerjee
- Agriculture Victoria, Grains Innovation Park, 110 Natimuk Rd, Horsham, VIC 3400, Australia
| | - Matthew Hayden
- AgriBio, Centre for AgriBioscience, Agriculture Victoria, 5 Ring Road, Melbourne, VIC 3083, Australia
- School of Applied Systems Biology, La Trobe University, Melbourne, VIC 3083, Australia
| | - Surya Kant
- Agriculture Victoria, Grains Innovation Park, 110 Natimuk Rd, Horsham, VIC 3400, Australia
- AgriBio, Centre for AgriBioscience, Agriculture Victoria, 5 Ring Road, Melbourne, VIC 3083, Australia
- School of Applied Systems Biology, La Trobe University, Melbourne, VIC 3083, Australia
| |
Collapse
|