1
|
Leiva F, Dhakal R, Himanen K, Ortiz R, Chawade A. The Combination of Low-Cost, Red-Green-Blue (RGB) Image Analysis and Machine Learning to Screen for Barley Plant Resistance to Net Blotch. Plants (Basel) 2024; 13:1039. [PMID: 38611568 PMCID: PMC11013667 DOI: 10.3390/plants13071039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/06/2024] [Revised: 03/26/2024] [Accepted: 04/01/2024] [Indexed: 04/14/2024]
Abstract
Challenges of climate change and growth population are exacerbated by noticeable environmental changes, which can increase the range of plant diseases, for instance, net blotch (NB), a foliar disease which significantly decreases barley (Hordeum vulgare L.) grain yield and quality. A resistant germplasm is usually identified through visual observation and the scoring of disease symptoms; however, this is subjective and time-consuming. Thus, automated, non-destructive, and low-cost disease-scoring approaches are highly relevant to barley breeding. This study presents a novel screening method for evaluating NB severity in barley. The proposed method uses an automated RGB imaging system, together with machine learning, to evaluate different symptoms and the severity of NB. The study was performed on three barley cultivars with distinct levels of resistance to NB (resistant, moderately resistant, and susceptible). The tested approach showed mean precision of 99% for various categories of NB severity (chlorotic, necrotic, and fungal lesions, along with leaf tip necrosis). The results demonstrate that the proposed method could be effective in assessing NB from barley leaves and specifying the level of NB severity; this type of information could be pivotal to precise selection for NB resistance in barley breeding.
Collapse
Affiliation(s)
- Fernanda Leiva
- Department of Plant Breeding, Swedish University of Agricultural Sciences (SLU), P.O. Box 190, SE-23422 Lomma, Sweden; (F.L.); (R.O.)
| | - Rishap Dhakal
- Department of Plant and Agroecosystem Sciences, University of Wisconsin-Madison, 1575 Linden Dr, Madison, WI 53706, USA
| | - Kristiina Himanen
- National Plant Phenotyping Infrastructure, Helsinki Institute of Life Science, Biocenter Finland, University of Helsinki, Latokartanonkaari 7, 00790 Helsinki, Finland;
| | - Rodomiro Ortiz
- Department of Plant Breeding, Swedish University of Agricultural Sciences (SLU), P.O. Box 190, SE-23422 Lomma, Sweden; (F.L.); (R.O.)
| | - Aakash Chawade
- Department of Plant Breeding, Swedish University of Agricultural Sciences (SLU), P.O. Box 190, SE-23422 Lomma, Sweden; (F.L.); (R.O.)
| |
Collapse
|
2
|
Caredda C, Van Reeth E, Mahieu-Williame L, Sablong R, Sdika M, Schneider FC, Picart T, Guyotat J, Montcel B. Intraoperative identification of functional brain areas with RGB imaging using statistical parametric mapping: Simulation and clinical studies. Neuroimage 2023; 278:120286. [PMID: 37487945 DOI: 10.1016/j.neuroimage.2023.120286] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2023] [Revised: 07/06/2023] [Accepted: 07/19/2023] [Indexed: 07/26/2023] Open
Abstract
Complementary technique to preoperative fMRI and electrical brain stimulation (EBS) for glioma resection could improve dramatically the surgical procedure and patient care. Intraoperative RGB optical imaging is a technique for localizing functional areas of the human cerebral cortex that can be used during neurosurgical procedures. However, it still lacks robustness to be used with neurosurgical microscopes as a clinical standard. In particular, a robust quantification of biomarkers of brain functionality is needed to assist neurosurgeons. We propose a methodology to evaluate and optimize intraoperative identification of brain functional areas by RGB imaging. This consist in a numerical 3D brain model based on Monte Carlo simulations to evaluate intraoperative optical setups for identifying functional brain areas. We also adapted fMRI Statistical Parametric Mapping technique to identify functional brain areas in RGB videos acquired for 12 patients. Simulation and experimental results were consistent and showed that the intraoperative identification of functional brain areas is possible with RGB imaging using deoxygenated hemoglobin contrast. Optical functional identifications were consistent with those provided by EBS and preoperative fMRI. We also demonstrated that a halogen lighting may be particularity adapted for functional optical imaging. We showed that an RGB camera combined with a quantitative modeling of brain hemodynamics biomarkers can evaluate in a robust way the functional areas during neurosurgery and serve as a tool of choice to complement EBS and fMRI.
Collapse
Affiliation(s)
- Charly Caredda
- Univ Lyon, INSA-Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS, Inserm, CREATIS UMR 5220, U1294, F69100, Lyon, France.
| | - Eric Van Reeth
- Univ Lyon, INSA-Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS, Inserm, CREATIS UMR 5220, U1294, F69100, Lyon, France
| | - Laurent Mahieu-Williame
- Univ Lyon, INSA-Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS, Inserm, CREATIS UMR 5220, U1294, F69100, Lyon, France
| | - Raphaël Sablong
- Univ Lyon, INSA-Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS, Inserm, CREATIS UMR 5220, U1294, F69100, Lyon, France
| | - Michaël Sdika
- Univ Lyon, INSA-Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS, Inserm, CREATIS UMR 5220, U1294, F69100, Lyon, France
| | - Fabien C Schneider
- Service de Radiologie, Centre Hospitalier Universitaire de Saint Etienne, TAPE EA7423, Université de Lyon, UJM Saint Etienne, F42023, France
| | - Thiébaud Picart
- Service de Neurochirurgie D, Hospices Civils de Lyon, Bron, France
| | - Jacques Guyotat
- Service de Neurochirurgie D, Hospices Civils de Lyon, Bron, France
| | - Bruno Montcel
- Univ Lyon, INSA-Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS, Inserm, CREATIS UMR 5220, U1294, F69100, Lyon, France.
| |
Collapse
|
3
|
Ye Z, Xu H, Huang Y, Yang M. Design of a Dual-Mode Multispectral Filter Array. Sensors (Basel) 2023; 23:6856. [PMID: 37571639 PMCID: PMC10422536 DOI: 10.3390/s23156856] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Revised: 07/22/2023] [Accepted: 07/31/2023] [Indexed: 08/13/2023]
Abstract
Multispectral imaging is valuable in many vision-related fields as it provides an additional modality to observe the world. Cameras equipped with multispectral filter arrays (MSFAs) are typically impractical for everyday use due to their intractable demosaicking and chromatic reproduction processes, which restrict their applicability beyond academic research. In this work, a novel MSFA design is proposed to enable dual-mode imaging for multispectral cameras. In addition to a conventional multispectral image, the camera is also able to produce a Bayer-formed RGB image from a single shot by grouping and merging adjacent pixels in the proposed MSFA, making it suitable for scenarios where display-ready RGB images are required. Furthermore, a two-stage optimization scheme is implemented to jointly optimize objective functions for both imaging modes. The evaluation results on multiple datasets suggest that the proposed MSFA design is able to simultaneously achieve competitive spectral reconstruction accuracy compared to elaborate multispectral cameras and chromatic accuracy compared to commercial RGB cameras.
Collapse
Affiliation(s)
| | - Haisong Xu
- State Key Laboratory of Extreme Photonics and Instrumentation, College of Optical Science and Engineering, Zhejiang University, Hangzhou 310027, China
| | | | | |
Collapse
|
4
|
Bethge H, Winkelmann T, Lüdeke P, Rath T. Low-cost and automated phenotyping system "Phenomenon" for multi-sensor in situ monitoring in plant in vitro culture. Plant Methods 2023; 19:42. [PMID: 37131210 PMCID: PMC10152611 DOI: 10.1186/s13007-023-01018-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/12/2022] [Accepted: 04/14/2023] [Indexed: 05/04/2023]
Abstract
BACKGROUND The current development of sensor technologies towards ever more cost-effective and powerful systems is steadily increasing the application of low-cost sensors in different horticultural sectors. In plant in vitro culture, as a fundamental technique for plant breeding and plant propagation, the majority of evaluation methods to describe the performance of these cultures are based on destructive approaches, limiting data to unique endpoint measurements. Therefore, a non-destructive phenotyping system capable of automated, continuous and objective quantification of in vitro plant traits is desirable. RESULTS An automated low-cost multi-sensor system acquiring phenotypic data of plant in vitro cultures was developed and evaluated. Unique hardware and software components were selected to construct a xyz-scanning system with an adequate accuracy for consistent data acquisition. Relevant plant growth predictors, such as projected area of explants and average canopy height were determined employing multi-sensory imaging and various developmental processes could be monitored and documented. The validation of the RGB image segmentation pipeline using a random forest classifier revealed very strong correlation with manual pixel annotation. Depth imaging by a laser distance sensor of plant in vitro cultures enabled the description of the dynamic behavior of the average canopy height, the maximum plant height, but also the culture media height and volume. Projected plant area in depth data by RANSAC (random sample consensus) segmentation approach well matched the projected plant area by RGB image processing pipeline. In addition, a successful proof of concept for in situ spectral fluorescence monitoring was achieved and challenges of thermal imaging were documented. Potential use cases for the digital quantification of key performance parameters in research and commercial application are discussed. CONCLUSION The technical realization of "Phenomenon" allows phenotyping of plant in vitro cultures under highly challenging conditions and enables multi-sensory monitoring through closed vessels, ensuring the aseptic status of the cultures. Automated sensor application in plant tissue culture promises great potential for a non-destructive growth analysis enhancing commercial propagation as well as enabling research with novel digital parameters recorded over time.
Collapse
Affiliation(s)
- Hans Bethge
- Laboratory for Biosystems Engineering, Faculty of Agricultural Sciences and Landscape Architecture, Osnabrück University of Applied Sciences, Oldenburger Landstraße 24, 49090, Osnabrück, Germany.
- Institute of Horticultural Production Systems, Section of Woody Plant and Propagation Physiology, Leibniz Universität Hannover, Herrenhäuser Str. 2, 30419, Hannover, Germany.
| | - Traud Winkelmann
- Institute of Horticultural Production Systems, Section of Woody Plant and Propagation Physiology, Leibniz Universität Hannover, Herrenhäuser Str. 2, 30419, Hannover, Germany
| | | | - Thomas Rath
- Laboratory for Biosystems Engineering, Faculty of Agricultural Sciences and Landscape Architecture, Osnabrück University of Applied Sciences, Oldenburger Landstraße 24, 49090, Osnabrück, Germany
| |
Collapse
|
5
|
Sha W, Hu K, Weng S. Statistic and Network Features of RGB and Hyperspectral Imaging for Determination of Black Root Mold Infection in Apples. Foods 2023; 12:foods12081608. [PMID: 37107403 PMCID: PMC10137991 DOI: 10.3390/foods12081608] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2023] [Revised: 03/24/2023] [Accepted: 03/31/2023] [Indexed: 04/29/2023] Open
Abstract
Apples damaged by black root mold (BRM) lose moisture, vitamins, and minerals as well as carry dangerous toxins. Determination of the infection degree can allow for customized use of apples, reduce financial losses, and ensure food safety. In this study, red-green-blue (RGB) imaging and hyperspectral imaging (HSI) are combined to detect the infection degree of BRM in apple fruits. First, RGB and HSI images of healthy, mildly, moderately, and severely infected fruits are measured, and those with effective wavelengths (EWs) are screened from HSI by random frog. Second, the statistic and network features of images are extracted by using color moment and convolutional neural network. Meanwhile, random forest (RF), K-nearest neighbor, and support vector machine are used to construct classification models with the above two features of RGB and HSI images of EWs. Optimal results with the 100% accuracy of training set and 96% accuracy of prediction set are obtained by RF with the statistic and network features of the two images, outperforming the other cases. The proposed method furnishes an accurate and effective solution for determining the BRM infection degree in apples.
Collapse
Affiliation(s)
- Wen Sha
- School of Electrical Engineering and Automation, Anhui University, 111 Jiulong Road Hefei, Hefei 230601, China
- Engineering Research Center of Autonomous Unmanned System Technology, Ministry of Education, Anhui University, 111 Jiulong Road Hefei, Hefei 230601, China
| | - Kang Hu
- School of Electrical Engineering and Automation, Anhui University, 111 Jiulong Road Hefei, Hefei 230601, China
| | - Shizhuang Weng
- National Engineering Research Center for Agro-Ecological Big Data Analysis & Application, Anhui University, 111 Jiulong Road Hefei, Hefei 230601, China
| |
Collapse
|
6
|
Thoday-Kennedy E, Dimech AM, Joshi S, Daetwyler HD, Hudson D, Spangenberg G, Hayden M, Kant S. An image dataset of diverse safflower (Carthamus tinctorius L.) genotypes for salt response phenotyping. Data Brief 2023; 46:108787. [PMID: 36506801 DOI: 10.1016/j.dib.2022.108787] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Revised: 11/21/2022] [Accepted: 11/24/2022] [Indexed: 11/30/2022] Open
Abstract
This article describes a dataset of high-resolution visible-spectrum images of safflower (Carthamus tinctorius L.) plants obtained from a LemnaTec Scanalyser automated phenomics platform along with the associated image analysis output and manually acquired biomass data. This series contains 1832 images of 200 diverse safflower genotypes, acquired at the Plant Phenomics Victoria, Horsham, Victoria, Australia. Two Prosilica GT RGB (red-green-blue) cameras were used to generate 6576 × 4384 pixel portable network graphic (PNG) images. Safflower genotypes were either subjected to a salt treatment (250 mM NaCl) or grown as a control (0 mM NaCl) and imaged daily from 15 to 36 days after sowing. Each snapshot consists of four images collected at a point in time; one of which is taken from above (top-view) and the remainder from the side at either 0°, 120° or 240°. The dataset also includes analysis output quantifying traits and describing phenotypes, as well as manually collected biomass and leaf ion content data. The usage of the dataset is already demonstrated in Thoday-Kennedy et al. (2021) [1]. This dataset describes the early growth differences of diverse safflower genotypes and identified genotypes tolerant or susceptible to salinity stress. This dataset provides detailed image analysis parameters for phenotyping a large population of safflower that can be used for the training of image-based trait identification pipelines for a wide range of crop species.
Collapse
|
7
|
Fu X, Bai Y, Zhou J, Zhang H, Xian J. A method for obtaining field wheat freezing injury phenotype based on RGB camera and software control. Plant Methods 2021; 17:120. [PMID: 34836556 PMCID: PMC8620711 DOI: 10.1186/s13007-021-00821-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/09/2021] [Accepted: 11/14/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND Low temperature freezing stress has adverse effects on wheat seedling growth and final yield. The traditional method to evaluate the wheat injury caused by the freezing stress is by visual observations, which is time-consuming and laborious. Therefore, a more efficient and accurate method for freezing damage identification is urgently needed. RESULTS A high-throughput phenotyping system was developed in this paper, namely, RGB freezing injury system, to effectively and efficiently quantify the wheat freezing injury in the field environments. The system is able to automatically collect, processing, and analyze the wheat images collected using a mobile phenotype cabin in the field conditions. A data management system was also developed to store and manage the original images and the calculated phenotypic data in the system. In this experiment, a total of 128 wheat varieties were planted, three nitrogen concentrations were applied and two biological and technical replicates were performed. And wheat canopy images were collected at the seedling pulling stage and three image features were extracted for each wheat samples, including ExG, ExR and ExV. We compared different test parameters and found that the coverage had a greater impact on freezing injury. Therefore, we preliminarily divided four grades of freezing injury according to the test results to evaluate the freezing injury of different varieties of wheat at the seedling stage. CONCLUSIONS The automatic phenotypic analysis method of freezing injury provides an alternative solution for high-throughput freezing damage analysis of field crops and it can be used to quantify freezing stress and has guiding significance for accelerating the selection of wheat excellent frost resistance genotypes.
Collapse
Affiliation(s)
- Xiuqing Fu
- College of Engineering, Nanjing Agricultural University, Nanjing, 210031, China.
- Key Laboratory of Intelligence Agricultural Equipment of Jiangsu Province, Nanjing, 210031, China.
| | - Yang Bai
- College of Engineering, Nanjing Agricultural University, Nanjing, 210031, China
| | - Jing Zhou
- Division of Food Systems and Bioengineering, University of Missouri, Columbia, MO, 65211, USA
| | - Hongwen Zhang
- School of Mechanical and Electrical Engineering, Shihezi University, Shihezi, 832003, China
| | - Jieyu Xian
- College of Engineering, Nanjing Agricultural University, Nanjing, 210031, China
| |
Collapse
|
8
|
Caredda C, Mahieu-Williame L, Sablong R, Sdika M, Schneider FC, Guyotat J, Montcel B. Intraoperative Resting-State Functional Connectivity Based on RGB Imaging. Diagnostics (Basel) 2021; 11:2067. [PMID: 34829414 DOI: 10.3390/diagnostics11112067] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2021] [Revised: 10/27/2021] [Accepted: 11/04/2021] [Indexed: 11/26/2022] Open
Abstract
RGB optical imaging is a marker-free, contactless, and non-invasive technique that is able to monitor hemodynamic brain response following neuronal activation using task-based and resting-state procedures. Magnetic resonance imaging (fMRI) and functional near infra-red spectroscopy (fNIRS) resting-state procedures cannot be used intraoperatively but RGB imaging provides an ideal solution to identify resting-state networks during a neurosurgical operation. We applied resting-state methodologies to intraoperative RGB imaging and evaluated their ability to identify resting-state networks. We adapted two resting-state methodologies from fMRI for the identification of resting-state networks using intraoperative RGB imaging. Measurements were performed in 3 patients who underwent resection of lesions adjacent to motor sites. The resting-state networks were compared to the identifications provided by RGB task-based imaging and electrical brain stimulation. Intraoperative RGB resting-state networks corresponded to RGB task-based imaging (DICE:0.55±0.29). Resting state procedures showed a strong correspondence between them (DICE:0.66±0.11) and with electrical brain stimulation. RGB imaging is a relevant technique for intraoperative resting-state networks identification. Intraoperative resting-state imaging has several advantages compared to functional task-based analyses: data acquisition is shorter, less complex, and less demanding for the patients, especially for those unable to perform the tasks.
Collapse
|
9
|
Thoday-Kennedy E, Joshi S, Daetwyler HD, Hayden M, Hudson D, Spangenberg G, Kant S. Digital Phenotyping to Delineate Salinity Response in Safflower Genotypes. Front Plant Sci 2021; 12:662498. [PMID: 34220887 PMCID: PMC8242588 DOI: 10.3389/fpls.2021.662498] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 05/24/2021] [Indexed: 05/27/2023]
Abstract
Salinity is a major contributing factor to the degradation of arable land, and reductions in crop growth and yield. To overcome these limitations, the breeding of crop varieties with improved salt tolerance is needed. This requires effective and high-throughput phenotyping to optimize germplasm enhancement. Safflower (Carthamus tinctorius L.), is an underappreciated but highly versatile oilseed crop, capable of growing in saline and arid environments. To develop an effective and rapid phenotyping protocol to differentiate salt responses in safflower genotypes, experiments were conducted in the automated imaging facility at Plant Phenomics Victoria, Horsham, focussing on digital phenotyping at early vegetative growth. The initial experiment, at 0, 125, 250, and 350 mM sodium chloride (NaCl), showed that 250 mM NaCl was optimum to differentiate salt sensitive and tolerant genotypes. Phenotyping of a diverse set of 200 safflower genotypes using the developed protocol defined four classes of salt tolerance or sensitivity, based on biomass and ion accumulation. Salt tolerance in safflower was dependent on the exclusion of Na+ from shoot tissue and the maintenance of K+ uptake. Salinity response identified in glasshouse experiments showed some consistency with the performance of representatively selected genotypes tested under sodic field conditions. Overall, our results suggest that digital phenotyping can be an effective high-throughput approach in identifying candidate genotypes for salt tolerance in safflower.
Collapse
Affiliation(s)
| | - Sameer Joshi
- Agriculture Victoria, Grains Innovation Park, Horsham, VIC, Australia
| | - Hans D. Daetwyler
- Agriculture Victoria, AgriBio, Centre for AgriBioscience, Bundoora, VIC, Australia
- School of Applied Systems Biology, La Trobe University, Bundoora, VIC, Australia
| | - Matthew Hayden
- Agriculture Victoria, AgriBio, Centre for AgriBioscience, Bundoora, VIC, Australia
- School of Applied Systems Biology, La Trobe University, Bundoora, VIC, Australia
| | - David Hudson
- GO Resources Pty Ltd., Brunswick, VIC, Australia
| | - German Spangenberg
- Agriculture Victoria, AgriBio, Centre for AgriBioscience, Bundoora, VIC, Australia
- School of Applied Systems Biology, La Trobe University, Bundoora, VIC, Australia
| | - Surya Kant
- Agriculture Victoria, Grains Innovation Park, Horsham, VIC, Australia
- Centre for Agricultural Innovation, School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, The University of Melbourne, Melbourne, VIC, Australia
| |
Collapse
|
10
|
Francesconi S, Harfouche A, Maesano M, Balestra GM. UAV-Based Thermal, RGB Imaging and Gene Expression Analysis Allowed Detection of Fusarium Head Blight and Gave New Insights Into the Physiological Responses to the Disease in Durum Wheat. Front Plant Sci 2021; 12:628575. [PMID: 33868331 PMCID: PMC8047627 DOI: 10.3389/fpls.2021.628575] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2020] [Accepted: 03/12/2021] [Indexed: 05/24/2023]
Abstract
Wheat is one of the world's most economically important cereal crop, grown on 220 million hectares. Fusarium head blight (FHB) disease is considered a major threat to durum (Triticum turgidum subsp. durum (Desfontaines) Husnache) and bread wheat (T. aestivum L.) cultivars and is mainly managed by the application of fungicides at anthesis. However, fungicides are applied when FHB symptoms are clearly visible and the spikes are almost entirely bleached (% of diseased spikelets > 80%), by when it is too late to control FHB disease. For this reason, farmers often react by performing repeated fungicide treatments that, however, due to the advanced state of the infection, cause a waste of money and pose significant risks to the environment and non-target organisms. In the present study, we used unmanned aerial vehicle (UAV)-based thermal infrared (TIR) and red-green-blue (RGB) imaging for FHB detection in T. turgidum (cv. Marco Aurelio) under natural field conditions. TIR and RGB data coupled with ground-based measurements such as spike's temperature, photosynthetic efficiency and molecular identification of FHB pathogens, detected FHB at anthesis half-way (Zadoks stage 65, ZS 65), when the percentage (%) of diseased spikelets ranged between 20% and 60%. Moreover, in greenhouse experiments the transcripts of the key genes involved in stomatal closure were mostly up-regulated in F. graminearum-inoculated plants, demonstrating that the physiological mechanism behind the spike's temperature increase and photosynthetic efficiency decrease could be attributed to the closure of the guard cells in response to F. graminearum. In addition, preliminary analysis revealed that there is differential regulation of genes between drought-stressed and F. graminearum-inoculated plants, suggesting that there might be a possibility to discriminate between water stress and FHB infection. This study shows the potential of UAV-based TIR and RGB imaging for field phenotyping of wheat and other cereal crop species in response to environmental stresses. This is anticipated to have enormous promise for the detection of FHB disease and tremendous implications for optimizing the application of fungicides, since global food crop demand is to be met with minimal environmental impacts.
Collapse
Affiliation(s)
- Sara Francesconi
- Department of Agriculture and Forest Sciences (DAFNE), University of Tuscia, Viterbo, Italy
| | - Antoine Harfouche
- Department for Innovation in Biological, Agro-Food and Forest Systems (DIBAF), University of Tuscia, Viterbo, Italy
| | - Mauro Maesano
- Department for Innovation in Biological, Agro-Food and Forest Systems (DIBAF), University of Tuscia, Viterbo, Italy
| | | |
Collapse
|
11
|
Zubler AV, Yoon JY. Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning. Biosensors (Basel) 2020; 10:E193. [PMID: 33260412 PMCID: PMC7760370 DOI: 10.3390/bios10120193] [Citation(s) in RCA: 18] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/25/2020] [Revised: 11/10/2020] [Accepted: 11/26/2020] [Indexed: 11/16/2022]
Abstract
Plant stresses have been monitored using the imaging or spectrometry of plant leaves in the visible (red-green-blue or RGB), near-infrared (NIR), infrared (IR), and ultraviolet (UV) wavebands, often augmented by fluorescence imaging or fluorescence spectrometry. Imaging at multiple specific wavelengths (multi-spectral imaging) or across a wide range of wavelengths (hyperspectral imaging) can provide exceptional information on plant stress and subsequent diseases. Digital cameras, thermal cameras, and optical filters have become available at a low cost in recent years, while hyperspectral cameras have become increasingly more compact and portable. Furthermore, smartphone cameras have dramatically improved in quality, making them a viable option for rapid, on-site stress detection. Due to these developments in imaging technology, plant stresses can be monitored more easily using handheld and field-deployable methods. Recent advances in machine learning algorithms have allowed for images and spectra to be analyzed and classified in a fully automated and reproducible manner, without the need for complicated image or spectrum analysis methods. This review will highlight recent advances in portable (including smartphone-based) detection methods for biotic and abiotic stresses, discuss data processing and machine learning techniques that can produce results for stress identification and classification, and suggest future directions towards the successful translation of these methods into practical use.
Collapse
Affiliation(s)
| | - Jeong-Yeol Yoon
- Department of Biosystems Engineering, The University of Arizona, Tucson, AZ 85721, USA;
| |
Collapse
|
12
|
Fernandez-Gallego JA, Lootens P, Borra-Serrano I, Derycke V, Haesaert G, Roldán-Ruiz I, Araus JL, Kefauver SC. Automatic wheat ear counting using machine learning based on RGB UAV imagery. Plant J 2020; 103:1603-1613. [PMID: 32369641 DOI: 10.1111/tpj.14799] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/22/2019] [Revised: 04/15/2020] [Accepted: 04/23/2020] [Indexed: 06/11/2023]
Abstract
In wheat (Triticum aestivum L) and other cereals, the number of ears per unit area is one of the main yield-determining components. An automatic evaluation of this parameter may contribute to the advance of wheat phenotyping and monitoring. There is no standard protocol for wheat ear counting in the field, and moreover it is time consuming. An automatic ear-counting system is proposed using machine learning techniques based on RGB (red, green, blue) images acquired from an unmanned aerial vehicle (UAV). Evaluation was performed on a set of 12 winter wheat cultivars with three nitrogen treatments during the 2017-2018 crop season. The automatic system uses a frequency filter, segmentation and feature extraction, with different classification techniques, to discriminate wheat ears in micro-plot images. The relationship between the image-based manual counting and the algorithm counting exhibited high levels of accuracy and efficiency. In addition, manual ear counting was conducted in the field for secondary validation. The correlations between the automatic and the manual in-situ ear counting with grain yield were also compared. Correlations between the automatic ear counting and grain yield were stronger than those between manual in-situ counting and GY, particularly for the lower nitrogen treatment. Methodological requirements and limitations are discussed.
Collapse
Affiliation(s)
- Jose A Fernandez-Gallego
- Plant Physiology Section, Department of Evolutionary Biology, Ecology and Environmental Sciences, Faculty of Biology, University of Barcelona, Diagonal 643, Barcelona, 08028, Spain
- AGROTECNIO (Center for Research in Agrotechnology), Av. Rovira Roure 191, Lleida, 25198, Spain
- Programa de Ingeniería Electrónica, Facultad de Ingeniería, Universidad de Ibagué, Carrera 22 Calle 67, Ibagué, 730001, Colombia
| | - Peter Lootens
- Plant Sciences Unit, Flanders Research Institute for Agriculture, Fisheries and Food (ILVO), Caritasstraat 39, Melle, 9090, Belgium
| | - Irene Borra-Serrano
- Plant Sciences Unit, Flanders Research Institute for Agriculture, Fisheries and Food (ILVO), Caritasstraat 39, Melle, 9090, Belgium
- Division of Forest, Nature and Landscape, KU Leuven, Celestijnenlaan 200E, Leuven, 3001, Belgium
| | - Veerle Derycke
- Department Plants and Crops, Faculty of Bioscience Engineering, Ghent University, Valentin Vaerwyckweg 1, Ghent, 9000, Belgium
| | - Geert Haesaert
- Department Plants and Crops, Faculty of Bioscience Engineering, Ghent University, Valentin Vaerwyckweg 1, Ghent, 9000, Belgium
| | - Isabel Roldán-Ruiz
- Plant Sciences Unit, Flanders Research Institute for Agriculture, Fisheries and Food (ILVO), Caritasstraat 39, Melle, 9090, Belgium
- Department of Plant Biotechnology and Bioinformatics, Ghent University, Technologiepark 71, Ghent, 9052, Belgium
| | - Jose L Araus
- Plant Physiology Section, Department of Evolutionary Biology, Ecology and Environmental Sciences, Faculty of Biology, University of Barcelona, Diagonal 643, Barcelona, 08028, Spain
- AGROTECNIO (Center for Research in Agrotechnology), Av. Rovira Roure 191, Lleida, 25198, Spain
| | - Shawn C Kefauver
- Plant Physiology Section, Department of Evolutionary Biology, Ecology and Environmental Sciences, Faculty of Biology, University of Barcelona, Diagonal 643, Barcelona, 08028, Spain
- AGROTECNIO (Center for Research in Agrotechnology), Av. Rovira Roure 191, Lleida, 25198, Spain
| |
Collapse
|
13
|
Özkan K, Işık Ş, Yavuz BT. Identification of wheat kernels by fusion of RGB, SWIR, and VNIR samples. J Sci Food Agric 2019; 99:4977-4984. [PMID: 30977132 DOI: 10.1002/jsfa.9732] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2018] [Revised: 02/18/2019] [Accepted: 03/25/2019] [Indexed: 05/22/2023]
Abstract
BACKGROUND The sustainable management of agricultural resources requires the integration of cutting-edge science with the observation and identification of crops. This assists experts to make correct decisions. The aim of this study is to assess the robustness of a commonly used deep learning tool, VGG16, in improving the categorization of wheat kernels. Two fusion methodologies were considered simultaneously. We performed experiments on visible light (RGB), short wave infrared (SWIR), and visible-near infrared (VNIR) datasets, including 40 classes, with 200 samples in each class, giving 8000 samples in total. RESULTS After making simulations with 6400 training and 1600 testing samples, we achieved excellent performance scores, with 98.19% and 100% accuracy rates, respectively. CONCLUSION The wheat identification system developed here serves as an effective identification framework and supports the view that deep learning tools can adequately discriminate between different types of wheat kernels. The proposed automated system would be useful for improving economic growth and in reducing the labor force, leading to greater efficiency and higher productivity in the wheat industry. © 2019 Society of Chemical Industry.
Collapse
Affiliation(s)
- Kemal Özkan
- Department of Computer Engineering, Eskişehir Osmangazi University, Eskişehir, Turkey
| | - Şahin Işık
- Department of Computer Engineering, Eskişehir Osmangazi University, Eskişehir, Turkey
| | - Büşra Topsakal Yavuz
- Department of Computer Engineering, Eskişehir Osmangazi University, Eskişehir, Turkey
| |
Collapse
|
14
|
Marzougui A, Ma Y, Zhang C, McGee RJ, Coyne CJ, Main D, Sankaran S. Advanced Imaging for Quantitative Evaluation of Aphanomyces Root Rot Resistance in Lentil. Front Plant Sci 2019; 10:383. [PMID: 31057562 PMCID: PMC6477098 DOI: 10.3389/fpls.2019.00383] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/31/2018] [Accepted: 03/13/2019] [Indexed: 05/08/2023]
Abstract
Aphanomyces root rot (ARR) is a soil-borne disease that results in severe yield losses in lentil. The development of resistant cultivars is one of the key strategies to control this pathogen. However, the evaluation of disease severity is limited to visual scores that can be subjective. This study utilized image-based phenotyping approaches to evaluate Aphanomyces euteiches resistance in lentil genotypes in greenhouse (351 genotypes from lentil single plant/LSP derived collection and 191 genotypes from recombinant inbred lines/RIL using digital Red-Green-Blue/RGB and hyperspectral imaging) and field (173 RIL genotypes using unmanned aerial system-based multispectral imaging) conditions. Moderate to strong correlations were observed between RGB, multispectral, and hyperspectral derived features extracted from lentil shoots/roots and visual scores. In general, root features extracted from RGB imaging were found to be strongly associated with disease severity. With only three root traits, elastic net regression model was able to predict disease severity across and within multiple datasets (R 2 = 0.45-0.73 and RMSE = 0.66-1.00). The selected features could represent visual disease scores. Moreover, we developed twelve normalized difference spectral indices (NDSIs) that were significantly correlated with disease scores: two NDSIs for lentil shoot section - computed from wavelengths of 1170, 1160, 1270, and 1280 nm (0.12 ≤ |r| ≤ 0.24, P < 0.05) and ten NDSIs for lentil root sections - computed from wavelengths in the range of 630-670, 700-840, and 1320-1530 nm (0.10 ≤ |r| ≤ 0.50, P < 0.05). Root-derived NDSIs were more accurate in predicting disease scores with an R 2 of 0.54 (RMSE = 0.86), especially when the model was trained and tested on LSP accessions, compared to R 2 of 0.25 (RMSE = 1.64) when LSP and RIL genotypes were used as train and test datasets, respectively. Importantly, NDSIs - computed from wavelengths of 700, 710, 730, and 790 nm - had strong positive correlations with disease scores (0.35 ≤r ≤ 0.50, P < 0.0001), which was confirmed in field phenotyping with similar correlations using vegetation index with red edge wavelength (normalized difference red edge, 0.36 ≤ |r| ≤ 0.57, P < 0.0001). The adopted image-based phenotyping approaches can help plant breeders to objectively quantify ARR resistance and reduce the subjectivity in selecting potential genotypes.
Collapse
Affiliation(s)
- Afef Marzougui
- Department of Biological Systems Engineering, Washington State University, Pullman, WA, United States
| | - Yu Ma
- Department of Horticulture, Washington State University, Pullman, WA, United States
| | - Chongyuan Zhang
- Department of Biological Systems Engineering, Washington State University, Pullman, WA, United States
| | - Rebecca J. McGee
- United States Department of Agriculture-Agricultural Research Service, Grain Legume Genetics and Physiology Research Unit, Washington State University, Pullman, WA, United States
| | - Clarice J. Coyne
- United States Department of Agriculture-Agricultural Research Service, Plant Germplasm Introduction and Testing Unit, Washington State University, Pullman, WA, United States
| | - Dorrie Main
- Department of Horticulture, Washington State University, Pullman, WA, United States
| | - Sindhuja Sankaran
- Department of Biological Systems Engineering, Washington State University, Pullman, WA, United States
| |
Collapse
|
15
|
Carlson ML, McClatchy DM, Gunn JR, Elliott JT, Paulsen KD, Kanick SC, Pogue BW. Wide-field color imaging of scatter-based tissue contrast using both high spatial frequency illumination and cross-polarization gating. J Biophotonics 2018. [PMID: 29024450 DOI: 10.1002/jbio.2018.11.issue-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
This study characterizes the scatter-specific tissue contrast that can be obtained by high spatial frequency (HSF) domain imaging and cross-polarization (CP) imaging, using a standard color imaging system, and how combining them may be beneficial. Both HSF and CP approaches are known to modulate the sensitivity of epi-illumination reflectance images between diffuse multiply scattered and superficially backscattered photons, providing enhanced contrast from microstructure and composition than what is achieved by standard wide-field imaging. Measurements in tissue-simulating optical phantoms show that CP imaging returns localized assessments of both scattering and absorption effects, while HSF has uniquely specific sensitivity to scatter-only contrast, with a strong suppression of visible contrast from blood. The combination of CP and HSF imaging provided an expanded sensitivity to scatter compared with CP imaging, while rejecting specular reflections detected by HSF imaging. ex vivo imaging of an atlas of dissected rodent organs/tissues demonstrated the scatter-based contrast achieved with HSF, CP and HSF-CP imaging, with the white light spectral signal returned by each approach translated to a color image for intuitive encoding of scatter-based contrast within images of tissue. The results suggest that visible CP-HSF imaging could have the potential to aid diagnostic imaging of lesions in skin or mucosal tissues and organs, where just CP is currently the standard practice imaging modality.
Collapse
Affiliation(s)
| | - David M McClatchy
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
| | - Jason R Gunn
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
| | - Jonathan T Elliott
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
| | - Keith D Paulsen
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
| | - Stephen C Kanick
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
- Profusa, Inc., South San Francisco, California
| | - Brian W Pogue
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
| |
Collapse
|
16
|
Carlson ML, McClatchy DM, Gunn JR, Elliott JT, Paulsen KD, Kanick SC, Pogue BW. Wide-field color imaging of scatter-based tissue contrast using both high spatial frequency illumination and cross-polarization gating. J Biophotonics 2018; 11:e201700104. [PMID: 28800205 DOI: 10.1002/jbio.201700104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Revised: 07/13/2017] [Accepted: 07/14/2017] [Indexed: 06/07/2023]
Abstract
This study characterizes the scatter-specific tissue contrast that can be obtained by high spatial frequency (HSF) domain imaging and cross-polarization (CP) imaging, using a standard color imaging system, and how combining them may be beneficial. Both HSF and CP approaches are known to modulate the sensitivity of epi-illumination reflectance images between diffuse multiply scattered and superficially backscattered photons, providing enhanced contrast from microstructure and composition than what is achieved by standard wide-field imaging. Measurements in tissue-simulating optical phantoms show that CP imaging returns localized assessments of both scattering and absorption effects, while HSF has uniquely specific sensitivity to scatter-only contrast, with a strong suppression of visible contrast from blood. The combination of CP and HSF imaging provided an expanded sensitivity to scatter compared with CP imaging, while rejecting specular reflections detected by HSF imaging. ex vivo imaging of an atlas of dissected rodent organs/tissues demonstrated the scatter-based contrast achieved with HSF, CP and HSF-CP imaging, with the white light spectral signal returned by each approach translated to a color image for intuitive encoding of scatter-based contrast within images of tissue. The results suggest that visible CP-HSF imaging could have the potential to aid diagnostic imaging of lesions in skin or mucosal tissues and organs, where just CP is currently the standard practice imaging modality.
Collapse
Affiliation(s)
| | - David M McClatchy
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
| | - Jason R Gunn
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
| | - Jonathan T Elliott
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
| | - Keith D Paulsen
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
| | - Stephen C Kanick
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
- Profusa, Inc., South San Francisco, California
| | - Brian W Pogue
- Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire
- Norris Cotton Cancer Center, Dartmouth Hitchcock Medical Center, Lebanon, New Hampshire
| |
Collapse
|