1
|
Wu X, Deng H, Wang Q, Lei L, Gao Y, Hao G. Meta-learning shows great potential in plant disease recognition under few available samples. THE PLANT JOURNAL : FOR CELL AND MOLECULAR BIOLOGY 2023; 114:767-782. [PMID: 36883481 DOI: 10.1111/tpj.16176] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/27/2022] [Revised: 02/15/2023] [Accepted: 02/23/2023] [Indexed: 05/27/2023]
Abstract
Plant diseases worsen the threat of food shortage with the growing global population, and disease recognition is the basis for the effective prevention and control of plant diseases. Deep learning has made significant breakthroughs in the field of plant disease recognition. Compared with traditional deep learning, meta-learning can still maintain more than 90% accuracy in disease recognition with small samples. However, there is no comprehensive review on the application of meta-learning in plant disease recognition. Here, we mainly summarize the functions, advantages, and limitations of meta-learning research methods and their applications for plant disease recognition with a few data scenarios. Finally, we outline several research avenues for utilizing current and future meta-learning in plant science. This review may help plant science researchers obtain faster, more accurate, and more credible solutions through deep learning with fewer labeled samples.
Collapse
Affiliation(s)
- Xue Wu
- National Key Laboratory of Green Pesticide, Key Laboratory of Green Pesticide and Agricultural Bioengineering, Ministry of Education, Center for Research and Development of Fine Chemicals, State Key Laboratory of Public Big Data, Guizhou University, Guiyang, 550025, Guizhou, China
| | - Hongyu Deng
- National Key Laboratory of Green Pesticide, Key Laboratory of Green Pesticide and Agricultural Bioengineering, Ministry of Education, Center for Research and Development of Fine Chemicals, State Key Laboratory of Public Big Data, Guizhou University, Guiyang, 550025, Guizhou, China
| | - Qi Wang
- National Key Laboratory of Green Pesticide, Key Laboratory of Green Pesticide and Agricultural Bioengineering, Ministry of Education, Center for Research and Development of Fine Chemicals, State Key Laboratory of Public Big Data, Guizhou University, Guiyang, 550025, Guizhou, China
| | - Liang Lei
- School of Physics & Optoelectronic Engineering, Guangdong University of Technology, Guangzhou, 550000, Guangzhou, China
| | - Yangyang Gao
- National Key Laboratory of Green Pesticide, Key Laboratory of Green Pesticide and Agricultural Bioengineering, Ministry of Education, Center for Research and Development of Fine Chemicals, State Key Laboratory of Public Big Data, Guizhou University, Guiyang, 550025, Guizhou, China
| | - Gefei Hao
- National Key Laboratory of Green Pesticide, Key Laboratory of Green Pesticide and Agricultural Bioengineering, Ministry of Education, Center for Research and Development of Fine Chemicals, State Key Laboratory of Public Big Data, Guizhou University, Guiyang, 550025, Guizhou, China
| |
Collapse
|
2
|
Taniguchi S, Sakamoto T, Imase R, Nonoue Y, Tsunematsu H, Goto A, Matsushita K, Ohmori S, Maeda H, Takeuchi Y, Ishii T, Yonemaru JI, Ogawa D. Prediction of heading date, culm length, and biomass from canopy-height-related parameters derived from time-series UAV observations of rice. FRONTIERS IN PLANT SCIENCE 2022; 13:998803. [PMID: 36582650 PMCID: PMC9792801 DOI: 10.3389/fpls.2022.998803] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/20/2022] [Accepted: 11/28/2022] [Indexed: 06/17/2023]
Abstract
Unmanned aerial vehicles (UAVs) are powerful tools for monitoring crops for high-throughput phenotyping. Time-series aerial photography of fields can record the whole process of crop growth. Canopy height (CH), which is vertical plant growth, has been used as an indicator for the evaluation of lodging tolerance and the prediction of biomass and yield. However, there have been few attempts to use UAV-derived time-series CH data for field testing of crop lines. Here we provide a novel framework for trait prediction using CH data in rice. We generated UAV-based digital surface models of crops to extract CH data of 30 Japanese rice cultivars in 2019, 2020, and 2021. CH-related parameters were calculated in a non-linear time-series model as an S-shaped plant growth curve. The maximum saturation CH value was the most important predictor for culm length. The time point at the maximum CH contributed to the prediction of days to heading, and was able to predict stem and leaf weight and aboveground weight, possibly reflecting the association of biomass with duration of vegetative growth. These results indicate that the CH-related parameters acquired by UAV can be useful as predictors of traits typically measured by hand.
Collapse
Affiliation(s)
- Shoji Taniguchi
- Research Center for Agricultural Information Technology, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Toshihiro Sakamoto
- Institute for Agro-Environmental Sciences, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Ryoji Imase
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Yasunori Nonoue
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Hiroshi Tsunematsu
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Akitoshi Goto
- Research Center for Agricultural Information Technology, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Kei Matsushita
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Sinnosuke Ohmori
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Hideo Maeda
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Yoshinobu Takeuchi
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Takuro Ishii
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Jun-ichi Yonemaru
- Research Center for Agricultural Information Technology, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| | - Daisuke Ogawa
- Institute of Crop Science, National Agricultural and Food Research Organization (NARO), Tsukuba, Japan
| |
Collapse
|
3
|
Monitoring Light Pollution with an Unmanned Aerial Vehicle: A Case Study Comparing RGB Images and Night Ground Brightness. REMOTE SENSING 2022. [DOI: 10.3390/rs14092052] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
There are several tools and methods to quantify light pollution due to direct or reflected light emitted towards the sky. Unmanned aerial vehicles (UAV) are still rarely used in light pollution studies. In this study, a digital camera and a sky quality meter mounted on a UAV have been used to study the relationship between indices computed on night images and night ground brightness (NGB) measured by an optical device pointed downward towards the ground. Both measurements were taken simultaneously during flights at an altitude of 70 and 100 m, and with varying exposure time. NGB correlated significantly both with the brightness index (−0.49 ÷ −0.56) and with red (−0.52 ÷ −0.58) and green band indices (−0.42 ÷ −0.58). A linear regression model based on the luminous intensity index was able to estimate observed NGB with an RMSE varying between 0.21 and 0.46 mpsas. Multispectral analysis applied to images taken at 70 m showed that increasing exposure time might cause a saturation of the colors of the image, especially in the red band, that worsens the correlation between image indices and NGB. Our study suggests that the combined use of low cost devices such as UAV and a sky quality meter can be used for assessing hotspot areas of light pollution originating from the surface.
Collapse
|
4
|
Abstract
The determination of bunch features that are relevant for bunch weight estimation is an important step in automatic vineyard yield estimation using image analysis. The conversion of 2D image features into mass can be highly dependent on grapevine cultivar, as the bunch morphology varies greatly. This paper aims to explore the relationships between bunch weight and bunch features obtained from image analysis considering a multicultivar approach. A set of 192 bunches from four cultivars, collected at sites located in Portugal and South Africa, were imaged using a conventional digital RGB camera, followed by image analysis, where several bunch features were extracted, along with physical measurements performed in laboratory conditions. Image data features were explored as predictors of bunch weight, individually and in a multiple stepwise regression analysis, which were then tested on 37% of the data. The results show that the variables bunch area and visible berries are good predictors of bunch weight (R2 ranging from 0.72 to 0.90); however, the simple regression lines fitted between these predictors and the response variable presented significantly different slopes among cultivars, indicating cultivar dependency. The elected multiple regression model used a combination of four variables: bunch area, bunch perimeter, visible berry number, and average berry area. The regression analysis between the actual and estimated bunch weight yielded a R2 = 0.91 on the test set. Our results are an important step towards automatic yield estimation in the vineyard, as they increase the possibility of applying image-based approaches using a generalized model, independent of the cultivar.
Collapse
|
5
|
Citizen Science for Marine Litter Detection and Classification on Unmanned Aerial Vehicle Images. WATER 2021. [DOI: 10.3390/w13233349] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Unmanned aerial vehicles (UAV, aka drones) are being used for mapping macro-litter in the environment. As drone images require a manual processing task for detecting marine litter, it is of interest to evaluate the accuracy of non-expert citizen science operators (CSO) in performing this task. Students from Italian secondary schools (in this work, the CSO) were invited to identify, mark, and classify stranded litter items on a UAV orthophoto collected on an Italian beach. A specific training program and working tools were developed for the aim. The comparison with the standard in situ visual census survey returned a general underestimation (50%) of items. However, marine litter bulk categorisation was fairly in agreement with the in situ survey, especially for sources classification. The concordance level among CSO ranged between 60% and 91%, depending on the item properties considered (type, material, and colour). As the assessment accuracy was in line with previous works developed by experts, remote detection of marine litter on UAV images can be improved through citizen science programs, upon an appropriate training plan and provision of specific tools.
Collapse
|
6
|
A Hybrid Vegetation Detection Framework: Integrating Vegetation Indices and Convolutional Neural Network. Symmetry (Basel) 2021. [DOI: 10.3390/sym13112190] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Vegetation inspection and monitoring is a time-consuming task. In the era of industrial revolution 4.0 (IR 4.0), unmanned aerial vehicles (UAV), commercially known as drones, are in demand, being adopted for vegetation inspection and monitoring activities. However, most off-the-shelf drones are least favoured by vegetation maintenance departments for on-site inspection due to limited spectral bands camera restricting advanced vegetation analysis. Most of these drones are normally equipped with a normal red, green, and blue (RGB) camera. Additional spectral bands are found to produce more accurate analysis during vegetation inspection, but at the cost of advanced camera functionalities, such as multispectral camera. Vegetation indices (VI) is a technique to maximize detection sensitivity related to vegetation characteristics while minimizing other factors which are not categorised otherwise. The emergence of machine learning has slowly influenced the existing vegetation analysis technique in order to improve detection accuracy. This study focuses on exploring VI techniques in identifying vegetation objects. The selected VIs investigated are Visible Atmospheric Resistant Index (VARI), Green Leaf Index (GLI), and Vegetation Index Green (VIgreen). The chosen machine learning technique is You Only Look Once (YOLO), which is a clever convolutional neural network (CNN) offering object detection in real time. The CNN model has a symmetrical structure along the direction of the tensor flow. Several series of data collection have been conducted at identified locations to obtain aerial images. The proposed hybrid methods were tested on captured aerial images to observe vegetation detection performance. Segmentation in image analysis is a process to divide the targeted pixels for further detection testing. Based on our findings, more than 70% of the vegetation objects in the images were accurately detected, which reduces the misdetection issue faced by previous VI techniques. On the other hand, hybrid segmentation methods perform best with the combination of VARI and YOLO at 84% detection accuracy.
Collapse
|
7
|
Carvalho LC, Gonçalves EF, Marques da Silva J, Costa JM. Potential Phenotyping Methodologies to Assess Inter- and Intravarietal Variability and to Select Grapevine Genotypes Tolerant to Abiotic Stress. FRONTIERS IN PLANT SCIENCE 2021; 12:718202. [PMID: 34764964 PMCID: PMC8575754 DOI: 10.3389/fpls.2021.718202] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 09/28/2021] [Indexed: 06/12/2023]
Abstract
Plant phenotyping is an emerging science that combines multiple methodologies and protocols to measure plant traits (e.g., growth, morphology, architecture, function, and composition) at multiple scales of organization. Manual phenotyping remains as a major bottleneck to the advance of plant and crop breeding. Such constraint fostered the development of high throughput plant phenotyping (HTPP), which is largely based on imaging approaches and automatized data retrieval and processing. Field phenotyping still poses major challenges and the progress of HTPP for field conditions can be relevant to support selection and breeding of grapevine. The aim of this review is to discuss potential and current methods to improve field phenotyping of grapevine to support characterization of inter- and intravarietal diversity. Vitis vinifera has a large genetic diversity that needs characterization, and the availability of methods to support selection of plant material (polyclonal or clonal) able to withstand abiotic stress is paramount. Besides being time consuming, complex and expensive, field experiments are also affected by heterogeneous and uncontrolled climate and soil conditions, mostly due to the large areas of the trials and to the high number of traits to be observed in a number of individuals ranging from hundreds to thousands. Therefore, adequate field experimental design and data gathering methodologies are crucial to obtain reliable data. Some of the major challenges posed to grapevine selection programs for tolerance to water and heat stress are described herein. Useful traits for selection and related field phenotyping methodologies are described and their adequacy for large scale screening is discussed.
Collapse
Affiliation(s)
- Luísa C. Carvalho
- LEAF – Linking Landscape, Environment, Agriculture and Food – Research Center, Associated Laboratory TERRA, Instituto Superior de Agronomia, Universidade de Lisboa, Lisboa, Portugal
| | - Elsa F. Gonçalves
- LEAF – Linking Landscape, Environment, Agriculture and Food – Research Center, Associated Laboratory TERRA, Instituto Superior de Agronomia, Universidade de Lisboa, Lisboa, Portugal
| | - Jorge Marques da Silva
- BioISI – Biosystems and Integrative Sciences Institute, Faculty of Sciences, Universidade de Lisboa, Lisboa, Portugal
| | - J. Miguel Costa
- LEAF – Linking Landscape, Environment, Agriculture and Food – Research Center, Associated Laboratory TERRA, Instituto Superior de Agronomia, Universidade de Lisboa, Lisboa, Portugal
| |
Collapse
|
8
|
Torres-Sánchez J, Mesas-Carrascosa FJ, Santesteban LG, Jiménez-Brenes FM, Oneka O, Villa-Llop A, Loidi M, López-Granados F. Grape Cluster Detection Using UAV Photogrammetric Point Clouds as a Low-Cost Tool for Yield Forecasting in Vineyards. SENSORS 2021; 21:s21093083. [PMID: 33925169 PMCID: PMC8125571 DOI: 10.3390/s21093083] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/05/2021] [Revised: 04/22/2021] [Accepted: 04/27/2021] [Indexed: 11/16/2022]
Abstract
Yield prediction is crucial for the management of harvest and scheduling wine production operations. Traditional yield prediction methods rely on manual sampling and are time-consuming, making it difficult to handle the intrinsic spatial variability of vineyards. There have been significant advances in automatic yield estimation in vineyards from on-ground imagery, but terrestrial platforms have some limitations since they can cause soil compaction and have problems on sloping and ploughed land. The analysis of photogrammetric point clouds generated with unmanned aerial vehicles (UAV) imagery has shown its potential in the characterization of woody crops, and the point color analysis has been used for the detection of flowers in almond trees. For these reasons, the main objective of this work was to develop an unsupervised and automated workflow for detection of grape clusters in red grapevine varieties using UAV photogrammetric point clouds and color indices. As leaf occlusion is recognized as a major challenge in fruit detection, the influence of partial leaf removal in the accuracy of the workflow was assessed. UAV flights were performed over two commercial vineyards with different grape varieties in 2019 and 2020, and the photogrammetric point clouds generated from these flights were analyzed using an automatic and unsupervised algorithm developed using free software. The proposed methodology achieved R2 values higher than 0.75 between the harvest weight and the projected area of the points classified as grapes in vines when partial two-sided removal treatment, and an R2 of 0.82 was achieved in one of the datasets for vines with untouched full canopy. The accuracy achieved in grape detection opens the door to yield prediction in red grape vineyards. This would allow the creation of yield estimation maps that will ease the implementation of precision viticulture practices. To the authors’ knowledge, this is the first time that UAV photogrammetric point clouds have been used for grape clusters detection.
Collapse
Affiliation(s)
- Jorge Torres-Sánchez
- Grupo Imaping, Instituto de Agricultura Sostenible-CSIC, 14004 Córdoba, Spain; (F.M.J.-B.); (F.L.-G.)
- Correspondence:
| | | | - Luis-Gonzaga Santesteban
- Departamento de Agronomía, Biotecnología y Alimentación, Universidad Pública de Navarra, 31006 Pamplona, Spain; (L.-G.S.); (O.O.); (A.V.-L.); (M.L.)
| | | | - Oihane Oneka
- Departamento de Agronomía, Biotecnología y Alimentación, Universidad Pública de Navarra, 31006 Pamplona, Spain; (L.-G.S.); (O.O.); (A.V.-L.); (M.L.)
| | - Ana Villa-Llop
- Departamento de Agronomía, Biotecnología y Alimentación, Universidad Pública de Navarra, 31006 Pamplona, Spain; (L.-G.S.); (O.O.); (A.V.-L.); (M.L.)
| | - Maite Loidi
- Departamento de Agronomía, Biotecnología y Alimentación, Universidad Pública de Navarra, 31006 Pamplona, Spain; (L.-G.S.); (O.O.); (A.V.-L.); (M.L.)
| | - Francisca López-Granados
- Grupo Imaping, Instituto de Agricultura Sostenible-CSIC, 14004 Córdoba, Spain; (F.M.J.-B.); (F.L.-G.)
| |
Collapse
|
9
|
Estimation of Apple Flowering Frost Loss for Fruit Yield Based on Gridded Meteorological and Remote Sensing Data in Luochuan, Shaanxi Province, China. REMOTE SENSING 2021. [DOI: 10.3390/rs13091630] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
With the increase in the frequency of extreme weather events in recent years, apple growing areas in the Loess Plateau frequently encounter frost during flowering. Accurately assessing the frost loss in orchards during the flowering period is of great significance for optimizing disaster prevention measures, market apple price regulation, agricultural insurance, and government subsidy programs. The previous research on orchard frost disasters is mainly focused on early risk warning. Therefore, to effectively quantify orchard frost loss, this paper proposes a frost loss assessment model constructed using meteorological and remote sensing information and applies this model to the regional-scale assessment of orchard fruit loss after frost. As an example, this article examines a frost event that occurred during the apple flowering period in Luochuan County, Northwestern China, on 17 April 2020. A multivariable linear regression (MLR) model was constructed based on the orchard planting years, the number of flowering days, and the chill accumulation before frost, as well as the minimum temperature and daily temperature difference on the day of frost. Then, the model simulation accuracy was verified using the leave-one-out cross-validation (LOOCV) method, and the coefficient of determination (R2), the root mean square error (RMSE), and the normalized root mean square error (NRMSE) were 0.69, 18.76%, and 18.76%, respectively. Additionally, the extended Fourier amplitude sensitivity test (EFAST) method was used for the sensitivity analysis of the model parameters. The results show that the simulated apple orchard fruit number reduction ratio is highly sensitive to the minimum temperature on the day of frost, and the chill accumulation and planting years before the frost, with sensitivity values of ≥0.74, ≥0.25, and ≥0.15, respectively. This research can not only assist governments in optimizing traditional orchard frost prevention measures and market price regulation but can also provide a reference for agricultural insurance companies to formulate plans for compensation after frost.
Collapse
|
10
|
Matese A, Di Gennaro SF. Beyond the traditional NDVI index as a key factor to mainstream the use of UAV in precision viticulture. Sci Rep 2021; 11:2721. [PMID: 33526834 PMCID: PMC7851140 DOI: 10.1038/s41598-021-81652-3] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2020] [Accepted: 01/01/2021] [Indexed: 11/26/2022] Open
Abstract
In the last decade there has been an exponential growth of research activity on the identification of correlations between vegetational indices elaborated by UAV imagery and productive and vegetative parameters of the vine. However, the acquisition and analysis of spectral data require costs and skills that are often not sufficiently available. In this context, the identification of geometric indices that allow the monitoring of spatial variability with low-cost instruments, without spectral analysis know-how but based on photogrammetry techniques with high-resolution RGB cameras, becomes extremely interesting. The aim of this work was to evaluate the potential of new canopy geometry-based indices for the characterization of vegetative and productive agronomic parameters compared to traditional NDVI based on spectral response of the canopy top. Furthermore, considering grape production as a key parameter directly linked to the economic profit of farmers, this study provides a deeper analysis focused on the development of a rapid yield forecast methodology based on UAV data, evaluating both traditional linear and machine learning regressions. Among the yield assessment models, one of the best results was obtained with the canopy thickness which showed high performance with the Gaussian process regression models (R2 = 0.80), while the yield prediction average accuracy of the best ML models reached 85.95%. The final results obtained confirm the feasibility of this research as a global yield model, which provided good performance through an accurate validation step realized in different years and different vineyards.
Collapse
Affiliation(s)
- Alessandro Matese
- Institute of BioEconomy, National Research Council (CNR-IBE), Via G. Caproni, 8, 50145, Florence, Italy.
| | | |
Collapse
|
11
|
Ogawa D, Sakamoto T, Tsunematsu H, Kanno N, Nonoue Y, Yonemaru JI. Remote-Sensing-Combined Haplotype Analysis Using Multi-Parental Advanced Generation Inter-Cross Lines Reveals Phenology QTLs for Canopy Height in Rice. FRONTIERS IN PLANT SCIENCE 2021; 12:715184. [PMID: 34721450 PMCID: PMC8553969 DOI: 10.3389/fpls.2021.715184] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/26/2021] [Accepted: 09/13/2021] [Indexed: 05/13/2023]
Abstract
High-throughput phenotyping systems with unmanned aerial vehicles (UAVs) enable observation of crop lines in the field. In this study, we show the ability of time-course monitoring of canopy height (CH) to identify quantitative trait loci (QTLs) and to characterise their pleiotropic effect on various traits. We generated a digital surface model from low-altitude UAV-captured colour digital images and investigated CH data of rice multi-parental advanced generation inter-cross (MAGIC) lines from tillering and heading to maturation. Genome-wide association studies (GWASs) using the CH data and haplotype information of the MAGIC lines revealed 11 QTLs for CH. Each QTL showed haplotype effects on different features of CH such as stage-specificity and constancy. Haplotype analysis revealed relationships at the QTL level between CH and, vegetation fraction and leaf colour [derived from UAV red-green-blue (RGB) data], and CH and yield-related traits. Noticeably, haplotypes with canopy lowering effects at qCH1-4, qCH2, and qCH10-2 increased the ratio of panicle weight to leaf and stem weight, suggesting biomass allocation to grain yield or others through growth regulation of CH. Allele mining using gene information with eight founders of the MAGIC lines revealed the possibility that qCH1-4 contains multiple alleles of semi-dwarf 1 (sd1), the IR-8 allele of which significantly contributed to the "green revolution" in rice. This use of remote-sensing-derived phenotyping data into genetics using the MAGIC lines gives insight into how rice plants grow, develop, and produce grains in phenology and provides information on effective haplotypes for breeding with ideal plant architecture and grain yield.
Collapse
Affiliation(s)
- Daisuke Ogawa
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
- *Correspondence: Daisuke Ogawa
| | - Toshihiro Sakamoto
- Institute for Agro-Environmental Sciences, National Agriculture and Food Research Organization, Tsukuba, Japan
| | - Hiroshi Tsunematsu
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| | - Noriko Kanno
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| | - Yasunori Nonoue
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| | - Jun-ichi Yonemaru
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
- Jun-ichi Yonemaru
| |
Collapse
|
12
|
GBCNet: In-Field Grape Berries Counting for Yield Estimation by Dilated CNNs. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10144870] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
We introduce here the Grape Berries Counting Net (GBCNet), a tool for accurate fruit yield estimation from smartphone cameras, by adapting Deep Learning algorithms originally developed for crowd counting. We test GBCNet using cross-validation procedure on two original datasets CR1 and CR2 of grape pictures taken in-field before veraison. A total of 35,668 berries have been manually annotated for the task. GBCNet achieves good performances on both the seven grape varieties dataset CR1, although with a different accuracy level depending on the variety, and on the single variety dataset CR2: in particular Mean Average Error (MAE) ranges from 0.85% for Pinot Gris to 11.73% for Marzemino on CR1 and reaches 7.24% on the Teroldego CR2 dataset.
Collapse
|
13
|
Bendel N, Kicherer A, Backhaus A, Klück HC, Seiffert U, Fischer M, Voegele RT, Töpfer R. Evaluating the suitability of hyper- and multispectral imaging to detect foliar symptoms of the grapevine trunk disease Esca in vineyards. PLANT METHODS 2020; 16:142. [PMID: 33101451 PMCID: PMC7579826 DOI: 10.1186/s13007-020-00685-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/04/2019] [Accepted: 10/13/2020] [Indexed: 05/06/2023]
Abstract
BACKGROUND Grapevine trunk diseases (GTDs) such as Esca are among the most devastating threats to viticulture. Due to the lack of efficient preventive and curative treatments, Esca causes severe economic losses worldwide. Since symptoms do not develop consecutively, the true incidence of the disease in a vineyard is difficult to assess. Therefore, an annual monitoring is required. In this context, automatic detection of symptoms could be a great relief for winegrowers. Spectral sensors have proven to be successful in disease detection, allowing a non-destructive, objective, and fast data acquisition. The aim of this study is to evaluate the feasibility of the in-field detection of foliar Esca symptoms over three consecutive years using ground-based hyperspectral and airborne multispectral imaging. RESULTS Hyperspectral disease detection models have been successfully developed using either original field data or manually annotated data. In a next step, these models were applied on plant scale. While the model using annotated data performed better during development, the model using original data showed higher classification accuracies when applied in practical work. Moreover, the transferability of disease detection models to unknown data was tested. Although the visible and near-infrared (VNIR) range showed promising results, the transfer of such models is challenging. Initial results indicate that external symptoms could be detected pre-symptomatically, but this needs further evaluation. Furthermore, an application specific multispectral approach was simulated by identifying the most important wavelengths for the differentiation tasks, which was then compared to real multispectral data. Even though the ground-based multispectral disease detection was successful, airborne detection remains difficult. CONCLUSIONS In this study, ground-based hyperspectral and airborne multispectral approaches for the detection of foliar Esca symptoms are presented. Both sensor systems seem to be suitable for the in-field detection of the disease, even though airborne data acquisition has to be further optimized. Our disease detection approaches could facilitate monitoring plant phenotypes in a vineyard.
Collapse
Affiliation(s)
- Nele Bendel
- Institute for Grapevine Breeding, Julius Kühn-Institut, Federal Research Centre for Cultivated Plants, Geilweilerhof, 76833 Siebeldingen, Germany
- Institute of Phytomedicine, University of Hohenheim, Otto-Sander-Straße 5, 70599 Stuttgart, Germany
| | - Anna Kicherer
- Institute for Grapevine Breeding, Julius Kühn-Institut, Federal Research Centre for Cultivated Plants, Geilweilerhof, 76833 Siebeldingen, Germany
| | - Andreas Backhaus
- Biosystems Engineering, Fraunhofer Institute for Factory Operation and Automation (IFF), Sandtorstr. 22, 39106 Magdeburg, Germany
| | - Hans-Christian Klück
- Biosystems Engineering, Fraunhofer Institute for Factory Operation and Automation (IFF), Sandtorstr. 22, 39106 Magdeburg, Germany
| | - Udo Seiffert
- Biosystems Engineering, Fraunhofer Institute for Factory Operation and Automation (IFF), Sandtorstr. 22, 39106 Magdeburg, Germany
| | - Michael Fischer
- Institute for Plant Protection in Fruit Crops and Viticulture, Julius Kühn-Institut, Federal Research Centre for Cultivated Plants, Geilweilerhof, 76833 Siebeldingen, Germany
| | - Ralf T. Voegele
- Institute of Phytomedicine, University of Hohenheim, Otto-Sander-Straße 5, 70599 Stuttgart, Germany
| | - Reinhard Töpfer
- Institute for Grapevine Breeding, Julius Kühn-Institut, Federal Research Centre for Cultivated Plants, Geilweilerhof, 76833 Siebeldingen, Germany
| |
Collapse
|
14
|
Combination of an Automated 3D Field Phenotyping Workflow and Predictive Modelling for High-Throughput and Non-Invasive Phenotyping of Grape Bunches. REMOTE SENSING 2019. [DOI: 10.3390/rs11242953] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
In grapevine breeding, loose grape bunch architecture is one of the most important selection traits, contributing to an increased resilience towards Botrytis bunch rot. Grape bunch architecture is mainly influenced by the berry number, berry size, the total berry volume, and bunch width and length. For an objective, precise, and high-throughput assessment of these architectural traits, the 3D imaging sensor Artec® Spider was applied to gather dense point clouds of the visible side of grape bunches directly in the field. Data acquisition in the field is much faster and non-destructive in comparison to lab applications but results in incomplete point clouds and, thus, mostly incomplete phenotypic values. Therefore, lab scans of whole bunches (360°) were used as ground truth. We observed strong correlations between field and lab data but also shifts in mean and max values, especially for the berry number and total berry volume. For this reason, the present study is focused on the training and validation of different predictive regression models using 3D data from approximately 2000 different grape bunches in order to predict incomplete bunch traits from field data. Modeling concepts included simple linear regression and machine learning-based approaches. The support vector machine was the best and most robust regression model, predicting the phenotypic traits with an R2 of 0.70–0.91. As a breeding orientated proof-of-concept, we additionally performed a Quantitative Trait Loci (QTL)-analysis with both the field modeled and lab data. All types of data resulted in joint QTL regions, indicating that this innovative, fast, and non-destructive phenotyping method is also applicable for molecular marker development and grapevine breeding research.
Collapse
|
15
|
Sentinel-2 Validation for Spatial Variability Assessment in Overhead Trellis System Viticulture Versus UAV and Agronomic Data. REMOTE SENSING 2019. [DOI: 10.3390/rs11212573] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Several remote sensing technologies have been tested in precision viticulture to characterize vineyard spatial variability, from traditional aircraft and satellite platforms to recent unmanned aerial vehicles (UAVs). Imagery processing is still a challenge due to the traditional row-based architecture, where the inter-row soil provides a high to full presence of mixed pixels. In this case, UAV images combined with filtering techniques represent the solution to analyze pure canopy pixels and were used to benchmark the effectiveness of Sentinel-2 (S2) performance in overhead training systems. At harvest time, UAV filtered and unfiltered images and ground sampling data were used to validate the correlation between the S2 normalized difference vegetation indices (NDVIs) with vegetative and productive parameters in two vineyards (V1 and V2). Regarding the UAV vs. S2 NDVI comparison, in both vineyards, satellite data showed a high correlation both with UAV unfiltered and filtered images (V1 R2 = 0.80 and V2 R2 = 0.60 mean values). Ground data and remote sensing platform NDVIs correlation were strong for yield and biomass in both vineyards (R2 from 0.60 to 0.95). These results demonstrate the effectiveness of spatial resolution provided by S2 on overhead trellis system viticulture, promoting precision viticulture also within areas that are currently managed without the support of innovative technologies.
Collapse
|
16
|
Ogawa D, Sakamoto T, Tsunematsu H, Yamamoto T, Kanno N, Nonoue Y, Yonemaru JI. Surveillance of panicle positions by unmanned aerial vehicle to reveal morphological features of rice. PLoS One 2019; 14:e0224386. [PMID: 31671163 PMCID: PMC6822732 DOI: 10.1371/journal.pone.0224386] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2019] [Accepted: 10/13/2019] [Indexed: 02/06/2023] Open
Abstract
Rice plant architecture affects biomass and grain yield. Thus, it is important to select rice genotypes with ideal plant architecture. High-throughput phenotyping by use of an unmanned aerial vehicle (UAV) allows all lines in a field to be observed in less time than with traditional procedures. However, discrimination of plants in dense plantings is difficult, especially during the reproductive stage, because leaves and panicles overlap. Here, we developed an original method that relies on using UAV to identify panicle positions for dissecting plant architecture and to distinguish rice lines by detecting red flags attached to panicle bases. The plant architecture of recombinant inbred lines derived from Japanese cultivars ‘Hokuriku 193’ and ‘Mizuhochikara’, which differ in plant architecture, was assessed using a commercial camera-UAV system. Orthomosaics were made from UAV digital images. The center of plants was plotted on the image during the vegetative stage. The horizontal distance from the center to the red flag during the reproductive stage was used as the panicle position (PP). The red flags enabled us to recognize the positions of the panicles at a rate of 92%. The PP phenotype was related to but was not identical with the phenotypes of the panicle base angle, leaf sheath angle, and score of spreading habit. These results indicate that PP on orthomosaics could be used as an index of plant architecture under field conditions.
Collapse
Affiliation(s)
- Daisuke Ogawa
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
- * E-mail: (DO); (TS)
| | - Toshihiro Sakamoto
- Institute for Agro-Environmental Sciences, National Agriculture and Food Research Organization, Tsukuba, Japan
- * E-mail: (DO); (TS)
| | - Hiroshi Tsunematsu
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| | - Toshio Yamamoto
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| | - Noriko Kanno
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| | - Yasunori Nonoue
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| | - Jun-ichi Yonemaru
- Institute of Crop Science, National Agricultural and Food Research Organization, Tsukuba, Japan
| |
Collapse
|