1
|
Taguchi K, Guo W, Burridge J, Ito A, Njehia NS, Matsuhira H, Usui Y, Hirafuji M. High-Throughput Yield Prediction of Diallele Crossed Sugar Beet in a Breeding Field Using UAV-Derived Growth Dynamics. PLANT PHENOMICS (WASHINGTON, D.C.) 2024; 6:0209. [PMID: 39077118 PMCID: PMC11283879 DOI: 10.34133/plantphenomics.0209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/30/2023] [Accepted: 06/08/2024] [Indexed: 07/31/2024]
Abstract
Data-driven techniques could be used to enhance decision-making capacity of breeders and farmers. We used an RGB camera on an unmanned aerial vehicle (UAV) to collect time series data on sugar beet canopy coverage (CC) and canopy height (CH) from small-plot breeding fields including 20 genotypes per season over 3 seasons. Digital orthomosaic and digital surface models were created from each flight and were converted to individual plot-level data. Plot-level data including CC and CH were calculated on a per-plot basis. A multiple regression model was fitted, which predicts root weight (RW) (r = 0.89, 0.89, and 0.92 in the 3 seasons, respectively) and sugar content (SC) (r = 0.79, 0.83, and 0.77 in the 3 seasons, respectively) using individual time point CC and CH data. Individual CC and CH values in late June tended to be strong predictors of RW and SC, suggesting that early season growth is critical for obtaining high RW and SC. Coefficient of parentage was not a strong factor influencing SC. Integrals of CC and CH time series data were calculated for genetic analysis purposes since they are more stable over multiple growing seasons. Calculations of general combining ability and specific combining ability in F1 offspring demonstrate how growth curve quantification can be used in diallel cross analysis and yield prediction. Our simple yet robust solution demonstrates how state-of-the-art remote sensing tools and basic analysis methods can be applied to small-plot breeder fields for selection purpose.
Collapse
Affiliation(s)
- Kazunori Taguchi
- National Agriculture and Food Research Organization, Hokkaido Agricultural Research Center, Memuro Research Station, 9-4 Shinseiminami, Memuro, Kasai, Hokkaido 082-0081, Japan
- National Agriculture and Food Research Organization, Central Region Agricultural Research Center, 3-1-3 Kannondai, Tsukuba, Ibaraki 305-8604, Japan
| | - Wei Guo
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, Nishi-Tokyo city, Tokyo 188-0002, Japan
| | - James Burridge
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, Nishi-Tokyo city, Tokyo 188-0002, Japan
| | - Atsushi Ito
- National Agriculture and Food Research Organization, Hokkaido Agricultural Research Center, Memuro Research Station, 9-4 Shinseiminami, Memuro, Kasai, Hokkaido 082-0081, Japan
| | - Njane Stephen Njehia
- National Agriculture and Food Research Organization, Hokkaido Agricultural Research Center, Memuro Research Station, 9-4 Shinseiminami, Memuro, Kasai, Hokkaido 082-0081, Japan
| | - Hiroaki Matsuhira
- National Agriculture and Food Research Organization, Hokkaido Agricultural Research Center, Memuro Research Station, 9-4 Shinseiminami, Memuro, Kasai, Hokkaido 082-0081, Japan
| | - Yasuhiro Usui
- National Agriculture and Food Research Organization, Hokkaido Agricultural Research Center, Memuro Research Station, 9-4 Shinseiminami, Memuro, Kasai, Hokkaido 082-0081, Japan
- National Agriculture and Food Research Organization, Central Region Agricultural Research Center, 3-1-3 Kannondai, Tsukuba, Ibaraki 305-8604, Japan
| | - Masayuki Hirafuji
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, Nishi-Tokyo city, Tokyo 188-0002, Japan
| |
Collapse
|
2
|
Lei T, Graefe J, Mayanja IK, Earles M, Bailey BN. Simulation of Automatically Annotated Visible and Multi-/Hyperspectral Images Using the Helios 3D Plant and Radiative Transfer Modeling Framework. PLANT PHENOMICS (WASHINGTON, D.C.) 2024; 6:0189. [PMID: 38817960 PMCID: PMC11136674 DOI: 10.34133/plantphenomics.0189] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Accepted: 04/25/2024] [Indexed: 06/01/2024]
Abstract
Deep learning and multimodal remote and proximal sensing are widely used for analyzing plant and crop traits, but many of these deep learning models are supervised and necessitate reference datasets with image annotations. Acquiring these datasets often demands experiments that are both labor-intensive and time-consuming. Furthermore, extracting traits from remote sensing data beyond simple geometric features remains a challenge. To address these challenges, we proposed a radiative transfer modeling framework based on the Helios 3-dimensional (3D) plant modeling software designed for plant remote and proximal sensing image simulation. The framework has the capability to simulate RGB, multi-/hyperspectral, thermal, and depth cameras, and produce associated plant images with fully resolved reference labels such as plant physical traits, leaf chemical concentrations, and leaf physiological traits. Helios offers a simulated environment that enables generation of 3D geometric models of plants and soil with random variation, and specification or simulation of their properties and function. This approach differs from traditional computer graphics rendering by explicitly modeling radiation transfer physics, which provides a critical link to underlying plant biophysical processes. Results indicate that the framework is capable of generating high-quality, labeled synthetic plant images under given lighting scenarios, which can lessen or remove the need for manually collected and annotated data. Two example applications are presented that demonstrate the feasibility of using the model to enable unsupervised learning by training deep learning models exclusively with simulated images and performing prediction tasks using real images.
Collapse
Affiliation(s)
- Tong Lei
- Department of Plant Sciences,
University of California, Davis, CA, USA
| | - Jan Graefe
- Leibniz Institute of Vegetable and Ornamental Crops e.V. (IGZ), Großbeeren, Germany
| | - Ismael K. Mayanja
- Department of Biological and Agricultural Engineering,
University of California, Davis, CA, USA
| | - Mason Earles
- Department of Biological and Agricultural Engineering,
University of California, Davis, CA, USA
- Department of Viticulture and Enology,
University of California, Davis, CA, USA
| | - Brian N. Bailey
- Department of Plant Sciences,
University of California, Davis, CA, USA
| |
Collapse
|
3
|
Aguilar-Ariza A, Ishii M, Miyazaki T, Saito A, Khaing HP, Phoo HW, Kondo T, Fujiwara T, Guo W, Kamiya T. UAV-based individual Chinese cabbage weight prediction using multi-temporal data. Sci Rep 2023; 13:20122. [PMID: 37978327 PMCID: PMC10656565 DOI: 10.1038/s41598-023-47431-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Accepted: 11/14/2023] [Indexed: 11/19/2023] Open
Abstract
The use of unmanned aerial vehicles (UAVs) has facilitated crop canopy monitoring, enabling yield prediction by integrating regression models. However, the application of UAV-based data to individual-level harvest weight prediction is limited by the effectiveness of obtaining individual features. In this study, we propose a method that automatically detects and extracts multitemporal individual plant features derived from UAV-based data to predict harvest weight. We acquired data from an experimental field sown with 1196 Chinese cabbage plants, using two cameras (RGB and multi-spectral) mounted on UAVs. First, we used three RGB orthomosaic images and an object detection algorithm to detect more than 95% of the individual plants. Next, we used feature selection methods and five different multi-temporal resolutions to predict individual plant weights, achieving a coefficient of determination (R2) of 0.86 and a root mean square error (RMSE) of 436 g/plant. Furthermore, we achieved predictions with an R2 greater than 0.72 and an RMSE less than 560 g/plant up to 53 days prior to harvest. These results demonstrate the feasibility of accurately predicting individual Chinese cabbage harvest weight using UAV-based data and the efficacy of utilizing multi-temporal features to predict plant weight more than one month prior to harvest.
Collapse
Affiliation(s)
- Andrés Aguilar-Ariza
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Yayoi, Bunkyo-ku, Tokyo, 113-8657, Japan
| | - Masanori Ishii
- Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Midoricho, Nishitokyo-shi, Tokyo, 188-0002, Japan
| | - Toshio Miyazaki
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | - Aika Saito
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | | | - Hnin Wint Phoo
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | - Tomohiro Kondo
- Nippon Norin Seed Co., 6-6-5 Takinogawa, Kita-ku, Tokyo, 114-0023, Japan
| | - Toru Fujiwara
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Yayoi, Bunkyo-ku, Tokyo, 113-8657, Japan
| | - Wei Guo
- Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Midoricho, Nishitokyo-shi, Tokyo, 188-0002, Japan
| | - Takehiro Kamiya
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1, Yayoi, Bunkyo-ku, Tokyo, 113-8657, Japan.
| |
Collapse
|
4
|
Wang H, Li T, Nishida E, Kato Y, Fukano Y, Guo W. Drone-Based Harvest Data Prediction Can Reduce On-Farm Food Loss and Improve Farmer Income. PLANT PHENOMICS (WASHINGTON, D.C.) 2023; 5:0086. [PMID: 37692103 PMCID: PMC10484300 DOI: 10.34133/plantphenomics.0086] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/25/2023] [Accepted: 08/14/2023] [Indexed: 09/12/2023]
Abstract
On-farm food loss (i.e., grade-out vegetables) is a difficult challenge in sustainable agricultural systems. The simplest method to reduce the number of grade-out vegetables is to monitor and predict the size of all individuals in the vegetable field and determine the optimal harvest date with the smallest grade-out number and highest profit, which is not cost-effective by conventional methods. Here, we developed a full pipeline to accurately estimate and predict every broccoli head size (n > 3,000) automatically and nondestructively using drone remote sensing and image analysis. The individual sizes were fed to the temperature-based growth model and predicted the optimal harvesting date. Two years of field experiments revealed that our pipeline successfully estimated and predicted the head size of all broccolis with high accuracy. We also found that a deviation of only 1 to 2 days from the optimal date can considerably increase grade-out and reduce farmer's profits. This is an unequivocal demonstration of the utility of these approaches to economic crop optimization and minimization of food losses.
Collapse
Affiliation(s)
- Haozhou Wang
- Graduate School of Agricultural and Life Sciences,
The University of Tokyo, Tokyo, Japan
| | - Tang Li
- Graduate School of Agricultural and Life Sciences,
The University of Tokyo, Tokyo, Japan
| | - Erika Nishida
- Graduate School of Agricultural and Life Sciences,
The University of Tokyo, Tokyo, Japan
| | - Yoichiro Kato
- Graduate School of Agricultural and Life Sciences,
The University of Tokyo, Tokyo, Japan
| | - Yuya Fukano
- Graduate School of Horticulture,
Chiba University, Chiba, Japan
| | - Wei Guo
- Graduate School of Agricultural and Life Sciences,
The University of Tokyo, Tokyo, Japan
| |
Collapse
|
5
|
Varga Z, Vörös F, Pál M, Kovács B, Jung A, Elek I. Performance and Accuracy Comparisons of Classification Methods and Perspective Solutions for UAV-Based Near-Real-Time "Out of the Lab" Data Processing. SENSORS (BASEL, SWITZERLAND) 2022; 22:8629. [PMID: 36433226 PMCID: PMC9696863 DOI: 10.3390/s22228629] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/02/2022] [Revised: 10/29/2022] [Accepted: 11/01/2022] [Indexed: 06/16/2023]
Abstract
Today, integration into automated systems has become a priority in the development of remote sensing sensors carried on drones. For this purpose, the primary task is to achieve real-time data processing. Increasing sensor resolution, fast data capture and the simultaneous use of multiple sensors is one direction of development. However, this poses challenges on the data processing side due to the increasing amount of data. Our study intends to investigate how the running time and accuracy of commonly used image classification algorithms evolve using Altum Micasense multispectral and thermal acquisition data with GSD = 2 cm spatial resolution. The running times were examined for two PC configurations, with a 4 GB and 8 GB DRAM capacity, respectively, as these parameters are closer to the memory of NRT microcomputers and laptops, which can be applied "out of the lab". During the accuracy assessment, we compared the accuracy %, the Kappa index value and the area ratio of correct pixels. According to our results, in the case of plant cover, the Spectral Angles Mapper (SAM) method achieved the best accuracy among the validated classification solutions. In contrast, the Minimum Distance (MD) method achieved the best accuracy on water surface. In terms of temporality, the best results were obtained with the individually constructed decision tree classification. Thus, it is worth developing these two directions into real-time data processing solutions.
Collapse
|
6
|
Kuroki K, Yan K, Iwata H, Shimizu KK, Tameshige T, Nasuda S, Guo W. Development of a high-throughput field phenotyping rover optimized for size-limited breeding fields as open-source hardware. BREEDING SCIENCE 2022; 72:66-74. [PMID: 36045888 PMCID: PMC8987849 DOI: 10.1270/jsbbs.21059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/11/2021] [Accepted: 11/30/2021] [Indexed: 06/15/2023]
Abstract
Phenotyping is a critical process in plant breeding, especially when there is an increasing demand for streamlining a selection process in a breeding program. Since manual phenotyping has limited efficiency, high-throughput phenotyping methods are recently popularized owing to progress in sensor and image processing technologies. However, in a size-limited breeding field, which is common in Japan and other Asian countries, it is challenging to introduce large machinery in the field or fly unmanned aerial vehicles over the field. In this study, we developed a ground-based high-throughput field phenotyping rover that could be easily introduced to a field regardless of the scale and location of the field even without special facilities. We also made the field rover open-source hardware, making its system available to public for easy modification, so that anyone can build one for their own use at a low cost. The trial run of the field rover revealed that it allowed the collection of detailed remote-sensing images of plants and quantitative analyses based on the images. The results suggest that the field rover developed in this study could allow efficient phenotyping of plants especially in a small breeding field.
Collapse
Affiliation(s)
- Ken Kuroki
- Graduate School of Agriculture, Kyoto University, Kitashirakawaoiwake-cho, Sakyo, Kyoto 606-8502, Japan
- Graduate School of Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-0033, Japan
| | - Kai Yan
- LabRomance Inc, 1-3-29-2F Ureshino, Fujimino, Saitama 356-0056, Japan
| | - Hiroyoshi Iwata
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Yayoi, Bunkyo, Tokyo 113-8657
| | - Kentaro K. Shimizu
- Department of Evolutionary Biology and Environmental Studies, University of Zurich, Zurich 8057, Switzerland
- Kihara Institute for Biological Research, Yokohama City University, 641-12 Maioka, Totsuka, Yokohama, Kanagawa 244-0813, Japan
| | - Toshiaki Tameshige
- Kihara Institute for Biological Research, Yokohama City University, 641-12 Maioka, Totsuka, Yokohama, Kanagawa 244-0813, Japan
- Department of Biology, Faculty of Science, Niigata University, 8050 Ikarashi 2-no-cho, Nishi, Niigata 950-2181, Japan
| | - Shuhei Nasuda
- Graduate School of Agriculture, Kyoto University, Kitashirakawaoiwake-cho, Sakyo, Kyoto 606-8502, Japan
| | - Wei Guo
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori, Nishitokyo, Tokyo 188-0002, Japan
| |
Collapse
|
7
|
Ninomiya S. High-throughput field crop phenotyping: current status and challenges. BREEDING SCIENCE 2022; 72:3-18. [PMID: 36045897 PMCID: PMC8987842 DOI: 10.1270/jsbbs.21069] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Accepted: 12/16/2021] [Indexed: 05/03/2023]
Abstract
In contrast to the rapid advances made in plant genotyping, plant phenotyping is considered a bottleneck in plant science. This has promoted high-throughput plant phenotyping (HTP) studies, resulting in an exponential increase in phenotyping-related publications. The development of HTP was originally intended for use as indoor HTP technologies for model plant species under controlled environments. However, this subsequently shifted to HTP for use in crops in fields. Although HTP in fields is much more difficult to conduct due to unstable environmental conditions compared to HTP in controlled environments, recent advances in HTP technology have allowed these difficulties to be overcome, allowing for rapid, efficient, non-destructive, non-invasive, quantitative, repeatable, and objective phenotyping. Recent HTP developments have been accelerated by the advances in data analysis, sensors, and robot technologies, including machine learning, image analysis, three dimensional (3D) reconstruction, image sensors, laser sensors, environmental sensors, and drones, along with high-speed computational resources. This article provides an overview of recent HTP technologies, focusing mainly on canopy-based phenotypes of major crops, such as canopy height, canopy coverage, canopy biomass, and canopy stressed appearance, in addition to crop organ detection and counting in the fields. Current topics in field HTP are also presented, followed by a discussion on the low rates of adoption of HTP in practical breeding programs.
Collapse
Affiliation(s)
- Seishi Ninomiya
- Graduate School of Agriculture and Life Sciences, The University of Tokyo, Nishitokyo, Tokyo 188-0002, Japan
- Plant Phenomics Research Center, Nanjing Agricultural University, Nanjing, China
- Corresponding author (e-mail: )
| |
Collapse
|
8
|
Wu S, Wen W, Gou W, Lu X, Zhang W, Zheng C, Xiang Z, Chen L, Guo X. A miniaturized phenotyping platform for individual plants using multi-view stereo 3D reconstruction. FRONTIERS IN PLANT SCIENCE 2022; 13:897746. [PMID: 36003825 PMCID: PMC9393617 DOI: 10.3389/fpls.2022.897746] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/16/2022] [Accepted: 07/08/2022] [Indexed: 05/14/2023]
Abstract
Plant phenotyping is essential in plant breeding and management. High-throughput data acquisition and automatic phenotypes extraction are common concerns in plant phenotyping. Despite the development of phenotyping platforms and the realization of high-throughput three-dimensional (3D) data acquisition in tall plants, such as maize, handling small-size plants with complex structural features remains a challenge. This study developed a miniaturized shoot phenotyping platform MVS-Pheno V2 focusing on low plant shoots. The platform is an improvement of MVS-Pheno V1 and was developed based on multi-view stereo 3D reconstruction. It has the following four components: Hardware, wireless communication and control, data acquisition system, and data processing system. The hardware sets the rotation on top of the platform, separating plants to be static while rotating. A novel local network was established to realize wireless communication and control; thus, preventing cable twining. The data processing system was developed to calibrate point clouds and extract phenotypes, including plant height, leaf area, projected area, shoot volume, and compactness. This study used three cultivars of wheat shoots at four growth stages to test the performance of the platform. The mean absolute percentage error of point cloud calibration was 0.585%. The squared correlation coefficient R 2 was 0.9991, 0.9949, and 0.9693 for plant height, leaf length, and leaf width, respectively. The root mean squared error (RMSE) was 0.6996, 0.4531, and 0.1174 cm for plant height, leaf length, and leaf width. The MVS-Pheno V2 platform provides an alternative solution for high-throughput phenotyping of low individual plants and is especially suitable for shoot architecture-related plant breeding and management studies.
Collapse
Affiliation(s)
- Sheng Wu
- Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Weiliang Wen
- Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Wenbo Gou
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Xianju Lu
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Wenqi Zhang
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Chenxi Zheng
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Zhiwei Xiang
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
| | - Liping Chen
- Intelligent Equipment Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- *Correspondence: Liping Chen
| | - Xinyu Guo
- Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
- Beijing Key Lab of Digital Plant, National Engineering Research Center for Information Technology in Agriculture, Beijing, China
- College of Agricultural Engineering, Jiangsu University, Zhenjiang, China
- Xinyu Guo
| |
Collapse
|
9
|
An Efficient Method for Estimating Wheat Heading Dates Using UAV Images. REMOTE SENSING 2021. [DOI: 10.3390/rs13163067] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Convenient, efficient, and high-throughput estimation of wheat heading dates is of great significance in plant sciences and agricultural research. However, documenting heading dates is time-consuming, labor-intensive, and subjective on a large-scale field. To overcome these challenges, model- and image-based approaches are used to estimate heading dates. Phenology models usually require complicated parameters calibrations, making it difficult to model other varieties and different locations, while in situ field-image recognition usually requires the deployment of a large amount of observational equipment, which is expensive. Therefore, in this study, we proposed a growth curve-based method for estimating wheat heading dates. The method first generates a height-based continuous growth curve based on five time-series unmanned aerial vehicle (UAV) images captured over the entire wheat growth cycle (>200 d). Then estimate the heading date by generated growth curve. As a result, the proposed method had a mean absolute error of 2.81 d and a root mean square error of 3.49 d for 72 wheat plots composed of different varieties and densities sown on different dates. Thus, the proposed method is straightforward, efficient, and affordable and meets the high-throughput estimation requirements of large-scale fields and underdeveloped areas.
Collapse
|
10
|
David E, Serouart M, Smith D, Madec S, Velumani K, Liu S, Wang X, Pinto F, Shafiee S, Tahir ISA, Tsujimoto H, Nasuda S, Zheng B, Kirchgessner N, Aasen H, Hund A, Sadhegi-Tehran P, Nagasawa K, Ishikawa G, Dandrifosse S, Carlier A, Dumont B, Mercatoris B, Evers B, Kuroki K, Wang H, Ishii M, Badhon MA, Pozniak C, LeBauer DS, Lillemo M, Poland J, Chapman S, de Solan B, Baret F, Stavness I, Guo W. Global Wheat Head Detection 2021: An Improved Dataset for Benchmarking Wheat Head Detection Methods. PLANT PHENOMICS (WASHINGTON, D.C.) 2021; 2021:9846158. [PMID: 34778804 PMCID: PMC8548052 DOI: 10.34133/2021/9846158] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 08/11/2021] [Indexed: 05/03/2023]
Abstract
The Global Wheat Head Detection (GWHD) dataset was created in 2020 and has assembled 193,634 labelled wheat heads from 4700 RGB images acquired from various acquisition platforms and 7 countries/institutions. With an associated competition hosted in Kaggle, GWHD_2020 has successfully attracted attention from both the computer vision and agricultural science communities. From this first experience, a few avenues for improvements have been identified regarding data size, head diversity, and label reliability. To address these issues, the 2020 dataset has been reexamined, relabeled, and complemented by adding 1722 images from 5 additional countries, allowing for 81,553 additional wheat heads. We now release in 2021 a new version of the Global Wheat Head Detection dataset, which is bigger, more diverse, and less noisy than the GWHD_2020 version.
Collapse
Affiliation(s)
- Etienne David
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Mario Serouart
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Daniel Smith
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
| | - Simon Madec
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
| | - Kaaviya Velumani
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
- Hiphen SAS, 120 Rue Jean Dausset, Agroparc, Bâtiment Technicité, 84140 Avignon, France
| | - Shouyang Liu
- Plant Phenomics Research Center, Nanjing Agricultural University, Nanjing, China
| | - Xu Wang
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Francisco Pinto
- Global Wheat Program, International Maize and Wheat Improvement Centre (CIMMYT), Mexico, D.F., Mexico
| | - Shahameh Shafiee
- Faculty of Biosciences, Norwegian University of Life Sciences, P.O. Box 5003, NO-1432 Ås, Norway
| | - Izzat S. A. Tahir
- Agricultural Research Corporation, Wheat Research Program, P.O. Box 126, Wad Medani, Sudan
| | - Hisashi Tsujimoto
- Arid Land Research Center, Tottori University, Tottori 680-0001, Japan
| | - Shuhei Nasuda
- Laboratories of Plant Genetics and Plant Breeding, Graduate School of Agriculture, Kyoto University, Japan
| | - Bangyou Zheng
- CSIRO Agriculture and Food, Queensland Biosciences Precinct, 306 Carmody Road, St Lucia, 4067 QLD, Australia
| | - Norbert Kirchgessner
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Helge Aasen
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Andreas Hund
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | | | - Koichi Nagasawa
- Institute of Crop Science, National Agriculture and Food Research Organization, Japan
| | - Goro Ishikawa
- Hokkaido Agricultural Research Center, National Agriculture and Food Research Organization, Japan
| | - Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benoit Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Byron Evers
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Ken Kuroki
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | - Haozhou Wang
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | - Masanori Ishii
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | | | - Curtis Pozniak
- Department of Plant Sciences, University of Saskatchewan, Canada
| | - David Shaner LeBauer
- College of Agriculture and Life Sciences, University of Arizona, Tucson, Arizona, USA
| | - Morten Lillemo
- Faculty of Biosciences, Norwegian University of Life Sciences, P.O. Box 5003, NO-1432 Ås, Norway
| | - Jesse Poland
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Scott Chapman
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
- CSIRO Agriculture and Food, Queensland Biosciences Precinct, 306 Carmody Road, St Lucia, 4067 QLD, Australia
| | - Benoit de Solan
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
| | - Frédéric Baret
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Ian Stavness
- Department of Computer Science, University of Saskatchewan, Canada
| | - Wei Guo
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| |
Collapse
|