53
|
David E, Serouart M, Smith D, Madec S, Velumani K, Liu S, Wang X, Pinto F, Shafiee S, Tahir ISA, Tsujimoto H, Nasuda S, Zheng B, Kirchgessner N, Aasen H, Hund A, Sadhegi-Tehran P, Nagasawa K, Ishikawa G, Dandrifosse S, Carlier A, Dumont B, Mercatoris B, Evers B, Kuroki K, Wang H, Ishii M, Badhon MA, Pozniak C, LeBauer DS, Lillemo M, Poland J, Chapman S, de Solan B, Baret F, Stavness I, Guo W. Global Wheat Head Detection 2021: An Improved Dataset for Benchmarking Wheat Head Detection Methods. PLANT PHENOMICS (WASHINGTON, D.C.) 2021; 2021:9846158. [PMID: 34778804 PMCID: PMC8548052 DOI: 10.34133/2021/9846158] [Citation(s) in RCA: 30] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 08/11/2021] [Indexed: 05/03/2023]
Abstract
The Global Wheat Head Detection (GWHD) dataset was created in 2020 and has assembled 193,634 labelled wheat heads from 4700 RGB images acquired from various acquisition platforms and 7 countries/institutions. With an associated competition hosted in Kaggle, GWHD_2020 has successfully attracted attention from both the computer vision and agricultural science communities. From this first experience, a few avenues for improvements have been identified regarding data size, head diversity, and label reliability. To address these issues, the 2020 dataset has been reexamined, relabeled, and complemented by adding 1722 images from 5 additional countries, allowing for 81,553 additional wheat heads. We now release in 2021 a new version of the Global Wheat Head Detection dataset, which is bigger, more diverse, and less noisy than the GWHD_2020 version.
Collapse
Affiliation(s)
- Etienne David
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Mario Serouart
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Daniel Smith
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
| | - Simon Madec
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
| | - Kaaviya Velumani
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
- Hiphen SAS, 120 Rue Jean Dausset, Agroparc, Bâtiment Technicité, 84140 Avignon, France
| | - Shouyang Liu
- Plant Phenomics Research Center, Nanjing Agricultural University, Nanjing, China
| | - Xu Wang
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Francisco Pinto
- Global Wheat Program, International Maize and Wheat Improvement Centre (CIMMYT), Mexico, D.F., Mexico
| | - Shahameh Shafiee
- Faculty of Biosciences, Norwegian University of Life Sciences, P.O. Box 5003, NO-1432 Ås, Norway
| | - Izzat S. A. Tahir
- Agricultural Research Corporation, Wheat Research Program, P.O. Box 126, Wad Medani, Sudan
| | - Hisashi Tsujimoto
- Arid Land Research Center, Tottori University, Tottori 680-0001, Japan
| | - Shuhei Nasuda
- Laboratories of Plant Genetics and Plant Breeding, Graduate School of Agriculture, Kyoto University, Japan
| | - Bangyou Zheng
- CSIRO Agriculture and Food, Queensland Biosciences Precinct, 306 Carmody Road, St Lucia, 4067 QLD, Australia
| | - Norbert Kirchgessner
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Helge Aasen
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | - Andreas Hund
- Institute of Agricultural Sciences, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
| | | | - Koichi Nagasawa
- Institute of Crop Science, National Agriculture and Food Research Organization, Japan
| | - Goro Ishikawa
- Hokkaido Agricultural Research Center, National Agriculture and Food Research Organization, Japan
| | - Sébastien Dandrifosse
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Alexis Carlier
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benjamin Dumont
- Plant Sciences, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Benoit Mercatoris
- Biosystems Dynamics and Exchanges, TERRA Teaching and Research Center, Gembloux Agro-Bio Tech, University of Liège, 5030 Gembloux, Belgium
| | - Byron Evers
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Ken Kuroki
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | - Haozhou Wang
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | - Masanori Ishii
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| | | | - Curtis Pozniak
- Department of Plant Sciences, University of Saskatchewan, Canada
| | - David Shaner LeBauer
- College of Agriculture and Life Sciences, University of Arizona, Tucson, Arizona, USA
| | - Morten Lillemo
- Faculty of Biosciences, Norwegian University of Life Sciences, P.O. Box 5003, NO-1432 Ås, Norway
| | - Jesse Poland
- Wheat Genetics Resource Center, Dep. of Plant Pathology, Kansas State Univ., 4024 Throckmorton Plant Sciences Center, Manhattan, Kansas, USA
| | - Scott Chapman
- School of Food and Agricultural Sciences, The University of Queensland, Gatton, 4343 QLD, Australia
- CSIRO Agriculture and Food, Queensland Biosciences Precinct, 306 Carmody Road, St Lucia, 4067 QLD, Australia
| | - Benoit de Solan
- Arvalis, Institut du Végétal, 3 Rue Joseph et Marie Hackin, 75116 Paris, France
| | - Frédéric Baret
- UMR1114 EMMAH, INRAE, Centre PACA, Bâtiment Climat, Domaine Saint-Paul, 228 Route de l'Aérodrome, CS 40509, 84914 Avignon Cedex, France
| | - Ian Stavness
- Department of Computer Science, University of Saskatchewan, Canada
| | - Wei Guo
- Graduate School of Agricultural and Life Sciences, The University of Tokyo, 1-1-1 Midori-cho, Nishitokyo City, Tokyo, Japan
| |
Collapse
|
55
|
Velumani K, Lopez-Lozano R, Madec S, Guo W, Gillet J, Comar A, Baret F. Estimates of Maize Plant Density from UAV RGB Images Using Faster-RCNN Detection Model: Impact of the Spatial Resolution. PLANT PHENOMICS (WASHINGTON, D.C.) 2021; 2021:9824843. [PMID: 34549193 PMCID: PMC8404552 DOI: 10.34133/2021/9824843] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/16/2021] [Accepted: 07/02/2021] [Indexed: 05/19/2023]
Abstract
Early-stage plant density is an essential trait that determines the fate of a genotype under given environmental conditions and management practices. The use of RGB images taken from UAVs may replace the traditional visual counting in fields with improved throughput, accuracy, and access to plant localization. However, high-resolution images are required to detect the small plants present at the early stages. This study explores the impact of image ground sampling distance (GSD) on the performances of maize plant detection at three-to-five leaves stage using Faster-RCNN object detection algorithm. Data collected at high resolution (GSD ≈ 0.3 cm) over six contrasted sites were used for model training. Two additional sites with images acquired both at high and low (GSD ≈ 0.6 cm) resolutions were used to evaluate the model performances. Results show that Faster-RCNN achieved very good plant detection and counting (rRMSE = 0.08) performances when native high-resolution images are used both for training and validation. Similarly, good performances were observed (rRMSE = 0.11) when the model is trained over synthetic low-resolution images obtained by downsampling the native training high-resolution images and applied to the synthetic low-resolution validation images. Conversely, poor performances are obtained when the model is trained on a given spatial resolution and applied to another spatial resolution. Training on a mix of high- and low-resolution images allows to get very good performances on the native high-resolution (rRMSE = 0.06) and synthetic low-resolution (rRMSE = 0.10) images. However, very low performances are still observed over the native low-resolution images (rRMSE = 0.48), mainly due to the poor quality of the native low-resolution images. Finally, an advanced super resolution method based on GAN (generative adversarial network) that introduces additional textural information derived from the native high-resolution images was applied to the native low-resolution validation images. Results show some significant improvement (rRMSE = 0.22) compared to bicubic upsampling approach, while still far below the performances achieved over the native high-resolution images.
Collapse
Affiliation(s)
- K. Velumani
- Hiphen SAS, 120 Rue Jean Dausset, Agroparc, Bâtiment Technicité, 84140 Avignon, France
- INRAE, UMR EMMAH, UMT CAPTE, 228 Route de l'Aérodrome, Domaine Saint Paul-Site Agroparc CS 40509, 84914 Avignon Cedex 9, France
| | - R. Lopez-Lozano
- INRAE, UMR EMMAH, UMT CAPTE, 228 Route de l'Aérodrome, Domaine Saint Paul-Site Agroparc CS 40509, 84914 Avignon Cedex 9, France
| | - S. Madec
- Arvalis, 228, Route de l'Aérodrome-CS 40509, 84914 Avignon Cedex 9, France
| | - W. Guo
- International Field Phenomics Research Laboratory, Institute for Sustainable Agro-Ecosystem Services, Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo, Japan
| | - J. Gillet
- Hiphen SAS, 120 Rue Jean Dausset, Agroparc, Bâtiment Technicité, 84140 Avignon, France
| | - A. Comar
- Hiphen SAS, 120 Rue Jean Dausset, Agroparc, Bâtiment Technicité, 84140 Avignon, France
| | - F. Baret
- INRAE, UMR EMMAH, UMT CAPTE, 228 Route de l'Aérodrome, Domaine Saint Paul-Site Agroparc CS 40509, 84914 Avignon Cedex 9, France
| |
Collapse
|
57
|
Jiang Y, Li C, Xu R, Sun S, Robertson JS, Paterson AH. DeepFlower: a deep learning-based approach to characterize flowering patterns of cotton plants in the field. PLANT METHODS 2020; 16:156. [PMID: 33372635 PMCID: PMC7720604 DOI: 10.1186/s13007-020-00698-y] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/11/2019] [Accepted: 11/24/2020] [Indexed: 05/20/2023]
Abstract
BACKGROUND Flowering is one of the most important processes for flowering plants such as cotton, reflecting the transition from vegetative to reproductive growth and is of central importance to crop yield and adaptability. Conventionally, categorical scoring systems have been widely used to study flowering patterns, which are laborious and subjective to apply. The goal of this study was to develop a deep learning-based approach to characterize flowering patterns for cotton plants that flower progressively over several weeks, with flowers distributed across much of the plant. RESULTS A ground mobile system (GPhenoVision) was modified with a multi-view color imaging module, to acquire images of a plant from four viewing angles at a time. A total of 116 plants from 23 genotypes were imaged during an approximately 2-month period with an average scanning interval of 2-3 days, yielding a dataset containing 8666 images. A subset (475) of the images were randomly selected and manually annotated to form datasets for training and selecting the best object detection model. With the best model, a deep learning-based approach (DeepFlower) was developed to detect and count individual emerging blooms for a plant on a given date. The DeepFlower was used to process all images to obtain bloom counts for individual plants over the flowering period, using the resulting counts to derive flowering curves (and thus flowering characteristics). Regression analyses showed that the DeepFlower method could accurately (R2 = 0.88 and RMSE = 0.79) detect and count emerging blooms on cotton plants, and statistical analyses showed that imaging-derived flowering characteristics had similar effectiveness as manual assessment for identifying differences among genetic categories or genotypes. CONCLUSIONS The developed approach could thus be an effective and efficient tool to characterize flowering patterns for flowering plants (such as cotton) with complex canopy architecture.
Collapse
Affiliation(s)
- Yu Jiang
- Horticulture Section, School of Integrative Plant Science, Cornell AgriTech, Cornell University, Geneva, NY, 14456, USA
- Phenomics and Plant Robotics Center/College of Engineering, The University of Georgia, Athens, GA, 30602, USA
| | - Changying Li
- Phenomics and Plant Robotics Center/College of Engineering, The University of Georgia, Athens, GA, 30602, USA.
- College of Agricultural & Environmental Sciences, The University of Georgia, Athens, GA, 30602, USA.
| | - Rui Xu
- Phenomics and Plant Robotics Center/College of Engineering, The University of Georgia, Athens, GA, 30602, USA
| | - Shangpeng Sun
- Phenomics and Plant Robotics Center/College of Engineering, The University of Georgia, Athens, GA, 30602, USA
| | - Jon S Robertson
- College of Agricultural & Environmental Sciences, The University of Georgia, Athens, GA, 30602, USA
| | - Andrew H Paterson
- College of Agricultural & Environmental Sciences, The University of Georgia, Athens, GA, 30602, USA
- Franklin College of Arts and Sciences, The University of Georgia, Athens, GA, 30602, USA
| |
Collapse
|